State of Change, Chapter 14: Engineering and Construction

See if any of this sounds familiar to you:  The productivity of a major American industry is said to be in a perpetual state of decline.  An antidote to this symptom turns up in the form of a proposal having already been made sometime in the 1970s by folks in the academic sector.  Naturally, skepticism ensues.  According to this proposal, technology makes feasible a new type of business interaction that could single-handedly revolutionize this particular industry and potentially cure the productivity problem.

For the remainder of the twentieth century, the bandwidth and computing power necessary to make the proposal feasible never exists in one place.  Ironically, high-powered workstations serve as clients for creating on-premise 3D visualizations of the industry’s critical data.  So the computing power does exist, just in the wrong spot.  What’s more, it’s being used inefficiently, underutilized, wasted.  In 2007, along comes a cloud computing-based distribution model.  Suddenly it’s possible for all that computing power to be pooled into a single data center, where heavy-duty projecting power can be harnessed by thinner, lighter-weight clients.  A new class of software emerges to enable this new model.  And the academics who for thirty-plus years were stuck preaching to the choir, now find themselves the exalted heroes of a new and emerging industry.

Literally within two years, the industry’s market saturation rate for their new class of software catapults from zero to 89 percent, according to an industry survey.  But there is an emerging realization that many of the firms that have invested in this new SaaS model and in emerging delivery platforms, actually don’t know what they are or how they’re supposed to work.  And many are equally bewildered as to how they can possibly implement the changes to their business and to their culture that this new system of work demands, especially since equivalent strategy shifts are required of every partner with whom they do business.

Eventually, the roadblock is described like this:  The strategy of the business and the interests of information technology, fail to align.

Perhaps your hand is raised sky-high, and you’re eager to claim this story for yourself.  And to be fair, this truly is your story to at least some degree, no matter where you work.  For a quarter-century we’ve had the ingredients for a modern computing network at our disposal, and we merely disposed of them. But perhaps nowhere else in America are the truths of this particular scenario more pronounced, more clearly black-and-white, than in construction and building engineering.  There is nothing the least bit vague about what is happening here.

Curing the Productivity Plague

The backdrop for any story regarding the construction industry is painted with a very dark ink provided by the U.S. Bureau of Labor Statistics, using data compiled by the U.S. Census Bureau.  Recently, Stanford University Professor Paul Teicholz compiled data for U.S. labor productivity in various industry segments, dating back to 1964.  After adjusting for inflation’s effect on these industries’ respective producer price indices, Teicholz demonstrated that, if the overall productivity of all U.S. non-farm labor for 1964 for each dollar invested was assessed at an index value of 100, the relative productivity for 2012 would eclipse 250.  That means that, for a unit of 2012 money that’s as valuable as $1 was in 1964, a business was likely to reap two-and-a-half times the reward in productivity.

But extract the construction industry from the rest of the data, and the dark ink becomes foreboding.  A 2012 unit of money that’s as valuable as $1 in 1964, would buy you about 92¢ of construction labor today, not $2.50.

The same forces that are preventing BIM from being used for the purpose in which it was originally intended, are preserving the partitions in the 19th century business model... with the implication that some unseen force somewhere is profiting from this.

It’s amid this exact backdrop — this very element of Labor Statistics data — that Teicholz and the other authors of Wiley & Sons’ BIM Handbook present their case for the complete substitution of present-day construction engineering management with a new way of work that would have been welcomed in the twentieth century if it were at all possible to do so.  BIM in this case is Building Information Modeling, and its premise is devastatingly simple:  An information system should contain every element of data about the architecture, construction, and maintenance of a building throughout its lifecycle, from the idea phase to the present day, including any and all known data about its future.

The book’s principal author is Prof. Chuck Eastman, who directs the Digital Building Lab for Georgia Tech University.  Eastman describes BIM as a process for producing a machine-readable model of a building that can be utilized by anyone involved in building or maintaining that building, whether it’s the original architect or an HVACR (cooling) technician.  Rather than simply virtualizing existing data from scanned-in drawings, BIM utilizes databases of the form created for 3D computer-aided design, so that all of a building’s component parts have parameters and rules in relation to one another.

Figure 14.1.

The layered components of the Building Information Model (BIM) as originally conceived by its creators at WSP Group.

The upshot of this is, when a change is made to one component, the alterations are reflected in all the others automatically.  It’s the type of freedom which financial planners have had with spreadsheets ever since VisiCalc, where the effects of changes are automatically reflected in the results.  The initial digital model of a building as conceived by architects evolves directly into the working digital model of the building as it is being constructed, and then into the active model of the building for maintenance and mechanical engineers.

The repercussions of adopting such a system may not be obvious to anyone outside of construction and engineering:  Up until now, a huge part of the ongoing construction planning process has involved the drafting of drawings depicting the changes made, on the premise that a picture is worth a thousand words (or, in the case of government-mandated document compliance, fifty thousand).  The adoption of BIM by all the stakeholders in building construction and maintenance could, theoretically, eliminate the need for all of this drafting work... 19th century laws notwithstanding.

BIM presents perhaps the perfect use-case scenario for one key element of cloud dynamics: the pooling together of multiple work processes into a single, shorter sequence.  Prof. Eastman, in a recent video promoting the latest edition of his book, describes it this way:

Building Information Modeling provides the opportunity for creating the iconic buildings of the 21st century.  Putting a building together in the computer before it’s put together on-site allows one to catch errors, understand how things are actually fabricated, encourages much closer collaboration between the fabrication team and the architects, and... provides the communication vehicle for allowing their efforts to be integrated.

But just as science fiction asks its audience to presume that the barrier to overcoming light-speed travel has already been shattered, BIM idealists make a plea to their fellow practitioners on a similarly fantastic scale:  BIM begs you to imagine a world where all the bartering, negotiation, bidding, reconciliation, and politics of managing a construction project are miraculously substituted with a nucleus of free-flowing, abundant logic.  It assumes that several of the laws mandating how contractual work is bid for, assessed, and approved can just wink out of existence.  And it assumes that something contractors still refer to as the construction supply chain — as the fundamental order of cause-and-effect upon which the construction universe is based — can be reduced to a single link.

The Thin Veneer

The rules of everyday work are not as easily disrupted as the foundations of applications platforms.  Nevertheless, a 2012 McGraw-Hill survey revealed that as many as 74 percent of general contractors (GCs) may use BIM applications, as opposed to 70 percent of architects and 67 percent of engineers.  The suggestion there is that, even though theoretically it’s the architects who create the data in the first place, it’s the building contractors who have displayed the greatest willingness to make better use of it.

That said, all those who have become BIM users in this first decade of its existence may not yet be actively subscribing to BIM’s principles.  They may be using BIM’s design tools internally, perhaps as a way to show off their designs to potential clients, but then reverting to conventional 2D plotted documents in order to meet the conventions of everyday business transactions.

There’s actually a catch-phrase for this kind of process throughout the construction industry: “Hollywood BIM,” they call it — a kind of façade like the false sets of a Western movie lot.  It could also be a way for certain of BIM’s advocates to make fun of BIM application users who “don’t get it,” and perhaps to justify the notion that re-educating these folks about higher-order methods of 3D modeling and architecture, will eventually cure them.

What’s hindering them is by no means a lack of architectural skill.  Real estate management consultant firm Brookwood Group recently published this recommendation for modern best practices (PDF available here) for the traditional delivery method for construction processes, called design-bid-build.  It’s a process that was developed and formalized in the 19th century in response to the need to break the monopoly on construction processes held by architectural firms.  Brookwood divides this process into three discrete stages, and from there into a multitude of sub-stages.  Summarized (since you only have so much time in your day), it works like this:

  • In the pre-design phase, the project owner selects a construction site, and consults with a management service to determine the legal and technical constraints inherent in building on that site.  Once those legal hurdles are cleared, a database is created called the Program Management Information System (PMIS).  Using that system, the project owner produces a document called the Program of Facility Requirements, which warns all the stakeholders about the resources that will be involved prior to drawing up the project budget.  With the budget drawn up, a Master Project Schedule is worked out, prior to the opening of the project to bids from architects and engineers.

  • The design phase begins after contracts are awarded.  The various contractors must produce their cost estimates in keeping with a multi-track costing program, which may be prescribed by law depending on the area.  All financing that is done during this phase must take place using a coordinated set of documents that are in keeping with the formats of all the contractors’ respective financial agencies of choice.

  • In the construction phase, competitive bids are received from contractors, and the laws of respective states and municipalities may regulate how bids must be reviewed and eventually rewarded — including the format in which those bids are presented.

Back in the 19th century (which, for the record, somewhat pre-dates the first version of ArchiCAD), the multitude of documents exchanged between business partners, both prospective and awarded, involved draftsmanship.  Especially for public works projects, state laws — some of which have literally been in place since that time — have specified the formats of those documents.  In many cases today, paper documents are drafted (certainly not by hand) as a formality, as a way for firms to demonstrate they’re abiding by the law and adhering to standards of fairness and transparency. 

Business advisory firm KPMG studied the problems of 19th century delivery models in public works and infrastructure projects (PDF available here), such as the desperately needed reconstruction of the nation’s bridges and public roadways.  A team led by Virginia Tech Professor Michael J. Garvin drew this conclusion:

One of the limitations of the design-bid-build project delivery system is that the owner only sees the architectural and engineering solution of one service provider, with one combination of cost, quality, and time attributes. But a single provider is never in a position — technically or financially — to fully consider and compare all alternatives for design, technology, initial costs, or lifecycle costs. As a result, the service provider’s limitations become the owner’s.

The ideal for BIM is the integration of all the data utilized by all the stakeholders in a project into one unit, thereby giving the project owner visibility into every part of the project’s prospective lifecycle costs, and also giving prospective GCs extensive insight as to the architects’ intentions.  Furthermore, the generation of the 3D model at the beginning of the pre-design phase should enable the automated production of any supplementary drawings and models as existing laws and building standards may require.  But this ideal is not what’s happening.

In the March-April 2013 edition of the International Journal of Engineering Research and Applications, two researchers from Sweden’s Chalmers University of Technology drew a conclusion (PDF available here) that readers of this series up to now will find ominously familiar:  The “Hollywood BIM” phenomenon, among a few others they note, arises when each of the separate components of the construction value chain utilizes BIM independently of the other, not to integrate data and make the construction process easier, but instead, the researchers state, “to win the jobs.”  The phenomenon that arises as a result of “Hollywood BIM,” however, is just as starkly black-and-white as any other aspect of this story:  Different executives from the separate components develop unique sets of expectations for BIM.  And therefore software vendors subdivide their marketing campaigns to address those unique sets separately.  As the Chalmers team writes (translated from Swedish):

Having a thorough review on empirical studies that have been undertaken in the area of BIM, it is evident that not only our perceptions from BIM defer greatly from one person to another, but also our expected outcomes of using BIM defers due to many reasons; having different definitions or unfamiliarity with all BIM’s potential uses may be a result of having different expectations.  These different expectations may hinder full use of BIM abilities; therefore, BIM’s potential abilities hide under the users’ expectations; therefore, only some parts of BIM’s potential may be used.  Some Authors [of researched works on BIM] named this phenomenon as “BIM Consideration”...  In this phenomenon, construction managers may not be able to understand full potentials of BIM so its uses could be isolated due to misunderstanding of its meanings and uses by top managers.

There are various stakeholders that interact when BIM is utilized.  Evidently different organizations and people create their own definitions of BIM, based on the specific way they work with BIM.  Thus, it is evident that there are differences in the way BIM is perceived by both different individuals and organizations within the construction industry.  As a result, it might be difficult to come up with a common definition of BIM for the entire construction industry.

Here is where a concept as simple as “open access” becomes bifurcated to the point of unrecognizability.  To counter this trend, an industry-wide effort to standardize BIM standards, practices, and formats is already under way in the U.S., in the form of version 2 of the National Institute of Building Sciences’ (NIBS) National BIM Standard.

The lifecycle of building information has taken on a serpentine path, as it’s translated from 3D to 2D to 3D to 2D, over and over indefinitely.  The last genuine estimate of the building industry’s actual cost of waste on account of redundant processes and unnecessary paperwork came in 2004, when the Construction Industry Institute projected that as much as 57% of all dollars invested in construction are non-value-added — as much as $600 billion per year.  Although forecasts for the growth of wasted spending were dire at that time, the proliferation of even newer sources of redundancy and waste have disabled any kind of rational estimate for today’s cost of waste.  Charged with the task of determining where those wasted dollars are going, the National Institute for Standards and Technology ended up saying nothing conclusive at all, as this NIBS document (PDF available here) explains:

While the NIST study and others have identified the loss of billions of dollars a year from inefficient business practices, we have not been able to identify the specific sources of those dollars in order to be able to redirect them to solve the problem.  The primary reasons are that the dollars are widely distributed and that most practitioners have an accepted way of doing business such that the imbedded waste and ways to improve are not readily seen.  Hence, the industry makes small incremental improvements to inefficient processes instead of the substantive changes required that involve the entire capital facilities industry.

Put another way, the same forces that are preventing BIM from being used for the purpose in which it was originally intended, are preserving the partitions in the 19th century business model... with the implication that some unseen force somewhere is profiting from this.

Fixing the Arrows

Analysts believe the market share leader in BIM software to be Autodesk, the firm that led the CAD revolution of the 1980s with AutoCAD.  Its BIM application is called Revit, and as a result, the electronic documents the application produces which describe building models, are called “Revits.”  These “documents” are, in fact, databases — colossal ones, which even after being optimized for distribution can exceed 1.5 GB each.  Revit achieved prominence in 2004 when it was selected for the management of New York City’s Freedom Tower project.  One can only imagine what the size of the Revit for that building must be.

Only in recent years has the sizes of flash cards increased, and their costs decreased.  This makes it feasible for a project owner to submit the requisite design-bid-build documents to the designated stakeholders and authorities, but also attach flash cards to them with bits of masking tape.  Up until that time, architectural and construction firms had been using FTP to shuttle multiple, evolving versions of Revit databases in monstrous overnight or even weekend uploads.  By 2009, the problems of overcoming the massive scale of Revits — which was only increasing with each new point release of the application — was being tackled by IT experts.  They had begun serious investigations of the prospects of enabling business partners to co-own the data warehouses, and actively integrate their respective Revits through regular and intense synchronization processes.

It was then that a pair of Las Vegas casino architects considered this simple prospect:  What if the telecommunications lines they used to connect their firm’s nationwide branch offices could be replaced with a central cloud platform?  Even with WAN accelerators in place, the time these offices consumed in merely opening a Revit could average a whole hour.

The solution these architects devised for themselves was a leased private cloud — a setup where public and private resources are pooled together, but the firewall remains contiguous and administration remains on-premise.  But following the principle that any true cloud services consumer may become a cloud services provider, they ended up selling their private BIM architecture as a model for other firms.  Their new firm is called BIM9, and its selling point is what BIM was supposed to have been to begin with: across-the-board integration.

“The philosophy became, instead of trying to move the data to the end-user with a high-speed workstation,” explains BIM9 marketing representative Brian K. Smith in a company video, “they would move the user to the data on a private BIM cloud.  This way, full-time employees, remote part-time employees, and contractors would all benefit for a fraction of the cost of the old leased-line concept.”

Smith goes on to explicitly show competitors’ diagrams of the separate partitions of the building value chain, all joined to the centralized BIM axis like dancers around a maypole.  These diagrams are intended to demonstrate BIM lifecycle management, he demonstrates, as if the existing lifecycle were something we really wanted or needed to manage.  The alternative Smith presents blows the lifecycle model out of the water.  “No one really talks about how everyone can really all connect to, and benefit from, the same model at the same time,” he says.

The process of reforming the existing lifecycle model into a modern usage model is what BIM9’s Smith calls “fixing the arrows” — specifically, the ones in those diagrams.  They’re intended to represent all the components of the architectural/engineering/construction (AEC) supply chain being brought together by one centralized component, BIM — which, you might think, is the right idea.  But what they may end up representing is the sequence of exchanges between stakeholders (usually building owners, architects, mechanical engineers, electrical engineers, construction managers, civil engineers, interior designers, and general contractors), with a depiction of BIM as a kind of broker for the shuttling of data — including those transferred Revit files — between these various silos.

It’s not the stakeholders that are the problem in this model, but the existence of a sequence that’s applied to BIM.  It implies that the lifecycle of a building should be reflected in the lifecycle of the data.  If healthcare worked this way, a detailed study of your physical condition would never be generated until you had major surgery, and the end result of every sequence would be the part where the mortician enters the picture.

The ideal situation that modern BIM proponents advocate is an emerging model they call integrated project delivery (IPD).  The integration removes the sequence arrows from the diagram, introducing in their place a collaborative process where any stakeholder is involved at any time when it’s appropriate, and the building owner is involved perpetually.

One case study for the potential benefits from involving building owners in IPD comes from 26-year veteran architect and Autodesk industry programs manager Robert E. Middlebrooks.  The nation’s call for greater energy efficiency is leading building owners, Middlebrooks points out, to choose upgrading their existing buildings over replacing them.  The data that renewable energy engineers, for example, need to be able to model reliable improvements to existing structures can only come from two sources: maintenance engineers, and the building owners who hire them.  And in many cases, especially with decades-old buildings, the former party no longer exists.

Even in the absence of reliable data, however, building owners can still undertake their own efforts to supply information that engineers and re-architects may require, including simple digital photography of every surface.  Integrating the data that the building owner can provide with laser scans supplied by contracted engineers, adds Middlebrooks, can multiply the opportunities for discovering energy savings, as he notes in the case of one hospital expansion project in Katy, Texas.

Tectonic Shifts

Implementing data integration among multiple stakeholders on the scale that IPD demands, implies the existence of two fundamental overhauls in the infrastructure of the data center.  This may be true to some extent for other industries besides AEC.

The first overhaul is formidable enough:  Laws in place throughout the country, on levels ranging from state to municipal, mandate the exchange of documents between specifically identified stakeholders.  States including California have debated, and in some cases passed, resolutions granting authority for certain public works projects to allow a streamlined supply chain model called design-build, which enables a single architect firm to maintain control over the building of the project.  But that type of streamlining, opponents point out, not only counteracts the goals of BIM and IPD but moves America from a 19th to a 17th century business model.  A 2009 public document from California’s State Senate Committee on Local Government points out the following:

Design-build is not without its disadvantages.  Because the owner does not fully define the project upon entering into a contract, the owner gives up control over design and construction quality.  Furthermore, because the designer and builder are on the same team, they share a financial incentive to reduce quality to increase their profits.  Critics also say design-build results in more expensive change orders and opens the door to favoritism in the selection process.

Opening up the debate for streamlining the legal process for architects and engineers has triggered, and will trigger again, a subsequent debate on the merits of embracing once again a business model that some say favors monopoly over efficiency.

The second fundamental infrastructure shift deals with the ownership of BIM data.  In previous articles in this series, I discussed businesses, including one pharmaceutical firm, that realized substantial benefits from surrendering the idea that it must own the systems and the data that comprise its information services.  A truly effective IPD system — which would, no doubt, be cloud-based — could transfer the ownership of the entire BIM model, throughout its entire lifecycle, to a kind of escrow.  In that escrow, individual stakeholders would be assigned separate and distinct roles.  The level and extent of their access to that data would be managed independently by an administrator.  And in a co-ownership model, that admin may very well be outside of any of the stakeholder firms.

Eric Chiu is the president of HyTrust, an IT security services provider that specializes in access control systems.  This is a relatively new model of security where no single role has unfettered access to data.  A theoretical, cloud-based, co-opting system for managing BIM data could utilize access control to give all stakeholders precisely the access they need to building data at the appropriate times in the building’s lifecycle, without mutating the data with exports and imports in an effort to follow that lifecycle.

In an interview with me, Chiu explains one of access control’s fundamental concepts:

The only way you can secure against breaches and data center disasters is by making sure that you limit access to sensitive operations and sensitive data, and you make that automated in technology — and you also have the right level of what we call role-based monitoring, that tracks all of the administrator activity and compares it against what they should be doing, alerting you as to when potentially anomalous or bad things are happening.

We’ll focus on the new security models in a future article in this series.  For now, it’s important to note that cloud technologies are making feasible systems which effectively make data management more efficient and more secure by coaxing their clients to cede the exclusive ownership of that data.  It sounds preposterous on its face, but the idea is essentially the same as a bank that uses vaults.  When data is placed in a trust, and the keys to that trust are delegated by access control, the documents and models that are critical to multiple organizations simultaneously can be shared not by transfer or export or attaching to an e-mail message (the three most dangerous processes in IT), but through true collaboration.

There are many businesses — and building construction is just one — where existing laws consider collaboration dangerous, if not forbidden.  The walls of many organizations’ silos truly are made of steel.

The case of the construction and engineering industries is the clearest example available of technology failing to change the world by itself.  In reality, technological evolution seldom translates into cultural or business or economic evolution.  When such changes do happen, some other force always serves as the catalyst.  And in this case, there are obvious forces which serve as inhibitors.

Previous
Previous

State of Change, Chapter 13: Manufacturing

Next
Next

State of Change, Chapter 15: Retail