Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 28
29 CHAPTER EIGHT INTEGRATED PROCESS MODEL INTEGRATED WORK PROCESS FOR language, how efficient would the office be with a PROJECT DELIVERY dozen translators added to make communication possible? In the same way, the TIM needs a universal The concept of the IPM evolves from all three investigative language with which all of the participants can com- aspects of the synthesis study (see Figure 13). municate digitally. Our literature review revealed sev- eral national efforts to standardize the data formats as 1. From the case studies, several of the agencies studied well as the structure and meaning of the data. reveal innovative concepts in various functional areas 3. From the initial survey, it is apparent that DOTs are in and stages of maturity that can be combined to form a various stages of adopting digital data exchange across complete project life-cycle model. Most significant are functional areas. the New York processes that initiate 3D modeling in the planning and design functional areas. This process GAPS AND SOLUTIONS change, from two-dimensional (2D) survey and design, is also a quickly growing technique in the vertical According to our case studies, besides software application construction industry segments (commercial, plant/ interoperability in general, the primary gap in the realization process/manufacturing) known as building informa- of TIM delivery is that currently the methodology begins to tion modeling (BIM). To date, however, the BIM con- atrophy during and after the construction (project) life-cycle cept has not been documented in literature as having stage. That is to say, there was no evidence of a TIM being functions in the construction, operations, or mainte- utilized beyond the procurement life-cycle stage. This phe- nance life-cycle stages, although one theory is to nomenon was reinforced by the literature review pertaining encapsulate all project data within the BIM. FDOT has to BIM project delivery. taken existing, reliable technology (CD-ROM media, digital signatures, etc.) and created a repository and Figure 14 displays TIM datasets and their migration archival mechanism that collects data throughout all and use across functional areas. No extensions of the data the functional areas or life-cycle stages. By melding in the successive project life-cycle stages or functional these two concepts, the IPM is the 3D modeling con- areas were found. The exception is FDOT, which uses datasets cept where the model also becomes the vessel of stor- in file structures burned onto CD-ROMs throughout all of age, retrieval, sharing, and archiving. the life-cycle stages; however, this is not the true TIM 2. From the literature review, it is apparent that technol- paradigm. ogy now exists to realize an IPM or TIM. One of the major barriers to realization is the myriad of differing (proprietary) data formats that must contribute to and Software Interoperability retrieve from the TIM. It is not practical that a TIM software application would have the ability (or knowl- Software interoperability is currently possible in three main edge of the proprietary trade secret code) to read liter- ways at the data file level: ally hundreds of different data formats of programs that require access to the model (e.g., CAD, design, 1. Bits of data (binary digits of 0 and 1) can be transferred scheduling, financial, and document applications). For between files by machine (hardware) interpretation. this reason is it imperative that all software applica- 2. Data can be converted between differing file structures tions accessing the model do so in just a few different by means of a common data map, which requires data formats. Standardization of data formats guaran- exportimport between the software applications (data tees that all contractual parties to the construction con- exchange). tract can communicate with the model. One can think 3. All parties to a work process use the same software of the TIM as a business office comprised of a dozen applications or data file formats, whether standard- or more persons. It takes all of them to successfully run ized or proprietary. This is not typically feasible in the office. If each person in the office spoke a different most construction delivery scenarios as no proprietary
OCR for page 29
30 FIGURE 13 Integrated process model. vendor markets software applications that satisfy all ways using spaces, commas, and other types of separators. requirements. For an exportimport to occur between two separate software applications, the ASCII text field must be of the same order Regarding method two, some basics in software applica- and type between the two programs. tion data exchange are useful. Until the last five years or so, data were exchanged between software with differing file for- Figure 15 displays a typical database table consisting of field mats by exporting from one program and importing into names ordered from left to right. Each intersection of a field another utilizing a standard file format called ASCII text. This name and record represents a field of data. Data fields consist of is simply alphanumeric and text characters that could be seen in text characters of either alphabetic, numeric, or alphanumeric a text editor (Microsoft Word Pad). Word processors also allow types and a defined number of characters (letters or numbers). the data to be saved in a text format with a .txt file extension. In Figure 15, the lower table has some familiar field names and Data fields in ASCII text format can be separated in several it is easy to imagine the data values inside the fields. FIGURE 14 TIM model data shown across partial project life-cycle stages (NYSDOT).
OCR for page 30
31 FIGURE 15 Database tables. FIGURE 16 ASCII text data exchange.
OCR for page 31
32 For an ASCII text data transfer to be successful, the format and from tabular order). Figure 17 attempts to dis- fields must be of the same name order and type as shown in play the difference of the exchange from Figure 16 and Figure 16. If the mapping is not exact, the exchange will not ASCII methodology. occur correctly, if at all. With XML, as long as the two software applications share Approximately five years ago a data exchange technique a common field name or label, and the data fields are struc- using XML came into mainstream use as a more efficient tured similarly, exchange will occur. As long as the separate method of exchanging data than ASCII Text. One of the software programs share the same field names (or schema), reasons for this popularity is that that the data field size and the data will find their appropriate places in the database on types must still be matched, but the data order or configu- exchange. ration does not have to match the destination data reposi- tory, which is to say that the data can be structured in any The following text structure displays schedule data in manner and still be imported (the data are independent of XML format as a separate example: test.xml 2006 MDOT Spread 2; As Bid 11/25/06 ACME, Inc. Ricky N. Dyess 1997-10-01T13:42:00 2007-03-31T20:59:00 1 2007-11-01T07:00:00 2008-08-13T17:00:00 1 0 2 $ 0 1 07:00:00 17:00:00 600 3600 26 0 2 0 0 7 3 0 0 0 0 0 1 1 0 0 1
OCR for page 32
33 The field names or tags are bracketed and encapsulate the Bowman goes on to state that data to be exchanged. As long as the separate software appli- At Bentley, we ran into interpretation issues upon acquiring cations share the same schema (labels), the data are easily InRoads, GEOPAK, and MX. Although each product had inde- imported and exported between programs with the use of pendently implemented LandXML, the standard offered multi- built-in parsers (that read and direct the data to the proper ple ways to store "similar" geometry that lead to widely varying fields). interpretations that, in turn, lead to unsatisfactory interoperabil- ity results. An extensive and time-consuming project ensued to guarantee that all three Bentley design systems interpreted the Hopefully from this demonstration it is possible to envi- LandXML standard in an identical manner. sion all transportation software applications sharing a com- Refining and documenting a schema to provide sufficient docu- mon data scheme, therefore allowing data to be exchanged mentation so that implementers can appreciate various schema from any software program utilized in the delivery of con- subtleties actually is more work than creating the schema struction projects or in any of the agency functional areas. itself. However, such consistency analyses represent the essence of true standardization so it cannot be ignored (D. Bowman, Bentley Systems, personal communication, 2007). The largest barrier to experiencing this method of inter- operability is convincing industry-segment participants to Software interoperability is defined as the ability of two or agree on a standardized (universal) schema, which has been more systems or components to exchange information and to and is being attempted in several industry segments with use the information that has been exchanged (Teague 2005b). limited success. NCHRP has funded research for the devel- Given the following: opment of TransXML, an intended universal schema for transportation-related schemas. · Software applications store information in their own unique way, optimized to support the application usage The magnitude and detail of datasets required for trans- scenarios and functions. portation agency specification requires interpretation of these · A data map is always required to share or exchange schemas by differing software applications. Dean Bowman, information among applications. Director of Research and Development at Bentley Systems, · A database or data repository is an application too, states: requiring a data mapping to the database or repository just like any other application. While quite true that TransXML and other efforts to create · A typical large company uses several hundred software a standardized schema are important, it is equally important that a "common interpretation" of this schema be established. This applications. No single commercial software supplier is a much more difficult but vital part of the standardization provides all needed tools. process. Otherwise, the way one software package development · Software users apply internally developed software in group views a given schema attribute can be quite different than addition to commercial software. the way another software package developer interprets the same identical schema attribute (D. Bowman, Bentley Systems, per- · Owner companies approach construction and support of sonal communication, 2007). capital facilities as a collaborative effort with multiple FIGURE 17 Order-independent data exchange possible with XML.
OCR for page 33
34 other business entities, including a number of service Standardization providers and product suppliers. · The size of the collaborating business entities that con- Although interoperability issues such as data exchange can struct and support capital facilities ranges from large, be remedied in part through the usage of XML, a special type multibillion dollar global corporations with a high level of standardization is required when representing objects in of IT expertise to very small service and product sup- the real world with digital data. When digital 3D models are pliers with a relatively low level of IT expertise. created, digital objects of a certain type must consistently · Each collaborating business entity has its own preferred represent the exact same real-life object in every model. tool set of commercial and internal software that, in general, is different from those their collaborators use. Several standardization efforts of this type have started including a collaborative effort led by FIATECH to establish FIATECH concluded that a universal platform called Accelerated Deployment of Inter- national Standards Organization (ISO) 15926 (ADI). ADI . . . there are thousands of application interfaces required to begins by using an established, universal language and achieve widespread software interoperability, especially when knowledge base. It implements the ISO Standard 15926 to attempting to achieve interoperability across organizational integrate project life-cycle information about plant facilities. boundaries to support electronic collaboration. Therefore, Using ISO 15926, each part used to build the plant is associated because of the sheer number of applications to be interfaced, software application mapping is the largest single cost associ- with a unique ID from the Reference Data Library. A generic ated with achieving widespread data interoperability (Teague data model defines how these parts should interact with others. 2005a). All of this is then integrated into a universal language called Web Ontology Language (OWL) of the World Wide Web FIATECH continued to explain that the method an orga- Consortium (W3C). nization chooses to achieve interoperability (data mapping methodology) is dependent on whether the organization is Therefore, using the example of a pipe being connected to attempting interoperability internally or externally. Teague a tank, we could say, "This is a TANK. It has ID `TK-1001'. then explains five approaches to achieving interoperability This is a PIPE. It has ID `50-11015.' `TK-1001' is connected (known today): to `50-11015' with a FLANGED CONNECTION since May 1st, 2007." The words shown in capital letters are resi- 1. Human interpretation, dent in the central Reference Data Library. The expressions 2. Developing internal database integration solutions, between them are modeled in the generic data model, and the 3. Purchasing commercial database integration solutions, total is implemented in the OWL language. 4. Integrating product suites purchased from the same vendor, and This standardizes not only the data exchange between 5. Creating software mapping interfaces. design and construction processes of different companies, but also allows firms maintaining the plant several years down Expanding on number 5, creating software mapping inter- the road to understand how it was built. Even if operators faces, two methodologies are contrasted as follows: speak different languages, they can access and understand this information, making things much easier for everyone. 1. Point-to-point software maps. In this methodology, Indeed, preliminary estimates predict a 30% productivity datasets are matched between applications (field for- improvement in the engineering, construction, supply-chain, mats, column order, etc.). The data are then exported operations, and maintenance phases of plants that implement from one application for import into another. Each interoperability. time datasets are migrated from one application to another, the field mapping and column ordering must An additional benefit of this project is that the Reference be restructured and the import/export process per- Data Library continuously grows with information added by formed. This approach becomes inefficient according contractors and owners. This work-in-progress Reference to Teague, as the number of applications needing to Data Library allows users to add information about parts, exchange data exceeds five each. This method does not activities, and processes, and use them immediately. The con- scale well. tinuous expansion of the Reference Data Library increases the 2. Consensus-based common industry format software capacity and promise of this project (Fornes 2007). maps. When application interfaces are designed to a common data format, then only one mapping opera- Another standardization project, the Industry Foundation tion needs to be done regardless of the software appli- Classes (IFC) data model, is a neutral and open specification cations involved. This operation scales linearly and that is not controlled by a singular vendor or group of ven- can be successfully accomplished with XML schema. dors. It is an object-oriented file format with a data model Current XML schema developed for the transportation developed by the International Alliance for Interoperability industry are LandXML and TransXML. (IAI) to facilitate interoperability in the building industry,
OCR for page 34
35 and is a commonly used format for BIM. The IFC model · Data-Model + data: Employs a data exchange file that specification is open and available. Because of its focus on includes both the data model and the data. ease of interoperability between software platforms, the Danish government has made the use of IFC format(s) compulsory for publicly aided building projects (Open Format 2007). Abstract Domain Versus Partitioned Domain The abstract domain layered data model approach has been For software applications to achieve interoperability, con- adopted by several standards efforts (e.g., STEP and ISO- sensus must be achieved in the order and classification of data. 15926) to address the need to support multiple domains. This Webster's Online Dictionary defines standard as "something approach meets the characteristics of supporting multiple sub- set up and established by authority as a rule for the measure ject domains, but with an added cost and complexity for appli- of quantity, weight, extent, value, or quality." The online cation owners who must construct software maps that match FIATECH Data Standards Clearing House adds the term con- up the application domain terminology with abstract data sensus standards, "which are a voluntary consensus by various model terminology required by the standard. This requires in- joint industry groups to use/adopt agreed on consensus stan- depth knowledge and expertise of both the abstract data dards and that may not be a true standard in the sense of being model and the domain data model. Unfortunately there are `established by authority' as the dictionary definition suggests" few people who have this combination of expertise. (Teague 2005a). Ben Nelson, Kansas DOT and Synthesis Panel member, Teague lists several key characteristics for comparing and states: understanding industry efforts: [O]ne of the great complicating factors of interoperable software · Domain: Subject domains addressed by the interoper- is having well-defined metadata of the data that are being ability standard; that is, industrial facilities versus trans- exchanged. Processes that are able to move one piece of data portation facilities. from one system to another could have great flaws if the data · Traditional Type Versus De Facto Type: Traditional have different metadata. This is, if the data do not satisfy the characteristics of the receiving system, great harm could be done standards efforts refer to those sponsored by an offi- in the engineering field. This leads back to needing detailed def- cial standards body through an official creation initions and even a knowledge of the data ontology . . . this need process. De Facto refers to voluntary adoption and is met by having an universally defined set of objects as dis- usage by consensus of participating member organiza- cussed. Fortunately, the DOTs, as the paper suggests, have gone a long way in defining objects such as the pay items listed in con- tions without having gone through rigorous official tracts using AASHTO definitions--as well as design objects creation processes. contained in the AASHTO design guide (and other guides such · Intellectual Property (IP)--Open Source or Proprietary: as the Manual of Uniform Traffic Control Devices). Further, Refers to ownership of the software code as well as the private-sector companies that dominate in this area have made progress in defining objects in enough detail that the objects can ability or freedoms to view it and/or modify it. be considered the same object in terms of the input and output of · File Format: Describes the data's file exchange format. the systems in which they are shared (B. Nelson, Kansas DOT, · Usage-Driven Versus Comprehensive Focus: Refers to personal communication, 2007). interoperability standards developed only for information that multiple organizations agree should be exchanged The partitioned domain approach can accommodate by across organizational boundaries. Comprehensive focus partitioning the multiple domains into some domains, which refers to standards completely open and flexible to any are general-purpose and reusable, and other domains, which data exchange. are very subject-specific. However, both the general-purpose · Multiple-Domain or Single-Domain Scope. and the subject-specific domains use commonly understood · Scalability: A measure of how easily the standard can "domain terminology" that applies concrete, commonly be utilized by very large, complex organizations. understood terminology. For example, units of measurement · Extensible Versus Fixed: Refers to the ability, or not, are a general-purpose domain, but are expressed using com- for two parties to extend the standard for purposes of a monly understood terminology. For XML-based standards, particular commercial exchange without having to wait the use of XML namespaces allows sufficient partitioning of for a full standardization process. the domain data model to enable multiple disciplines or · Exchange or Repository Sharing: Relating to broad scope groups to work independently of each other, while providing or focused scope, it is the standard application to applica- a common core set of reusable data models that can be tion or shared by a central repository (application). applied across multiple subject domains. · Exchange File Content: Data must be exchanged with defined data names, data types, and structural The most important advantage of the partitioned domain relationships. approach is that writing application software interface map- · Data-Only Contained in the Exchange File: Data that pings to domain-based applications is easier to match up with are provided to applications in separate data model def- domain-based terminology in the standard. Therefore, build- inition files. ing software maps is less complicated and less expensive
OCR for page 35
36 than matching domain terminology to abstract terminology mechanisms and creative agreements must be put in place (Teague 2005a). for the laws to evolve to this technology. The delivery sys- tem does not fit current precedent construction law. Business Practices Human Resources and Skills Business practices (and mindsets) requiring change or re- engineering appear to be as follows: Agency human capital resources, the information workers, personnel, and staff involved in the TIM delivery methodol- · The practice of designing in 3D versus 2D. ogy will need to acquire additional skill sets. Architecture, · Standardized CAD practices (i.e., defining data layers engineering, and construction management colleges and uni- so participants can easily find objects and data from versities in the United States are currently in the process of model to model). embedding BIM curriculum into their programs; this should · The practice of involving project stakeholders earlier in eventually benefit the DOT agencies. As a result of BIM cur- the project life cycle. With BIM, the designers and ricula, higher education programs are finding that designers builders collaborate early in the life-cycle process to are requiring more construction management skills and that virtually construct the facility, perform constructability the construction students are required to learn design princi- reviews, and solve problems with the model before they ples they previously could ignore. This phenomenon will no occur in physical reality. This hurdle may require cre- doubt be also true with professional stakeholders involved in ative and innovative solutions to realize the benefits. the BIM/TIM delivery methodologies. New skills and knowl- Whereas TIM would lend itself well to design-build edge will include: project delivery, the predominate design-bid-build delivery is another story as contractors are used to getting · Cross discipline knowledge (design-construction); 2D drawings at theoretical 100% design completion. · New software application training; · Legal issues in which innovative cooperation will be · IT skills (data exchange, schema and mapping, hard- required to solve potential liability issues when sharing ware and networking); and and relying on digital data supplied by others. Insurance · Collaboration, cooperation, and communication skills.