National Academies Press: OpenBook

Information Technology for Efficient Project Delivery (2008)

Chapter: Chapter Eight - Integrated Process Model

« Previous: Chapter Seven - Information Technology for Operations and Maintenance
Page 28
Suggested Citation:"Chapter Eight - Integrated Process Model." National Academies of Sciences, Engineering, and Medicine. 2008. Information Technology for Efficient Project Delivery. Washington, DC: The National Academies Press. doi: 10.17226/14213.
×
Page 28
Page 29
Suggested Citation:"Chapter Eight - Integrated Process Model." National Academies of Sciences, Engineering, and Medicine. 2008. Information Technology for Efficient Project Delivery. Washington, DC: The National Academies Press. doi: 10.17226/14213.
×
Page 29
Page 30
Suggested Citation:"Chapter Eight - Integrated Process Model." National Academies of Sciences, Engineering, and Medicine. 2008. Information Technology for Efficient Project Delivery. Washington, DC: The National Academies Press. doi: 10.17226/14213.
×
Page 30
Page 31
Suggested Citation:"Chapter Eight - Integrated Process Model." National Academies of Sciences, Engineering, and Medicine. 2008. Information Technology for Efficient Project Delivery. Washington, DC: The National Academies Press. doi: 10.17226/14213.
×
Page 31
Page 32
Suggested Citation:"Chapter Eight - Integrated Process Model." National Academies of Sciences, Engineering, and Medicine. 2008. Information Technology for Efficient Project Delivery. Washington, DC: The National Academies Press. doi: 10.17226/14213.
×
Page 32
Page 33
Suggested Citation:"Chapter Eight - Integrated Process Model." National Academies of Sciences, Engineering, and Medicine. 2008. Information Technology for Efficient Project Delivery. Washington, DC: The National Academies Press. doi: 10.17226/14213.
×
Page 33
Page 34
Suggested Citation:"Chapter Eight - Integrated Process Model." National Academies of Sciences, Engineering, and Medicine. 2008. Information Technology for Efficient Project Delivery. Washington, DC: The National Academies Press. doi: 10.17226/14213.
×
Page 34
Page 35
Suggested Citation:"Chapter Eight - Integrated Process Model." National Academies of Sciences, Engineering, and Medicine. 2008. Information Technology for Efficient Project Delivery. Washington, DC: The National Academies Press. doi: 10.17226/14213.
×
Page 35

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

29 INTEGRATED WORK PROCESS FOR PROJECT DELIVERY The concept of the IPM evolves from all three investigative aspects of the synthesis study (see Figure 13). 1. From the case studies, several of the agencies studied reveal innovative concepts in various functional areas and stages of maturity that can be combined to form a complete project life-cycle model. Most significant are the New York processes that initiate 3D modeling in the planning and design functional areas. This process change, from two-dimensional (2D) survey and design, is also a quickly growing technique in the vertical construction industry segments (commercial, plant/ process/manufacturing) known as building informa- tion modeling (BIM). To date, however, the BIM con- cept has not been documented in literature as having functions in the construction, operations, or mainte- nance life-cycle stages, although one theory is to encapsulate all project data within the BIM. FDOT has taken existing, reliable technology (CD-ROM media, digital signatures, etc.) and created a repository and archival mechanism that collects data throughout all the functional areas or life-cycle stages. By melding these two concepts, the IPM is the 3D modeling con- cept where the model also becomes the vessel of stor- age, retrieval, sharing, and archiving. 2. From the literature review, it is apparent that technol- ogy now exists to realize an IPM or TIM. One of the major barriers to realization is the myriad of differing (proprietary) data formats that must contribute to and retrieve from the TIM. It is not practical that a TIM software application would have the ability (or knowl- edge of the proprietary trade secret code) to read liter- ally hundreds of different data formats of programs that require access to the model (e.g., CAD, design, scheduling, financial, and document applications). For this reason is it imperative that all software applica- tions accessing the model do so in just a few different data formats. Standardization of data formats guaran- tees that all contractual parties to the construction con- tract can communicate with the model. One can think of the TIM as a business office comprised of a dozen or more persons. It takes all of them to successfully run the office. If each person in the office spoke a different language, how efficient would the office be with a dozen translators added to make communication possible? In the same way, the TIM needs a universal language with which all of the participants can com- municate digitally. Our literature review revealed sev- eral national efforts to standardize the data formats as well as the structure and meaning of the data. 3. From the initial survey, it is apparent that DOTs are in various stages of adopting digital data exchange across functional areas. GAPS AND SOLUTIONS According to our case studies, besides software application interoperability in general, the primary gap in the realization of TIM delivery is that currently the methodology begins to atrophy during and after the construction (project) life-cycle stage. That is to say, there was no evidence of a TIM being utilized beyond the procurement life-cycle stage. This phe- nomenon was reinforced by the literature review pertaining to BIM project delivery. Figure 14 displays TIM datasets and their migration and use across functional areas. No extensions of the data in the successive project life-cycle stages or functional areas were found. The exception is FDOT, which uses datasets in file structures burned onto CD-ROMs throughout all of the life-cycle stages; however, this is not the true TIM paradigm. Software Interoperability Software interoperability is currently possible in three main ways at the data file level: 1. Bits of data (binary digits of 0 and 1) can be transferred between files by machine (hardware) interpretation. 2. Data can be converted between differing file structures by means of a common data map, which requires export–import between the software applications (data exchange). 3. All parties to a work process use the same software applications or data file formats, whether standard- ized or proprietary. This is not typically feasible in most construction delivery scenarios as no proprietary CHAPTER EIGHT INTEGRATED PROCESS MODEL

30 vendor markets software applications that satisfy all requirements. Regarding method two, some basics in software applica- tion data exchange are useful. Until the last five years or so, data were exchanged between software with differing file for- mats by exporting from one program and importing into another utilizing a standard file format called ASCII text. This is simply alphanumeric and text characters that could be seen in a text editor (Microsoft Word Pad). Word processors also allow the data to be saved in a text format with a .txt file extension. Data fields in ASCII text format can be separated in several ways using spaces, commas, and other types of separators. For an export–import to occur between two separate software applications, the ASCII text field must be of the same order and type between the two programs. Figure 15 displays a typical database table consisting of field names ordered from left to right. Each intersection of a field name and record represents a field of data. Data fields consist of text characters of either alphabetic, numeric, or alphanumeric types and a defined number of characters (letters or numbers). In Figure 15, the lower table has some familiar field names and it is easy to imagine the data values inside the fields. FIGURE 13 Integrated process model. FIGURE 14 TIM model data shown across partial project life-cycle stages (NYSDOT).

31 FIGURE 15 Database tables. FIGURE 16 ASCII text data exchange.

32 For an ASCII text data transfer to be successful, the fields must be of the same name order and type as shown in Figure 16. If the mapping is not exact, the exchange will not occur correctly, if at all. Approximately five years ago a data exchange technique using XML came into mainstream use as a more efficient method of exchanging data than ASCII Text. One of the reasons for this popularity is that that the data field size and types must still be matched, but the data order or configu- ration does not have to match the destination data reposi- tory, which is to say that the data can be structured in any manner and still be imported (the data are independent of format and from tabular order). Figure 17 attempts to dis- play the difference of the exchange from Figure 16 and ASCII methodology. With XML, as long as the two software applications share a common field name or label, and the data fields are struc- tured similarly, exchange will occur. As long as the separate software programs share the same field names (or schema), the data will find their appropriate places in the database on exchange. The following text structure displays schedule data in XML format as a separate example: <?xml version=“1.0” ?> <Project xmlns=“http://schemas.microsoft.com/project”> <Name>test.xml</Name> <Title>2006 MDOT Spread 2; As Bid 11/25/06</Title> <Company>ACME, Inc.</Company> <Author>Ricky N. Dyess</Author> <CreationDate>1997-10-01T13:42:00</CreationDate> <LastSaved>2007-03-31T20:59:00</LastSaved> <ScheduleFromStart>1</ScheduleFromStart> <StartDate>2007-11-01T07:00:00</StartDate> <FinishDate>2008-08-13T17:00:00</FinishDate> <FYStartDate>1</FYStartDate> <CriticalSlackLimit>0</CriticalSlackLimit> <CurrencyDigits>2</CurrencyDigits> <CurrencySymbol>$</CurrencySymbol> <CurrencySymbolPosition>0</CurrencySymbolPosition> <CalendarUID>1</CalendarUID> <DefaultStartTime>07:00:00</DefaultStartTime> <DefaultFinishTime>17:00:00</DefaultFinishTime> <MinutesPerDay>600</MinutesPerDay> <MinutesPerWeek>3600</MinutesPerWeek> <DaysPerMonth>26</DaysPerMonth> <DefaultTaskType>0</DefaultTaskType> <DefaultFixedCostAccrual>2</DefaultFixedCostAccrual> <DefaultStandardRate>0</DefaultStandardRate> <DefaultOvertimeRate>0</DefaultOvertimeRate> <DurationFormat>7</DurationFormat> <WorkFormat>3</WorkFormat> <EditableActualCosts>0</EditableActualCosts> <HonorConstraints>0</HonorConstraints> <InsertedProjectsLikeSummary>0</InsertedProjectsLikeSummary> <MultipleCriticalPaths>0</MultipleCriticalPaths> <NewTasksEffortDriven>0</NewTasksEffortDriven> <NewTasksEstimated>1</NewTasksEstimated> <SplitsInProgressTasks>1</SplitsInProgressTasks> <SpreadActualCost>0</SpreadActualCost> <SpreadPercentComplete>0</SpreadPercentComplete> <TaskUpdatesResource>1</TaskUpdatesResource>

33 The field names or tags are bracketed and encapsulate the data to be exchanged. As long as the separate software appli- cations share the same schema (labels), the data are easily imported and exported between programs with the use of built-in parsers (that read and direct the data to the proper fields). Hopefully from this demonstration it is possible to envi- sion all transportation software applications sharing a com- mon data scheme, therefore allowing data to be exchanged from any software program utilized in the delivery of con- struction projects or in any of the agency functional areas. The largest barrier to experiencing this method of inter- operability is convincing industry-segment participants to agree on a standardized (universal) schema, which has been and is being attempted in several industry segments with limited success. NCHRP has funded research for the devel- opment of TransXML, an intended universal schema for transportation-related schemas. The magnitude and detail of datasets required for trans- portation agency specification requires interpretation of these schemas by differing software applications. Dean Bowman, Director of Research and Development at Bentley Systems, states: While quite true that TransXML and other efforts to create a standardized schema are important, it is equally important that a “common interpretation” of this schema be established. This is a much more difficult but vital part of the standardization process. Otherwise, the way one software package development group views a given schema attribute can be quite different than the way another software package developer interprets the same identical schema attribute (D. Bowman, Bentley Systems, per- sonal communication, 2007). Bowman goes on to state that At Bentley, we ran into interpretation issues upon acquiring InRoads, GEOPAK, and MX. Although each product had inde- pendently implemented LandXML, the standard offered multi- ple ways to store “similar” geometry that lead to widely varying interpretations that, in turn, lead to unsatisfactory interoperabil- ity results. An extensive and time-consuming project ensued to guarantee that all three Bentley design systems interpreted the LandXML standard in an identical manner. Refining and documenting a schema to provide sufficient docu- mentation so that implementers can appreciate various schema subtleties actually is more work than creating the schema itself. However, such consistency analyses represent the essence of true standardization so it cannot be ignored (D. Bowman, Bentley Systems, personal communication, 2007). Software interoperability is defined as the ability of two or more systems or components to exchange information and to use the information that has been exchanged (Teague 2005b). Given the following: • Software applications store information in their own unique way, optimized to support the application usage scenarios and functions. • A data map is always required to share or exchange information among applications. • A database or data repository is an application too, requiring a data mapping to the database or repository just like any other application. • A typical large company uses several hundred software applications. No single commercial software supplier provides all needed tools. • Software users apply internally developed software in addition to commercial software. • Owner companies approach construction and support of capital facilities as a collaborative effort with multiple FIGURE 17 Order-independent data exchange possible with XML.

other business entities, including a number of service providers and product suppliers. • The size of the collaborating business entities that con- struct and support capital facilities ranges from large, multibillion dollar global corporations with a high level of IT expertise to very small service and product sup- pliers with a relatively low level of IT expertise. • Each collaborating business entity has its own preferred tool set of commercial and internal software that, in general, is different from those their collaborators use. FIATECH concluded that . . . there are thousands of application interfaces required to achieve widespread software interoperability, especially when attempting to achieve interoperability across organizational boundaries to support electronic collaboration. Therefore, because of the sheer number of applications to be interfaced, software application mapping is the largest single cost associ- ated with achieving widespread data interoperability (Teague 2005a). FIATECH continued to explain that the method an orga- nization chooses to achieve interoperability (data mapping methodology) is dependent on whether the organization is attempting interoperability internally or externally. Teague then explains five approaches to achieving interoperability (known today): 1. Human interpretation, 2. Developing internal database integration solutions, 3. Purchasing commercial database integration solutions, 4. Integrating product suites purchased from the same vendor, and 5. Creating software mapping interfaces. Expanding on number 5, creating software mapping inter- faces, two methodologies are contrasted as follows: 1. Point-to-point software maps. In this methodology, datasets are matched between applications (field for- mats, column order, etc.). The data are then exported from one application for import into another. Each time datasets are migrated from one application to another, the field mapping and column ordering must be restructured and the import/export process per- formed. This approach becomes inefficient according to Teague, as the number of applications needing to exchange data exceeds five each. This method does not scale well. 2. Consensus-based common industry format software maps. When application interfaces are designed to a common data format, then only one mapping opera- tion needs to be done regardless of the software appli- cations involved. This operation scales linearly and can be successfully accomplished with XML schema. Current XML schema developed for the transportation industry are LandXML and TransXML. 34 Standardization Although interoperability issues such as data exchange can be remedied in part through the usage of XML, a special type of standardization is required when representing objects in the real world with digital data. When digital 3D models are created, digital objects of a certain type must consistently represent the exact same real-life object in every model. Several standardization efforts of this type have started including a collaborative effort led by FIATECH to establish a universal platform called Accelerated Deployment of Inter- national Standards Organization (ISO) 15926 (ADI). ADI begins by using an established, universal language and knowledge base. It implements the ISO Standard 15926 to integrate project life-cycle information about plant facilities. Using ISO 15926, each part used to build the plant is associated with a unique ID from the Reference Data Library. A generic data model defines how these parts should interact with others. All of this is then integrated into a universal language called Web Ontology Language (OWL) of the World Wide Web Consortium (W3C). Therefore, using the example of a pipe being connected to a tank, we could say, “This is a TANK. It has ID ‘TK-1001’. This is a PIPE. It has ID ‘50-11015.’ ‘TK-1001’ is connected to ‘50-11015’ with a FLANGED CONNECTION since May 1st, 2007.” The words shown in capital letters are resi- dent in the central Reference Data Library. The expressions between them are modeled in the generic data model, and the total is implemented in the OWL language. This standardizes not only the data exchange between design and construction processes of different companies, but also allows firms maintaining the plant several years down the road to understand how it was built. Even if operators speak different languages, they can access and understand this information, making things much easier for everyone. Indeed, preliminary estimates predict a 30% productivity improvement in the engineering, construction, supply-chain, operations, and maintenance phases of plants that implement interoperability. An additional benefit of this project is that the Reference Data Library continuously grows with information added by contractors and owners. This work-in-progress Reference Data Library allows users to add information about parts, activities, and processes, and use them immediately. The con- tinuous expansion of the Reference Data Library increases the capacity and promise of this project (Fornes 2007). Another standardization project, the Industry Foundation Classes (IFC) data model, is a neutral and open specification that is not controlled by a singular vendor or group of ven- dors. It is an object-oriented file format with a data model developed by the International Alliance for Interoperability (IAI) to facilitate interoperability in the building industry,

35 and is a commonly used format for BIM. The IFC model specification is open and available. Because of its focus on ease of interoperability between software platforms, the Danish government has made the use of IFC format(s) compulsory for publicly aided building projects (Open Format 2007). For software applications to achieve interoperability, con- sensus must be achieved in the order and classification of data. Webster’s Online Dictionary defines standard as “something set up and established by authority as a rule for the measure of quantity, weight, extent, value, or quality.” The online FIATECH Data Standards Clearing House adds the term con- sensus standards, “which are a voluntary consensus by various joint industry groups to use/adopt agreed on consensus stan- dards and that may not be a true standard in the sense of being ‘established by authority’ as the dictionary definition suggests” (Teague 2005a). Teague lists several key characteristics for comparing and understanding industry efforts: • Domain: Subject domains addressed by the interoper- ability standard; that is, industrial facilities versus trans- portation facilities. • Traditional Type Versus De Facto Type: Traditional standards efforts refer to those sponsored by an offi- cial standards body through an official creation process. De Facto refers to voluntary adoption and usage by consensus of participating member organiza- tions without having gone through rigorous official creation processes. • Intellectual Property (IP)—Open Source or Proprietary: Refers to ownership of the software code as well as the ability or freedoms to view it and/or modify it. • File Format: Describes the data’s file exchange format. • Usage-Driven Versus Comprehensive Focus: Refers to interoperability standards developed only for information that multiple organizations agree should be exchanged across organizational boundaries. Comprehensive focus refers to standards completely open and flexible to any data exchange. • Multiple-Domain or Single-Domain Scope. • Scalability: A measure of how easily the standard can be utilized by very large, complex organizations. • Extensible Versus Fixed: Refers to the ability, or not, for two parties to extend the standard for purposes of a particular commercial exchange without having to wait for a full standardization process. • Exchange or Repository Sharing: Relating to broad scope or focused scope, it is the standard application to applica- tion or shared by a central repository (application). • Exchange File Content: Data must be exchanged with defined data names, data types, and structural relationships. • Data-Only Contained in the Exchange File: Data that are provided to applications in separate data model def- inition files. • Data-Model + data: Employs a data exchange file that includes both the data model and the data. Abstract Domain Versus Partitioned Domain The abstract domain layered data model approach has been adopted by several standards efforts (e.g., STEP and ISO- 15926) to address the need to support multiple domains. This approach meets the characteristics of supporting multiple sub- ject domains, but with an added cost and complexity for appli- cation owners who must construct software maps that match up the application domain terminology with abstract data model terminology required by the standard. This requires in- depth knowledge and expertise of both the abstract data model and the domain data model. Unfortunately there are few people who have this combination of expertise. Ben Nelson, Kansas DOT and Synthesis Panel member, states: [O]ne of the great complicating factors of interoperable software is having well-defined metadata of the data that are being exchanged. Processes that are able to move one piece of data from one system to another could have great flaws if the data have different metadata. This is, if the data do not satisfy the characteristics of the receiving system, great harm could be done in the engineering field. This leads back to needing detailed def- initions and even a knowledge of the data ontology . . . this need is met by having an universally defined set of objects as dis- cussed. Fortunately, the DOTs, as the paper suggests, have gone a long way in defining objects such as the pay items listed in con- tracts using AASHTO definitions—as well as design objects contained in the AASHTO design guide (and other guides such as the Manual of Uniform Traffic Control Devices). Further, private-sector companies that dominate in this area have made progress in defining objects in enough detail that the objects can be considered the same object in terms of the input and output of the systems in which they are shared (B. Nelson, Kansas DOT, personal communication, 2007). The partitioned domain approach can accommodate by partitioning the multiple domains into some domains, which are general-purpose and reusable, and other domains, which are very subject-specific. However, both the general-purpose and the subject-specific domains use commonly understood “domain terminology” that applies concrete, commonly understood terminology. For example, units of measurement are a general-purpose domain, but are expressed using com- monly understood terminology. For XML-based standards, the use of XML namespaces allows sufficient partitioning of the domain data model to enable multiple disciplines or groups to work independently of each other, while providing a common core set of reusable data models that can be applied across multiple subject domains. The most important advantage of the partitioned domain approach is that writing application software interface map- pings to domain-based applications is easier to match up with domain-based terminology in the standard. Therefore, build- ing software maps is less complicated and less expensive

than matching domain terminology to abstract terminology (Teague 2005a). Business Practices Business practices (and mindsets) requiring change or re- engineering appear to be as follows: • The practice of designing in 3D versus 2D. • Standardized CAD practices (i.e., defining data layers so participants can easily find objects and data from model to model). • The practice of involving project stakeholders earlier in the project life cycle. With BIM, the designers and builders collaborate early in the life-cycle process to virtually construct the facility, perform constructability reviews, and solve problems with the model before they occur in physical reality. This hurdle may require cre- ative and innovative solutions to realize the benefits. Whereas TIM would lend itself well to design-build project delivery, the predominate design-bid-build delivery is another story as contractors are used to getting 2D drawings at theoretical 100% design completion. • Legal issues in which innovative cooperation will be required to solve potential liability issues when sharing and relying on digital data supplied by others. Insurance mechanisms and creative agreements must be put in place for the laws to evolve to this technology. The delivery sys- tem does not fit current precedent construction law. Human Resources and Skills Agency human capital resources, the information workers, personnel, and staff involved in the TIM delivery methodol- ogy will need to acquire additional skill sets. Architecture, engineering, and construction management colleges and uni- versities in the United States are currently in the process of embedding BIM curriculum into their programs; this should eventually benefit the DOT agencies. As a result of BIM cur- ricula, higher education programs are finding that designers are requiring more construction management skills and that the construction students are required to learn design princi- ples they previously could ignore. This phenomenon will no doubt be also true with professional stakeholders involved in the BIM/TIM delivery methodologies. New skills and knowl- edge will include: • Cross discipline knowledge (design-construction); • New software application training; • IT skills (data exchange, schema and mapping, hard- ware and networking); and • Collaboration, cooperation, and communication skills. 36

Next: Chapter Nine - Conclusions and Suggestions for Further Research »
Information Technology for Efficient Project Delivery Get This Book
×
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

TRB's National Cooperative Highway Research Program (NCHRP) Synthesis 385: Information Technology for Efficient Project Delivery explores "best practices" for the seamless sharing of information throughout all phases of the project delivery process.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!