Cracking the code: Putting Data at the Center of Digital Project Delivery
Data-centric project delivery has huge advantages over the traditional, document-driven method. It helps ensure clarity, efficiency, communication, and accountability.
Traditional project delivery relies on documents and drawings – often on paper – whose data is siloed and cannot be seamlessly integrated or shared with other applications. Conversely, data-centric project delivery – carried out digitally – produces data-rich intelligent models and containers that allow for real-time integration and connectiveness between multiple functions and stakeholders. This in turn allows for the effective evaluation of the quality rigor of all data sources.
Even so, there is currently far too much emphasis on digital tools and information modeling software, and not enough on work processes – the steps involved in the creation, management, and use of data throughout a project’s lifecycle.
More emphasis needs to be placed on the development of a wider process that empowers people to extract, aggregate, and analyze the data quickly and effectively. To garner the true power of digital project delivery, we need to make a paradigm shift in the way we think and work together.
To get meaningful benefits with quality outcomes, the data needs to be interoperable, such that the stakeholders can consume the data with minimal interpretation or manipulation.
Getting to a point of interoperable data between stakeholders requires up-front engagement and planning to:
- Define the practical stakeholder use cases of the data.
- Prepare and agree on the work processes.
- Prepare and agree on the data exchange requirements, formats, and properties.
- Complete early testing of the data exchange between stakeholders.
- Monitor the quality of data exchange.
- Make the necessary corrections.
The benefits of this up-front work will be felt by all the involved stakeholders and reduce risks involving quality and timeliness of delivery. Besides the obvious reporting benefits that interoperable data would bring, it may also unlock potential innovative use cases bringing further value such as with forward looking performance indicators, insights, and predictive analytics.
To ensure quality rigor, every data-centric project requires accurate source data (digital information), code (programming instructions) and tags (types of assets), which are used to track, review, and share progress. That said, the best source for work processing data is the engineering disciplines themselves as they possess the knowledge and experience to ensure proper data development, formatting, accuracy, completeness, consistency, and validation.
When planning a data-centric digital delivery project one must consider the full gamut of the lifecycle of an asset. It is not just about engineering and construction, but rather about designing with the end in mind. The benefit of this approach comes downstream as it leads to predictive analysis, optimized costs, and the ability to identify and quickly mitigate risks and safety issues in operations. A real-world example would be the ability to predict when a pump in operation is going to fail, and to be able to take it offline during a planned maintenance activity and effect repairs well ahead of failure.
The information generated from the predictive analysis prevents unplanned downtime, reducing idle time and maximizing productivity, thus resulting in sustainable and predictable operations.
A vitally important initial step to a successful data-centric delivery project is the development of master data, the single source of truth that can be published to other environments. For example, if you have trusted master data in a piping and instrumentation diagram (P&ID) you can confidently push that information to other data fields in the project. This is critical, as incorrect, or inconsistent source data, if published to the common data environment (CDE), will populate multiple data fields, and create errors that could lead to sub-optimal project delivery (inefficiencies) or, even worse, significant injuries in construction and operations.
Without well-defined, quality work processes in digital project delivery, you risk several potential issues, including:
- Lack of clarity: It can be difficult to establish clear roles and responsibilities for team members, leading to confusion and delays in the project.
- Inefficiency: Team members may not know how to approach tasks and may waste time trying to figure out what to do, leading to decreased productivity and increased project costs.
- Quality issues: Team members may not know what standards to follow or what steps to take to ensure quality, leading to subpar work and potential rework.
- Communication issues:Team members may have difficulty communicating new ways to work with one another in the data-centric environment, leading to misunderstandings and errors.
- Lack of accountability: Difficulty holding team members accountable for their work, leading to potential finger-pointing and lack of ownership over project outcomes.
The key to all digital delivery is accuracy and consistency, which ultimately drives efficiency and leads to satisfied clients. Clients want projects delivered with a high degree of accuracy from start to completion. They want minimal errors and a project that is delivered on schedule and on budget.
Digital project delivery can provide all that and more. One of the key advantages it has over the traditional method is that clients can virtually follow the design process through interactive design reviews and self-serve access to the CDE. A critical issue for clients was understanding operability and maintainability requirements through progressive design reviews. Design reviews using virtual representations, allow clients wearing virtual reality headsets to conduct “walk throughs” of an operating facility and provide a simulated lifelike “feel" experience of the physical parameters of the valves and pumps from a maintainability and operability perspective. This leads to a better appreciation of the maintainability, accessibility, and reliability of the final deliverable and thus reduces regret work and unrealized production ramp-up targets.
There is an ever-increasing demand from clients to develop a digital representation of their assets (aka digital twin) to gain higher levels of efficiency, predictability, and optimization. This can only be realized through the development of an integrated operations with the underpinning of digital asset management and advanced process control to achieve higher levels of automation, digitization, and predictive maintenance.
The common thread that weaves through all levels of automation and control is the data correlated to the tag (asset) interpolated across the CDE.
The CDE is the integrator and consolidator of project data (one source of the truth). The relationships developed in the CDE provide a single point of access to the digital assets, which improves integration capabilities to integrated construction and operations environments via cloud-based infrastructure. The digital asset is enabled by downstream data models developed in engineering by correlation of design components (tag) to documents.
Creating an engineering data model in this way forms the foundation of the digital asset and enables downstream building information modeling (BIM) processes (3D to 7D) to be utilized. This allows clients to gain real-time insights into all areas of a project throughout its lifecycle.
At Hatch, our global staff of cross-sector experts leverage digital tools and collaborative work processes to provide innovative solutions that exceed our clients’ expectations. We’re eager to share our unparalleled knowledge and expertise with clients who want to optimize all areas of their digital project delivery, including design, construction, scheduling, cost, sustainability, and facility management, and most importantly, with a higher degree of quality and safety.
For more information about Hatch’s Digital Project Delivery services see Disruptive innovation today for the digital future
Global Discipline Director, Information Management
With 32 years of mining and metal industry experience, Sherwin is a respected leader. He spearheaded the engineering development of the first digital underground Potash mine, achieving unprecedented integration. His expertise drives successful projects and he holds an Executive MBA from Queens University and a Bachelor's Degree in Mechanical Engineering from the University of the West Indies.