In the world of big facilities, chemical owners lose $6 billion every year because of how data is being managed. With the current state of technology, one would think this is something of the past.
“Digital Twin” is the new industry buzz that is a set of technologies that provide an exact replica in its finest detail of a facility in digital format. The technology is often referred to as the magic to solve all process problems in a plant.
The challenge, however, is that to utilize this technology, data need to be aligned throughout the enterprise. The reality is that how we create, consume, and manage data, is a problem that is getting more complicated every year.
Aging technology and facilities, poorly adopted standards and system integrations, antiquated regulations and requirements ─ all hamper effective and efficient processes. A Best Documented Asset strategy seeks to eliminate these unnecessary costs with a holistic and realistic approach.
What is poor data
To understand the impact on a facility, first, let us take a basic look at what the different cost aspects may be for having poor data. It is often discussed in terms of capital projects and the related challenges of how data is exchanged. It is not easy to precisely quantify the cost of poor data and this is where a classification framework can help companies understand the source of these costs from different aspects.
One of the aspects is “data quality dimensions.” These describe things as accuracy, reliability, relevance, completeness, and others. This aspect can be used to provide a rating for a dataset in terms of quality. There are 27 dimensions, which combined, provide intrinsic value for the quality of the dataset. An organization can use this to set priorities for actioning the data.
A second aspect describes the cost of data quality. From one point of view, costs are incurred because of poor quality, causing the need for verification, re-entry, compensation, wrong decisions, etc. Secondly, costs are incurred to prevent poor quality, such as training, monitoring, analysis, reporting, for example.
This may seem abstract, so I’ll use an example of an owner-operator managing their critical equipment using Microsoft Excel. The list contains an overview of equipment such as pumps and items including maintenance schedules, purchase dates, and certification requirements.
The relevance of this data is very high, however, due to the nature of the tool, reliability, and accuracy will likely be low because multiple people have access to and maintain this information. The cost impact is significant, with likely mistakes waiting to happen. It is easy to imagine that a maintenance schedule is missed causing repairs or replacement – which in turn can cause production outages. Larger facilities can have dozens of data sources that can be intertwined ─ compounding the problem.
A strategy for change
Technical data for facilities are, to a great extent, centered around the reliability of equipment and regulatory requirements. For the people that maintain the facility, accurate, complete, relevant, and reliable information is of vital importance. We see in our daily work that organizations are consistently trying to catch-up. The industry is curbing this trend through practical implementations of ISO 15926 (i.e. CFIHOS).
By standardizing the information needed for each type of equipment, organizations can use modern tools to hunt for missing information and providing a more holistic picture of the state of equipment. Knowing what you do not have has become equally important as knowing what you do have. With the supply chain and owners agreeing on this standard, the quality and completeness of the information can be warranted upfront and during design, compared to being an afterthought when the project is completed.
The $6 billion question
As part of the introduction, I referred to billions of dollars lost because of poor data.
Research has shown that during the execution of projects, large and small, many facility owners struggle with effectively communicating with their project partners (EPC firms).
On average, the cost for handling documentation is around 0.3% of a total project for the EPC firm. However, managing the exact same information on the owner’s side is around 2 to 4% of the project cost for each project. The cost an owner incurs from receiving incomplete data, drawings, certifications, and more skyrockets as soon as they take ownership. Re-creating or retrieving a document after the project is finished, is easily eight times more expensive than during the project.
With a global capital expenditure of around $200 billion for the chemical industry in 2020, a fair amount is lost just because of project execution. Upon closure of the project, a lot of the data needs to be integrated with the existing tools that are in use. However, since these systems are often outdated and poorly aligned, there is an additional cost for integrating the data. The result is often that the bare minimum is transferred, leaving behind the operations teams who need this data.
With the industry adopting the CFIHOS (Capital Facilities Information Handover Specification) standard, handover of projects should become a gentler ride.
There remains to be much work for the owner to find methods to integrate data end to end and ensure their assets are representing reality as much as possible. This however is a necessary step to utilize Digital Twin technology in the new normal.
Brian Sallade is the president and CEO of Kinsmen Group. He has built his 30-year career in leadership as CEO, spearheading multiple acquisitions by SNC-Lavalin Group and BlueCielo ECM Solutions, also where he grew revenue by over fivefold.
© 2021 Newsmax Finance. All rights reserved.