Data Quality: Disrupting the Disruption

Utopia Marketing | March 08, 2021 | 3 min read

These are times of radical changes in the industry! Driven on one hand by the traditional factors we have grown accustomed to adapting to, and on the other hand with new environmental factors caused by a global pandemic which requires every organization to change, adapt and pivot. The quality of your data assets are more critical than ever. Highly accurate and complete master data allows an organization to make faster, smarter decisions, cut costs, increase efficiency and ultimately, keep the lights on during tumultuous times.

Often the term data quality is used like a synonym for customer data quality. So, people think of CRM systems and the flood of duplicate organizations or individual customers they find in their database that make their daily life in marketing and sales so hard. But, customer data quality is just one component in an organization’s data landscape. In modern ERP systems like SAP S/4HANA you have a host of data objects which drive your critical business processes across your organization and beyond across the whole supply chain.

If you are an asset intensive organization then customer data is of course important, but even more so is the criticality of your production assets like equipment, functional locations, maintenance plans, task lists or bill of materials.

In large global organizations you find historical assets, so called brownfield assets that have been in operation for decades. Often the information on these assets is not documented anywhere, or in the best case, can only be found on paper somewhere in a drawer. As you operate such critical assets, you need to ensure the health and safety of your workers, you need to ensure that only trained personnel is executing the regular maintenance and you need to have a holistic view to support shutdowns and turnarounds. This is only feasible if you have a central asset system of record with up to date, complete and accurate information. Think of how easily you can make the wrong decisions if your data is missing or inaccurate!

Take a pump with all of its spare parts and technical characteristics. If you don’t know the operating conditions like humidity, temperature or that if it can be operated in a chemical environment then your support and maintenance engineers have to tediously go onsite (often to remote locations like wind parks) and check what is being operated.

A better approach is to collect the data with an intelligent process – e.g. taking pictures of the assets nameplate with a mobile device and then deriving all necessary and accurate information automatically through machine learning processes, then enriched with web-scraping and anonymized crowdsourcing. Another key step to enhance data quality after initial capture is to keep your asset registry clean. This requires that changes being made are consistent, standardized, correct and approved by your domain experts. With a Data Governance system like SAP Master Data Governance you can create and maintain all of your critical assets with high data quality. The system enforces data consistency and accuracy through real-time checks, business rules, duplicate checks, and finally workflow-driven approval processes.

Given the dramatic changes in all industries across the globe, all organizations are evaluating their processes, how they operate, how things can be contactless and executed remotely. This is only feasible if your data foundation is giving you a holistic view on the reality of your operating environment and assets. If you find empty fields in your database, for example the ABC indicator, how can you determine if the asset that you look at is critical or not? If you want to do sophisticated predictive maintenance to find out if an asset is about to fail, you need to know all the details of this equipment- the material type, the technical specifications, the operating times, maintenance cycles, historical failure, environmental conditions and a lot more. Without this data foundation in place – how can we expect that sophisticated machine learning processes derive correct predictions? How can we implement next-gen advanced automation initiatives like Industrie 4.0 or IIoT? Simply put, we can’t.

In times of change, organizations need to convert to a robust and resilient operating mode to survive. The need for digital transformation is required not only to assess the current processes, but equally so to clean up historically grown, distributed, and cluttered data systems. It has always been important to invest in data quality – but now it is the foundation of automation, innovation, and the ability to stay ahead of the competition. If you don’t have your data under control, then you don’t have your organization under control.

Contact us today for a 15-minute discussion with one of our Subject-Matter Experts

SCHEDULE A CALL