The value of real-time and historic data in manufacturing

Thomas Degen, solutions engineer industries at KX, discusses how combining real-time and historic data can improve the manufacturing process

Do you remember when you first saw high-definition television? The richness of colour, the greater depth and contrast of images? Manufacturing is undergoing its own 4K moment, driven by the ability to better capture and analyse the petabytes of data that are being created daily in a modern manufacturing environment.

To gain this deeper understanding of the manufacturing process, the quality of data is critical. Yet challenges exist in how to capture, process, analyse and act on the insights contained within massively increased volumes of data.

An ever-growing IoT datasphere

It’s clear that manufacturers are facing a deluge of data. The global datasphere from Internet of Things (IoT) & Industrial Internet of Things (IIoT) devices is predicted to grow to 79.4 zettabytes by 2025. Manufacturers are looking to monitor processes, equipment and environment to understand what is happening and what has occurred in the past.

When there are processes that involve conditions that change in milliseconds, like fault detection and classification (FDC) in modern semiconductor manufacturing, you need to be observing with much higher frequency and resolution and crucially, take the right action in as short a time window as possible. The challenge for many manufacturers is that existing technologies, infrastructure and solutions, are simply not up to the task.

Updating legacy systems

The first step organisations should take is to audit their existing infrastructure, as well as data management and analytics systems. With increased volumes of data expected to be coming through their systems via multiple new touchpoints, outdated tools will need to be updated, and or replaced.

Another point to consider is the types of data expected to be generated and analysed. For example, if a manufacturing firm is looking to use machine learning to run complex automated actions, it will require as much rich, historical real-time data as possible. This will then be able to generate the underlying Artificial intelligence (AI) models and algorithms needed in order to accurately inform the automation in real-time at runtime. Being able to compare this newly generated data with historical context will also be required, to generate true operational intelligence.

Automate to accelerate

Streaming analytics is the technology that enables this automation, allowing manufacturers to collect and analyse data in real-time at the edge of their network as well as at the data centre, while comparing it to historical records and context. The potential benefits such technologies can afford manufactures should not be overlooked. Not only can AI be used reactively, but, it can also form trained models to run in real-time in order to identify events and anomalies before they even occur.

However, when implementing automation, operations need to be careful. Human-machine collaboration is key to avoid the model which is being applied starting to make incorrect recommendations based on new conditions at runtime that were not included in the original scope. Hooks for human interaction have to become adopted amongst all automation put in place.

The correct checks and processes should be put in place when it comes to building a trained model, as well as when it should be applied. These fail-safes need to be in place not just to ensure efficiency and product quality, but also for the health and safety of those working within the business.

Predict the future

Together, the availability of rich data pools coupled with streaming analytics software enable ‘microsecond’ decision making that ensures high-tech manufacturing organisations can implement automation for better control of manufacturing processes, increased uptime, yield and revenue.

The opportunity is huge. However, to truly benefit, having the right data quality and the skills to use these technologies effectively will be crucial as manufacturers continue to tackle the ongoing data deluge in order to drive continuous operational intelligence.

Written by Thomas Degen, solutions engineer industries at KX

Editor's Choice

Editor's Choice consists of the best articles written by third parties and selected by our editors. You can contact us at timothy.adler at

Related Topics

Data Science