Industry 4.0 has gone quickly from a buzzword to a standard operating strategy for businesses, including manufacturers. If you aren’t at least thinking about how to leverage data to improve your manufacturing processes, you risk being left behind.
But the implementation of Industry 4.0 in a plant or factory is not an end-point or checklist exercise. Rather, it is an ever-evolving philosophy that will continue to incorporate new concepts and to branch out into different, unexpected directions. So what comes next?
Firstly, let’s recap where we are now. Different organizations are at different stages but one thing is true for all those on an Industry 4.0 journey – namely, that they are trying to sift, sort, discard, rank and, most importantly analyze, the data coming from their production lines.
Within the process control environment there is a huge amount of data being produced – frankly too much for any one person to make sense of it. As a result, there is a real danger that operators are being blinded by a blizzard of information and are missing real opportunities for process and quality improvement.
The answer to this problem lies in augmenting human expertise with artificial intelligence (AI), not only to analyze but also to predict and ultimately prescribe actions. Correctly predicting what will happen next in any given situation and to prescribe corrective action to prevent it is extremely valuable if it can be done affordably, at scale, and early enough to meaningfully change outcomes, such as the elimination of wastage/scrap. In short, the goal of AI is to leverage Big Data to radically lower the cost of “non-quality” by prescribing optimal parameter settings to operators.
This commoditization of our ability to predict the future at scale, is, unsurprisingly, revolutionizing every discipline it touches. The most obvious examples of this revolution are Google, Facebook and Amazon, where AI has radically improved the applicability and efficacy of those platforms’ mass-market advertising.
In this example, AI is used to sift through massive quantities of online and seemingly unrelated, behavioral data generated by billions of users. Then, using AI they can predict in real-time the best answers to search queries or the most useful ads to display for each individual user.
The same process can be used with manufacturing data. Real-world production lines, always include multiple process flows, some in parallel, some sequential, all more-or-less impacting each another and each requiring a specific combination of production variables to achieve optimal efficiency and lowest wastage.
AI operates in these complex manufacturing environments, where historical production and quality data from multiple sources are ingested to provide a unified view. Then it’s possible to process this data in real-time, through AI algorithms, to pre-empt potential quality problems downstream and proactively prescribe corrective actions to prevent variance in quality.
This is not theoretical. It is possible now to use advanced forms of supervised and unsupervised machine learning to create a digital twin of a plant and to discover the optimal operating regime for complex, multi-step, industrial processes.
Traditional statistical methods struggle with the quantity, rate and diversity of data in large-scale manufacturing processes. Where statistical process controls strain to account for and deal with the unintended consequences that typically propagate through large, complex production lines, AI algorithms can analyze cascade effects and successfully make predictions, suggesting operating parameters in order to avoid unintended consequences and optimize quality.
The fine-tuning of production parameters has traditionally been performed at cell or machine-level in relative isolation from up and down-stream variables, which means it has been almost impossible to identify the impact of upstream changes or halt the propagation of further problems.
AI routines enable controls to be set based on the relative impact of every cell or machine in a production line. By simultaneously leveraging data from multiple sources it is possible to achieve a uniquely holistic perspective against which to set production parameters. The goal is to achieve the optimal operating state for a production line, effectively reducing the chance of defects occurring at every point in the production process.
However, the proof of AI’s ability to revolutionize manufacturing processes comes from real-world cases. For example, at DataProphet, we have been working in recent years with one of the largest automotive suppliers in the Southern Hemisphere — they manufacture Daimler engine blocks. The plant had significant problems with high scrap and rework rates, ultimately impacting its costs.
The solution was to gather 15 months of production data from all parts of the organization, often in various formats, from Excel files to access data and then use predictive modelling to identify optimal operating parameters, and identify engine blocks that would go on to be defective.
The end result was a 50% scrap-rate reduction in the first month of operation; a 0% external scrap rate within three months; and an annualized saving of $1 million. In fact, following the deployment, for the first time in the history of the company not a single defective casting was produced.
AI solutions allow users to map an entire manufacturing process via analysis of the multi-source historical and real-time production line data, offering operators a holistic view of the production and quality data relating to every unit produced. Consequently, users can review reports prescribing optimal parameter changes, and compare the efficacy of current operating parameters with past parameters.
Frans Cronje is the managing director of DataProphet, the developer of the OMNI artificial intelligence platform for manufacturing. Contact him via LinkedIn.