It has been reported that around 90 per cent of data generated in a manufacturing plant typically doesn’t get used to build insights that can really help the business. While this is concerning, it’s not surprising since data is often siloed and unavailable to the people that could benefit from it most. In this article, Beth Ragdale, Product Manager at automation specialist Beckhoff UK, explains the value of consolidating production line data.
We are already in an era of big data, but it is estimated that the total volume of data doubles every two years. The colossal volume of data is something that threatens to undermine our capacity to derive value from it, especially in the world of manufacturing. Manufacturers expect data volumes to increase by more than any other industry.
The Era of Big Data
The potential from this data growth is enormous. Sensors can detect recurrent failure patterns; models can solve bottlenecks and optimise processes; analytics insights can improve sustainability. A joint study carried out by Honeywell and KRC discovered that harnessing big data analytics can lower breakdowns by up to 26 per cent and reduce unscheduled downtime by nearly a quarter.
However, without the correct data architecture in place, the opportunity is lost. This is why many forward-thinking manufacturers are already moving away from the era of big data and talking instead about smart data.
While big data in manufacturing comes with a number of challenges, perhaps the leading issue is that for as long as the data is captured in siloes, nobody is able to grasp the bigger picture. This is why, according to an often-cited report from Forrester Research, 73 per cent of all data collected in an organisation goes unused.
Data siloes result in fragmented information, preventing a comprehensive view of operations. When data is confined to a specific department or system, it becomes challenging to gain a holistic understanding of the entire manufacturing process. All the elements of a manufacturing process are interconnected, so while data is isolated in siloes it hinders the ability to optimise processes and identify areas for improvement.
Implementation of Data Governance Practices
Overcoming this challenge requires a strategic approach involving the integration of systems, the adoption of standardised communication protocols where possible and the implementation of comprehensive data governance practices. Technology like TwinCAT is key to this, as it allows all hardware and software on a production line to communicate via an open range of protocols, enabling everyone to see the data they need.
In the shift from the era of big data to the world of smart data, the ability to use real-time data insight will also be key. Previously, there has been more of an emphasis on spotting patterns in historical data. However, the future of manufacturing will belong to those who can gather and utilise real-time data. This becomes a greater competitive advantage as stable product lines become less common and the demand for flexibility in manufacturing increases.
Technologies like EtherCAT will be key to fulfilling this promise. EtherCAT has widely been adopted in the automation world with over 7000 members due to its speed, simplicity and efficiency. All EtherCAT devices can be connected to, collect information, and respond in real-time using the Ethernet backbone, enabling seamless data flow within the flexible network. Real-time insight is the holy grail of industrial connectivity, and EtherCAT architecture facilitates that.
Consolidating production line data will be a major focus for investment in the coming years. Analysis from ABI Research estimates that manufacturers will spend $20 billion to transform and support data analytics by 2026. With the right data architecture in place, the continuous growth in manufacturing data can be a strategic asset. The key will be utilising technologies that prevent data from becoming trapped in siloes.
For more information:www.beckhoff.com