With the scale of big data poised to grow significantly, manufacturers will need to be proactive about separating the critical signal from the noise.
The image of a needle in a haystack doesn’t begin to portray the challenge of big data in manufacturing. The depth and breadth of data produced in even small manufacturing environments is equally exciting and overwhelming. Every insight potentially has value, but finding the right one at the right time within an always-expanding volume of data often proves challenging.
A better example is to think of useful data as a signal amid a sea of noise. Just one or two metrics can provide invaluable business intelligence, but, unfortunately, they often get buried by the irrelevant information that surrounds them. As a result, effective data management in a manufacturing environment is a time- and labor-intensive process that frequently produces underwhelming outcomes or ongoing disruptions.
At this point, no manufacturer can afford to dismiss the potential of big data. Industrial Internet of Things-based systems are estimated to create $4 trillion to $11 trillion in new economic value for manufacturers by 2025, according to McKinsey Global Institute. That kind of industry expansion is almost unparalleled since the industrial revolution, underscoring just how transformative this technology is.
In a study of manufacturers conducted by the MPI Group, 76% of respondents reported plans to increase their use of smart devices in the next two years while 66% plan to increase their investment in IIoT-enabled products over the same time period. With the scale of big data poised to grow significantly, manufacturers will need to be proactive about separating the critical signal from the noise.
Improving Data Management in 4 Steps
By rethinking how they handle data management, manufacturers of all sizes — and across all industries — can find more of the in-depth, on-time insights big data is supposed to reveal.
Here are four steps that will jump-start this transformation:
- Recognize human limits and the burden of isolation. Any revision to a data management strategy has to acknowledge the most common and consequential challenges.
First and foremost are the limits of human scale. In this day and age, it’s simply not possible for a human data team to glean actionable insights in a timely manner. Further, monitoring data in isolation also limits its utility. Large-scale interconnected processes require a heterogeneous approach to monitoring in collaboration with the entire universe of the system.
- Forget the traditional supply chain cycle. In the past, complex supply chains were typically seen more as impediments than advantages. Thanks to the IIoT, however, that complexity can now be leveraged to produce actionable data from every step of the cycle.
Locating materials, servicing equipment, and monitoring the productivity and efficiency of the system as a whole are all improved significantly thanks to new data sources. Today’s data managers need to realize that the complexity of data-driven systems is exactly what gives them order and function.
- Embrace the cognitive potential of analytics. Production programs like Lean and Six Sigma have successfully reduced the uncertainty in manufacturing, but they have failed to eliminate it entirely. Making timely, targeted corrections remains a major challenge that directly degrades yields.
Cognitive predictive maintenance provides the kind of granular, up-to-date information that’s essential for optimizing processes, streamlining workflows, and predicting maintenance needs. If and when a problem is detected, it can be corrected in a way that maximizes improvements while minimizing disruptions and downtime.
- Focus on the cost of warranty claims. Standardizing production processes in order to limit the number of warranty claims is a challenge for all manufacturers. Hortonworks estimates that up to 3% of the annual revenue in automobile manufacturing is spent paying warranty claims alone.
Focusing on the metrics that compromise standardization can eliminate this multimillion-dollar expense while improving the reputation of the brand and the loyalty of the customer base. Rather than hemorrhaging money as part of a reactive approach to defects, manufacturers can save on warranty claims through predictive maintenance.
Engineering a New Approach to Data Management
In preparation for making sweeping changes to the way a manufacturer gathers, stores, and analyzes data, it’s essential to lay a foundation. When gearing up to take the four steps described above, be sure you know the answers to the following questions:
- What is the business case? Big data only has value when it’s leveraged to achieve specific strategic objectives. A detailed business case for the IIoT and smart data management practices creates a direct link between actions and outcomes.
- Where are the problems? Finding the signal in the noise is only possible by limiting the scope of data analysis. Identifying the metrics that lead directly to improvements in productivity, efficiency, predictive analysis, and problem-solving empowers data managers to separate the relevant from the irrelevant.
- Who are the experts? Machine learning allows systems to evolve and improve through efficient automated processes. Making these processes as effective as possible requires domain experts who can make customized corrections while ensuring that insights can be operationalized.
- When is optimization required? In order to keep the cost of data management from spiraling out of control, manufacturers must identify the most ineffective or undervalued aspects of the system and direct their efforts accordingly. Focusing on automated closed loop systems tends to have the greatest impact.
The physical and digital worlds are converging today in ways we only imagined in our wildest dreams. Machines and manual processes that have been part of factories for centuries are disappearing before our eyes, turning into algorithms and automation.
As a result, today’s manufacturers must make a concerted effort to separate the signal from the noise. Doing so will effectively lower costs, improve outcomes, and create a level of predictability and stability that has never before been possible.
Sundeep Sanghavi is a highly accomplished data junkie, innovator, and entrepreneur with more than 20 years of experience in using data as the currency to perform advanced analytics. He has learned to productize data in day-to-day business workflows, which has led to multibillion-dollar savings. Sundeep founded Razorsight, a leading provider of cloud-based analytics solutions for communications service providers. He previously served in management roles at industry leaders, including Cable & Wireless Worldwide and Arthur Andersen, and co-founded DataRPM to tackle the business problem of maintenance inefficiencies in the industrial IoT.