Managers need access to the right information at the right time in order to solve today's most pressing business challenges. Without this they are lost. But, simply having this information is not enough. Managers need information in the right context of their business as well as the ability to add, refine or "slice and dice" intelligence in any number of ways.
In a perfect world, the process of drilling down into the problem should be possible from the convenience of your own desktop or laptop. In reality, this type of exercise typically requires a formal request to corporate IT, which is often bombarded and overloaded from similar report changes from all other users. The time required to access this information might be days, weeks or even months. What makes it worse is that this process must be repeated until the root-cause is found. This process creates a huge obstacle to quickly address the intelligence requirements of today's managers, potentially leading to downtime or a loss of productivity.
For example, a process engineer might need to know why their production line is now generating more scrap material. This change could be attributed to any number of factors, such as out of specification raw materials, resource availability, equipment maintenance, poor performing employees or simply a process execution issue. With the right data and analytics available to quickly extract meaningful intelligence, the process engineer can investigate all of the factors -- and different combinations thereof -- in order to find out exactly what the leading factors and variability are that is causing the excess scrap.
Without systems support to enable better, more detailed understanding of operational performance, managers might simply "go with their gut." In other words, if getting the data requires following a formal process to submit a request for a new query, justify the action by filling out a form, and so forth, some may elect to bypass this extra work.
More resourceful plant engineers might elect to implement a few of their own "decision support" tools, which might include a "shadow IT" consultant, typically without the support of their corporate IT department. Raw data can then be massaged in an Excel file, often giving enough information for more informed decision making. Regardless, the workaround policy sometimes works, and sometimes it doesn't - but failures can never be understood and successes can seldom be replicated. Even worse, knowledge gained during the process is rarely shared across the enterprise.
A Framework for Success
One way to more efficiently and cost effectively address these challenges is to empower shop floor employees and engineers with greater visibility to manufacturing intelligence and ad hoc reporting from within a tool set that is readily available. With this capability, greater intelligence can be extracted faster, contributing to improved productivity and throughput across the factory. Fortunately, modern IT systems now exist that contain this capability. If you are considering such an upgrade, here are four attributes you should definitely consider in order to maximize the benefits.
- Standardize your business processes and data collection methodologies: Standardized data can be used faster, as no "cleansing" is necessary. Further, when data can be instantly used, the concept of real-time can then be accomplished, helping to further accelerate time-to-decision. Clean, readily available data is more intuitive, easier to use and faster to analyze between and among departments. This is especially important from multiple sources so that you can guarantee you are comparing business processes equally and fairly (apples to apples comparison).
- Make sure the analysis is based on your business processes and problems: Data that is being fed from your execution systems needs to be "transformed" into a structure whereby the right information is being tracked, and these business variables are being linked to the actual transactions being performed. Examples include reported quantities, employee and shift information, product and product group hierarchies and any other "factors" critical in performing day to day activities.
- Embed analytics in your standardized business processes: By embedding analytics within common processes, intelligence gathering can be performed instantly, including the automatic triggering of alerts for common events to further streamline and automate business processes. Notably, this will better synchronize your reporting capabilities, resulting in simplified and accelerated issue resolution. What is the point in showing an employee that they performed poorly at the end of their shift? This information needs to be presented as they are performing their daily operations, in real or close to real time.
- Expand business processes beyond single departments or locations to be more holistic: If one plant is experiencing an issue, chances are others might be too. By expanding notification and resolution of issues to others in related roles, you can continue the streamlining process to other locations. Everyone should be able to share the same analysis to ensure the knowledge and intelligence is shared, including the vision for how the factory needs to operate and perform in order to best meet future challenges.
In today's data-rich manufacturing environments, the challenge now is how to best make use of all the data now readily available. Data does not mean information. Simply having it doesn't lead to performance improvement. It must be analyzed, evaluated and then distributed in order to maximize the potential for performance improvement. Lots of data isn't necessarily great. In fact, too much of it can cause confusion. Getting this data at the right level, with the right "context" is fantastic, especially if it can be made readily available to then let you act (and look) smarter, even if you might have a funny Australian accent when you speak!
James Montgomery is a product manager at Apriso,a provider of manufacturing software solutions. He happens to be Australian.