A Ford vehicle without brakes could roll away from drivers. A big data project without limits could roll over Ford.
Ford partnered with IT service management company GFT Technologies, which leverages Google Cloud technology to develop an application Ryska conceived of back in 2004, before the widespread use of edge computing and cloud computing. The application monitors material inputs and most significant process parameters on a large production metal stamping line that produces outer panels for F-150 pickups and Transit commercial vans.
The stamping press runs at 900 parts per hour and generates a flood of data, including an image of every part produced on the line, that Ford hosts in a Google Cloud environment. Through visualization tools and analytics, engineers can understand the distribution and variation of material inputs on the line to ensure part quality.
Any manufacturer facing a sizeable number of pilot applications—each with potentially huge data sets to capture, store and process—can easily feel overwhelmed by a big data project. Ryska has a methodology to get the data truck rolling without having to ride the brakes.
Let the Experts Choose the Use Case
Ryska has 28 years’ experience in automotive manufacturing and much of that involves metal stamping. This gave him a leg up when considering what processes to analyze. Domain experts, he believes, serve key roles in determining what a big data pilot application should focus on.
“Don’t let non domain experts or don’t let the industry define the problem for you because you’re going to get lost in the implementation and data collection,” Ryska says. “You’re going to try to perhaps implement things that aren’t necessary. Engage the domain experts and let them define the problem.
“Having that inherent practical knowledge or process knowledge is helpful to guide the process through implementation, because somebody has to make sense of the data,” Ryska continues. “Once you create the visualizations and you want to do the analytical models, you need that gut check of does this make sense? And how do we point it in the direction to solve a particular problem?”
Specific Benefits Can Generate Wide-Scale Improvements
When determining the scale of the pilot, Ryska believes manufacturers should consider whether the insights they want to glean apply only to the specific process in question or whether those insights could apply to other, different processes.
“Some manufacturers may be looking for implementing a point solution. If you take an automotive manufacturer, are we focused on a solution that is in stamping?” Ryska says. “Are we focused on something that can be applied only to assembly or machining operations? Or are we engaging in this space looking to learn something that can be applied across a multitude of manufacturing processes globally?
“Some manufacturers may be looking to leverage the capabilities of data collection and analytics to drive efficiency and only one part of their process. Maybe it’s one plant that they’re starting up and has high importance within the company or it’s a problem within their manufacturing footprint,” Ryska continues. “So, understanding the scale of what you want to accomplish is important. And then from there, you pick a relevant problem that is going to apply or give you learnings to scale appropriately.
“So, say if it’s broad based, you want to go to a more generalized solution. How do I improve quality? How do I understand cycle time and improve OEE? If it’s a very pointed solution, you’re going to get into the specifics of that particular manufacturing process. And maybe it’s the processing point of where the value is added,” Ryska says.
Determine Parameters for Success Early
If companies have already implemented a specific technology and documented the results, Ryska believes a manufacturer is well-served to do the research prior to considering the same technology.
“If it's a technology where other companies or other industries are implementing with success, and it’s documented, then you should be able to set up clear metrics along the way to guide the team and ensure that you’re achieving similar or better results,” Ryska says.
That said, when dealing with a brand-new technology for which prior use cases do not exist, the domain experts should guide the vision of what’s possible. “If we don't take the chance and push technology to the edge or try to apply it in ways that we haven’t done before then we’re simply following the industry and perhaps we wouldn’t maintain our competitive advantage,” Ryska says.
You Don't Have to Measure Everything
Leveraging Google Cloud technology allows Ford “elasticity” of data storage. Companies leveraging the cloud face the temptation to measure every data point available. If you have a tremendous amount of storage space, why not collect huge datasets and make the decision later which to keep and which to throw out?
Ryska believes that if a manufacturer really understands the problem it’s trying to solve with analytics that manufacturer also understands the math involved and which data are actually important.
“Start with that domain knowledge and collect realistically what you know with your expertise are the significant variables and parameters,” Ryska says. “If the inherent knowledge says some data is not a relevant part of the information, put that other data into a different problem statement and perhaps an expanded scope.”
The difference between necessary and extraneous data is not always easy to determine, however. “Some variables or process parameters, we’ll call them on the bubble,” Ryska says. “There are those that are binary. Yes, this absolutely impacts the dependent variable they output. This one is absolutely no. And there are some that could have influenced, but we don’t know. We consciously said, we don’t need to start with those [maybes].”
Not capturing extraneous data in the first place makes the cleansing process—throwing out data unnecessary to solve the specific problem—much easier. “There’s so much to be learned from a process standpoint of collecting the data, getting it linked together in series format so that you can draw the correlations, cleansing the data, visualizing the data and doing the analytics,” Ryska says. “There’s enough starting with a subset of variables. There’s a substantial amount of work in that to prove success. Omitting some data or not including data does not mean failure.”
You're Testing More Your Data
Ryska's team ensures technology initiatives mesh with Ford’s larger corporate goals and enable financial performance or product launch cycles. In this case, the big data project fits into Ford’s connectivity initiative, pursuing the idea of “always on factories” with connected equipment, vehicles and people. It requires technology partners like GFT Technologies and Google Cloud. Unless a company has all the necessary capabilities in-house, technology initiatives are pilots for partnerships as well as applications.
“This pilot just isn’t what’s inherent to sheet metal forming. It’s the entire process of connectivity. And it’s the relationship and collaboration with Google and their engineers and scientists and Ford engineers and scientists. So. It’s that collective process that we wanted to pilot and prove out before we go to scale,” Ryska says.
“And, I think we have a better understanding of both sides of, if you turn this on and if you have to do more, what does that look like?” Ryska continues. “What are the resources and what’s a reasonable timing and what are the engineering activities or issues that we can predict that will need to be solved as we approach different applications?”
And always remember that if you’re having trouble figuring out to scale a big data project that you are not alone. “Everybody is susceptible to chasing shiny objects,” Ryska says. “The key thing is to have strategic presence or leaders within the organization that can navigate that territory.”