Morgan Zimmermann is chief executive officer of the EXALEAD brand at Dassault Systèmes. EXALEAD provides data discovery solutions that allow companies to manage their information assets for faster, smarter decision-making, real-time unified data access, and improved productivity. On the eve of Manufacturing in the Age of Experience, IndustryWeek spoke with Morgan about the changing role of data and analytics within manufacturing organizations—and how they are helping to drive decision-making and optimize operations across the enterprise.
Question 1. "Smart" manufacturing is something of a buzzword today. What do you think it means? Is analytics at the foundation of it?
Morgan Zimmermann: When it comes to smart manufacturing, it's not one buzzword but the aggregation of many buzzwords—industrial internet, IoT, digital twin, you name it—all of which are related to analytics and digital transformation. Thanks to new technology, we now have the ability to capture far more insight and far more data than we ever could, starting with the way we use the data coming out of the sensors.
I would like to differentiate what we call 3DExperience twin against what the industry calls digital twin. Many companies and organizations are talking about digital twin or digital threads, and what they mean is the ability to map data—real-time data or past data—into an abstraction or representation of a physical product. So you get sensor information to gain a good visualization of the data and of the physical product—whether it is a machine in a factory, a manufacturing line, or a whole plant. Then, you can use advanced analytics and predictive capabilities to anticipate problems. We believe that this is very limited in terms of perspective because it is leveraging a single engine: a data science engine.
In the 3DExperience platform, we have positioned two engines: a data science engine and a modeling and simulation engine. We do not project the data on a virtual representation of a physical product; instead we map the data that we get on a model-based virtual product. That makes a very big difference. With traditional predictive and digital twin, you can only predict what will happen and mitigate impact on your future. But when you have modeling and simulation capabilities, you can play and configure your future.
As an analogy, let's say you are a businessman and your predictive models tell you that it is going to rain tomorrow. The best you can do to mitigate the impact of the rain might be to take your umbrella so your head won't get wet. But your shoes will still be wet, and you will still be late to your meeting because of traffic. What we would do at Dassault Systèmes is, because of the rain, we would remodel the whole agenda automatically so that you can work from home in the morning and leave when the rain is over. When you have a modeling and simulation engine, you can learn from the data to understand how you want to rebuild your factory, how you want to reorganize your logistics, how you want to reorganize your product, and how you want to reorganize the way you manufacture your product to be capable of adapting with far more agility.
Question 2: You recently discussed in an interview that traditional analytics don't work with product lifecycle management (PLM) because analytics is about transactions and PLM is about product structure. Can you elaborate, and what is the solution?
MZ: When EXALEAD was acquired by Dassault Systèmes in 2010, I was asked at the time if I would be capable of using our technology to do advanced analytics in the PLM landscape. I thought "yes," as we were already doing analytics every day for logistics optimization, CRM, finance, and supply chain management. But the fact is that it took us many years to develop the technology stacks capable of tackling analytics in the PLM space.
Traditional analytic technologies have been built to manage “transactions”, which are rather simple objects, they have a beginning and an end, and are mostly characterized by numbers. Numbers are easy, you can add them, multiply them—and that fits very well into a table, a cube, or a hypercube. Now, when we go into PLM, what you manipulate is different; it is graphs and semantics.
As an example, when you weight and balance analytics, you need to contextualize the weight to a configured product; in practice that requires technology that supports dynamic and configurable graphs. But it is even more complex than that, as in PLM, at every node of your graph, you have a part or component on which you apply complex and collaborative business processes that are mostly characterized by status, very often described with text and semantics (issues, requirements, etc.).
You can try to model that in any type of cube, hypercube, supercube, or superhypercube—but it will never work. You need a whole new technology stack to do that which has the best graph analytics capabilities and the best semantics analytics capabilities and, of course, that supports high volume and user scale.
Question 3. We seem to be swimming in data, yet companies struggle with getting the data they need. Why is that and how can they overcome the challenges?
MZ: At Dassault Systèmes, we speak frequently about "digital continuity." One way to understand what we mean by that term is to compare an electronic boarding pass that you have downloaded in PDF with a digitally connected boarding pass that you might have on your smartphone. While the PDF boarding pass is electronic, it is not digitally connected. If your seat changes or a gate changes, your boarding pass will not reflect it. But if your boarding pass is digitally connected on your smartphone, then when you are upgraded or the boarding gate or time changes, your boarding pass will automatically reflect it. That is a very important concept for this question.
There is a large auto manufacturer for whom we are doing lots of analytics with the 3DExperience platform. Previously, when one of their executives wanted to have a report, he or she would ask a VP, who would ask a manager, who would ask a middle manager, who would ask a worker, who would get the relevant data from IT. Then, everyone would manipulate the data to make sure that the report represented what the boss wanted to see. Eventually, that report would be stored and circulate on Excel or PowerPoint—but no one actually knew if the data was true or if it was relevant to the stated purpose of the report. An additional nasty side effect was that no one was updating the data at the source.
Subsequently, that customer made the decision to have the analytics digitally connected. So, now, when the CEO wants to know the stages of a car program, he logs into the 3DExperience platform and accesses the data as it sits in all of the underlying systems—whether this is PLM or ERP. This is making a substantial impact. The first impact when analytics become digitally connected is that people start to update the data in that underlying system rather quickly. Now, there is only one place to see the data, which is in the platform, and there is no way to manipulate the data. So that single source of truth suddenly becomes a reality because analytics are digitally connected.
The second impact is even more interesting. The managers have told us they like this new system because they don't have to “massage” the data anymore—they're not spending half of their time in preparing a report making sure that the data is representing what their boss wants to see. The boss sees the data as it is in the system, and the result is that they are far more effective as an organization. So, by allowing everyone a single point of access to all of the underlying data—through the 3DExperience platform—and by making sure that everyone has the same understanding of the key performance indicators the enterprise is looking at, we are greatly simplifying the handling of data and solving some of the challenges traditionally associated with data overload.
Question 4. Manufacturing professionals are pretty freaked out about sharing data with the enterprise. How can they put those concerns into perspective and protect their data?
MZ: Twenty years ago, Dassault Systèmes made the first digital mockup—for the Boeing 777. It was very revealing to workers from every discipline in the enterprise, who were now capable of looking at the mockup and understanding the impact of every decision on the aircraft, as well as on their peers working in other disciplines. It became the enabler of multi-discipline collaboration.
Now, we are at the point where all enterprise processes are fully digital. From resource allocation on programs to cost and logistics management, everything that exists in the systems is digital. In that context, what we are doing with analytics and smart analytics for manufacturing is using it in the same manner that we were doing in the digital mockup, the enabler for multidiscipline collaboration, i.e., provide everyone who has a decision to make the ability of doing so because he understands the full context of what is going on, regarding prices, program, costs, logistics, resources, or timing.
This is very important because when professionals start to understand that the data becomes the enabler for their personal performance and for the performance of the company, sharing that data completely changes their perception of the impact of a decision. When you start empowering everyone with smart analytics and give them the full context for any decision that they have to take, you are empowering management at the edge of the company to take local decisions while understanding the global impact of their decisions. Everyone understands how to make a decision to fit with the enterprise's key performance indicator or the enterprise performance target. For these reasons, the fears of sharing data with the enterprise are very limited compared to the benefits that are attainable.
Question 5. Can you give some examples of companies that are using analytics effectively within their manufacturing organizations?
MZ: Most of the customers who are using our DELMIA Apriso applications are all already using analytics for manufacturing and have been doing so since DELMIA Apriso's initial deployment. Beyond that, companies such as Ford and Dassault Aviation are also deploying some very interesting analytics with the aim to leverage quality data, customer feedback data, and warranty data back into manufacturing processes to improve manufacturing quality and performance.
Question 6. Are you optimistic about the future of manufacturing? Why or why not?
MZ: I am very optimistic about the future of manufacturing for one specific reason: I believe manufacturing is becoming both more important and more contributive to enterprise performance. If you look back at the history of manufacturing, it has gone from siloed optimization—local optimization at the factory level—to enterprise optimization and is now going into integrated operation. And by integrated operation, I mean that manufacturing is at the core of the convergence of engineering—manufacturing engineering, supply chain, quality, logistics, manufacturing operations. And because manufacturing is now at the core of these ecosystems, it is now capable of making companies infinitely elastic and agile. That is what is not only driving topline revenues for companies, but more importantly, it gives them the agility to transform towards new business models or develop new types of offers. So, I am highly optimistic about both the future of manufacturing itself and the companies that benefit from it.