Manufacturing is going to find a new normal. And as manufacturers regain traction on their digital journeys, data creation, collection and manipulation levels will continue to intensify. After all, how well manufacturers manage data will play a pivotal role in determining the success of their digital pursuits.
Of course, there are infrastructure impacts to prioritizing so much data. Manufacturers now realize the importance of real time capabilities, which understandably increases the interest in 5G. As manufacturers continue to leverage data it has the potential to change the shape of the network infrastructure dramatically. After all, central cloud models were not designed to handle the rising tide of data, explains Veea Senior Vice President Kurt Michel.
“The pendulum shifts between smarter centralized processing to distributed processing, and each time the shift occurs, an imbalance is created between the data transport and compute capabilities in the center of the network and the edge,” says Michel. “The pendulum is currently swinging from the cloud-based data center model to one placing more demands of processing data closer to its creation point – in this round driven by 5G.”
Recognizing the challenges
Storage. Although storage has become relatively cheap, if the price of storage falls by 50%, and data creation doubles, a problem still exists. “Data is only as useful as the knowledge we can extract from it. Rather than shipping all of that data somewhere for storage and future processing, manufacturers need to process the data as close to real time as possible and compress it into meaningful knowledge it can store more efficiently,” says Michel.
Bandwidth. Yes, the deployment of 5G represents an opportunity to expand the bandwidth dramatically but moving mountains of collected data back to a central cloud has a cost. “By processing the data as close as possible to the sources, and only passing relevant events to the central clouds, manufacturers can realize tremendous savings not only in the cost of the bandwidth itself, but also in the need to continually build up infrastructure capacity to handle it,” says Michel.
Responsiveness. Some decision-making is real-time and must be completed in milliseconds. Robotic equipment controls, autonomous vehicles, access controls, human interaction systems and AR/VR systems often require localized processing to deliver the required level of responsiveness. Centralized clouds located miles away can suffer from network congestion, server capacity or scaling issues. By localizing data processing and critical decision-making at the closest feasible point to the data sources, this challenge can be met.
On the Edge
Adding compute capabilities as close as possible to where devices are generating the data, and where control and decision-making is required makes good sense. “Once we have more edge processing in place, new applications previously considered unrealistic will emerge. However, this will only happen if the industry considers the edge processing capacity as an addition to the core capacity – not a replacement,” says Michel. “That means our software architects will have to consider the combined processing capacity of the core and the edge as a whole and partition their systems to take advantage of the unique and complementary benefits of each.”
By designing solutions for easy, modular portability between edge and cloud resources, manufactures make it possible to find an optimal mix which can dynamically shift based on both short and long-term demands. Microservices in container-based systems will be an important part of a successful edge computing strategy.
Understandably, the industry cannot simply start creating smaller regional or local data centers and think that will address the challenges. “Racks of servers in a hyperscale datacenter makes sense, but as we move processing closer to the data consumer, greater integration is necessary so that operators and their support teams are not overwhelmed by the number of different elements and physically distributed failure points that are being created,” says Michel.
By virtue of its higher frequency and shorter reach, 5G demands more physical access points than 4G. With this in mind, manufacturers will need to add “processing capabilities to the locations where the access infrastructure exists whether that’s from the device edge where devices like smartphones connect, or the aggregation infrastructure edge,” he says. “Different types and amounts of processing will be needed at each stage.”
Not only will successful edge elements need to complement the central cloud elements, they will also need to complement other edge processing elements, essentially creating what Michel refers to as a “processing mesh” that can shift workload both vertically with the central cloud and horizontally with other edge processing elements.
“Of course, security in these new systems is crucial. In hyperscale datacenters, the ability to maintain a strong degree of physical security is possible. That kind of physical protection will often not be feasible at the edge of the network, especially the device edge,” he says. “Therefore, strong data security measures will be required, not only for data at rest, but data in flight. We cannot afford to leave that as an afterthought.”