Commanding robots from the edge
In a Nokia factory in India, robots abound. These robots whiz back and forth across the crowded shop floor, carrying components and equipment. They travel between assembly lines, testing stations and loading docks – weaving their way between employees, vehicles and their robot counterparts in what seems to be chaotic patterns but, in reality, is an intricate dance choregraphed in real time.
Nokia Bell Labs has developed a proof-of-concept cloud monitoring and control system that can coordinate the actions of robots from different manufacturers and with different functions, all operating under the same factory roof. These robots not only share the shop floor with one another, but also other factory workers, contractors and the machines and vehicles they operate. The system is intelligent enough to account for all those variables, reconfiguring the flow of robot traffic throughout the factory around people as they work and travel within it.
In essence, the factory is aware of the location and movement of every robot and person in the building. In addition, the factory reacts to, and even anticipates, new situations, modifying robot routes and activities as conditions in the factory change. We believe this holistic approach to robot orchestration can enhance factory automation, producing quantifiable improvements in productivity and safety while utilizing industrial infrastructure far more effectively. What’s more, this kind of orchestration will give factory operators unprecedented insights into their factory operations, arming them with the tools to optimize their manufacturing processes to the fullest. Ultimately this technology could become a key element in creating the flexible factories of the future.
The bumpy road to automation
Factories around the world are making significant investments in automation equipment, but while the state-of-the-art of these technologies is quite impressive, the automation world is still largely siloed. Each system makes its own decisions and does its own planning without knowledge of other systems’ decisions or plans. As a result, different operational technology (OT) systems are often at odds with one another.
As Nokia upgrades its own factories, we’ve come across many of these challenges, particularly with autonomous mobile robots (AMRs) and automated guided vehicles (AGVs). As robots move around the factory, their paths often overlap and intersect due to the highly inter-linked nature of our manufacturing, assembly and testing lines. An often-used remedy is to constrain these robots to “one-way streets.” This prevents collisions but it creates other problems, such as traffic back-ups on many of these one-way paths as a single stopped robot could delay all other robots behind it.
Many of our industrial customers trying to use robots to automate their factory operations have encountered similar challenges. Combined, those issues dramatically reduce – and sometimes even eliminate – the efficiency gains that robot-based logistics are intended to deliver. If the factory is an orchestra, then the strings, horns and percussion sections are each playing their own tune. What we need is a conductor, which is exactly what Nokia Bell Labs has set out to create.
Enter the robot conductor
Nokia Bell Labs’ robot orchestration research faced two major logistical challenges: scale and diversity. Any orchestration system suitable for industry needs to manage hundreds if not thousands of individual fully-autonomous and automated-guided robots at any given factory or campus. And those robots, more often than not, come from multiple manufacturers each with their own robot-control and-automation solutions. We opted to create a centrally controlled system in the on-premise edge that could not only manage the activities of massive multi-robot systems but also handle the complex interplay between different vendors’ robot automation platforms. We built this prototype on top of Nokia’s MX Industrial Edge solution (MXIE) in the Nokia Digital Automation Cloud (NDAC), which provides the predictable wireless connectivity, local-processing capabilities, low latency and capacity required to manage these interactions in real-time while ensuring the privacy and security of our data.
We then needed to give the system awareness of every robot, machine, person and location in the factory. We did this through a sophisticated sensing network. Native sensors on the robots “see” their surroundings, while radio-frequency localization technology provides precise positioning of each robot on the shop floor. Video cameras within the factory supplement all this sensor data.
That sensing data is then sent to the on-premise edge via a private wireless network, where machine learning and AI techniques extract the information and data from video and other sensor streams. Those algorithms give the orchestration system the ability to accurately interpret the factory environment. That knowledge is then used to dynamically issue commands and set routes for the mobile robots, both individually and in groups, so they can accomplish their tasks in the most efficient and safest ways possible.
With the orchestration system in place, robots can literally see around corners, allowing them to reroute their paths to avoid collisions and logjams. The system is also able to adjust for human presence by sensing the movements of factory workers and even visitors. For instance, if it detects a group of people congregating around a workstation, it can redirect the flow of robot traffic to avoid the station.
Furthermore, the orchestration system doesn’t just optimize the status quo. It gives factory operators an enormous degree of flexibility to change things on the fly. If production volumes need to be increased or a portion of the factory needs to be reconfigured for a new production task, then a factory manager need only commission the new project at a high level. Our edge-based control system does the complex work, using algorithms to optimize resources, time and paths for every robot in the factory.
Finally, this technology could become a key component of the industrial metaverse. By sensing every aspect of the factory, the orchestration system builds a digital twin of the entire factory environment, which can be used not only to visualize operations but also in advanced planning and design.
The orchestra at work
Nokia Bell Labs implemented its robot orchestration prototype at a Nokia factory in Chennai, India, and we saw immediate benefits. We found that for every robot mission, the average distance traveled went down by 28% after the trial began. In addition, the average time it took for a robot to complete a given task dropped by 40%. With the orchestration system planning traffic flows in a holistic manner, the robots were stuck in fewer traffic jams and traversed routes that avoided both factory workers and other robots and vehicles. This shaved both meters and minutes off their journeys while making factory operations safer.
But the proof goes beyond the metrics. As our trial progressed, we found that this new factory awareness produced continuous improvements in production output, efficiency and quality. It freed workers from more mundane tasks like restocking and resupply, allowing them to focus exclusively on monitoring supervision and control of the manufacturing processes. What’s more the factory workers participating in the trial began embracing the orchestration system, because it not only increased their productivity but also provided new insights that led to better decisions.
In short, the people vital to making the factory run began to rely on the robot orchestration system. Ultimately, we consider that to be one of the key steps to building large-scale robot management systems in the factories of the future. Human acceptance is the ultimate validation.