The lessons we learn from self-driving will drive our robotics future
Robotics is entering an exponential growth phase. There are increasingly new and diverse applications for robots, both the inspiring and the mundane. Just within the context of the COVID-19 pandemic response, robots have been deployed in novel ways — disinfecting public spaces, handling infectious materials, and providing medical care to patients.
But the horizon for new robotics applications is ever expanding, and it is AV (autonomous vehicle) development that will further accelerate this growth. Why? Because the challenge that self-driving cars present is the same challenge that acts as a barrier for most other kinds of robots. The AV industry, with its concentration of talent, infrastructure, and capital, is primed to meet this challenge.
The autonomy challenge
Even as the use of robots has become more widespread, its applications have remained somewhat limited. For decades, one-armed giants performed highly scripted tasks and were built for a single purpose, like spot welding or adding threads to the end of a pipe. They were not flexible enough to perform a variety of tasks or respond well in unstructured environments. Even when deployed in less structured environments, like those used in surgical settings or even aerial drones, robots have functioned primarily as a remote-controlled extension of a human actor, with limited autonomy.
AVs, on the other hand, inherently require a great deal of autonomy; there is literally no human being behind the wheel, and the stakes are high. AVs need the ability to sense, plan, and act in highly dynamic, unstructured environments such as the chaotic streets of San Francisco. They need to respond to humans — other drivers, pedestrians, cyclists, that guy on a motorized skateboard — and make collaborative decisions with them.
Consider one of the common yet more challenging traffic scenarios that humans regularly encounter: a four-way stop. Despite the laws that govern how drivers should stop and proceed in their turn, the reality is that most of the time, people navigate these intersections via nonverbal communication with each other. They make eye contact, nod, wave each other on. Without the capacity to communicate using these cues, an AV must still decipher the intent of other drivers and communicate its own — for instance, creeping forward slowly to convey its intent to proceed through the intersection — all while obeying traffic laws and making safety-critical decisions. This choreography cannot be scripted in advance. AV decision-making must conform to human-like social expectations in real time based on the current situation and potential evolution of all the relevant actors in the scene, including itself, for some time into the future.
The crux of the challenge involves making decisions under uncertainty; that is, choosing actions based on often imperfect observations and incomplete knowledge of the world. Autonomous robots have to observe the current state of the world (imperfect observations), understand how this is likely to evolve (incomplete knowledge), and make decisions about the best course of action to pursue in every situation. This cognitive capability is also essential to interpersonal interactions because human communications presuppose an ability to understand the motivations of the participants and subjects of the discussion. As the complexity of human–machine interactions increases and automated systems become more intelligent, we strive to provide computers with comparable communicative and decision-making capabilities. This is what takes robots from machines that humans supervise to machines with which humans can collaborate.
Where human-robot collaboration can take us
As robotics has grown as an industry, costs have fallen, enabling adoption across a broad variety of contexts. In some cases, the technology is familiar but the application is novel. While drones aren’t new, companies deploying them to inspect power lines or to collect information for insurance claims is. Same for the one-armed giants now employed as hotel concierges or baristas instead of spot welders.
Commerce has benefited greatly from automation. Materials handling in particular has been ripe for automation via self-guided vehicles, largely because it’s such a dangerous sector for human workers. Robots equipped with lidar, cameras, and a bevy of other sensors — like those that enable AVs’ perception systems — can safely and quickly navigate loading docks and factory floors while avoiding collisions with workers. These robots, however, still rely on a fairly structured and predictable environment (markers on the ground help them navigate) and lack dynamic responsiveness. During the last few years, some have argued that injuries in some fulfillment centers have resulted from robots moving at a faster pace than the humans working alongside them.
Robotics in healthcare environments has become commonplace, too. Robot-assisted surgical systems like Intuitive’s da Vinci are used in 90% of prostatectomies instead of traditional laparoscopic tools. But robots are increasingly valuable not just in the operating room but throughout hospitals and nursing homes, especially in the context of the COVID-19 pandemic. Robots are helping caregivers lift patients and performing other tasks as well as providing social interaction to the elderly. Robotics have increasingly been used with children as well, not just as trendy tech toys but legitimate STEM educational tools. Research into the treatment of children with autism using emotive robots has gained traction in recent years.
AV development is key
With more players in the field and increasing adoption, the $100+ billion global robotics sector has been growing by leaps and bounds, and according to IDC is expected to triple by the end of 2021. Much of this can be attributed to driver-assistance technologies now common in new vehicles, especially those at the higher end of the market. Companies developing fully autonomous technology, however, are poised to push the robotics envelope in the automotive industry and beyond.
As AV companies meet the challenge of human-robot collaboration at the level required to bring self-driving vehicles to market, the horizon for leveraging these solutions for other robotics applications only expands. Like a chess grandmaster, an AV must consider multiple possible moves and countermoves both for itself and other traffic participants and then make safety-critical decisions in a noisy and rapidly changing environment. It needs to take into account context like traffic laws and local norms; driving in a city like Houston is not the same as navigating Hong Kong. And a successful AV has to communicate its goals and its intent to humans in a way that feels natural and intuitive.
Developing the kind of decision-making needed for AVs to succeed will unlock complex “critical thinking” for other robotic applications, allowing a greater degree of autonomy and human-robot collaboration in both new and familiar use cases. Physical agents that can autonomously generate engaging, life-like behavior will lead to safer and more responsive robots. The shift from humans supervising robots to collaborating with them is the way forward for both AVs and the sector at large.
This article was written by Rashed Haq and Cruise from VentureBeat and was legally licensed through the Industry Dive publisher network. Please direct all licensing questions to email@example.com.