Groups of collaborating robots complete tasks more efficiently than a robot or a person working alone in areas as diverse as search and rescue, aerial surveillance and data gathering, robotic assistants, and autonomous systems operating on land, at sea, in the air, or outer space.
This area of human-robot collaboration, as with autonomous vehicles, often involves physical objects whose motions need to be controlled. Mathematically speaking, robot teams and swarms depend heavily upon algorithms and software that control their interactions with the outside world. These algorithms enable planning and control, safety, and resilience under dynamic and uncertain conditions. That need becomes critical as robotic unmanned vehicles proliferate in the airspace of densely populated cities.
This work draws upon theoretical computer science to develop software that can assess whether a robotic system will behave as expected. Researchers also devise algorithms that will automatically generate code to make a robot perform a desired task without the need for a programming expert.
The goal is to create highly capable teams of networked robots that can safely interact with each other and with humans, as they navigate uncertain physical environments, encounter unpredictable human behavior, assess the reliability of incoming data that may have been compromised, and even face hostile opposition.
With support from the National Aeronautics and Space Administration, U-M roboticists work toward extending the capabilities of free-flying, cube-shaped robotic systems called Astrobees. Earlier versions of the Astrobees streamed mobile video of space station crews to ground controllers, who operated the robotic assistants independent of the astronauts. The next-generation of Astrobees will be able to process in real time where to get the best camera views of an astronaut who is moving back and forth among several tasks, letting the astronaut see himself — and the results of his work — via augmented reality.
This work starts with simple laboratory experiments in which a single drone watches a human counterpart performing a task and quickly assesses where it should direct its camera to transmit useful images to its human partner. With further advances, the work will incorporate multiple drones assisting a single human, and will eventually extend to multiple drones coordinating their work with multiple humans performing multiple tasks.