Speaking like dolphins, a robot fleet takes on underwater tasks

December 17, 2018
An underwater autonomous robot built to inspect dams, bridges, and hulls of ships practices by inspecting the side of a pool. Courtesy of Joshua Mangelson.

In a Navy shipyard in San Diego, a new generation of underwater robots are learning to communicate and collaborate in order to inspect boats, bridges, pipelines, and other underwater structures. Developed by Joshua Mangelson, a University of Michigan doctoral student in Robotics, the autonomous vehicles overcome the many challenges posed by murky water by simplifying how the robots coordinate and communicate.

Water, while the basis of life for many, means death for wireless communication. “Below a meter or two of water, Wi-Fi cuts out completely,” Mangelson said. “Same with GPS signals. This is because water attenuates electromagnetic signals very quickly, which makes underwater exploration and mapping a very interesting problem.”

Continue reading ⇒

Decentralized air traffic control for drone-laden skies

December 17, 2018

Anticipating skies crowded with crisscrossing autonomous vehicles, University of Michigan researchers have developed a future air traffic control system that allows any number of autonomous planes to safely route around each other to their final destinations.

Kunal Garg, a University of Michigan graduate student, designed the system to be utilized by autonomous fixed-wing aircraft, which require more time and space to turn than rotorcraft such as quadcopters. The work could be extended to autonomous vehicles, which follow similar turning dynamics.

In this simulation, autonomous fixed-wing aircraft change goal points and flight modes to avoid collisions based on control laws developed by Kunal Garg.
Continue reading ⇒

Virtual reality job trials for collaborative robots

October 11, 2018
construction worker reads plans
A construction worker reads plans for the GG Brown Addition on North Campus. Photo: Joseph Xu, Michigan Engineering

Robots seem a perfect match for many of the exacting, tedious, and repetitive jobs that occur on a construction site, and they can free up human workers to take on more complex tasks. However, introducing robots capable of nailing drywall or laying loads of brick can introduce unfamiliar dangers to a worksite already full of hazards from heavy machinery, power tools, and cranes dangling steel beams.

To ensure humans feel safe working with and around new robots, researchers at the University of Michigan developed and tested a social theory using a platform that is able to prototype robots in an immersive virtual reality.

Continue reading ⇒

Director’s Letter from Jessy Grizzle

August 23, 2016

What is Michigan Robotics all about?

In three words, we are ALGORITHMS IN MOTION.

We are a team of roboticists and faculty with related expertise who are deeply grounded in the science and fundamentals that produce breakthroughs in robotics, now and in the future. We are fully committed to the balance of theory and practice, where ideas are fleshed out on the whiteboard and in simulations — and then tested in real hardware. Before we share our work with you, our peers and the public, we’ve given it a pretty thorough going over in house. We believe that confronting both theory and hardware is critical in the training of the next generation of leaders.

All of us have seen the sped-up videos of robots picking up objects or walking across uneven terrain — a sure sign that the state of robotics is yet not up to the task of dealing with the real world. A driving force in our work is ROBOTS THAT MOVE AT THE SPEED OF LIFE. We are developing the science and technology for robots that work quickly, safely and efficiently alongside humans, outside the laboratory, “or in the wild,” as we like to say.

The core of autonomy is the ability to handle the unknown — explore previously unmapped environments, dexterously manipulate new objects, and recover from unexpected situations, accidents and malfunctions. We attack the problem from all angles, an approach we call FULL SPECTRUM AUTONOMY.

It is much easier to get the top layer in the autonomy spectrum — whether AI or Deep Learning methods — working on a machine that can already reliably execute  a rich set of motion primitives. This lower level of the autonomy spectrum typically relies on modeling, optimization, and feedback control methods, integrated with active perception, be it vision or LIDAR or something else entirely.

But none of this makes much sense unless the machine can carry sufficient energy onboard to complete a task. Michigan Robotics researchers also explore completely different notions of autonomy, optimizing robot hardware for mechanical and electromechanical efficiency. Full spectrum autonomy covers the highest levels of reasoning to the lowest level servo loops, active perception, and robotic mechanisms themselves, with the bottom line being:

BOLDLY SENDING ROBOTS WHERE NO ROBOT HAS GONE BEFORE!

Jessy Grizzle

Jessy Grizzle
Director, Michigan Robotics
Elmer G. Gilbert Distinguished University Professor
Jerry W. and Carol L. Levin Professor of Engineering