‘Fake’ data helps robots learn the ropes faster

June 29, 2022

In a step toward robots that can learn on the fly like humans do, a new approach expands training data sets for robots that work with soft objects like ropes and fabrics, or in cluttered environments.

Developed by robotics researchers at the University of Michigan, it could cut learning time for new materials and environments down to a few hours rather than a week or two.

In simulations, the expanded training data set improved the success rate of a robot looping a rope around an engine block by more than 40% and nearly doubled the successes of a physical robot for a similar task.

Continue reading ⇒

Rubble-roving robots use hands and feet to navigate treacherous terrain

August 13, 2021

Humans are adept at using our hands to keep our balance, whether by grabbing a railing as we climb stairs, walking with help from a cane, or gripping a strap on the subway. Now, University of Michigan researchers have enabled humanoid robots to use their hands in a similar way, so the robots can better travel across rough terrain, such as disaster areas or construction sites.

“In a collapsed building or on very rough terrain, a robot won’t always be able to balance itself and move forward with just its feet,” said Dmitry Berenson, professor of electrical and computer engineering and core faculty in the Robotics Institute. 

“You need new algorithms to figure out where to put both feet and hands. You need to coordinate all these limbs together to maintain stability, and what that boils down to is a very difficult problem.”

Continue reading ⇒

Helping robots learn what they can and can’t do in new situations

May 19, 2021

The models that robots use to do tasks work well in the structured environment of the laboratory. Outside the lab, however, even the most sophisticated models may prove inadequate in new situations or in difficult to model tasks, such as working with soft materials like rope and cloth. 

To overcome this problem, University of Michigan researchers have created a way for robots to predict when they can’t trust their models, and to recover when they find that their model is unreliable. 

“We’re trying to teach the robot to make do with what it has,” said Peter Mitrano, Robotics PhD student.

Continue reading ⇒

Using computer vision to track social distancing

April 15, 2020

With advanced computer vision models and live public street cam video, a University of Michigan startup is tracking social distancing behaviors in real time at some of the most visited places in the world.

Voxel51’s new tool shows—quite literally—an uptick in public gathering in Dublin on St. Patrick’s Day, for example, and at New Jersey’s Seaside Heights boardwalk during a recent weekend of unusually good weather.

Continue reading ⇒

A quicker eye for robotics to help in our cluttered, human environments

May 23, 2019
Chad Jenkins, seen here with a Fetch robot, leads the Laboratory for Progress, which aims to discover methods for computational reasoning and perception that will enable robots to effectively assist people in common human environments. Karthik Desingh, lead author on the paper, is a member of the lab. Photo: Joseph Xu/Michigan Engineering

In a step toward home-helper robots that can quickly navigate unpredictable and disordered spaces, University of Michigan researchers have developed an algorithm that lets machines perceive their environments orders of magnitude faster than similar previous approaches.

Continue reading ⇒