Focus Areas

Autonomous Vehicles| Human-Robot Interaction | Legged Locomotion | Manipulation | Manufacturing and Construction | Mobile, Cooperative Robots | Perception | Rehabilitation Robotics and Brain-Machine Interfaces

Autonomous Vehicles

A roundabout at Mcity.
Mcity, a test site for autonomous vehicles that emulates the urban driving experience, is just half a mile from the robotics building site.

Driverless cars steal the headlines, but autonomous vehicles in general can do much more to improve the safety of humans and help protect the environment. Autonomous aerial vehicles can observe farm crops, fish populations, wildfires and droughts. Pollutants, temperature and other indicators of the ocean’s health can be monitored at a variety of depths through the use of autonomous underwater and surface drones or gliders.

Autonomous cars could reduce crashes, congestion and emissions. In anticipation of the future demand, all of the major automobile companies now conduct and sponsor research and development of systems to either assist the driver or serve as chauffeur. With the Mcity autonomous vehicle test site on campus, U-M is a natural leader in the field. Michigan Robotics research addresses a variety of technical concerns to make autonomous operation safe and reliable. Other outstanding questions include how autonomous or semi-autonomous systems should interact with humans and how they may affect society. In answering these questions and developing solutions, core robotics faculty benefit from collaboration with the strong psychology, social science and public policy departments at U-M.

Faculty doing autonomous vehicle research include: Ella AtkinsKira Barton, Jason CorsoEd DurfeeTulga Ersal, Ryan EusticeAnouck Girard, Matt Johnson-RobersonVineet KamatEdwin Olson, Necmiye OzayDimitra PanagouHuei Peng, Lionel Robert, Ram Vasudevan

Top of page

Human-Robot Interaction 

Autonomous robots are rarely purely autonomous. Robots will need a skill set that allows them to interact effectively with humans. This area of robotics benefits from collaboration with social scientists, who can elucidate how teams of humans operate and how a robot might understand these teams and its own role.

Human-robot interaction includes physical elements, such as tele-operation and safety, as well as intellectual and emotional engagement, such as ensuring that humans and robots can accurately predict one another’s behaviors. It also explores how much autonomy should be given to a robot for a particular task, taking into account whether control could be passed to a human operator who would be alert and aware enough to handle the potentially dangerous situation.

Faculty doing human-robot interaction research include: Ella AtkinsKira Barton, Dmitry Berenson, Cynthia ChestekJason CorsoJia Deng, Brent Gillespie, Chad JenkinsBen KuipersWalter LaseckiGabor Orosz, Emily Mower Provost, Lionel RobertNadine Sarter, Xi Jessie Yang

Top of page

Legged Locomotion

Michigan researchers are giving legged robots remarkable new capabilities. Human-sized bipedal robots are charging over unstructured terrain with unprecedented speed and agility, using advanced feedback control algorithms built on cutting-edge mathematics. Dog-sized quadrupedal robots are being designed to break through energy consumption barriers that threatened to keep legged machines more of a curiosity item rather than ubiquitous tools. Another effort is focused on highly novel multi-pedal mechanical designs and associated algorithms that give legged machines a level of adaptability seen now only in the animal world.

This research is directly tackling one of the greatest challenges for mobile robots, the negotiation of human environments, full of furniture, fallen debris, catwalks, discrete footholds, and other obstacles inhospitable to wheeled vehicles. U-M faculty engaged in research on new mathematical algorithms to give legged robots the ability to move freely in challenging environments, and on the design of mechanisms to make legged robots more adept and efficient include: Jessy GrizzleC. David RemyShai Revzen, Elliott Rouse

Top of page


Can you imagine a robot with the dexterity of a human? Even the dexterity of an “ordinary” human would be incredible, but what if it had the skills of a surgeon or a pianist? Michigan roboticists are laying the foundational blocks with the goal of first meeting and then exceeding the manual dexterity of an ordinary human.

Manipulation – the study of physical interaction between robots and objects – has long been one of the principal challenges of robotics. To be useful for tasks like caring for the elderly, cleaning up a home, stocking a warehouse, assembling electronics, or exploring space, robots must be able to understand actions that we as humans often take for granted, such as “picking-and-placing” objects, using tools, and operating devices. This interaction involves reasoning about the properties of objects, as well as the constraints of the task and uncertainty in the robot’s sensing and motion. The lack of structure in environments like homes means that robots cannot execute a fixed program but instead must constantly adapt to their changing environment, synthesizing new plans and strategies as needed. The presence of humans in the environment also requires the robot to understand human expectations and reason about what humans are likely to do so that it can operate safely and efficiently.

Robots with simple grippers could do amazing things if they had human-level manipulation skills

U-M faculty are developing leading computational methods that allow robots to perform manipulation in human workspaces and building integrated systems that allow robots to perceive, grasp, and plan to manipulate household objects. Advanced research in manipulation includes artificial intelligence and machine learning methods to teach robots new manipulation tasks and skills, computer vision and scene estimation for taskable goal-directed manipulation, and robust motion planning for manipulation of soft materials like cloth, string, and tissue.

Faculty doing manipulation research include: Dmitry Berenson, Chad Jenkins, Ben Kuipers and John Laird

Top of page

Manufacturing and Construction Robotics

Michigan researchers in manufacturing and construction robotics are driving innovation to tackle important design challenges in sensing, actuation, and automation. U-M faculty are engaged in research on the use of advanced manufacturing tools and designs to fabricate morphable mechanisms and customizable sensors with unique materials and higher resolutions, and on hybrid modeling and agent-based reasoning and learning methodologies to enable more secure and efficient system-level manufacturing. U-M faculty are also creating collaborative autonomous robots for construction, assembly, inspection, and maintenance of civil infrastructure. Research specifically focuses on finding solutions to the robot localization, pose estimation, and scene understanding problems in unstructured and evolving environments.

Faculty doing manufacturing research include: Kira Barton, Dmitry Berenson, Vineet Kamat, Wesley McGee, Chinedum Okwudire, C. David Remy, Kazuhiro Saitou, Dawn Tilbury, A. Galip Ulsoy

Top of page

Mobile, Cooperative Robots

Michigan researchers are engineering cooperative multi-robot and multi-vehicle systems to enable applications as diverse as autonomous surveillance and data gathering, search-and-rescue, and autonomous driving. Robots collaborating in teams can often complete tasks more safely and more efficiently than a robot or person working alone. Faculty and students at Michigan focus on the development of cutting-edge integrated systems for autonomous deployment in real-world, uncertain environments, and highly capable robotic systems that can safely interact with other robots and humans. Planning, navigation and control of multiple interacting robots or vehicles demands expertise in a range of technical disciplines, including path and motion planning, control and estimation, decision-making, queuing and logistics, distributed systems, and discrete event systems.

U-M faculty engaged in research on the development and deployment of advanced mathematical algorithms to make multiple robots coordinate and interact safely and efficiently include: Jason CorsoRyan Eustice, Anouck Girard, Chad JenkinsMatthew Johnson-RobersonYoram KorenWalter Lasecki, Lauro OjedaEdwin Olson, Necmiye OzayDimitra Panagou, Stephane Lafortune

Top of page


Michigan researchers are endowing robots with exceptional capabilities for perceiving their environments and their place within them. Our capabilities integrate vision, sonar, and LIDAR, with proprioception such as encoders. State-of-the-art technologies in simultaneously mapping and localizing the positions of robots both indoors and outdoors have resulted from our mobile robotics programs. We are experts in air, land and sea—our robots already inspect ship hulls automatically for damage; they already handle wildly cluttered indoor settings so a wheelchair can navigate autonomously. Exciting developments in not only visual perception but also audio perception are paving the way for more interactive robots.

Perception research at Michigan directly targets foundational questions:

  • What is the relationship between sensed signals?
  • What are the semantics and the language that humans use to describe actions?
  • How can modeling and the use of both active and passive algorithms enrich a robot’s awareness of the world around it?
  • How can modeling and fusion over many complementary sensors converge into a single overall representation?
  • How can the progressive learning of robotic perceptive intelligence leverage modern machine learning and signal processing methodologies?

These questions, when answered, will lead to huge advances in robotics. Faculty engaged in perception research include: Jason Corso, Jia Deng, Jason CorsoRyan Eustice, Ben Kuipers, Chad Jenkins, Matt Johnson-RobersonLauro OjedaEd Olson

Top of page

Rehabilitation Robotics and Brain-Machine Interfaces

The University of Michigan is a nationwide leader in the scientific effort to design, control, and evaluate robots and other devices that assist people with disabilities. Our research in the field of Rehabilitation Robotics aims at developing machines that directly and physically interact with humans. These robots deliver physical therapy to assist with recovery after musculoskeletal or neurological injury – or they replace functionality that was lost irreversibly. To this end, we study how the brain and body work, develop powered prostheses and exoskeletons, and investigate novel ways to interact with the human body.

A young man with a mask over his nose and mouth, connected to boxes on his chest
Oxygen measurements inform the design of an exoskeleton.

There are a large number of ongoing exciting projects. U-M researchers work on brain-machine interfaces, such that users can control robots and other assistive devices with their mind; they design and control the next generation of active prostheses and exoskeletons to make them lighter, smarter, and more intuitive to control; they work on robotic exercise equipment that can challenge a patient in just the right way during therapy; and they investigate how the human brain learns and re-learns after neurological injury. Our robots help with all sorts of tasks from walking to manipulation, and they come in all shapes and forms, from large wearable devices to tiny sensors and actuators.

Since Rehabilitation Robotics is a truly multi-disciplinary and collaborative effort, our team spans the entire College of Engineering and we further collaborate with specialists from the school of kinesiology and the medical school. The complementary expertise of our faculty creates an unrivaled environment that accelerates the development, testing, and effective use of devices that improve mobility and function in individuals with physical disabilities and limitations.

Faculty doing medical and rehabilitation robotics research include: Cynthia ChestekDaniel Ferris, Deanna GatesBrent Gillespie, Jessy Grizzle, Jane Huggins, Chandramouli KrishnanBen Kuipers,        C. David Remy, Elliott Rouse, Rachel SeidlerKathleen Sienko  

Top of page