Dr. Jason Corso and Dr. Brent Griffin are extending prior work in bottom-up video segmentation to include depth information from RGBD video, which allows us to better train for specific tasks and adaptively update representations of objects in complex environments. For robotics applications, we are incorporating this into a framework that guides the processing of RGBD video using a kinematic description of a robot’s actions, thereby increasing the quality of observations while reducing the overall computational costs. Using kinematically-guided RGBD video, we are able to provide feedback to a robot in real-time to: identify task failure, detect external objects or agents moving into a workspace, and develop a better understanding of objects while interacting them.
The concept of “formal methods”, also know as “correct by construction” is applied to the trajectory planning for mobile robots/small autonomous vehicles. The main challenges are (i) the presence of multiple moving objects (pedestrians, other robots/vehicles), and (ii) plant uncertainties. We aim to address both in our research.
The example publications show a formal design process to deal with multiple moving objects (without considering plant uncertainties). Our method achieves better balance between safety (zero collision!) and performance (the robot does not keep stopping to avoid collisions) compared with other methods from the literature.
Mr. Yuxiao Chen is a U-M graduate student co-advised by Professors Huei Peng and Jessy Grizzle.
CSE graduate students Qi Zhang and Shun Zhang will present exciting research papers at ICAPS 2017, the 27th International Conference on Automated Planning and Scheduling, taking place this June at Carnegie Mellon University in Pittsburgh PA.
Here are the papers:
Professor Dimitra Panagou has been awarded the NASA Early CAREER Faculty award, which enables Professor Panagou and her student team to develop the AstroNet: A Swarm of Free-Flying Space Co-robots that are conceived to interact with NASA crew members and assist them in Extra-Vehicular Activities (EVAs) for the inspection, maintenance and repair of spacecraft exteriors. The 3-year project will focus on the development of Guidance, Navigation and Control algorithms that will enable the AstroNet to (i) safely surround the crew member during EVAs, (ii) perceive simple human commands (e.g., gestures) and (iii) respond to human commands by redistributing autonomously in space to dynamically and continuously enhance mission efficacy in a human-centric way.
Cooperative Quadrotors in Action (video).
Study.com – Top Robotics Graduate Programs
Grad School Hub – Top 20 Robotics Engineering Schools in the U.S.
John E. Laird, U-M
February 23, 2015