Presentation of collaborative project with Microsoft about Mixed Reality for Shared Autonomy

The 30th of March I presented at the Applied Machine Learning Days (AMLD EPFL 2022) our recently started collaborative project with Microsoft about Mixed Reality for Shared Autonomy. The presentation is available at https://youtu.be/Ybmt7sX3zYA

Our objective is to exploit recent developments in MR to enhance human capabilities with robotic assistance. Robots offer mobility and power but are not capable of performing complex tasks in challenging environments such as construction, contact-based inspection, cleaning, and maintenance. On the other hand, humans have excellent higher-order reasoning, and skilled workers have the experience and training to adapt to new circumstances quickly and effectively. However, they lack in mobility and power. We envision to reduce this limitation by empowering human operators with the assistance and the capabilities provided by a robot system. This requires a human-robot interface that fully leverages the capabilities of both the human operator and the robot system. In this project we aim to explore the problem of shared autonomy for physical interaction tasks in shared physical workspaces. We will explore how an operator can effectively command a robot system using a MR interface over a range of autonomy levels from low-level direct teleoperation to high-level task specification. We will develop methods for estimating the intent and comfort level of an operator to provide an intuitive and effective interface. Finally, we will explore how to pass information from the robot system back to the human operator for effective understanding of the robot’s plans. We will prove the value of mixed reality interfaces by enhancing human capabilities with robot systems through effective, bilateral communication for a wide variety of complex tasks.