Successfully assimilating robotic systems into our everyday lives requires that they can robustly perform physical interaction tasks in ever-changing human environments. One approach to improving how robots learn such tasks in the real world is by enhancing their ability to interpret human behavior. This applies not just to exoskeleton robots learning to guide human behavior. For autonomous robotic systems, learning by observing the actions and interactions of humans can be a safer and more efficient approach to robot learning than from random exploration alone. Adding a human in the loop is still a challenge, however, because little is understood about how human neuromotor system controls movement and manages physical interaction. In this presentation, I will discuss my research into understanding how humans control physical interactions during complex tasks. I will also highlight recent work on how humans can perceive dynamic properties of a limb from visually observing its motion.
Dr. Meghan E. Huber is a postdoctoral research associate in the Department of Mechanical Engineering at the Massachusetts Institute of Technology and a member of the Newman Laboratory for Biomechanics and Human Rehabilitation led by Professor Neville Hogan. She received her B.S. degree in Biomedical Engineering from Rutgers University in 2009 and her M.S. degree in Biomedical Engineering from The University of Texas at Dallas in 2011. She received her Ph.D. in Bioengineering from Northeastern University in 2016 under the advisement of Professor Dagmar Sternad. During her doctoral training, she was a Visiting Junior Scientist in the Autonomous Motion Department at the Max Planck Institute for Intelligent Systems in Tübingen, Germany in 2014-2015.