Successful assimilation of robotic systems into our everyday lives requires that they are actually useful to humans. Whether they are in a factory performing an assembly line task with a human or in the home assisting with household chores, robots must be capable of robustly performing physical interaction tasks in collaboration with humans. In the new Human Robot Systems (HRS) Laboratory at UMass Amherst, our mission is to (1) improve human-robot physical interaction using principles from human neuromotor control and perception and (2) advance how humans and robots learn to guide the physical interactive behavior of one another. To this end, I will first discuss our research on improving the interpretability of both human and robot motor behavior to facilitate human-robot physical interaction. In particular, I will highlight our recent work on human visual perception of mechanical impedance from motion. Second, I will discuss our work on implicitly guiding and shaping human behavior using robotic exoskeletons in order to robustly deliver physical assistance and/or rehabilitation in real world environments.
Meghan E. Huber is an Assistant Professor in the Department of Mechanical and Industrial Engineering at University of Massachusetts Amherst and leads the new Human Robot Systems (HRS) Laboratory. Prior to UMass Amherst, she was a postdoctoral research associate in the Department of Mechanical Engineering at the Massachusetts Institute of Technology from 2016-2020. She received her B.S. degree in Biomedical Engineering from Rutgers University in 2009, M.S. degree in Biomedical Engineering from The University of Texas at Dallas in 2011, and Ph.D. in Bioengineering from Northeastern University in 2016. During her doctoral training, she was also a Visiting Junior Scientist in the Autonomous Motion Department at the Max Planck Institute for Intelligent Systems in Tübingen, Germany in 2014.