22/01/2020 Computer Science
DOI: 10.1109/ICRA40945.2020.9196665 SemanticScholar ID: 211858532 MAG: 3010467267

Human-like Planning for Reaching in Cluttered Environments

Publication Summary

Humans, in comparison to robots, are remarkably adept at reaching for objects in cluttered environments. The best existing robot planners are based on random sampling of configuration space- which becomes excessively high-dimensional with large number of objects. Consequently, most planners often fail to efficiently find object manipulation plans in such environments. We addressed this problem by identifying high-level manipulation plans in humans, and transferring these skills to robot planners. We used virtual reality to capture human participants reaching for a target object on a tabletop cluttered with obstacles. From this, we devised a qualitative representation of the task space to abstract the decision making, irrespective of the number of obstacles. Based on this representation, human demonstrations were segmented and used to train decision classifiers. Using these classifiers, our planner produced a list of waypoints in task space. These waypoints provided a high-level plan, which could be transferred to an arbitrary robot model and used to initialise a local trajectory optimiser. We evaluated this approach through testing on unseen human VR data, a physics-based robot simulation, and a real robot (dataset and code are publicly available1). We found that the human-like planner outperformed a state-of-the-art standard trajectory optimisation algorithm, and was able to generate effective strategies for rapid planning- irrespective of the number of obstacles in the environment.

CAER Authors

Avatar Image for Faisal Mushtaq

Dr. Faisal Mushtaq

University of Leeds - Associate Professor in Cognitive Neuroscience

Avatar Image for Mark Mon-Williams

Prof. Mark Mon-Williams

University of Leeds - Chair in Cognitive Psychology

Share this

Next publication

2009 Psychology

The Dynamics of Category Conjunctions

R. Hutter, R. Crisp, G. Humphreys, Gillian. M. Waters + 1 more