2020 Computer Science
SemanticScholar ID: 211572564 MAG: 3007009866

Introducing a Human-like Planner for Reaching in Cluttered Environments

Publication Summary

Humans, in comparison to robots, are remarkably adept at reaching for objects in cluttered environments. The best existing robot planners are based on random sampling in configuration space -- which becomes excessively high-dimensional with a large number of objects. Consequently, most of these planners suffer from limited object manipulation. We address this problem by learning high-level manipulation planning skills from humans and transfer these skills to robot planners. We used virtual reality to generate data from human participants whilst they reached for objects on a cluttered table top. From this, we devised a qualitative representation of the task space to abstract human decisions, irrespective of the number of objects in the way. Based on this representation, human demonstrations were segmented and used to train decision classifiers. Using these classifiers, our planner produced a list of waypoints in task space. These waypoints provide a high-level plan, which can be transferred to an arbitrary robot model and used to initialize a local trajectory optimiser. We evaluated this approach through testing on unseen human VR data, a physics-based robot simulation and real robot experiments. We find that this human-like planner outperforms a state-of-the-art standard trajectory optimisation algorithm and is able to generate effective strategies for rapid planning, irrespective of the number of objects in a cluttered environment. Our dataset and source code are publicly available.

CAER Authors

Avatar Image for Faisal Mushtaq

Dr. Faisal Mushtaq

University of Leeds - Associate Professor in Cognitive Neuroscience

Avatar Image for Mark Mon-Williams

Prof. Mark Mon-Williams

University of Leeds - Chair in Cognitive Psychology

Share this

Next publication

2009 Psychology

The Dynamics of Category Conjunctions

R. Hutter, R. Crisp, G. Humphreys, Gillian. M. Waters + 1 more