A new design of robot can navigate and retrieve objects in a cluttered space, in a way that is akin to how a human would carry out the task. In the future, it could be deployed into settings where workers require support.
The EPSRC-funded project investigates how skills performed by humans when completing a fairly complex manipulation task, such as picking up an item from the back of cluttered shelf, can be transferred to the robot. The aim is to ‘teach’ the robot to use this information in a way that means it can carry out the task in a more human-like manner, and more efficiently.
Researchers collected data from humans manipulating objects in a virtual reality world. The robot then uses machine learning to extract `rules’ that explain the choices the humans made whilst manipulating the objects in the virtual world. The robot then uses the learned rules to produce a high-level plan as to how best to manipulate objects when presented with a problem in the real world, overcoming limitations in current robotic technology in cluttered environments.
Finding solutions to problems using virtual reality data
Researchers from the Schools of Computing and Psychology, supported by the Centre for Immersive Technologies, are pioneering the investigation of ‘human-like planning’ in robots, and a prototype is in the early stages of testing. They are combining the AI techniques of ‘automated planning’ and ‘machine learning’ to overcome limitations in existing models.
Automated planning involves the robot exploring the different possible actions in its knowledge base, and in which order they might be carried out to reach the item. The current state-of-the-art tries different action sequences at random until it finds one that works. However, this process can take a long time. A novel aspect of Professor Cohn’s research is that it takes data from human movements in virtual reality, and enables the robot to use it to produce better robot plans.
By learning how humans plan in cluttered environments, the robot is able formulate a high-level plan to navigate the gaps and objects. This information is then fed into traditional planning system to calculate detailed trajectories for the robot arm. When confronted with a new scenario, the robot the model it has learned to assess and choose a route, identifying the key points of its planned route in order to retrieve the target item.
The researchers collected data using sensors attached to three points on a person’s arm: on the wrist, elbow, and shoulder, to build an accurate picture of how the human interacted with the virtual world and its dynamics. The team then used this information to learn how humans manipulate objects in such settings, and transferred this knowledge to the robot.
Making logistics easier
Currently, robots that are deployed in factories can perform limited, simple tasks and often require long pauses between activities. Objects need to be carefully arranged so the robot can work with them, which requires human intervention.
The robotic technology being developed at the University of Leeds could, in the future, serve an important function where there is a need to explore an environment that is dangerous for humans to enter.
The technology could also be used in care settings, where elderly patients may benefit from having an assistant. A robot with the ability to find and bring items to a person who is unable to retrieve them could help a person who has limited mobility, for example.
Professor Tony Cohn, who is leading the EPSRC grant, said: “The idea is that the robots can assist humans and collaborate with them to make manual tasks easier and more efficient. The potential applications for flexible, multi-purpose robots that can use human-like planning are vast. Possibilities range from factories, to healthcare organisations, and even agricultural settings.”
The potential applications...are vast. Possibilities range from factories, to healthcare organisations, and even agricultural settings.
He continued: “Society has been automating since the industrial revolution, and robotics will continue to add value to the role of technology in improving infrastructure. Wider impact is at the heart of the robotics community.
“Ultimately, research into human-like physics in robots could allow us to get closer to bridging the robot-human skill gap. By having robots that move in a `human-like way’, such collaborations will be easier for the human participants.”
...research into humanlike physics in robots could allow us to get closer to bridging the robot-human skill gap.
Community of researchers
The University of Leeds is home to a vibrant, multidisciplinary community of engineers, computer scientists and roboticists that offers a variety of funded PhD opportunities for postgraduate researchers like Wisdom Agboh. The research conducted here was performed by Mohamed Hasan and Matthew Warburton from the Schools of Computing and Psychology respectively, who were employed on the grant.
‘Human-like computing’ is a multidisciplinary collaboration comprised of researchers who each bring a different area of expertise.
Dr He Wang, from the School of Computing, contributes to the virtual reality experimental design. Drs Mehmet Dogar and Matteo Leonetti, also based in Computing, lead on different aspects of physics based robotic manipulation and machine learning.
Professor Mark Mon-Williams and Dr Faisal Mushtaq are leading the psychology and cognitive science aspects of the research, and bring expertise in cognitive neuroscience, reinforcement learning and decision-making, and experimental methodologies in virtual reality.
Read the paper: Human-like Planning for Reaching in Cluttered Environments
View the dataset: Learning manipulation planning from VR human demonstrations
Learn about the project: Human-like computing
If you would like to discuss this area of research in more detail, please contact Professor Anthony Cohn.