News Release

Fast and comfortable robot-to-human handover for mobile cooperation robot system

Peer-Reviewed Publication

Beijing Institute of Technology Press Co., Ltd

The proposed mobile robot handover system is delineated into five distinct modules: Perception, Grasp Planning, Navigation, Hand Pose Generation, and Handover Execution.

image: 

Initially, the Perception module enables the robot to search and identify the object within its environment. Leveraging prior knowledge, the Grasp Planning module predicts an appropriate hand pose for successful object grasping. Following this, the Navigation module guides the robot toward the human recipient, dynamically adjusting to their current posture. The Hand Pose Generation module is responsible for tracking the human hand and generating a detailed mesh model, which aids in pinpointing the optimal handover location. Finally, the robot completes the handover by transferring the object to the human using a responsive strategy that adapts to real-time human movements.

view more 

Credit: Tin Lun Lam, School of Science and Engineering, The Chinese University of Hong Kong

A research paper by scientists at The Chinese University of Hong Kong proposed a method that enables a mobile robot to hand over objects to humans efficiently and safely by combining mobile navigation with visual perception.

The new research paper, published on Aug. 13 in the journal Cyborg and Bionic Systems, introduced a comprehensive handover framework tailored for mobile robots, designed to seamlessly manage the entire handover process, from object location to grasping and final delivery.

Robot interaction has become a cornerstone of contemporary society, with its applications permeating diverse sectors including manufacturing, healthcare, and personal assistance. Nevertheless, achieving a handover process that parallels the efficiency and fluidity of exchanges among humans remains a formidable challenge for the robotics community. “The impetus for a robot-to-human handover is derived from the human’s need to acquire an object for a specific task. The object in question may be situated within the robot’s immediate operational area, such as an operating table, or it may be located some distance away.” Explained study author Tin Lun Lam, a professor at The Chinese University of Hong Kong. Consequently, the robot must execute a series of actions: navigate effectively and safely to the object’s location, secure the object, and then return to deliver it to the human collaborator.

The handover navigation process is bifurcated into two sequential stages: the initial detection and acquisition of the object, followed by the robot’s traversal to the human recipient. This sequence encompasses four core components: localization, exploration, object grasping, and path planning. Model-based human body and hand reconstruction techniques offer a solution to this problem. “Our robotic system can map its environment in real-time and locate objects to pick up. It uses advanced algorithms to grasp objects in a way that suits human preference and employs path planning and obstacle avoidance to navigate back to the human user. The robot adjusts its movements during handover by analyzing the human’s posture and movements through visual sensors, ensuring a smooth and collision-free handover. Tests of our system show that it can successfully hand over various objects to humans and adapt to changes in the human’s hand position, highlighting improvements in safety and versatility for robotic handovers.” said Chongxi Meng.

A prominent feature of our framework is its advanced vision-based system that is adept at recognizing and interpreting a wide range of human hand postures through a specialized detection and reconstruction approach. This allows for accurate estimation of hand poses in various configurations. The framework’s ability to identify the optimal grasp type is crucial, as it ensures that the robot can determine both a safe hand posture for the human recipient and a successful grasp configuration for the object itself. To demonstrate the versatility and effectiveness of our system across different scenarios, we have rigorously tested our robot-to-human handover algorithm on both single-arm and dual-arm robots. The results, showcasing the algorithm’s adaptability and performance in varied settings, are presented in a video included in the Supplementary Materials. This evidence further substantiates the robustness and general applicability of the algorithms constituting our handover system.

Authors of the paper include Chongxi Meng, Tianwei Zhang, Da Zhao, Tin Lun Lam

This work was supported by the National Natural Science Foundation of China (grant nos. 62306185 and 62073274), the Guangdong Basic and Applied Basic Research Foundation (grant no. 2023B1515020089), and the Shenzhen Science and Technology Program (grant no. JSGGKQTD20221101115656029).

The paper, “Fast and Comfortable Robot-to-Human Handover for Mobile Cooperation Robot System” was published in the journal Cyborg and Bionic Systems on Aug 13, 2024, at DOI: 10.34133/cbsystems.0120.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.