Engineers at the University of California, San Diego have made significant strides in humanoid robotics by training a robot to perform a wide range of expressive movements effortlessly. This includes simple dance routines and gestures like waving, high-fiving, and hugging, all while maintaining a steady gait on various terrains. The enhanced expressiveness and agility of this humanoid robot hold promise for improving human-robot interactions in diverse settings, such as factory assembly lines, hospitals, homes, and hazardous environments like laboratories or disaster sites.

“Through expressive and more human-like body motions, we aim to build trust and showcase the potential for robots to coexist harmoniously with humans,” said Xiaolong Wang, a professor in the Department of Electrical and Computer Engineering at the UC San Diego Jacobs School of Engineering. “We are working to help reshape public perceptions of robots as friendly and collaborative rather than terrifying like The Terminator.”

Wang and his team will present their groundbreaking work at the 2024 Robotics: Science and Systems Conference, scheduled to take place from July 15 to 19 in Delft, Netherlands.

What sets this humanoid robot apart is its ability to learn from a diverse array of human body motions, enabling it to generalize new movements and mimic them with ease. Similar to a quick-learning dance student, the robot can swiftly learn new routines and gestures. The team achieved this by using an extensive collection of motion capture data and dance videos. Their innovative training technique involved separately training the robot’s upper and lower body. This allowed the upper body to replicate various reference motions, such as dancing and high-fiving, while the legs maintained a steady stepping motion to ensure balance and adaptability on different terrains.

“The main goal here is to show the robot’s ability to perform various tasks while walking from place to place without falling,” Wang explained.

Despite the separate training of the upper and lower body, the robot operates under a unified policy that governs its entire structure. This coordinated policy ensures that the robot can perform complex upper body gestures while walking steadily on surfaces like gravel, dirt, wood chips, grass, and inclined concrete paths.

Simulations were first conducted on a virtual humanoid robot before being transferred to a real robot. The real-world robot demonstrated the ability to execute both learned and new movements in various conditions. Currently, the robot’s movements are directed by a human operator using a game controller, which dictates its speed, direction, and specific motions. However, the team envisions a future version equipped with a camera, enabling the robot to perform tasks and navigate terrains autonomously.

The team is now focused on refining the robot’s design to tackle more intricate and fine-grained tasks. “By extending the capabilities of the upper body, we can expand the range of motions and gestures the robot can perform,” said Wang.

This groundbreaking work at UC San Diego represents a significant step towards more expressive and adaptable robots, which could revolutionize the way humans interact with machines in everyday and specialized environments.

By Impact Lab