The remote-controlled robot iCub3 can represent a human surprisingly well, even over long distances. In experiments, a researcher was able to see and hear with the help of the two-legged machine, and haptic impressions on the robot hands were also transmitted. At the same time, the humanoid robot took over movements of the researcher 300 kilometers away, imitated his facial expressions and reproduced his voice.
The research group led by Stefano Dafarra from the Istituto Italiano di Tecnologia in Genoa (Italy) took part in the international robot avatar competition ANA Avatar XPrize with iCub3. The scientists have now described the technology and the tasks accomplished in the journal “Science Robotics”.
Information exchange runs in both directions
“We present a versatile avatar system that allows people to be embodied by humanoid robots,” the researchers write. The exchange of information runs in both directions: the operator lets the avatar walk and make arm and hand movements. He communicates with people through the avatar using his voice and facial expressions, particularly the eyebrows, mouth and eyes – iCub3 is able to blink when exposed to bright light.
Conversely, the operator receives sensory impressions from the robot. He wears an augmented virtual reality headset. The device shows him camera images of the avatar and lets him hear the recordings from the built-in microphone – all in real time. The headset also records the operator’s facial expressions and transmits them to the avatar. A full-body suit and special gloves allow the operator to feel the robot’s touch. Force-torque sensors enable weight to be recorded when something is placed in the robot’s hands.
Such sensors are also built into the shoes that the operator wears on the walking platform. They measure the force and torque that the operator exchanges with the ground and thus indicate the running movement to the avatar. However, because balance is always a problem with two-legged robots, iCub3 has an automatic balancing system that is used, for example, when the robot is transporting a load.
Performing in front of large audiences
While iCub3 moved through the Venice Biennale art exhibition in November 2021, it was controlled in Genoa, 290 kilometers away. The operator gained impressions of the exhibition and chatted with a companion. In the end, the attendant hugged the robot and the operator and his avatar returned the hug. In June 2022, iCub3 performed in front of 2,000 spectators at a show at the We Make Future festival in Rimini, Italy. Again it was remotely controlled from Genoa, this time over a distance of 300 kilometers.
At the ANA Avatar XPrize science competition that same year, iCub3 took second place in the semifinals. Among other things, it completed the tasks of placing pieces of a puzzle, feeling the surface of a vase and estimating its weight, and toasting with a cup. At the final in November 2022 in Los Angeles, iCub3 hit a goal post, fell and was eliminated early. Still, Dafarra and colleagues are proud of their achievement: “Our avatar system was the only one that ultimately completed tasks using bipedal locomotion using a lightweight set of operating devices consisting of commercial and custom wearables,” they write in the study.