Rollbody: A Telepresence Robot

1Roboy Student Team SS2022 @ TU Munich, 2Devanthro
Team Photo

Rollbody allows a remote operator wearing a VR headset to have an immersive telepresence experience. This robot was built as a university project in the summer semester of 2022.

The Challenge

Our robot's big brother Roboy does not walk on his two legs yet. Therefore, as our main challenge we wanted to build a mobile telepresence robot that is operated through remote control in a virtual reality environment.

In order to develop an intuitive and immersive telepresence experience, we also researched relative topics on computer vision, control, etc.

Final Video

Mechanical & Electrical Engineering

CAD

The body structure of Rollbody is built ontop of a Segway Loomo and made out of Bosch profiles. Although our robot was self-balancing in the beginning of the development process, it required support wheels after we added more cameras, a tablet and the height adjustment mechanism.

Height Control

Rollbody's upper body features a telescopic height adjustment mechanism. It is driven by a stepper motor that is connected to a spindle drive.

Power

The Jetson Nano is powered from the Segway Loomo's hardware extension bay. Due to high energy consumption, the height-adjusting motor has an external battery.

Electrical

Software Engineering

Locomotion

We run a Java app on the Segway Loomo's Android OS that receives the motor commands via ROS 2 and controls the movement. Getting ROS 2 to work in Java was challenging, since it is not natively supported.

Head Animation

Visual feedback from a remote operator enhances the quality of the conversation. For this reason, we animate the operator according to his speech and head movement in a Unity app. Because this app runs in C# on Android, we cannot use ROS 2 and rely on a TCP connection to drive the animation instead.

VR Unity App

The robot operator wears an Oculus Quest VR headset on which we run a Unity app. This app connects to the robot's vision, hearing, voice and locomotion over the internet.

180° Stereo Vision

Captured frames from the 180° fisheye cameras are projected ontop of half-spheres and the operator's FOV is rendered from the inside of them. In practice, low bandwith limits the robot to monocular vision, even though we experimented with image compression mechanisms.

Software Architecture

We run a Unity client app on the operator's VR headset (left side) which connects to the robot (right side) over the Animus cloud. On robot side, animus_server is a ROS 2 node that connects to the client's vision, audio and motor-command streams. This server gathers/distributes the information from sensors/to actuator-nodes on the robot side.

Software Architecture

Acknowledgements

There is a lot of excellent work that our project relies on.

The cloud-streaming stack, the Unity-app running on the VR headset and the head animation of our robot are based on software that is used in Roboy. We also repurpose some hardware components that were built by Roboy Student Teams in previous semesters.