Skip to content

Several intuitive and immersive methods for robot control, including joystick, hand-gestures and Azure Spatial Anchors

Notifications You must be signed in to change notification settings

MixedRealityETHZ/Interact-with-robot-via-Hololens2

Repository files navigation

Interacting with the Robot via Hololens 2

This repo contains the source code for Interacting with the Robot via Hololens 2 project. We implemented several intuitive and immersive methods for robot control, including joystick, hand-gestures and Azure Spatial Anchors.

Setup

Environment preparation

We build our app on Unity 2020.3.40f1, Visual Studio 2022.

Clone source code

git clone https://github.com/MixedRealityETHZ/Interact-with-robot-via-Hololens2.git

Import toolkits

Open the MixedRealityFeatureTool and select the folder that you just cloned. Choose "Restore Features".

Import project

Open Unity Hub, Select the drop down box near the "Open" and choose "Add project from disk".

After opening the unity, drag the three scenes from Scenes folder and delete the original empty scene.

Select "File"->"Build Settings"->"Universal Windows Platform"->"Switch Platform"->"Build", and this will generate a sln file.

Open the sln file with visual studio 2022, choose build mode as "Release", target as "Device". Connect Hololens to the computer with cable and wait for building.

Detailed functions

Joystick

The joystick is a cube and the Spot robot moves by interacting with it. For example if we rotate the cube it sends angular velocity commands to Spot and it rotates, while if you move the cube up and down, the robot moves forward and backward respectively by receiving linear velocity commands. Watch the video

Hand-gestures

This is a system for detecting and recognizing hand gestures to control a Spot robot. We use the MRTK (Mixed Reality Toolkit) to detect hands and obtain hand joint data, which we then pass to a hand gesture recognition model. We convert the model output into velocity commands, which we send to the Spot computer. Watch the video

Hand gesture recognition training code and README are in the MR_model_training folder.

Azure Spatial Anchors

Azure Spatial Anchor is an online service to map 3D spaces, we can use it to colocalize Spot with Hololens and know their relative position with each other. When a user places a spatial anchor (represented by a sphere) anywhere in the room, the robots moves to that location. The orientation of the Spot robot is such that it always faces the user. Watch the video

Using spot

rosrun image_transport republish raw in:=/spot/camera/hand_color/image compressed out:=/spot/camera/hand_color

Capture from HoloLens Portal video for higher quality

rosservice call /asa_ros/create_anchor '{anchor_in_target_frame: {header: {frame_id: odom}}}'

Run Spot driver: roslaunch spot_driver driver.launch

Run TCP Endpoint: roslaunch ros_tcp_endpoint endpoint.launch

Run ASA: roslaunch asa_ros asa_ros.launch

Run go2anchor.py: python3 src/go2anchor/go2anchor.py

Create a dummy anchor in spot: rosservice call /asa_ros/create_anchor '{anchor_in_target_frame: {header: {frame_id: odom}}}’

Run image to joystick scene: rosrun image_transport republish raw in:=/spot/camera/hand_color/image compressed out:=/spot/camera/hand_color

About

Several intuitive and immersive methods for robot control, including joystick, hand-gestures and Azure Spatial Anchors

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages