Skip to content

Setup of the TX2, ZED and Host PC

mtbsteve edited this page Oct 16, 2020 · 16 revisions

ZED Stereo Camera Setup

While the original Redtail project is using a monocular camera, we are focusing on a stereo depth estimation in this project. Therefore, this wiki describes the use of the Stereolabs ZED camera. You may also use the ZED Mini camera, however all scripts are assuming by default the ZED. Prerequisite is to have the Stereolabs SDK installed on your TX2. Therefore follow these steps to download and install the SDK for the Jetson 4.2 environment. Plug the camera into the USB 3.0 port, and run the test examples as described on the Stereolabs web page to ensure its fully operational.

This project comes along with the 3D parts needed to build a simple 1-axis gimbal. One axis is sufficient to level the camera around the nick-axis. We do not need the roll or yaw axis stabilization since we won't create cinema-grade footage anyway and we do want to keep the weight as low as possible.

The ZED 1 - axis gimbal

For a high performance operation of the DNNs, it is suggested to run the ZED in VGA mode once you have the ZED ROS Wrapper installed (see below). Therefore, edit the file:

vi $CATKIN_WS/src/zed-ros-wrapper/zed_wrapper/params/common.yaml

and set the resolution parameter to 3.

Jetson setup

The NVIDIA Jetson platform is used to run most of the components, such as DNN inference, the controller, and video streaming. Install Jetpack 4.2.x by using the Nvidia SDKmanager. If you are using the Auvidea J120 breakout board, you need to apply some kernel patches as described here

Important: you need to have apsync installed in order to have all drone communication set up correctly. Code and instructions can be found here

The Arducopter wiki describes how to connect the Jetson with the Pixhawk here and how to configure Arducopter to communicate with the TX2.

Do not install the TX2 image file referenced in the Arducopter wiki since its outdated.

apsync creates a WiFi access point with both SSID and password set as ardupilot on the TX2. Instead of using the bulky wifi antennas supplied with the TX2 development board, you may add a small WiFi antenna to your Jetson board, such as a Taoglas Limited FXP522.A.07.A.001 dual channel 3dBi antenna which is very small and can lay flat on one part of your drone's surface and also is powerful enough to be able to connect to Jetson module using WiFi from several meters away.

Connect to this WiFi access point using a computer you can SSH to Jetson module from. Apsync creates an apysync username for Jetson and sets it as it's default login, asking you to set the password for this account during installation. It also sets the default IP address of the module to 10.0.1.128 static address. You can either use the IP address to SSH to the module or add apsync 10.0.1.128 to your known hosts in /etc/hosts and use the hostname for SSH:

Finally, set the TX2 to max power mode with sudo nvpmodel -m 0 and then run jetson_clocks

Redtail Installation on TX2

To install Redtail, please follow the steps as described here.

This shell script downloads and installs all the dependencies as well as building the code for this project. Make sure that the apsync wifi access point is set up and the Jetson module has internet access through LAN cable connected to the J120 board. The script needs a small interaction such as accepting the licences and entering super user's password for installing dependencies such as ROS. This script installs all the dependencies, clones a copy of this repository, creates a Catkin workspace in ~/catkin_ws using Python Catkin tools and build all the required codes.

The script installs also the ZED ROS wrapper code.

In order to communicate to mavlink through mavros, make sure you have opened a UDP port in your mavlink-router. Edit ~/start_mavlink-router/mavlink-router.conf and add the following lines at the end:

[UdpEndpoint to_ros]
Mode = Normal
Address = 127.0.0.1
Port = 14855

Optional: YOLO object detection (does not work with OpenCV 4.x)

With Jetpack 4.3, OpenCV 4.0 is included wich is not supported by ROS and the YOLO implementation

In addition to the trail detection and follow capability of the original Redtail project, you may add a ROS node for a YOLO (You Only Look Once). Therefore, install darknet and the corresponding ROS wrapper:

cd $CATKIN_WS/src
git clone --recursive https://github.com/leggedrobotics/darknet_ros.git
cd ..
catkin build darknet_ros -DCMAKE_BUILD_TYPE=Release

Edit the darknet_ros/darknet_ros/config/ros.yaml file and insert the corresponding zed image node. You may also disable the image_view output by setting enable_opencv: false. The video stream of the ZED overlayed with the bounding boxes for the identified objects can be visualized by subscribing to /darknet_ros/detection_image

Stream video from the ROS image nodes

In order to stream video to your GCS, you may install a ROS image node streamer. I am using the ROS to RTSP streamer from here:

cd $CATKIN_WS
git clone https://github.com/CircusMonkey/ros_rtsp.git
catkin build ros_rtsp

Before you build the project, please replace the image2rtsp.cpp source file with the version included in the redtail/tools/ros2rtsp_patches directory. This patch includes the required changes to the gstreamer pipeline to display the video stream on all rtsp players correctly.

Edit the configuration file located at $CATKIN_WS/src/ros_rtsp/config/stream_setup.yaml and enter the following nodes (example) at the end of the file:

  stream-zed_camera:
    type: topic
    source: /zed/zed_node/left/image_rect_color
    mountpoint: /zedimage
    caps: video/x-raw,framerate=10/1,width=640,height=480
    bitrate: 100

  stream-zed_ros_depth:
    type: topic
    source: /zed/zed_node/confidence/confidence_image
    mountpoint: /zedconf
    caps: video/x-raw,framerate=10/1,width=640,height=480
    bitrate: 100

Set the following rtsp pipeline in your ground control station (eg QGC) on your PC or Tablet computer:

rtsp://10.0.1.128:8554/<your mountpoint>

For the mountpoint, enter the name you have defined in the stream_setup.yaml file above.

Groundcontrol Station (GCS) Setup

Using a PC/Laptop as GCS

First, install ROS on your PC. For Ubuntu 16.04 (Xenial) install ROS Kinetic Desktop. For instructions see here. For Ubuntu 18.04, install ROS Melodic Desktop, see here. Then install the required joystick driver for your ROS version (kinetic or melodic): http://wiki.ros.org/joy and test if it works as described here. You need to set the following environment variables and add them to your .bashrc.

$ export ROS_MASTER_URI=http://10.0.1.128:11311 #the TX2 IP address should be 10.0.1.128 if you are using apsync
$ export ROS_IP=10.0.1.xxx # enter here the IP of your host PC to find out, use ifconfig

Start the joystick node on the host PC:

rosrun joy joy_node _dev:=/dev/input/js0 & # check the correct address of your joystick with ls -l /dev/input

To test if the joystick is operating correctly with ROS and communicating with the ROS master on the TX2: execute the following commands in a terminal window on the TX2:

roscore &
rostopic echo /joy

Alternatively, you can install the joystick support in a docker container as described in the original Nvidia Redtail project on your host PC.

Note: It seems that Jetpack 4.2.x has an issue with the ROS joystick drivers. I was not able to get it running.

As Ground Control software, you may use Q-GroundControl which runs on Windows, Linux, Android and iOS.

Use an Android device as GCS

Once you have set up and tested everything, you may use an Android device as GCS. Benefit is you do not need to carry a laptop around. As GCS, you may use QGroundControl or Solex. Both apps can be found on GooglePlay. While QGroundControl can be downloaded for free, I strongly recommend Solex since it provides all the means to embed the various ROS controls to start/stop nodes, to control cameras, and to handle joystick input. See chapter Usage of SolexCC on how Solex can be used in this project.

Joystick control

Control with the ROS Joystick Controller app

In order to use the joystick or gamepad as a ROS joystick input device from your tablet instead from a Laptop or PC, you can use this app: https://github.com/mtbsteve/ROSJoyController. Please follow the instructions to install the app on your Android device.

On the TX2, you need to install the rosbridge server:

sudo apt-get install ros-melodic-rosbridge-server

To test if the joystick is operating correctly with ROS and communicating with the ROS master on the TX2: execute the following commands in a terminal window on the TX2:

roslaunch rosbridge_server rosbridge_websocket.launch &
rostopic echo /joy

ROS Joy Controller App

Use Solex-CC as joystick controller

Another way, and actually my preferred method, is to use Solex-CC built-in functionality for joystick control. Therefore you need to install rosnodejs via npm.

Note: There is a known incompatibility between npm and ROS. Once you install ROS, it automatically deinstalls npm, and vice versa due to an incompatibility in libssl1.0 which is being used by both packages.

Workaround:

sudo apt-get install nodejs-dev node-gyp libssl1.0-dev
sudo apt-get install npm
npm install rosnodejs
# then reinstall ROS (melodic):
sudo apt install ros-melodic-desktop # you need to reinstall all needed ROS packages!