-
Notifications
You must be signed in to change notification settings - Fork 17
Home
This project contains instructions, code and other artifacts to rebuild the Nvidia Redtail project with a drone running the Arducopter flight stack and the latest Nvidia Jetpack 4.2.x on a TX2. I am using a stereo camera as default since it provides a more sophisticated depth calculation compared to mono cameras as used in the original Redtail project.
This wiki will not cover the build instructions and setup for a drone since this is covered in detail in the original Redtail wiki. Focus is on the changes made to make this project run on the Arducopter flightstack and on the Jetpack 4.2.x release.
In order to rebuild the project, the following components are needed:
- A Jetson TX2 and a small size carrier board, such as the Auvidia J120 to mount the TX2 on a drone
- A drone with enough thrust and size to carry all the needed hardware components, eg a Tarot 650 frame
- A Pixhawk cube with Arducopter installed
- Needed sensors: an optical flow sensor to navigate w/o GPS, a Lidar ground distance sensor. I am using the original PX4Flow along with a Garmin LidarLite.
- A ZED Stereo camera
- A joystick eg Logitech 710
- A laptop running Ubuntu 16.04 or 18.04 is a convenient way to run GCS software like QGroundcontrol as well to control the drone using a Joystick.
Note: I have attached a 3,6mm lens to my PX4FLOW sensor as described in the original Nvidia Redtail since this is providing very good results when flying in FLOWHOLD mode. Other than described in the Nvidia Redtail wiki, you need to install this firmware on the PX4FLOW sensor to run it with Arducopter.
In this chapter, we describe the necessary steps to set up the software stack on the TX2, the ZED, and the Host PC. For the drone hardware configuration, please refer to the original Nvidia Redtail project wiki, which explains in detail the wiring and configuration of the sensors. Before you begin with the TX2 configuration and setup, you should have your drone built, and thoroughly flight-tested.
The project's AI that enables autonomous navigation is based on a deep neural network (DNN) which can be trained from scratch using publicly available data. A few pre-trained DNNs are also available as a part of this project including Trailnet, YOLO, and a few TensorRT stereo-DNNs for use with the ZED camera, including NVTiny, NVsmall, resnet18 and resnet18_2D.
Please see the original Nvidia Redtail repo for a more detailed description of the different models and instructions on how to train new models.
Thorough testing of the end2end setup is mandatory before any real flight endeavors. Follow these steps to test all features. Before you start testing the Redtail code, please verify that your drone flies properly without the TX2 connected. Please refer to the Arducopter wiki on how to tune and maiden the copter.
Once the hardware and software setup steps are complete, it's time to take off! Follow these steps to fly the drone.