Deep Reinforcement Learning applied to hybrid vehicles using gazebo with RotorS and UUV sim as a simulated training environment.
We propose the use of Deep-RL to perform autonomous mapless navigation for HUAUVs. We developed two approaches based on state-of-the-art Deep-RL algorithms: (i) Deep Deterministic Policy Gradient (DDPG) and (ii) Soft Actor-Critic (SAC). Our system uses the vehicle's relative localization data and simple sparse range data to train the intelligent agents. We compared our approaches with a traditional geometric tracking controller for mapless navigation.
All of requirements is show in the badgets above, but if you want to install all of them, enter the repository and execute the following line of code:
pip3 install -r requirements.txt
Before cloning the repository we need to configure your workspace. To do this proceed with the following commands in your terminal:
mkdir -p ~/hydrone/src
cd ~/hydrone/
catkin_make
Now that the workspace is already configured just enter the src folder and clone the repository, finally compile the project. To do this proceed with the following commands in your terminal:
cd ~/hydrone/src/
git clone https://github.com/ricardoGrando/hydrone_deep_rl_icra --recursive
cd ~/hydrone/
catkin_make
We now need to configure your terminal to accept the commands directed to our hydrone workspace. For this you can simply copy the line of code below to your .bashrc (or .zshrc if you use zsh instead of bash) or put the code directly into your terminal. Note that if you choose the second option, every time you open a new terminal you will have to give the following command again.
For bash:
source ~/hydrone/devel/setup.bash
For zsh:
source ~/hydrone/devel/setup.zsh
Okay, now your Hydrone is ready to run!
To do this, just execute the following command:
roslaunch hydrone_deep_rl_icra hydrone.launch
The way Jubileo works may seem a bit complex and in fact some parts are. We recommend to you visualize yourself how the Hydrone's nodes works (with the Hydrone already running) by using the following command:
rosrun rqt_graph rqt_graph
We have the official simulation video posted on youtube, to access it just click on the following hyperlink: Video
If you liked this repository, please don't forget to starred it!