Bearnav is a simple teach-and-repeat visual navigation system robust to appearance changes induced by varying illumination and naturally-occurring environment changes. It's core method is computationally efficient, it does not require camera calibration and it can learn and autonomously traverse arbitrarily-shaped paths. During the teaching phase, where the robot is driven by a human operator, the robot stores its velocities and image features visible from its on-board camera. During autonomous navigation, the method does not perform explicit robot localisation in the 2d/3d space but it simply replays the velocities that it learned during a teaching phase, while correcting its heading relatively to the path based on its camera data. The experiments performed indicate that the proposed navigation system corrects position errors of the robot as it moves along the path. Therefore, the robot can repeatedly drive along the desired path, which was previously taught by the human operator. Early versions of the system proved their ability to reliably traverse polygonal trajectories indoors and outdoors during adverse illumination conditions [1,2], in environments undergoing drastic appearance changes [2,3] and on flying robots[4]. The version presented here is described in [5,6] and it allows to learn arbitrary, smooth paths, is fully integrated in the ROS operating system and is available on-line in this repository.
- Unsure if opencv needs to be installed as below on Ubuntu 18 (I tested with Ubuntu 16).
- Teach
- Start the core mapping node:
roslaunch stroll_bearnav mapping-core-miro.launch
- Start the mapping gui node:
roslaunch stroll_bearnav mapping-gui-miro.launch
- Enter a map name between the quotation marks after prefix. Eg.
prefix: 'map-test'
- Click
SEND GOAL
(should be "start mapping") - Drive the robot around on the teach run, then click
CANCEL GOAL
(should be "finish mapping")
- Start the core mapping node:
- Repeat
- Start the core navigation node:
roslaunch stroll_bearnav navigation-core-miro.launch
- Start the navigation gui node:
roslaunch stroll_bearnav navigation-gui-miro.launch
- Load the map: in the window
stroll_bearnav/loadMap GUI Client
, enter the map name after prefix. Eg.prefix: 'map-test'
- Click
SEND GOAL
in that window (should be "load map") - In the window
stroll_bearnav/navigator GUI Client
, clickSEND GOAL
(should be "start repeat") - The repeat run will start. Best to reload the map if you want to run it again, else it seems to get confused when repeating control commands
- Start the core navigation node:
- You should install Ubuntu 16 with ROS kinetic or Ubuntu 18 with ROS melodic.
- Also, you should install other prerequisities:
sudo apt install git
.
Nothing special needs to be done here. You can continue with installation.
If you are using Ubuntu 18, you will need to compile opencv with opencv-contrib:
- Create a folder to perform the compilation and switch to it:
mkdir ~/opencv;cd ~/opencv
- Download opencv:
git clone -b 3.4 --single-branch https://github.com/opencv/opencv.git
- Download opencv-contrib:
git clone -b 3.4 --single-branch https://github.com/opencv/opencv_contrib.git
- Go to opencv folder, create a build folder and switch to it:
mkdir opencv/build;cd opencv/build
- Tell opencv to compile with the contrib (the following lines are ONE command, do not copy/paste it in separate):
cmake -DOPENCV_ENABLE_NONFREE:BOOL=ON -DOPENCV_EXTRA_MODULES_PATH=~/opencv/opencv_contrib/modules ~/opencv/opencv
- Compile it:
make -j5
. - Install it:
sudo make install
Prepare your environment in the home folder:
cd
,mkdir -p ~/robotika_ws/src
,cd ~/robotika_ws/src
,catkin_init_workspace
Make your usb camera work:
- Clone the usb_cam ROS driver:
git clone https://github.com/gestom/usb_cam.git
- Compile it:
cd ..
,catkin_make
- Source your environment:
source devel/setup.bash
- Make your camera easy to access:
sudo chmod 777 /dev/video0
- Run the camera node:
roslaunch usb_cam usb_cam-test.launch
- Keep the camera running and open a new terminal to continue.
Make the stroll_bearnav
package work:
cd ~/robotika_ws/src
- Clone the stroll_bearnav package:
git clone --branch robotika_sk_19 https://github.com/gestom/stroll_bearnav.git
- Compile it:
cd ..
,catkin_make
- Source your environment:
source devel/setup.bash
- Run it:
roslaunch stroll_bearnav stroll-core.launch
- Open a new terminal, source your environment and check the image features:
rosrun rqt_image_view rqt_image_view /image_with_features
- Open a new terminal, source your environment and check the system structure
rosrun rqt_graph rqt_graph
- Run the operator GUIs:
roslaunch stroll_bearnav stroll-gui.launch
- Now find the
mapper
client gui and create a map by entering its name, e.g.A
behind thefileName
clickSend goal
, wait for feedback and then clickCancel goal
. - Now find the
loadMap
gui, enter the map name in the prefix and clickSend goal
- Start the navigation by clicking
Send goal
in thenavigator
gui.
Test how the image features' matches from the map to the current view reflect the pan of the camera.
A detailed system description is provided in [5].
- T.Krajnik, L.Preucil: A simple visual navigation system with convergence property.In European Robotics Symposium, 2008. [bibtex]
- T.Krajnik, J.Faigl et al.: Simple yet stable bearing-only navigation. Journal of Field Robotics, 2010. [bibtex]
- T.Krajnik, S.Pedre, L.Preucil: Monocular navigation for long-term autonomy.In 16th International Conference on Advanced Robotics (ICAR), 2013. [bibtex]
- T.Krajnik, M.Nitsche et al: A simple visual navigation system for an UAV.In 9th International Multi-Conference on Systems, Signals and Devices (SSD), 2012. [bibtex]
- F.Majer, L.Halodova, T.Krajnik: A precise teach and repeat visual navigation system based on the convergence theorem. In Student Conference on Planning in Artificial Intelligence and Robotics (PAIR), 2017 (in review). [bibtex]
- T.Krajnik, F.Majer, L.Halodova, T.Vintr: Navigation without localisation: reliable teach and repeat based on the convergence theorem. 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)[bibtex]