CVR-LSE: Compact Vectorized Representation of Local Static Environments for Reliable Obstacle Detection
Motivated by the requirement of robust and compact local environment perception, this repository contains code for a novel and compact vectorized representation approach of local static environments for unmanned ground vehicles, called CVR-LSE.
[2023-9] CVR-LSE is accepted by IEEE Transactions on Industrial Electronics (TIE)!
To run this project in minutes, check Dependency and Quick Start. Please refer to README.md in each folder to learn more about the contents.
Please cite the following paper if you use this project in your research:
H. Gao, Q. Qiu, W Hua, X. Zhang, Z Su, and S. Zhang, "CVR-LSE: Compact Vectorized Representation of Local Static Environments for Reliable Obstacle Detection," IEEE Transactions on Industrial Electronics, 2023, DOl: 10.1109/TIE.2023.3322016.
- ROS (tested with Melodic)
- gtsam (Georgia Tech Smoothing and Mapping library, 4.0.0-alpha2)
wget -O ~/Downloads/gtsam.zip https://github.com/borglab/gtsam/archive/4.0.0-alpha2.zip cd ~/Downloads/ && unzip gtsam.zip -d ~/Downloads/ cd ~/Downloads/gtsam-4.0.0-alpha2/ mkdir build && cd build cmake .. sudo make install
The project has been tested on 18.04 (ROS Melodic), and it can also be tested in other ROS versions (such as Kinetic with Ubuntu 16.04 and Noetic with Ubuntu 20.04). In the following we will take ROS Melodic version as the example.
At first, we have modified the well-known LIO-SAM, and obtain the new project called ldiar_odo, which is used to obtain the real-time vehicle pose. Then create and initialize a ROS workspace, run the following commands to clone this repo and build it:
$ mkdir -p ~/catkin_ws/src
$ cd ~/catkin_ws/src
$ catkin_init_workspace
$ git clone https://github.com/ghm0819/lidar_odo
$ cd ..
$ catkin_make
After the above preparation, in the same workspace, run the following commands:
$ cd ~/catkin_ws/src
$ git clone https://github.com/ghm0819/cvr_lse
$ cd ..
$ catkin_make
Finally
$ source devel/setup.bash
$ roslaunch cvr_lse cvr_lse.launch
The dataset with LiDAR information and IMU information is needed, and the calibration parameters of the LiDAR sensor and the vehicle should also be prepared in advance, change the parameter value in the config file. Then you should change the corresponding topic name in the launch file and config file, and open a new terminal and play the bag
$ rosbag play --clock your_bag_name
We make use of linefit_ground_segmentation for ground_segmentation, and grid_map_library for maintaining the local grid map, LIO-SAM for real-time vehicle pose.