Repository Summary
Description | Semantic LIDAR odometry and mapping for cylinderical objects (e.g. trees in forests) |
Checkout URI | https://github.com/kumarrobotics/sloam.git |
VCS Type | git |
VCS Version | main |
Last Updated | 2022-10-17 |
Dev Status | UNKNOWN |
Released | UNRELEASED |
Tags | mapping ros slam odometry semantic-slam |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
sloam | 0.0.0 |
sloam_msgs | 0.0.1 |
README
Semantic Lidar Odometry and Mapping in Forests (SLOAM)
(Note: Trees in grey color do not represent all trees in the semantic map. Instead, they represent a submap centered around the robot that is used for data association.)
Setup a worskpace
Even though we recommend using docker to run SLOAM, we decided to keep the ROS workspace on the host machine and create a volume that maps the folder to the container. This way, we can discard the container once the execution stops, but also keep a cached version of the last compilation. To do this, we will need a worskpace with the following structure on your host machine:
sloam_ws/
-> src/
-> sloam (folder from this repo)
-> sloam_msgs (folder from this repo)
-> models (you have to create this)
Segmentation
You will also need a neural network model for tree segmentation. You can find the ones we used for pine trees here (we used the same model in real world and simulated experiments). Download the model and put it in the models
folder.
We used RangeNet++ for segmentation. The trained model needs to be exported to ONNX format. Depending on your inputs/architecture you may need to change the file sloam/src/segmentation/inference.cpp
. You may also need to change the seg_model_path
in the file sloam/params/sloam.yaml
to point to the trained model.
Docker Image
To build the Docker image locally, you can use the docker/build_sloam_image.sh
script. This will create an image named sloam/runtime
. WARNING we use multi-stage build to make sure the runtime image is as small as possible, but rebuilding the image will create an auxiliary image that is 26GB+.
Alternatively, you can download the built image from Docker hub
docker pull gnardari/sloam:runtime
Running the container
Now that you configured the workspace, it is time to configure the run script sloam/docker/run_sloam_container.sh
.
You will have to change the variable in the first line of the file to where you created the workspace on your host machine and map a folder where you will put ROS bag files that will be accessed by the container:
# Example
SLOAMWS="$HOME/ros/sloam_ws"
BAGS_DIR="$HOME/bags"
Also check if the image name in the docker run
command matches the image you built/downloaded.
Once inside the container, you can use TMUX to create many terminal windows with tmux
.
Make sure that the Docker volume mapping the host workspace to the container is working by running cd /opt/sloam_ws/ && ls src
. This directory should not be empty.
Local Installation
Instead of Docker, you can install all dependencies locally to run SLOAM. Please refer to the local installation README for instructions.
Build workspace
cd /path/to/sloam_ws/
catkin build -DCMAKE_BUILD_TYPE=Release
Starting the SLOAM Node
The launch file sloam/launch/sloam.launch
contains the SLOAM parameters that you can tune. You can start SLOAM using the run.launch
file for real world data (you may need to configure some parameters depending on your sensor) or use run_sim.launch
to run SLOAM with simulated data. You can download an example bag here.
tmux
source devel/setup.bash
roslaunch sloam run_sim.launch # running sloam with sim data
ctrl+b % # create new TMUX pannel
cd ../bags/
rosbag play example.bag # play bag
Odometry Backbone
This version of SLOAM requires an odometry backbone to receive an initial guess for pose estimation. The bags we provided will have odometry messages, but for custom data, you will need to run another state estimation algorithm that will be used as an initial guess. Check out LLOL for a lidar odometry backbone and MSCKF for a stereo VIO option.
Parameter Tuning
Most of the SLOAM parameters can be viewed in the sloam/launch/sloam.launch
file. There are also the run.launch
and run_sim.launch
files where you should define the lidar point cloud and odometry topics, but can be used to change other parameters specifically for your scenario.
Development
Code Structure
Here is a high level diagram of the code structure.
-
Input Manager
will listen for Odometry and Point Cloud data and call SLOAM once the odometry estimated that the robot movedminOdomDistance
from the previous keyframe.
File truncated at 100 lines see the full file