Repository Summary
Description | Multi-Modal Sensor Fusion and Object Tracking for Autonomous Racing |
Checkout URI | https://github.com/tumftm/fusiontracking.git |
VCS Type | git |
VCS Version | main |
Last Updated | 2023-08-16 |
Dev Status | UNKNOWN |
Released | UNRELEASED |
Tags | python fusion kalman-filter ros2 ros2-node |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
tracking | 0.0.0 |
README
Multi-Modal Sensor Fusion and Object Tracking
The following figure outlines the high level structure of the algorithm, which covers the tasks of multi-modal sensor fusion and object tracking. The algorithm is developed for the Indy Autonomous Challenge 2021 and the Autonomous Challenge at CES 2022 and is part of the software of TUM Autonomous Motorsport.
The sensor fusion handles multiple object lists that originate from different perception pipelines. The perception pipelines work independently from each other and output individual object lists. This algorithm combines the given information to output a unified object list. This late fusion approach allows us to incorporate a variable number of perception pipelines without any dependencies.
The object tracking addresses the estimation of the detected objects’ dynamic states, which is realized by an Extended Kalman Filter (EKF) based on a constant turn-rate and velocity (CTRV)-model.
Requirements
- OS:
Ubuntu 22.04 LTS
- Docker:
20.10.17
- Docker Compose:
v2.6.1
- Python:
3.8
- ROS2:
galactic
Installation
Clone repository:
git clone https://github.com/TUMFTM/FusionTacking.git
Setup virtual environment and install requirements:
python3.8 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
Install tex extensions, necessary to plot with desired font:
sudo apt-get install texlive-latex-extra texlive-fonts-recommended dvipng cm-super
Data and Evaluation
The evaluation is entirely conducted with real-world data of team TUM Autonomous Motorsport from the AC@CES 2022. The recorded raw data of all tracking inputs stored in rosbags is available open-source (Link, uncompressed size: 16.4 GB). The data processing and evaluation procedure is described in the README. Follow the described steps to reproduce the evaluation.
Docker Image
It is recommended to run the ROS2 node of the module in a Docker container. To build the related image, execute:
docker build --pull --tag <image_name>:<tag> .
# e.g. docker build --pull --tag tracking:0.0.1 .
To run the container and launch the ROS2 node, run:
docker run <image_name>:<tag> ros2 launch tracking tracking.launch.py
# e.g. docker run tracking:0.0.1 ros2 launch tracking tracking.launch.py
It is recommended to mount a volume to save the logs during the node runs (see replay.yml for an example).
Add additional parameters to the ros2 launch command if desired, see section Parameter and Files
below. For further details about Docker and ROS2, we refer to the official documentations.
Parameter and Files
Directory: tracking
The directory tracking
contains the source code (subfolder: tracking
) and ROS2 launch configuration (subfolder: launch
) of the module.
Files | Description |
---|---|
tracking/tracking_node.py |
ROS2 main file to apply the MixNet |
launch/tracking.launch.py |
ROS2 launch file with parameter definition |
The launch description contains the following parameters:
Parameter | Type | Default | Description |
---|---|---|---|
frequency | float, int | 50.0 Hz | Cycle frequency of the ROS2 node |
max_delay_ego_s | float | 0.15 s | Threshold for ego state message delay |
checks_enabled | boolean | False | If true failed safety checks trigger emergency state of the module |
track | string | LVMS | Name of used race track map |
use_sim_time | boolean | False | Flag to use sim time instead of system time |
ego_raceline | string | default | String of used ego raceline of motion planner (default, inner, outer, center) |
send_prediction | boolean | True | If true a prediction is published |
Add them at the end of the docker run
-command. Example with modified frequency and enabled safety checks:
docker run tracking:0.0.1 ros2 launch tracking tracking.launch.py frequency:=100.0 checks_enabled:=True
Directory: tools
The directory tools
contains the script to visualize logged data of the applied ROS2 Node. To visualize logged data of the tracking-node run:
python tools/visualize_logfiles.py
Logs must be stored in tracking/tracking/logs
to be considered. Enter the number of the desired log or hit enter
to run the latest.
Add additional arguments if desired. Without any argument the overall tracking process will be shown (always recommended at first).
Additional Arguments:
-
--n_obs
: Specifies the number of objects to show filter values / states of (default: 5) -
--filter
: Visualizes filter values of the-most seen objects (default: False) -
--states
: Visualizes the dynamic states of the-most seen objects (default: False)
File truncated at 100 lines see the full file