![]() |
rko_lio repositoryrobotics mapping ros imu lidar inertial slam odometry ros2 lidar-inertial-odometry rko_lio |
Repository Summary
Description | A Robust Approach for LiDAR-Inertial Odometry Without Sensor-Specific Modelling |
Checkout URI | https://github.com/PRBonn/rko_lio.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2025-10-08 |
Dev Status | DEVELOPED |
Released | RELEASED |
Tags | robotics mapping ros imu lidar inertial slam odometry ros2 lidar-inertial-odometry |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
rko_lio | 0.1.6 |
README
RKO LIO
Robust LiDAR-Inertial Odometry Without Sensor-Specific Modelling
Four different platforms, four different environments, one odometry system
Quick Start
Assuming you have a rosbag (ros1/ros2) which contains a TF tree, you can inspect the results of our odometry system with the following two steps
pip install rko_lio rosbags rerun-sdk
Why these three packages?
-
rko_lio
-> our odometry package -
rosbags
-> required for our rosbag dataloader. Both ros1 and ros2 bags are supported! -
rerun-sdk
-> required for our optional visualizer (-v
flag)
Next, run
# data path should be a directory with *.bag files (ROS1) or a metadata.yaml (ROS2)
rko_lio -v /path/to/data
and you should be good to go!
Click here for some more details on how the above works and how to use RKO LIO!
The `-v` flag enables visualization. You can specify a dataloader to use with `-d`, but if you don't, we try to guess the format based on the layout of the data. Our rosbag dataloader works with either ROS1 or ROS2 bags. Place split ROS1 bags in a single folder and pass the folder as the data path. Note that we don't support running RKO LIO on partial or incomplete bags, though you can try (and maybe raise an issue if you think we should support this). ROS2 especially will need a `metadata.yaml` file. By default, we assume there is just one IMU topic and one LiDAR topic in the bag, in which case we automatically pick up the topic names and proceed further. If there are multiple topics per sensor, you will be prompted to select one via the `--imu` or `--lidar` flags, which you can pass to `rko_lio`. Next, we assume there is a (static) TF tree in the bag. If so, we take the frame ids from the message topics we just picked up, build a static TF tree, and then query it for the extrinsic from IMU to LiDAR. Our odometry estimates the robot pose with respect to a base frame, and by default, we assume the LiDAR frame to be the base frame. If you would like to use a different frame, you can pass the frame id with `--base_frame` (note the other options available with `--help`). The TF tree will be queried for the appropriate transformations (if they exist in the bag!). In case there is no TF tree in the bag, then you will have to manually specify the extrinsics for IMU to base frame and LiDAR to base frame, as these two are **required** parameters. Set one of the extrinsics to identity if you want that one to be the base frame (you will still have to specify both parameters). You can specify the extrinsics via a config YAML file with the keys `extrinsic_imu2base_quat_xyzw_xyz` and `extrinsic_lidar2base_quat_xyzw_xyz`. You can dump a config with all the options set to default values by running `rko_lio --dump_config`. Modify as you require, and pass this file to `rko_lio` using the `-c` flag. Please check `python/config` in the GitHub repository for example configurations. An example invocation would then be ```bash # the config should have the sensor extrinsics if the rosbag doesn't rko_lio -v -c config.yaml --imu imu_topic --lidar lidar_topic /path/to/rosbag_folder File truncated at 100 lines [see the full file](https://github.com/PRBonn/rko_lio/tree/master/README.md)
CONTRIBUTING
![]() |
rko_lio repositoryrobotics mapping ros imu lidar inertial slam odometry ros2 lidar-inertial-odometry rko_lio |
Repository Summary
Description | A Robust Approach for LiDAR-Inertial Odometry Without Sensor-Specific Modelling |
Checkout URI | https://github.com/PRBonn/rko_lio.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2025-10-08 |
Dev Status | DEVELOPED |
Released | RELEASED |
Tags | robotics mapping ros imu lidar inertial slam odometry ros2 lidar-inertial-odometry |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
rko_lio | 0.1.6 |
README
RKO LIO
Robust LiDAR-Inertial Odometry Without Sensor-Specific Modelling
Four different platforms, four different environments, one odometry system
Quick Start
Assuming you have a rosbag (ros1/ros2) which contains a TF tree, you can inspect the results of our odometry system with the following two steps
pip install rko_lio rosbags rerun-sdk
Why these three packages?
-
rko_lio
-> our odometry package -
rosbags
-> required for our rosbag dataloader. Both ros1 and ros2 bags are supported! -
rerun-sdk
-> required for our optional visualizer (-v
flag)
Next, run
# data path should be a directory with *.bag files (ROS1) or a metadata.yaml (ROS2)
rko_lio -v /path/to/data
and you should be good to go!
Click here for some more details on how the above works and how to use RKO LIO!
The `-v` flag enables visualization. You can specify a dataloader to use with `-d`, but if you don't, we try to guess the format based on the layout of the data. Our rosbag dataloader works with either ROS1 or ROS2 bags. Place split ROS1 bags in a single folder and pass the folder as the data path. Note that we don't support running RKO LIO on partial or incomplete bags, though you can try (and maybe raise an issue if you think we should support this). ROS2 especially will need a `metadata.yaml` file. By default, we assume there is just one IMU topic and one LiDAR topic in the bag, in which case we automatically pick up the topic names and proceed further. If there are multiple topics per sensor, you will be prompted to select one via the `--imu` or `--lidar` flags, which you can pass to `rko_lio`. Next, we assume there is a (static) TF tree in the bag. If so, we take the frame ids from the message topics we just picked up, build a static TF tree, and then query it for the extrinsic from IMU to LiDAR. Our odometry estimates the robot pose with respect to a base frame, and by default, we assume the LiDAR frame to be the base frame. If you would like to use a different frame, you can pass the frame id with `--base_frame` (note the other options available with `--help`). The TF tree will be queried for the appropriate transformations (if they exist in the bag!). In case there is no TF tree in the bag, then you will have to manually specify the extrinsics for IMU to base frame and LiDAR to base frame, as these two are **required** parameters. Set one of the extrinsics to identity if you want that one to be the base frame (you will still have to specify both parameters). You can specify the extrinsics via a config YAML file with the keys `extrinsic_imu2base_quat_xyzw_xyz` and `extrinsic_lidar2base_quat_xyzw_xyz`. You can dump a config with all the options set to default values by running `rko_lio --dump_config`. Modify as you require, and pass this file to `rko_lio` using the `-c` flag. Please check `python/config` in the GitHub repository for example configurations. An example invocation would then be ```bash # the config should have the sensor extrinsics if the rosbag doesn't rko_lio -v -c config.yaml --imu imu_topic --lidar lidar_topic /path/to/rosbag_folder File truncated at 100 lines [see the full file](https://github.com/PRBonn/rko_lio/tree/master/README.md)
CONTRIBUTING
![]() |
rko_lio repositoryrobotics mapping ros imu lidar inertial slam odometry ros2 lidar-inertial-odometry rko_lio |
Repository Summary
Description | A Robust Approach for LiDAR-Inertial Odometry Without Sensor-Specific Modelling |
Checkout URI | https://github.com/PRBonn/rko_lio.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2025-10-08 |
Dev Status | DEVELOPED |
Released | RELEASED |
Tags | robotics mapping ros imu lidar inertial slam odometry ros2 lidar-inertial-odometry |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
rko_lio | 0.1.6 |
README
RKO LIO
Robust LiDAR-Inertial Odometry Without Sensor-Specific Modelling
Four different platforms, four different environments, one odometry system
Quick Start
Assuming you have a rosbag (ros1/ros2) which contains a TF tree, you can inspect the results of our odometry system with the following two steps
pip install rko_lio rosbags rerun-sdk
Why these three packages?
-
rko_lio
-> our odometry package -
rosbags
-> required for our rosbag dataloader. Both ros1 and ros2 bags are supported! -
rerun-sdk
-> required for our optional visualizer (-v
flag)
Next, run
# data path should be a directory with *.bag files (ROS1) or a metadata.yaml (ROS2)
rko_lio -v /path/to/data
and you should be good to go!
Click here for some more details on how the above works and how to use RKO LIO!
The `-v` flag enables visualization. You can specify a dataloader to use with `-d`, but if you don't, we try to guess the format based on the layout of the data. Our rosbag dataloader works with either ROS1 or ROS2 bags. Place split ROS1 bags in a single folder and pass the folder as the data path. Note that we don't support running RKO LIO on partial or incomplete bags, though you can try (and maybe raise an issue if you think we should support this). ROS2 especially will need a `metadata.yaml` file. By default, we assume there is just one IMU topic and one LiDAR topic in the bag, in which case we automatically pick up the topic names and proceed further. If there are multiple topics per sensor, you will be prompted to select one via the `--imu` or `--lidar` flags, which you can pass to `rko_lio`. Next, we assume there is a (static) TF tree in the bag. If so, we take the frame ids from the message topics we just picked up, build a static TF tree, and then query it for the extrinsic from IMU to LiDAR. Our odometry estimates the robot pose with respect to a base frame, and by default, we assume the LiDAR frame to be the base frame. If you would like to use a different frame, you can pass the frame id with `--base_frame` (note the other options available with `--help`). The TF tree will be queried for the appropriate transformations (if they exist in the bag!). In case there is no TF tree in the bag, then you will have to manually specify the extrinsics for IMU to base frame and LiDAR to base frame, as these two are **required** parameters. Set one of the extrinsics to identity if you want that one to be the base frame (you will still have to specify both parameters). You can specify the extrinsics via a config YAML file with the keys `extrinsic_imu2base_quat_xyzw_xyz` and `extrinsic_lidar2base_quat_xyzw_xyz`. You can dump a config with all the options set to default values by running `rko_lio --dump_config`. Modify as you require, and pass this file to `rko_lio` using the `-c` flag. Please check `python/config` in the GitHub repository for example configurations. An example invocation would then be ```bash # the config should have the sensor extrinsics if the rosbag doesn't rko_lio -v -c config.yaml --imu imu_topic --lidar lidar_topic /path/to/rosbag_folder File truncated at 100 lines [see the full file](https://github.com/PRBonn/rko_lio/tree/master/README.md)
CONTRIBUTING
![]() |
rko_lio repositoryrobotics mapping ros imu lidar inertial slam odometry ros2 lidar-inertial-odometry rko_lio |
Repository Summary
Description | A Robust Approach for LiDAR-Inertial Odometry Without Sensor-Specific Modelling |
Checkout URI | https://github.com/PRBonn/rko_lio.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2025-10-08 |
Dev Status | DEVELOPED |
Released | RELEASED |
Tags | robotics mapping ros imu lidar inertial slam odometry ros2 lidar-inertial-odometry |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
rko_lio | 0.1.6 |
README
RKO LIO
Robust LiDAR-Inertial Odometry Without Sensor-Specific Modelling
Four different platforms, four different environments, one odometry system
Quick Start
Assuming you have a rosbag (ros1/ros2) which contains a TF tree, you can inspect the results of our odometry system with the following two steps
pip install rko_lio rosbags rerun-sdk
Why these three packages?
-
rko_lio
-> our odometry package -
rosbags
-> required for our rosbag dataloader. Both ros1 and ros2 bags are supported! -
rerun-sdk
-> required for our optional visualizer (-v
flag)
Next, run
# data path should be a directory with *.bag files (ROS1) or a metadata.yaml (ROS2)
rko_lio -v /path/to/data
and you should be good to go!
Click here for some more details on how the above works and how to use RKO LIO!
The `-v` flag enables visualization. You can specify a dataloader to use with `-d`, but if you don't, we try to guess the format based on the layout of the data. Our rosbag dataloader works with either ROS1 or ROS2 bags. Place split ROS1 bags in a single folder and pass the folder as the data path. Note that we don't support running RKO LIO on partial or incomplete bags, though you can try (and maybe raise an issue if you think we should support this). ROS2 especially will need a `metadata.yaml` file. By default, we assume there is just one IMU topic and one LiDAR topic in the bag, in which case we automatically pick up the topic names and proceed further. If there are multiple topics per sensor, you will be prompted to select one via the `--imu` or `--lidar` flags, which you can pass to `rko_lio`. Next, we assume there is a (static) TF tree in the bag. If so, we take the frame ids from the message topics we just picked up, build a static TF tree, and then query it for the extrinsic from IMU to LiDAR. Our odometry estimates the robot pose with respect to a base frame, and by default, we assume the LiDAR frame to be the base frame. If you would like to use a different frame, you can pass the frame id with `--base_frame` (note the other options available with `--help`). The TF tree will be queried for the appropriate transformations (if they exist in the bag!). In case there is no TF tree in the bag, then you will have to manually specify the extrinsics for IMU to base frame and LiDAR to base frame, as these two are **required** parameters. Set one of the extrinsics to identity if you want that one to be the base frame (you will still have to specify both parameters). You can specify the extrinsics via a config YAML file with the keys `extrinsic_imu2base_quat_xyzw_xyz` and `extrinsic_lidar2base_quat_xyzw_xyz`. You can dump a config with all the options set to default values by running `rko_lio --dump_config`. Modify as you require, and pass this file to `rko_lio` using the `-c` flag. Please check `python/config` in the GitHub repository for example configurations. An example invocation would then be ```bash # the config should have the sensor extrinsics if the rosbag doesn't rko_lio -v -c config.yaml --imu imu_topic --lidar lidar_topic /path/to/rosbag_folder File truncated at 100 lines [see the full file](https://github.com/PRBonn/rko_lio/tree/master/README.md)
CONTRIBUTING
![]() |
rko_lio repositoryrobotics mapping ros imu lidar inertial slam odometry ros2 lidar-inertial-odometry |
![]() |
rko_lio repositoryrobotics mapping ros imu lidar inertial slam odometry ros2 lidar-inertial-odometry |
![]() |
rko_lio repositoryrobotics mapping ros imu lidar inertial slam odometry ros2 lidar-inertial-odometry |
![]() |
rko_lio repositoryrobotics mapping ros imu lidar inertial slam odometry ros2 lidar-inertial-odometry |
![]() |
rko_lio repositoryrobotics mapping ros imu lidar inertial slam odometry ros2 lidar-inertial-odometry |