|
fusion_estimator package from ros2go2estimator repofusion_estimator |
ROS Distro
|
Package Summary
| Version | 0.0.0 |
| License | Apache-2.0 |
| Build type | AMENT_CMAKE |
| Use | RECOMMENDED |
Repository Summary
| Description | |
| Checkout URI | https://github.com/shineminxing/ros2go2estimator.git |
| VCS Type | git |
| VCS Version | main |
| Last Updated | 2026-04-02 |
| Dev Status | UNKNOWN |
| Released | UNRELEASED |
| Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Maintainers
- Sun Minxing
Authors
CAPO-LeggedRobotOdometry π¦Ύ
| Language / θ―θ¨οΌ English | δΈζ |
CAPO-LeggedRobotOdometry is a ROS 2 (Humble) proprioceptive odometry / state-estimation repository for biped / quadruped / wheel-legged robots on Ubuntu 22.04.
It provides high-accuracy odometry using only IMU + joint encoders + foot contact/force signals, without requiring cameras or LiDAR.
In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.
At the repository level, fusion_estimator_node.cpp is mainly the ROS2 wrapper (topics, parameters, message conversion, publishing), while the actual fusion-estimation algorithm lives in FusionEstimator/ as a portable, pure C++ implementation. This makes it straightforward to port the estimator to ROS1, non-ROS2 applications, or embedded platforms.
In addition, the Matlab/ folder provides a MATLAB + MEX example that compiles and calls the same C++ estimator core for offline validation, algorithm analysis, and cross-platform reuse.
π Paper
Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)
- Paper: https://arxiv.org/abs/2602.17393
If you use this repository in research, please consider citing the paper.
π¦ Data Sharing (Go2-EDU ROS bags)
To help readers quickly validate the pipeline, we provide two Go2-EDU trial datasets, including real-world videos and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.
- Download (Google Drive): https://drive.google.com/drive/folders/1FfVO69rfmUu6B9crPhZCfKf9wFnV4L7n?usp=sharing
Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.
β¨ Key Features
| Category | Description |
|---|---|
| Biped / Quadruped / Wheel-Legged Unified | Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking. |
| Full 3D & Planar 2D | Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D). |
| No Exteroception Required | Works without cameras or LiDAR; only IMU, joint encoders, and foot contact/force signals are required. |
| Portable Pure C++ Core | The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2. |
| MATLAB / MEX Validation | The Matlab/ folder demonstrates how to compile and invoke the same C++ core from MATLAB. |
| Runtime Tuning | Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots. |
ποΈ Related Repositories
| Scope | Repository | Summary |
|---|---|---|
| Low-level / Drivers | https://github.com/ShineMinxing/Ros2Go2Base | DDS bridge, Unitree SDK2 control, pointcloudβLaserScan, TF utilities |
| Odometry | CAPO-LeggedRobotOdometry (this repo) | Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
|
| SLAM / Mapping | https://github.com/ShineMinxing/Ros2SLAM | Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO |
| Voice / LLM | https://github.com/ShineMinxing/Ros2Chat | Offline ASR + OpenAI Chat + TTS |
| Vision | https://github.com/ShineMinxing/Ros2ImageProcess | Camera pipelines, spot / face / drone detection |
| Gimbal Tracking | https://github.com/ShineMinxing/Ros2AmovG1 | Amov G1 gimbal control and tracking |
| Tools | https://github.com/ShineMinxing/Ros2Tools | Bluetooth IMU, joystick mapping, gimbal loop, data logging |
β οΈ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with
Ros2SLAMandRos2Go2Base.
π Repository Layout
CAPO-LeggedRobotOdometry/
βββ CMakeLists.txt
βββ package.xml
βββ config.yaml
βββ fusion_estimator_node.cpp # ROS2 node wrapper: params / topics / odom publishing
βββ FusionEstimator/ # pure C++ fusion-estimation core (portable)
β βββ Estimators/
β βββ fusion_estimator.h
β βββ LowlevelState.h
β βββ SensorBase.cpp
β βββ SensorBase.h
β βββ Sensor_IMU.cpp
β βββ Sensor_IMU.h
β βββ Sensor_Legs.cpp
β βββ Sensor_Legs.h
β βββ Readme.md
βββ Matlab/ # MATLAB + MEX example for calling the C++ core
β βββ build_mex.m
β βββ fusion_estimator.m
β βββ fusion_estimator_mex.cpp
β βββ MPXY150Z10.csv
β βββ MWXY150Z10.csv
βββ Plotjuggler.xml # PlotJuggler layout / visualization helper
βββ Readme.md
π§© Architecture Notes
This repository is intentionally split into two layers:
File truncated at 100 lines see the full file
Package Dependencies
| Deps | Name |
|---|---|
| ament_cmake | |
| ament_lint_auto | |
| ament_lint_common | |
| rclcpp | |
| sensor_msgs | |
| std_msgs | |
| nav_msgs | |
| tf2 | |
| tf2_geometry_msgs |
System Dependencies
Dependant Packages
Launch files
Messages
Services
Plugins
Recent questions tagged fusion_estimator at Robotics Stack Exchange
|
fusion_estimator package from ros2go2estimator repofusion_estimator |
ROS Distro
|
Package Summary
| Version | 0.0.0 |
| License | Apache-2.0 |
| Build type | AMENT_CMAKE |
| Use | RECOMMENDED |
Repository Summary
| Description | |
| Checkout URI | https://github.com/shineminxing/ros2go2estimator.git |
| VCS Type | git |
| VCS Version | main |
| Last Updated | 2026-04-02 |
| Dev Status | UNKNOWN |
| Released | UNRELEASED |
| Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Maintainers
- Sun Minxing
Authors
CAPO-LeggedRobotOdometry π¦Ύ
| Language / θ―θ¨οΌ English | δΈζ |
CAPO-LeggedRobotOdometry is a ROS 2 (Humble) proprioceptive odometry / state-estimation repository for biped / quadruped / wheel-legged robots on Ubuntu 22.04.
It provides high-accuracy odometry using only IMU + joint encoders + foot contact/force signals, without requiring cameras or LiDAR.
In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.
At the repository level, fusion_estimator_node.cpp is mainly the ROS2 wrapper (topics, parameters, message conversion, publishing), while the actual fusion-estimation algorithm lives in FusionEstimator/ as a portable, pure C++ implementation. This makes it straightforward to port the estimator to ROS1, non-ROS2 applications, or embedded platforms.
In addition, the Matlab/ folder provides a MATLAB + MEX example that compiles and calls the same C++ estimator core for offline validation, algorithm analysis, and cross-platform reuse.
π Paper
Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)
- Paper: https://arxiv.org/abs/2602.17393
If you use this repository in research, please consider citing the paper.
π¦ Data Sharing (Go2-EDU ROS bags)
To help readers quickly validate the pipeline, we provide two Go2-EDU trial datasets, including real-world videos and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.
- Download (Google Drive): https://drive.google.com/drive/folders/1FfVO69rfmUu6B9crPhZCfKf9wFnV4L7n?usp=sharing
Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.
β¨ Key Features
| Category | Description |
|---|---|
| Biped / Quadruped / Wheel-Legged Unified | Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking. |
| Full 3D & Planar 2D | Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D). |
| No Exteroception Required | Works without cameras or LiDAR; only IMU, joint encoders, and foot contact/force signals are required. |
| Portable Pure C++ Core | The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2. |
| MATLAB / MEX Validation | The Matlab/ folder demonstrates how to compile and invoke the same C++ core from MATLAB. |
| Runtime Tuning | Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots. |
ποΈ Related Repositories
| Scope | Repository | Summary |
|---|---|---|
| Low-level / Drivers | https://github.com/ShineMinxing/Ros2Go2Base | DDS bridge, Unitree SDK2 control, pointcloudβLaserScan, TF utilities |
| Odometry | CAPO-LeggedRobotOdometry (this repo) | Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
|
| SLAM / Mapping | https://github.com/ShineMinxing/Ros2SLAM | Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO |
| Voice / LLM | https://github.com/ShineMinxing/Ros2Chat | Offline ASR + OpenAI Chat + TTS |
| Vision | https://github.com/ShineMinxing/Ros2ImageProcess | Camera pipelines, spot / face / drone detection |
| Gimbal Tracking | https://github.com/ShineMinxing/Ros2AmovG1 | Amov G1 gimbal control and tracking |
| Tools | https://github.com/ShineMinxing/Ros2Tools | Bluetooth IMU, joystick mapping, gimbal loop, data logging |
β οΈ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with
Ros2SLAMandRos2Go2Base.
π Repository Layout
CAPO-LeggedRobotOdometry/
βββ CMakeLists.txt
βββ package.xml
βββ config.yaml
βββ fusion_estimator_node.cpp # ROS2 node wrapper: params / topics / odom publishing
βββ FusionEstimator/ # pure C++ fusion-estimation core (portable)
β βββ Estimators/
β βββ fusion_estimator.h
β βββ LowlevelState.h
β βββ SensorBase.cpp
β βββ SensorBase.h
β βββ Sensor_IMU.cpp
β βββ Sensor_IMU.h
β βββ Sensor_Legs.cpp
β βββ Sensor_Legs.h
β βββ Readme.md
βββ Matlab/ # MATLAB + MEX example for calling the C++ core
β βββ build_mex.m
β βββ fusion_estimator.m
β βββ fusion_estimator_mex.cpp
β βββ MPXY150Z10.csv
β βββ MWXY150Z10.csv
βββ Plotjuggler.xml # PlotJuggler layout / visualization helper
βββ Readme.md
π§© Architecture Notes
This repository is intentionally split into two layers:
File truncated at 100 lines see the full file
Package Dependencies
| Deps | Name |
|---|---|
| ament_cmake | |
| ament_lint_auto | |
| ament_lint_common | |
| rclcpp | |
| sensor_msgs | |
| std_msgs | |
| nav_msgs | |
| tf2 | |
| tf2_geometry_msgs |
System Dependencies
Dependant Packages
Launch files
Messages
Services
Plugins
Recent questions tagged fusion_estimator at Robotics Stack Exchange
|
fusion_estimator package from ros2go2estimator repofusion_estimator |
ROS Distro
|
Package Summary
| Version | 0.0.0 |
| License | Apache-2.0 |
| Build type | AMENT_CMAKE |
| Use | RECOMMENDED |
Repository Summary
| Description | |
| Checkout URI | https://github.com/shineminxing/ros2go2estimator.git |
| VCS Type | git |
| VCS Version | main |
| Last Updated | 2026-04-02 |
| Dev Status | UNKNOWN |
| Released | UNRELEASED |
| Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Maintainers
- Sun Minxing
Authors
CAPO-LeggedRobotOdometry π¦Ύ
| Language / θ―θ¨οΌ English | δΈζ |
CAPO-LeggedRobotOdometry is a ROS 2 (Humble) proprioceptive odometry / state-estimation repository for biped / quadruped / wheel-legged robots on Ubuntu 22.04.
It provides high-accuracy odometry using only IMU + joint encoders + foot contact/force signals, without requiring cameras or LiDAR.
In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.
At the repository level, fusion_estimator_node.cpp is mainly the ROS2 wrapper (topics, parameters, message conversion, publishing), while the actual fusion-estimation algorithm lives in FusionEstimator/ as a portable, pure C++ implementation. This makes it straightforward to port the estimator to ROS1, non-ROS2 applications, or embedded platforms.
In addition, the Matlab/ folder provides a MATLAB + MEX example that compiles and calls the same C++ estimator core for offline validation, algorithm analysis, and cross-platform reuse.
π Paper
Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)
- Paper: https://arxiv.org/abs/2602.17393
If you use this repository in research, please consider citing the paper.
π¦ Data Sharing (Go2-EDU ROS bags)
To help readers quickly validate the pipeline, we provide two Go2-EDU trial datasets, including real-world videos and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.
- Download (Google Drive): https://drive.google.com/drive/folders/1FfVO69rfmUu6B9crPhZCfKf9wFnV4L7n?usp=sharing
Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.
β¨ Key Features
| Category | Description |
|---|---|
| Biped / Quadruped / Wheel-Legged Unified | Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking. |
| Full 3D & Planar 2D | Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D). |
| No Exteroception Required | Works without cameras or LiDAR; only IMU, joint encoders, and foot contact/force signals are required. |
| Portable Pure C++ Core | The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2. |
| MATLAB / MEX Validation | The Matlab/ folder demonstrates how to compile and invoke the same C++ core from MATLAB. |
| Runtime Tuning | Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots. |
ποΈ Related Repositories
| Scope | Repository | Summary |
|---|---|---|
| Low-level / Drivers | https://github.com/ShineMinxing/Ros2Go2Base | DDS bridge, Unitree SDK2 control, pointcloudβLaserScan, TF utilities |
| Odometry | CAPO-LeggedRobotOdometry (this repo) | Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
|
| SLAM / Mapping | https://github.com/ShineMinxing/Ros2SLAM | Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO |
| Voice / LLM | https://github.com/ShineMinxing/Ros2Chat | Offline ASR + OpenAI Chat + TTS |
| Vision | https://github.com/ShineMinxing/Ros2ImageProcess | Camera pipelines, spot / face / drone detection |
| Gimbal Tracking | https://github.com/ShineMinxing/Ros2AmovG1 | Amov G1 gimbal control and tracking |
| Tools | https://github.com/ShineMinxing/Ros2Tools | Bluetooth IMU, joystick mapping, gimbal loop, data logging |
β οΈ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with
Ros2SLAMandRos2Go2Base.
π Repository Layout
CAPO-LeggedRobotOdometry/
βββ CMakeLists.txt
βββ package.xml
βββ config.yaml
βββ fusion_estimator_node.cpp # ROS2 node wrapper: params / topics / odom publishing
βββ FusionEstimator/ # pure C++ fusion-estimation core (portable)
β βββ Estimators/
β βββ fusion_estimator.h
β βββ LowlevelState.h
β βββ SensorBase.cpp
β βββ SensorBase.h
β βββ Sensor_IMU.cpp
β βββ Sensor_IMU.h
β βββ Sensor_Legs.cpp
β βββ Sensor_Legs.h
β βββ Readme.md
βββ Matlab/ # MATLAB + MEX example for calling the C++ core
β βββ build_mex.m
β βββ fusion_estimator.m
β βββ fusion_estimator_mex.cpp
β βββ MPXY150Z10.csv
β βββ MWXY150Z10.csv
βββ Plotjuggler.xml # PlotJuggler layout / visualization helper
βββ Readme.md
π§© Architecture Notes
This repository is intentionally split into two layers:
File truncated at 100 lines see the full file
Package Dependencies
| Deps | Name |
|---|---|
| ament_cmake | |
| ament_lint_auto | |
| ament_lint_common | |
| rclcpp | |
| sensor_msgs | |
| std_msgs | |
| nav_msgs | |
| tf2 | |
| tf2_geometry_msgs |
System Dependencies
Dependant Packages
Launch files
Messages
Services
Plugins
Recent questions tagged fusion_estimator at Robotics Stack Exchange
|
fusion_estimator package from ros2go2estimator repofusion_estimator |
ROS Distro
|
Package Summary
| Version | 0.0.0 |
| License | Apache-2.0 |
| Build type | AMENT_CMAKE |
| Use | RECOMMENDED |
Repository Summary
| Description | |
| Checkout URI | https://github.com/shineminxing/ros2go2estimator.git |
| VCS Type | git |
| VCS Version | main |
| Last Updated | 2026-04-02 |
| Dev Status | UNKNOWN |
| Released | UNRELEASED |
| Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Maintainers
- Sun Minxing
Authors
CAPO-LeggedRobotOdometry π¦Ύ
| Language / θ―θ¨οΌ English | δΈζ |
CAPO-LeggedRobotOdometry is a ROS 2 (Humble) proprioceptive odometry / state-estimation repository for biped / quadruped / wheel-legged robots on Ubuntu 22.04.
It provides high-accuracy odometry using only IMU + joint encoders + foot contact/force signals, without requiring cameras or LiDAR.
In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.
At the repository level, fusion_estimator_node.cpp is mainly the ROS2 wrapper (topics, parameters, message conversion, publishing), while the actual fusion-estimation algorithm lives in FusionEstimator/ as a portable, pure C++ implementation. This makes it straightforward to port the estimator to ROS1, non-ROS2 applications, or embedded platforms.
In addition, the Matlab/ folder provides a MATLAB + MEX example that compiles and calls the same C++ estimator core for offline validation, algorithm analysis, and cross-platform reuse.
π Paper
Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)
- Paper: https://arxiv.org/abs/2602.17393
If you use this repository in research, please consider citing the paper.
π¦ Data Sharing (Go2-EDU ROS bags)
To help readers quickly validate the pipeline, we provide two Go2-EDU trial datasets, including real-world videos and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.
- Download (Google Drive): https://drive.google.com/drive/folders/1FfVO69rfmUu6B9crPhZCfKf9wFnV4L7n?usp=sharing
Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.
β¨ Key Features
| Category | Description |
|---|---|
| Biped / Quadruped / Wheel-Legged Unified | Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking. |
| Full 3D & Planar 2D | Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D). |
| No Exteroception Required | Works without cameras or LiDAR; only IMU, joint encoders, and foot contact/force signals are required. |
| Portable Pure C++ Core | The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2. |
| MATLAB / MEX Validation | The Matlab/ folder demonstrates how to compile and invoke the same C++ core from MATLAB. |
| Runtime Tuning | Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots. |
ποΈ Related Repositories
| Scope | Repository | Summary |
|---|---|---|
| Low-level / Drivers | https://github.com/ShineMinxing/Ros2Go2Base | DDS bridge, Unitree SDK2 control, pointcloudβLaserScan, TF utilities |
| Odometry | CAPO-LeggedRobotOdometry (this repo) | Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
|
| SLAM / Mapping | https://github.com/ShineMinxing/Ros2SLAM | Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO |
| Voice / LLM | https://github.com/ShineMinxing/Ros2Chat | Offline ASR + OpenAI Chat + TTS |
| Vision | https://github.com/ShineMinxing/Ros2ImageProcess | Camera pipelines, spot / face / drone detection |
| Gimbal Tracking | https://github.com/ShineMinxing/Ros2AmovG1 | Amov G1 gimbal control and tracking |
| Tools | https://github.com/ShineMinxing/Ros2Tools | Bluetooth IMU, joystick mapping, gimbal loop, data logging |
β οΈ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with
Ros2SLAMandRos2Go2Base.
π Repository Layout
CAPO-LeggedRobotOdometry/
βββ CMakeLists.txt
βββ package.xml
βββ config.yaml
βββ fusion_estimator_node.cpp # ROS2 node wrapper: params / topics / odom publishing
βββ FusionEstimator/ # pure C++ fusion-estimation core (portable)
β βββ Estimators/
β βββ fusion_estimator.h
β βββ LowlevelState.h
β βββ SensorBase.cpp
β βββ SensorBase.h
β βββ Sensor_IMU.cpp
β βββ Sensor_IMU.h
β βββ Sensor_Legs.cpp
β βββ Sensor_Legs.h
β βββ Readme.md
βββ Matlab/ # MATLAB + MEX example for calling the C++ core
β βββ build_mex.m
β βββ fusion_estimator.m
β βββ fusion_estimator_mex.cpp
β βββ MPXY150Z10.csv
β βββ MWXY150Z10.csv
βββ Plotjuggler.xml # PlotJuggler layout / visualization helper
βββ Readme.md
π§© Architecture Notes
This repository is intentionally split into two layers:
File truncated at 100 lines see the full file
Package Dependencies
| Deps | Name |
|---|---|
| ament_cmake | |
| ament_lint_auto | |
| ament_lint_common | |
| rclcpp | |
| sensor_msgs | |
| std_msgs | |
| nav_msgs | |
| tf2 | |
| tf2_geometry_msgs |
System Dependencies
Dependant Packages
Launch files
Messages
Services
Plugins
Recent questions tagged fusion_estimator at Robotics Stack Exchange
|
fusion_estimator package from ros2go2estimator repofusion_estimator |
ROS Distro
|
Package Summary
| Version | 0.0.0 |
| License | Apache-2.0 |
| Build type | AMENT_CMAKE |
| Use | RECOMMENDED |
Repository Summary
| Description | |
| Checkout URI | https://github.com/shineminxing/ros2go2estimator.git |
| VCS Type | git |
| VCS Version | main |
| Last Updated | 2026-04-02 |
| Dev Status | UNKNOWN |
| Released | UNRELEASED |
| Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Maintainers
- Sun Minxing
Authors
CAPO-LeggedRobotOdometry π¦Ύ
| Language / θ―θ¨οΌ English | δΈζ |
CAPO-LeggedRobotOdometry is a ROS 2 (Humble) proprioceptive odometry / state-estimation repository for biped / quadruped / wheel-legged robots on Ubuntu 22.04.
It provides high-accuracy odometry using only IMU + joint encoders + foot contact/force signals, without requiring cameras or LiDAR.
In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.
At the repository level, fusion_estimator_node.cpp is mainly the ROS2 wrapper (topics, parameters, message conversion, publishing), while the actual fusion-estimation algorithm lives in FusionEstimator/ as a portable, pure C++ implementation. This makes it straightforward to port the estimator to ROS1, non-ROS2 applications, or embedded platforms.
In addition, the Matlab/ folder provides a MATLAB + MEX example that compiles and calls the same C++ estimator core for offline validation, algorithm analysis, and cross-platform reuse.
π Paper
Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)
- Paper: https://arxiv.org/abs/2602.17393
If you use this repository in research, please consider citing the paper.
π¦ Data Sharing (Go2-EDU ROS bags)
To help readers quickly validate the pipeline, we provide two Go2-EDU trial datasets, including real-world videos and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.
- Download (Google Drive): https://drive.google.com/drive/folders/1FfVO69rfmUu6B9crPhZCfKf9wFnV4L7n?usp=sharing
Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.
β¨ Key Features
| Category | Description |
|---|---|
| Biped / Quadruped / Wheel-Legged Unified | Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking. |
| Full 3D & Planar 2D | Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D). |
| No Exteroception Required | Works without cameras or LiDAR; only IMU, joint encoders, and foot contact/force signals are required. |
| Portable Pure C++ Core | The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2. |
| MATLAB / MEX Validation | The Matlab/ folder demonstrates how to compile and invoke the same C++ core from MATLAB. |
| Runtime Tuning | Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots. |
ποΈ Related Repositories
| Scope | Repository | Summary |
|---|---|---|
| Low-level / Drivers | https://github.com/ShineMinxing/Ros2Go2Base | DDS bridge, Unitree SDK2 control, pointcloudβLaserScan, TF utilities |
| Odometry | CAPO-LeggedRobotOdometry (this repo) | Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
|
| SLAM / Mapping | https://github.com/ShineMinxing/Ros2SLAM | Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO |
| Voice / LLM | https://github.com/ShineMinxing/Ros2Chat | Offline ASR + OpenAI Chat + TTS |
| Vision | https://github.com/ShineMinxing/Ros2ImageProcess | Camera pipelines, spot / face / drone detection |
| Gimbal Tracking | https://github.com/ShineMinxing/Ros2AmovG1 | Amov G1 gimbal control and tracking |
| Tools | https://github.com/ShineMinxing/Ros2Tools | Bluetooth IMU, joystick mapping, gimbal loop, data logging |
β οΈ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with
Ros2SLAMandRos2Go2Base.
π Repository Layout
CAPO-LeggedRobotOdometry/
βββ CMakeLists.txt
βββ package.xml
βββ config.yaml
βββ fusion_estimator_node.cpp # ROS2 node wrapper: params / topics / odom publishing
βββ FusionEstimator/ # pure C++ fusion-estimation core (portable)
β βββ Estimators/
β βββ fusion_estimator.h
β βββ LowlevelState.h
β βββ SensorBase.cpp
β βββ SensorBase.h
β βββ Sensor_IMU.cpp
β βββ Sensor_IMU.h
β βββ Sensor_Legs.cpp
β βββ Sensor_Legs.h
β βββ Readme.md
βββ Matlab/ # MATLAB + MEX example for calling the C++ core
β βββ build_mex.m
β βββ fusion_estimator.m
β βββ fusion_estimator_mex.cpp
β βββ MPXY150Z10.csv
β βββ MWXY150Z10.csv
βββ Plotjuggler.xml # PlotJuggler layout / visualization helper
βββ Readme.md
π§© Architecture Notes
This repository is intentionally split into two layers:
File truncated at 100 lines see the full file
Package Dependencies
| Deps | Name |
|---|---|
| ament_cmake | |
| ament_lint_auto | |
| ament_lint_common | |
| rclcpp | |
| sensor_msgs | |
| std_msgs | |
| nav_msgs | |
| tf2 | |
| tf2_geometry_msgs |
System Dependencies
Dependant Packages
Launch files
Messages
Services
Plugins
Recent questions tagged fusion_estimator at Robotics Stack Exchange
|
fusion_estimator package from ros2go2estimator repofusion_estimator |
ROS Distro
|
Package Summary
| Version | 0.0.0 |
| License | Apache-2.0 |
| Build type | AMENT_CMAKE |
| Use | RECOMMENDED |
Repository Summary
| Description | |
| Checkout URI | https://github.com/shineminxing/ros2go2estimator.git |
| VCS Type | git |
| VCS Version | main |
| Last Updated | 2026-04-02 |
| Dev Status | UNKNOWN |
| Released | UNRELEASED |
| Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Maintainers
- Sun Minxing
Authors
CAPO-LeggedRobotOdometry π¦Ύ
| Language / θ―θ¨οΌ English | δΈζ |
CAPO-LeggedRobotOdometry is a ROS 2 (Humble) proprioceptive odometry / state-estimation repository for biped / quadruped / wheel-legged robots on Ubuntu 22.04.
It provides high-accuracy odometry using only IMU + joint encoders + foot contact/force signals, without requiring cameras or LiDAR.
In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.
At the repository level, fusion_estimator_node.cpp is mainly the ROS2 wrapper (topics, parameters, message conversion, publishing), while the actual fusion-estimation algorithm lives in FusionEstimator/ as a portable, pure C++ implementation. This makes it straightforward to port the estimator to ROS1, non-ROS2 applications, or embedded platforms.
In addition, the Matlab/ folder provides a MATLAB + MEX example that compiles and calls the same C++ estimator core for offline validation, algorithm analysis, and cross-platform reuse.
π Paper
Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)
- Paper: https://arxiv.org/abs/2602.17393
If you use this repository in research, please consider citing the paper.
π¦ Data Sharing (Go2-EDU ROS bags)
To help readers quickly validate the pipeline, we provide two Go2-EDU trial datasets, including real-world videos and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.
- Download (Google Drive): https://drive.google.com/drive/folders/1FfVO69rfmUu6B9crPhZCfKf9wFnV4L7n?usp=sharing
Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.
β¨ Key Features
| Category | Description |
|---|---|
| Biped / Quadruped / Wheel-Legged Unified | Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking. |
| Full 3D & Planar 2D | Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D). |
| No Exteroception Required | Works without cameras or LiDAR; only IMU, joint encoders, and foot contact/force signals are required. |
| Portable Pure C++ Core | The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2. |
| MATLAB / MEX Validation | The Matlab/ folder demonstrates how to compile and invoke the same C++ core from MATLAB. |
| Runtime Tuning | Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots. |
ποΈ Related Repositories
| Scope | Repository | Summary |
|---|---|---|
| Low-level / Drivers | https://github.com/ShineMinxing/Ros2Go2Base | DDS bridge, Unitree SDK2 control, pointcloudβLaserScan, TF utilities |
| Odometry | CAPO-LeggedRobotOdometry (this repo) | Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
|
| SLAM / Mapping | https://github.com/ShineMinxing/Ros2SLAM | Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO |
| Voice / LLM | https://github.com/ShineMinxing/Ros2Chat | Offline ASR + OpenAI Chat + TTS |
| Vision | https://github.com/ShineMinxing/Ros2ImageProcess | Camera pipelines, spot / face / drone detection |
| Gimbal Tracking | https://github.com/ShineMinxing/Ros2AmovG1 | Amov G1 gimbal control and tracking |
| Tools | https://github.com/ShineMinxing/Ros2Tools | Bluetooth IMU, joystick mapping, gimbal loop, data logging |
β οΈ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with
Ros2SLAMandRos2Go2Base.
π Repository Layout
CAPO-LeggedRobotOdometry/
βββ CMakeLists.txt
βββ package.xml
βββ config.yaml
βββ fusion_estimator_node.cpp # ROS2 node wrapper: params / topics / odom publishing
βββ FusionEstimator/ # pure C++ fusion-estimation core (portable)
β βββ Estimators/
β βββ fusion_estimator.h
β βββ LowlevelState.h
β βββ SensorBase.cpp
β βββ SensorBase.h
β βββ Sensor_IMU.cpp
β βββ Sensor_IMU.h
β βββ Sensor_Legs.cpp
β βββ Sensor_Legs.h
β βββ Readme.md
βββ Matlab/ # MATLAB + MEX example for calling the C++ core
β βββ build_mex.m
β βββ fusion_estimator.m
β βββ fusion_estimator_mex.cpp
β βββ MPXY150Z10.csv
β βββ MWXY150Z10.csv
βββ Plotjuggler.xml # PlotJuggler layout / visualization helper
βββ Readme.md
π§© Architecture Notes
This repository is intentionally split into two layers:
File truncated at 100 lines see the full file
Package Dependencies
| Deps | Name |
|---|---|
| ament_cmake | |
| ament_lint_auto | |
| ament_lint_common | |
| rclcpp | |
| sensor_msgs | |
| std_msgs | |
| nav_msgs | |
| tf2 | |
| tf2_geometry_msgs |
System Dependencies
Dependant Packages
Launch files
Messages
Services
Plugins
Recent questions tagged fusion_estimator at Robotics Stack Exchange
|
fusion_estimator package from ros2go2estimator repofusion_estimator |
ROS Distro
|
Package Summary
| Version | 0.0.0 |
| License | Apache-2.0 |
| Build type | AMENT_CMAKE |
| Use | RECOMMENDED |
Repository Summary
| Description | |
| Checkout URI | https://github.com/shineminxing/ros2go2estimator.git |
| VCS Type | git |
| VCS Version | main |
| Last Updated | 2026-04-02 |
| Dev Status | UNKNOWN |
| Released | UNRELEASED |
| Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Maintainers
- Sun Minxing
Authors
CAPO-LeggedRobotOdometry π¦Ύ
| Language / θ―θ¨οΌ English | δΈζ |
CAPO-LeggedRobotOdometry is a ROS 2 (Humble) proprioceptive odometry / state-estimation repository for biped / quadruped / wheel-legged robots on Ubuntu 22.04.
It provides high-accuracy odometry using only IMU + joint encoders + foot contact/force signals, without requiring cameras or LiDAR.
In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.
At the repository level, fusion_estimator_node.cpp is mainly the ROS2 wrapper (topics, parameters, message conversion, publishing), while the actual fusion-estimation algorithm lives in FusionEstimator/ as a portable, pure C++ implementation. This makes it straightforward to port the estimator to ROS1, non-ROS2 applications, or embedded platforms.
In addition, the Matlab/ folder provides a MATLAB + MEX example that compiles and calls the same C++ estimator core for offline validation, algorithm analysis, and cross-platform reuse.
π Paper
Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)
- Paper: https://arxiv.org/abs/2602.17393
If you use this repository in research, please consider citing the paper.
π¦ Data Sharing (Go2-EDU ROS bags)
To help readers quickly validate the pipeline, we provide two Go2-EDU trial datasets, including real-world videos and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.
- Download (Google Drive): https://drive.google.com/drive/folders/1FfVO69rfmUu6B9crPhZCfKf9wFnV4L7n?usp=sharing
Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.
β¨ Key Features
| Category | Description |
|---|---|
| Biped / Quadruped / Wheel-Legged Unified | Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking. |
| Full 3D & Planar 2D | Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D). |
| No Exteroception Required | Works without cameras or LiDAR; only IMU, joint encoders, and foot contact/force signals are required. |
| Portable Pure C++ Core | The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2. |
| MATLAB / MEX Validation | The Matlab/ folder demonstrates how to compile and invoke the same C++ core from MATLAB. |
| Runtime Tuning | Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots. |
ποΈ Related Repositories
| Scope | Repository | Summary |
|---|---|---|
| Low-level / Drivers | https://github.com/ShineMinxing/Ros2Go2Base | DDS bridge, Unitree SDK2 control, pointcloudβLaserScan, TF utilities |
| Odometry | CAPO-LeggedRobotOdometry (this repo) | Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
|
| SLAM / Mapping | https://github.com/ShineMinxing/Ros2SLAM | Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO |
| Voice / LLM | https://github.com/ShineMinxing/Ros2Chat | Offline ASR + OpenAI Chat + TTS |
| Vision | https://github.com/ShineMinxing/Ros2ImageProcess | Camera pipelines, spot / face / drone detection |
| Gimbal Tracking | https://github.com/ShineMinxing/Ros2AmovG1 | Amov G1 gimbal control and tracking |
| Tools | https://github.com/ShineMinxing/Ros2Tools | Bluetooth IMU, joystick mapping, gimbal loop, data logging |
β οΈ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with
Ros2SLAMandRos2Go2Base.
π Repository Layout
CAPO-LeggedRobotOdometry/
βββ CMakeLists.txt
βββ package.xml
βββ config.yaml
βββ fusion_estimator_node.cpp # ROS2 node wrapper: params / topics / odom publishing
βββ FusionEstimator/ # pure C++ fusion-estimation core (portable)
β βββ Estimators/
β βββ fusion_estimator.h
β βββ LowlevelState.h
β βββ SensorBase.cpp
β βββ SensorBase.h
β βββ Sensor_IMU.cpp
β βββ Sensor_IMU.h
β βββ Sensor_Legs.cpp
β βββ Sensor_Legs.h
β βββ Readme.md
βββ Matlab/ # MATLAB + MEX example for calling the C++ core
β βββ build_mex.m
β βββ fusion_estimator.m
β βββ fusion_estimator_mex.cpp
β βββ MPXY150Z10.csv
β βββ MWXY150Z10.csv
βββ Plotjuggler.xml # PlotJuggler layout / visualization helper
βββ Readme.md
π§© Architecture Notes
This repository is intentionally split into two layers:
File truncated at 100 lines see the full file
Package Dependencies
| Deps | Name |
|---|---|
| ament_cmake | |
| ament_lint_auto | |
| ament_lint_common | |
| rclcpp | |
| sensor_msgs | |
| std_msgs | |
| nav_msgs | |
| tf2 | |
| tf2_geometry_msgs |
System Dependencies
Dependant Packages
Launch files
Messages
Services
Plugins
Recent questions tagged fusion_estimator at Robotics Stack Exchange
|
fusion_estimator package from ros2go2estimator repofusion_estimator |
ROS Distro
|
Package Summary
| Version | 0.0.0 |
| License | Apache-2.0 |
| Build type | AMENT_CMAKE |
| Use | RECOMMENDED |
Repository Summary
| Description | |
| Checkout URI | https://github.com/shineminxing/ros2go2estimator.git |
| VCS Type | git |
| VCS Version | main |
| Last Updated | 2026-04-02 |
| Dev Status | UNKNOWN |
| Released | UNRELEASED |
| Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Maintainers
- Sun Minxing
Authors
CAPO-LeggedRobotOdometry π¦Ύ
| Language / θ―θ¨οΌ English | δΈζ |
CAPO-LeggedRobotOdometry is a ROS 2 (Humble) proprioceptive odometry / state-estimation repository for biped / quadruped / wheel-legged robots on Ubuntu 22.04.
It provides high-accuracy odometry using only IMU + joint encoders + foot contact/force signals, without requiring cameras or LiDAR.
In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.
At the repository level, fusion_estimator_node.cpp is mainly the ROS2 wrapper (topics, parameters, message conversion, publishing), while the actual fusion-estimation algorithm lives in FusionEstimator/ as a portable, pure C++ implementation. This makes it straightforward to port the estimator to ROS1, non-ROS2 applications, or embedded platforms.
In addition, the Matlab/ folder provides a MATLAB + MEX example that compiles and calls the same C++ estimator core for offline validation, algorithm analysis, and cross-platform reuse.
π Paper
Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)
- Paper: https://arxiv.org/abs/2602.17393
If you use this repository in research, please consider citing the paper.
π¦ Data Sharing (Go2-EDU ROS bags)
To help readers quickly validate the pipeline, we provide two Go2-EDU trial datasets, including real-world videos and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.
- Download (Google Drive): https://drive.google.com/drive/folders/1FfVO69rfmUu6B9crPhZCfKf9wFnV4L7n?usp=sharing
Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.
β¨ Key Features
| Category | Description |
|---|---|
| Biped / Quadruped / Wheel-Legged Unified | Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking. |
| Full 3D & Planar 2D | Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D). |
| No Exteroception Required | Works without cameras or LiDAR; only IMU, joint encoders, and foot contact/force signals are required. |
| Portable Pure C++ Core | The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2. |
| MATLAB / MEX Validation | The Matlab/ folder demonstrates how to compile and invoke the same C++ core from MATLAB. |
| Runtime Tuning | Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots. |
ποΈ Related Repositories
| Scope | Repository | Summary |
|---|---|---|
| Low-level / Drivers | https://github.com/ShineMinxing/Ros2Go2Base | DDS bridge, Unitree SDK2 control, pointcloudβLaserScan, TF utilities |
| Odometry | CAPO-LeggedRobotOdometry (this repo) | Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
|
| SLAM / Mapping | https://github.com/ShineMinxing/Ros2SLAM | Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO |
| Voice / LLM | https://github.com/ShineMinxing/Ros2Chat | Offline ASR + OpenAI Chat + TTS |
| Vision | https://github.com/ShineMinxing/Ros2ImageProcess | Camera pipelines, spot / face / drone detection |
| Gimbal Tracking | https://github.com/ShineMinxing/Ros2AmovG1 | Amov G1 gimbal control and tracking |
| Tools | https://github.com/ShineMinxing/Ros2Tools | Bluetooth IMU, joystick mapping, gimbal loop, data logging |
β οΈ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with
Ros2SLAMandRos2Go2Base.
π Repository Layout
CAPO-LeggedRobotOdometry/
βββ CMakeLists.txt
βββ package.xml
βββ config.yaml
βββ fusion_estimator_node.cpp # ROS2 node wrapper: params / topics / odom publishing
βββ FusionEstimator/ # pure C++ fusion-estimation core (portable)
β βββ Estimators/
β βββ fusion_estimator.h
β βββ LowlevelState.h
β βββ SensorBase.cpp
β βββ SensorBase.h
β βββ Sensor_IMU.cpp
β βββ Sensor_IMU.h
β βββ Sensor_Legs.cpp
β βββ Sensor_Legs.h
β βββ Readme.md
βββ Matlab/ # MATLAB + MEX example for calling the C++ core
β βββ build_mex.m
β βββ fusion_estimator.m
β βββ fusion_estimator_mex.cpp
β βββ MPXY150Z10.csv
β βββ MWXY150Z10.csv
βββ Plotjuggler.xml # PlotJuggler layout / visualization helper
βββ Readme.md
π§© Architecture Notes
This repository is intentionally split into two layers:
File truncated at 100 lines see the full file
Package Dependencies
| Deps | Name |
|---|---|
| ament_cmake | |
| ament_lint_auto | |
| ament_lint_common | |
| rclcpp | |
| sensor_msgs | |
| std_msgs | |
| nav_msgs | |
| tf2 | |
| tf2_geometry_msgs |
System Dependencies
Dependant Packages
Launch files
Messages
Services
Plugins
Recent questions tagged fusion_estimator at Robotics Stack Exchange
|
fusion_estimator package from ros2go2estimator repofusion_estimator |
ROS Distro
|
Package Summary
| Version | 0.0.0 |
| License | Apache-2.0 |
| Build type | AMENT_CMAKE |
| Use | RECOMMENDED |
Repository Summary
| Description | |
| Checkout URI | https://github.com/shineminxing/ros2go2estimator.git |
| VCS Type | git |
| VCS Version | main |
| Last Updated | 2026-04-02 |
| Dev Status | UNKNOWN |
| Released | UNRELEASED |
| Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Maintainers
- Sun Minxing
Authors
CAPO-LeggedRobotOdometry π¦Ύ
| Language / θ―θ¨οΌ English | δΈζ |
CAPO-LeggedRobotOdometry is a ROS 2 (Humble) proprioceptive odometry / state-estimation repository for biped / quadruped / wheel-legged robots on Ubuntu 22.04.
It provides high-accuracy odometry using only IMU + joint encoders + foot contact/force signals, without requiring cameras or LiDAR.
In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.
At the repository level, fusion_estimator_node.cpp is mainly the ROS2 wrapper (topics, parameters, message conversion, publishing), while the actual fusion-estimation algorithm lives in FusionEstimator/ as a portable, pure C++ implementation. This makes it straightforward to port the estimator to ROS1, non-ROS2 applications, or embedded platforms.
In addition, the Matlab/ folder provides a MATLAB + MEX example that compiles and calls the same C++ estimator core for offline validation, algorithm analysis, and cross-platform reuse.
π Paper
Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)
- Paper: https://arxiv.org/abs/2602.17393
If you use this repository in research, please consider citing the paper.
π¦ Data Sharing (Go2-EDU ROS bags)
To help readers quickly validate the pipeline, we provide two Go2-EDU trial datasets, including real-world videos and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.
- Download (Google Drive): https://drive.google.com/drive/folders/1FfVO69rfmUu6B9crPhZCfKf9wFnV4L7n?usp=sharing
Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.
β¨ Key Features
| Category | Description |
|---|---|
| Biped / Quadruped / Wheel-Legged Unified | Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking. |
| Full 3D & Planar 2D | Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D). |
| No Exteroception Required | Works without cameras or LiDAR; only IMU, joint encoders, and foot contact/force signals are required. |
| Portable Pure C++ Core | The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2. |
| MATLAB / MEX Validation | The Matlab/ folder demonstrates how to compile and invoke the same C++ core from MATLAB. |
| Runtime Tuning | Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots. |
ποΈ Related Repositories
| Scope | Repository | Summary |
|---|---|---|
| Low-level / Drivers | https://github.com/ShineMinxing/Ros2Go2Base | DDS bridge, Unitree SDK2 control, pointcloudβLaserScan, TF utilities |
| Odometry | CAPO-LeggedRobotOdometry (this repo) | Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
|
| SLAM / Mapping | https://github.com/ShineMinxing/Ros2SLAM | Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO |
| Voice / LLM | https://github.com/ShineMinxing/Ros2Chat | Offline ASR + OpenAI Chat + TTS |
| Vision | https://github.com/ShineMinxing/Ros2ImageProcess | Camera pipelines, spot / face / drone detection |
| Gimbal Tracking | https://github.com/ShineMinxing/Ros2AmovG1 | Amov G1 gimbal control and tracking |
| Tools | https://github.com/ShineMinxing/Ros2Tools | Bluetooth IMU, joystick mapping, gimbal loop, data logging |
β οΈ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with
Ros2SLAMandRos2Go2Base.
π Repository Layout
CAPO-LeggedRobotOdometry/
βββ CMakeLists.txt
βββ package.xml
βββ config.yaml
βββ fusion_estimator_node.cpp # ROS2 node wrapper: params / topics / odom publishing
βββ FusionEstimator/ # pure C++ fusion-estimation core (portable)
β βββ Estimators/
β βββ fusion_estimator.h
β βββ LowlevelState.h
β βββ SensorBase.cpp
β βββ SensorBase.h
β βββ Sensor_IMU.cpp
β βββ Sensor_IMU.h
β βββ Sensor_Legs.cpp
β βββ Sensor_Legs.h
β βββ Readme.md
βββ Matlab/ # MATLAB + MEX example for calling the C++ core
β βββ build_mex.m
β βββ fusion_estimator.m
β βββ fusion_estimator_mex.cpp
β βββ MPXY150Z10.csv
β βββ MWXY150Z10.csv
βββ Plotjuggler.xml # PlotJuggler layout / visualization helper
βββ Readme.md
π§© Architecture Notes
This repository is intentionally split into two layers:
File truncated at 100 lines see the full file
Package Dependencies
| Deps | Name |
|---|---|
| ament_cmake | |
| ament_lint_auto | |
| ament_lint_common | |
| rclcpp | |
| sensor_msgs | |
| std_msgs | |
| nav_msgs | |
| tf2 | |
| tf2_geometry_msgs |