No version for distro humble showing github. Known supported distros are highlighted in the buttons above.
Repo symbol

ros2go2estimator repository

fusion_estimator

ROS Distro
github

Repository Summary

Description
Checkout URI https://github.com/shineminxing/ros2go2estimator.git
VCS Type git
VCS Version main
Last Updated 2026-04-20
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

Name Version
fusion_estimator 0.0.0

README

CAPO-LeggedRobotOdometry 🦾 License

Language / θ―­θ¨€οΌš English δΈ­ζ–‡

CAPO-LeggedRobotOdometry is a pure proprioceptive odometry library for legged robots, implemented with a portable C++ estimator core that depends only on IMU and joint motor data.

The core estimation logic is implemented in FusionEstimator/fusion_estimator.h. The file fusion_estimator_node.cpp provides a ROS 2 wrapper around this estimator, while the Matlab/ folder contains examples for MATLAB + C++ mixed compilation and offline evaluation.

For side-by-side comparison, Matlab/Comparison/invariant-ekf/ provides a MATLAB mixed-compilation workflow for invariant-ekf, making it easier to compare this repository against a representative open-source legged odometry baseline.

In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.


πŸ“„ Paper

Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)

  • Paper: https://arxiv.org/abs/2602.17393

If you use this repository in research, please consider citing the paper.


πŸ“¦ Data Sharing (Go2-EDU ROS bags)

To help readers quickly validate the pipeline, we provide Go2-EDU trial datasets, including real-world videos, derived CSV files, and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.

  • Download (GitHub Releases): https://github.com/ShineMinxing/CAPO-LeggedRobotOdometry/releases/tag/DataForTest
  • Recommended assets in that release include GO2Flat, GO2Stairs, MPXY150Z10, MWXY150Z10, robot_flat_1_compress.zip, and robot_stairs_1_compress.zip

Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.


✨ Key Features

Category Description
Biped / Quadruped / Wheel-Legged Unified Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking.
IMU + Joint-Motor Only The estimator core works with only IMU and joint motor measurements, without requiring cameras or LiDAR.
MATLAB / C++ Mixed Compilation The Matlab/ folder provides MATLAB + MEX examples for calling the same C++ core, and Matlab/Comparison/invariant-ekf/ includes a comparable mixed-compilation setup for invariant-ekf.
Full 3D & Planar 2D Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D).
Portable Pure C++ Core The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2.
Runtime Tuning Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots.

Scope Repository Summary
Low-level / Drivers https://github.com/ShineMinxing/Ros2Go2Base DDS bridge, Unitree SDK2 control, pointcloud→LaserScan, TF utilities
Odometry CAPO-LeggedRobotOdometry (this repo) Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
SLAM / Mapping https://github.com/ShineMinxing/Ros2SLAM Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO
Voice / LLM https://github.com/ShineMinxing/Ros2Chat Offline ASR + OpenAI Chat + TTS
Vision https://github.com/ShineMinxing/Ros2ImageProcess Camera pipelines, spot / face / drone detection
Gimbal Tracking https://github.com/ShineMinxing/Ros2AmovG1 Amov G1 gimbal control and tracking
Tools https://github.com/ShineMinxing/Ros2Tools Bluetooth IMU, joystick mapping, gimbal loop, data logging

⚠️ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with Ros2SLAM and Ros2Go2Base.


πŸ“‚ Repository Layout

CAPO-LeggedRobotOdometry/
β”œβ”€β”€ CMakeLists.txt
β”œβ”€β”€ package.xml
β”œβ”€β”€ config.yaml
β”œβ”€β”€ fusion_estimator_node.cpp        # ROS2 wrapper around the C++ estimator core
β”œβ”€β”€ FusionEstimator/                 # portable pure C++ estimator core
β”‚   β”œβ”€β”€ Estimators/
β”‚   β”œβ”€β”€ fusion_estimator.h           # main estimator entry
β”‚   β”œβ”€β”€ LowlevelState.h
β”‚   β”œβ”€β”€ SensorBase.cpp
β”‚   β”œβ”€β”€ SensorBase.h
β”‚   β”œβ”€β”€ Sensor_IMU.cpp
β”‚   β”œβ”€β”€ Sensor_IMU.h
β”‚   β”œβ”€β”€ Sensor_Legs.cpp
β”‚   β”œβ”€β”€ Sensor_Legs.h
β”‚   └── Readme.md
β”œβ”€β”€ Matlab/                          # MATLAB + MEX examples for the same C++ core
β”‚   β”œβ”€β”€ build_mex.m
β”‚   β”œβ”€β”€ fusion_estimator.m
β”‚   β”œβ”€β”€ fusion_estimator_mex.cpp
β”‚   β”œβ”€β”€ Comparison/
β”‚   β”‚   └── invariant-ekf/           # MATLAB mixed-compilation workflow for invariant-ekf
β”‚   └── ...                          # optional test datasets are published via GitHub Releases
β”œβ”€β”€ Plotjuggler.xml
└── Readme.md



🧩 Architecture Notes

This repository is intentionally split into three layers:

File truncated at 100 lines see the full file

No version for distro jazzy showing github. Known supported distros are highlighted in the buttons above.
Repo symbol

ros2go2estimator repository

fusion_estimator

ROS Distro
github

Repository Summary

Description
Checkout URI https://github.com/shineminxing/ros2go2estimator.git
VCS Type git
VCS Version main
Last Updated 2026-04-20
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

Name Version
fusion_estimator 0.0.0

README

CAPO-LeggedRobotOdometry 🦾 License

Language / θ―­θ¨€οΌš English δΈ­ζ–‡

CAPO-LeggedRobotOdometry is a pure proprioceptive odometry library for legged robots, implemented with a portable C++ estimator core that depends only on IMU and joint motor data.

The core estimation logic is implemented in FusionEstimator/fusion_estimator.h. The file fusion_estimator_node.cpp provides a ROS 2 wrapper around this estimator, while the Matlab/ folder contains examples for MATLAB + C++ mixed compilation and offline evaluation.

For side-by-side comparison, Matlab/Comparison/invariant-ekf/ provides a MATLAB mixed-compilation workflow for invariant-ekf, making it easier to compare this repository against a representative open-source legged odometry baseline.

In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.


πŸ“„ Paper

Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)

  • Paper: https://arxiv.org/abs/2602.17393

If you use this repository in research, please consider citing the paper.


πŸ“¦ Data Sharing (Go2-EDU ROS bags)

To help readers quickly validate the pipeline, we provide Go2-EDU trial datasets, including real-world videos, derived CSV files, and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.

  • Download (GitHub Releases): https://github.com/ShineMinxing/CAPO-LeggedRobotOdometry/releases/tag/DataForTest
  • Recommended assets in that release include GO2Flat, GO2Stairs, MPXY150Z10, MWXY150Z10, robot_flat_1_compress.zip, and robot_stairs_1_compress.zip

Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.


✨ Key Features

Category Description
Biped / Quadruped / Wheel-Legged Unified Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking.
IMU + Joint-Motor Only The estimator core works with only IMU and joint motor measurements, without requiring cameras or LiDAR.
MATLAB / C++ Mixed Compilation The Matlab/ folder provides MATLAB + MEX examples for calling the same C++ core, and Matlab/Comparison/invariant-ekf/ includes a comparable mixed-compilation setup for invariant-ekf.
Full 3D & Planar 2D Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D).
Portable Pure C++ Core The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2.
Runtime Tuning Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots.

Scope Repository Summary
Low-level / Drivers https://github.com/ShineMinxing/Ros2Go2Base DDS bridge, Unitree SDK2 control, pointcloud→LaserScan, TF utilities
Odometry CAPO-LeggedRobotOdometry (this repo) Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
SLAM / Mapping https://github.com/ShineMinxing/Ros2SLAM Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO
Voice / LLM https://github.com/ShineMinxing/Ros2Chat Offline ASR + OpenAI Chat + TTS
Vision https://github.com/ShineMinxing/Ros2ImageProcess Camera pipelines, spot / face / drone detection
Gimbal Tracking https://github.com/ShineMinxing/Ros2AmovG1 Amov G1 gimbal control and tracking
Tools https://github.com/ShineMinxing/Ros2Tools Bluetooth IMU, joystick mapping, gimbal loop, data logging

⚠️ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with Ros2SLAM and Ros2Go2Base.


πŸ“‚ Repository Layout

CAPO-LeggedRobotOdometry/
β”œβ”€β”€ CMakeLists.txt
β”œβ”€β”€ package.xml
β”œβ”€β”€ config.yaml
β”œβ”€β”€ fusion_estimator_node.cpp        # ROS2 wrapper around the C++ estimator core
β”œβ”€β”€ FusionEstimator/                 # portable pure C++ estimator core
β”‚   β”œβ”€β”€ Estimators/
β”‚   β”œβ”€β”€ fusion_estimator.h           # main estimator entry
β”‚   β”œβ”€β”€ LowlevelState.h
β”‚   β”œβ”€β”€ SensorBase.cpp
β”‚   β”œβ”€β”€ SensorBase.h
β”‚   β”œβ”€β”€ Sensor_IMU.cpp
β”‚   β”œβ”€β”€ Sensor_IMU.h
β”‚   β”œβ”€β”€ Sensor_Legs.cpp
β”‚   β”œβ”€β”€ Sensor_Legs.h
β”‚   └── Readme.md
β”œβ”€β”€ Matlab/                          # MATLAB + MEX examples for the same C++ core
β”‚   β”œβ”€β”€ build_mex.m
β”‚   β”œβ”€β”€ fusion_estimator.m
β”‚   β”œβ”€β”€ fusion_estimator_mex.cpp
β”‚   β”œβ”€β”€ Comparison/
β”‚   β”‚   └── invariant-ekf/           # MATLAB mixed-compilation workflow for invariant-ekf
β”‚   └── ...                          # optional test datasets are published via GitHub Releases
β”œβ”€β”€ Plotjuggler.xml
└── Readme.md



🧩 Architecture Notes

This repository is intentionally split into three layers:

File truncated at 100 lines see the full file

No version for distro kilted showing github. Known supported distros are highlighted in the buttons above.
Repo symbol

ros2go2estimator repository

fusion_estimator

ROS Distro
github

Repository Summary

Description
Checkout URI https://github.com/shineminxing/ros2go2estimator.git
VCS Type git
VCS Version main
Last Updated 2026-04-20
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

Name Version
fusion_estimator 0.0.0

README

CAPO-LeggedRobotOdometry 🦾 License

Language / θ―­θ¨€οΌš English δΈ­ζ–‡

CAPO-LeggedRobotOdometry is a pure proprioceptive odometry library for legged robots, implemented with a portable C++ estimator core that depends only on IMU and joint motor data.

The core estimation logic is implemented in FusionEstimator/fusion_estimator.h. The file fusion_estimator_node.cpp provides a ROS 2 wrapper around this estimator, while the Matlab/ folder contains examples for MATLAB + C++ mixed compilation and offline evaluation.

For side-by-side comparison, Matlab/Comparison/invariant-ekf/ provides a MATLAB mixed-compilation workflow for invariant-ekf, making it easier to compare this repository against a representative open-source legged odometry baseline.

In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.


πŸ“„ Paper

Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)

  • Paper: https://arxiv.org/abs/2602.17393

If you use this repository in research, please consider citing the paper.


πŸ“¦ Data Sharing (Go2-EDU ROS bags)

To help readers quickly validate the pipeline, we provide Go2-EDU trial datasets, including real-world videos, derived CSV files, and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.

  • Download (GitHub Releases): https://github.com/ShineMinxing/CAPO-LeggedRobotOdometry/releases/tag/DataForTest
  • Recommended assets in that release include GO2Flat, GO2Stairs, MPXY150Z10, MWXY150Z10, robot_flat_1_compress.zip, and robot_stairs_1_compress.zip

Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.


✨ Key Features

Category Description
Biped / Quadruped / Wheel-Legged Unified Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking.
IMU + Joint-Motor Only The estimator core works with only IMU and joint motor measurements, without requiring cameras or LiDAR.
MATLAB / C++ Mixed Compilation The Matlab/ folder provides MATLAB + MEX examples for calling the same C++ core, and Matlab/Comparison/invariant-ekf/ includes a comparable mixed-compilation setup for invariant-ekf.
Full 3D & Planar 2D Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D).
Portable Pure C++ Core The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2.
Runtime Tuning Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots.

Scope Repository Summary
Low-level / Drivers https://github.com/ShineMinxing/Ros2Go2Base DDS bridge, Unitree SDK2 control, pointcloud→LaserScan, TF utilities
Odometry CAPO-LeggedRobotOdometry (this repo) Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
SLAM / Mapping https://github.com/ShineMinxing/Ros2SLAM Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO
Voice / LLM https://github.com/ShineMinxing/Ros2Chat Offline ASR + OpenAI Chat + TTS
Vision https://github.com/ShineMinxing/Ros2ImageProcess Camera pipelines, spot / face / drone detection
Gimbal Tracking https://github.com/ShineMinxing/Ros2AmovG1 Amov G1 gimbal control and tracking
Tools https://github.com/ShineMinxing/Ros2Tools Bluetooth IMU, joystick mapping, gimbal loop, data logging

⚠️ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with Ros2SLAM and Ros2Go2Base.


πŸ“‚ Repository Layout

CAPO-LeggedRobotOdometry/
β”œβ”€β”€ CMakeLists.txt
β”œβ”€β”€ package.xml
β”œβ”€β”€ config.yaml
β”œβ”€β”€ fusion_estimator_node.cpp        # ROS2 wrapper around the C++ estimator core
β”œβ”€β”€ FusionEstimator/                 # portable pure C++ estimator core
β”‚   β”œβ”€β”€ Estimators/
β”‚   β”œβ”€β”€ fusion_estimator.h           # main estimator entry
β”‚   β”œβ”€β”€ LowlevelState.h
β”‚   β”œβ”€β”€ SensorBase.cpp
β”‚   β”œβ”€β”€ SensorBase.h
β”‚   β”œβ”€β”€ Sensor_IMU.cpp
β”‚   β”œβ”€β”€ Sensor_IMU.h
β”‚   β”œβ”€β”€ Sensor_Legs.cpp
β”‚   β”œβ”€β”€ Sensor_Legs.h
β”‚   └── Readme.md
β”œβ”€β”€ Matlab/                          # MATLAB + MEX examples for the same C++ core
β”‚   β”œβ”€β”€ build_mex.m
β”‚   β”œβ”€β”€ fusion_estimator.m
β”‚   β”œβ”€β”€ fusion_estimator_mex.cpp
β”‚   β”œβ”€β”€ Comparison/
β”‚   β”‚   └── invariant-ekf/           # MATLAB mixed-compilation workflow for invariant-ekf
β”‚   └── ...                          # optional test datasets are published via GitHub Releases
β”œβ”€β”€ Plotjuggler.xml
└── Readme.md



🧩 Architecture Notes

This repository is intentionally split into three layers:

File truncated at 100 lines see the full file

No version for distro rolling showing github. Known supported distros are highlighted in the buttons above.
Repo symbol

ros2go2estimator repository

fusion_estimator

ROS Distro
github

Repository Summary

Description
Checkout URI https://github.com/shineminxing/ros2go2estimator.git
VCS Type git
VCS Version main
Last Updated 2026-04-20
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

Name Version
fusion_estimator 0.0.0

README

CAPO-LeggedRobotOdometry 🦾 License

Language / θ―­θ¨€οΌš English δΈ­ζ–‡

CAPO-LeggedRobotOdometry is a pure proprioceptive odometry library for legged robots, implemented with a portable C++ estimator core that depends only on IMU and joint motor data.

The core estimation logic is implemented in FusionEstimator/fusion_estimator.h. The file fusion_estimator_node.cpp provides a ROS 2 wrapper around this estimator, while the Matlab/ folder contains examples for MATLAB + C++ mixed compilation and offline evaluation.

For side-by-side comparison, Matlab/Comparison/invariant-ekf/ provides a MATLAB mixed-compilation workflow for invariant-ekf, making it easier to compare this repository against a representative open-source legged odometry baseline.

In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.


πŸ“„ Paper

Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)

  • Paper: https://arxiv.org/abs/2602.17393

If you use this repository in research, please consider citing the paper.


πŸ“¦ Data Sharing (Go2-EDU ROS bags)

To help readers quickly validate the pipeline, we provide Go2-EDU trial datasets, including real-world videos, derived CSV files, and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.

  • Download (GitHub Releases): https://github.com/ShineMinxing/CAPO-LeggedRobotOdometry/releases/tag/DataForTest
  • Recommended assets in that release include GO2Flat, GO2Stairs, MPXY150Z10, MWXY150Z10, robot_flat_1_compress.zip, and robot_stairs_1_compress.zip

Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.


✨ Key Features

Category Description
Biped / Quadruped / Wheel-Legged Unified Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking.
IMU + Joint-Motor Only The estimator core works with only IMU and joint motor measurements, without requiring cameras or LiDAR.
MATLAB / C++ Mixed Compilation The Matlab/ folder provides MATLAB + MEX examples for calling the same C++ core, and Matlab/Comparison/invariant-ekf/ includes a comparable mixed-compilation setup for invariant-ekf.
Full 3D & Planar 2D Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D).
Portable Pure C++ Core The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2.
Runtime Tuning Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots.

Scope Repository Summary
Low-level / Drivers https://github.com/ShineMinxing/Ros2Go2Base DDS bridge, Unitree SDK2 control, pointcloud→LaserScan, TF utilities
Odometry CAPO-LeggedRobotOdometry (this repo) Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
SLAM / Mapping https://github.com/ShineMinxing/Ros2SLAM Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO
Voice / LLM https://github.com/ShineMinxing/Ros2Chat Offline ASR + OpenAI Chat + TTS
Vision https://github.com/ShineMinxing/Ros2ImageProcess Camera pipelines, spot / face / drone detection
Gimbal Tracking https://github.com/ShineMinxing/Ros2AmovG1 Amov G1 gimbal control and tracking
Tools https://github.com/ShineMinxing/Ros2Tools Bluetooth IMU, joystick mapping, gimbal loop, data logging

⚠️ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with Ros2SLAM and Ros2Go2Base.


πŸ“‚ Repository Layout

CAPO-LeggedRobotOdometry/
β”œβ”€β”€ CMakeLists.txt
β”œβ”€β”€ package.xml
β”œβ”€β”€ config.yaml
β”œβ”€β”€ fusion_estimator_node.cpp        # ROS2 wrapper around the C++ estimator core
β”œβ”€β”€ FusionEstimator/                 # portable pure C++ estimator core
β”‚   β”œβ”€β”€ Estimators/
β”‚   β”œβ”€β”€ fusion_estimator.h           # main estimator entry
β”‚   β”œβ”€β”€ LowlevelState.h
β”‚   β”œβ”€β”€ SensorBase.cpp
β”‚   β”œβ”€β”€ SensorBase.h
β”‚   β”œβ”€β”€ Sensor_IMU.cpp
β”‚   β”œβ”€β”€ Sensor_IMU.h
β”‚   β”œβ”€β”€ Sensor_Legs.cpp
β”‚   β”œβ”€β”€ Sensor_Legs.h
β”‚   └── Readme.md
β”œβ”€β”€ Matlab/                          # MATLAB + MEX examples for the same C++ core
β”‚   β”œβ”€β”€ build_mex.m
β”‚   β”œβ”€β”€ fusion_estimator.m
β”‚   β”œβ”€β”€ fusion_estimator_mex.cpp
β”‚   β”œβ”€β”€ Comparison/
β”‚   β”‚   └── invariant-ekf/           # MATLAB mixed-compilation workflow for invariant-ekf
β”‚   └── ...                          # optional test datasets are published via GitHub Releases
β”œβ”€β”€ Plotjuggler.xml
└── Readme.md



🧩 Architecture Notes

This repository is intentionally split into three layers:

File truncated at 100 lines see the full file

Repo symbol

ros2go2estimator repository

fusion_estimator

ROS Distro
github

Repository Summary

Description
Checkout URI https://github.com/shineminxing/ros2go2estimator.git
VCS Type git
VCS Version main
Last Updated 2026-04-20
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

Name Version
fusion_estimator 0.0.0

README

CAPO-LeggedRobotOdometry 🦾 License

Language / θ―­θ¨€οΌš English δΈ­ζ–‡

CAPO-LeggedRobotOdometry is a pure proprioceptive odometry library for legged robots, implemented with a portable C++ estimator core that depends only on IMU and joint motor data.

The core estimation logic is implemented in FusionEstimator/fusion_estimator.h. The file fusion_estimator_node.cpp provides a ROS 2 wrapper around this estimator, while the Matlab/ folder contains examples for MATLAB + C++ mixed compilation and offline evaluation.

For side-by-side comparison, Matlab/Comparison/invariant-ekf/ provides a MATLAB mixed-compilation workflow for invariant-ekf, making it easier to compare this repository against a representative open-source legged odometry baseline.

In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.


πŸ“„ Paper

Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)

  • Paper: https://arxiv.org/abs/2602.17393

If you use this repository in research, please consider citing the paper.


πŸ“¦ Data Sharing (Go2-EDU ROS bags)

To help readers quickly validate the pipeline, we provide Go2-EDU trial datasets, including real-world videos, derived CSV files, and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.

  • Download (GitHub Releases): https://github.com/ShineMinxing/CAPO-LeggedRobotOdometry/releases/tag/DataForTest
  • Recommended assets in that release include GO2Flat, GO2Stairs, MPXY150Z10, MWXY150Z10, robot_flat_1_compress.zip, and robot_stairs_1_compress.zip

Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.


✨ Key Features

Category Description
Biped / Quadruped / Wheel-Legged Unified Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking.
IMU + Joint-Motor Only The estimator core works with only IMU and joint motor measurements, without requiring cameras or LiDAR.
MATLAB / C++ Mixed Compilation The Matlab/ folder provides MATLAB + MEX examples for calling the same C++ core, and Matlab/Comparison/invariant-ekf/ includes a comparable mixed-compilation setup for invariant-ekf.
Full 3D & Planar 2D Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D).
Portable Pure C++ Core The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2.
Runtime Tuning Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots.

Scope Repository Summary
Low-level / Drivers https://github.com/ShineMinxing/Ros2Go2Base DDS bridge, Unitree SDK2 control, pointcloud→LaserScan, TF utilities
Odometry CAPO-LeggedRobotOdometry (this repo) Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
SLAM / Mapping https://github.com/ShineMinxing/Ros2SLAM Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO
Voice / LLM https://github.com/ShineMinxing/Ros2Chat Offline ASR + OpenAI Chat + TTS
Vision https://github.com/ShineMinxing/Ros2ImageProcess Camera pipelines, spot / face / drone detection
Gimbal Tracking https://github.com/ShineMinxing/Ros2AmovG1 Amov G1 gimbal control and tracking
Tools https://github.com/ShineMinxing/Ros2Tools Bluetooth IMU, joystick mapping, gimbal loop, data logging

⚠️ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with Ros2SLAM and Ros2Go2Base.


πŸ“‚ Repository Layout

CAPO-LeggedRobotOdometry/
β”œβ”€β”€ CMakeLists.txt
β”œβ”€β”€ package.xml
β”œβ”€β”€ config.yaml
β”œβ”€β”€ fusion_estimator_node.cpp        # ROS2 wrapper around the C++ estimator core
β”œβ”€β”€ FusionEstimator/                 # portable pure C++ estimator core
β”‚   β”œβ”€β”€ Estimators/
β”‚   β”œβ”€β”€ fusion_estimator.h           # main estimator entry
β”‚   β”œβ”€β”€ LowlevelState.h
β”‚   β”œβ”€β”€ SensorBase.cpp
β”‚   β”œβ”€β”€ SensorBase.h
β”‚   β”œβ”€β”€ Sensor_IMU.cpp
β”‚   β”œβ”€β”€ Sensor_IMU.h
β”‚   β”œβ”€β”€ Sensor_Legs.cpp
β”‚   β”œβ”€β”€ Sensor_Legs.h
β”‚   └── Readme.md
β”œβ”€β”€ Matlab/                          # MATLAB + MEX examples for the same C++ core
β”‚   β”œβ”€β”€ build_mex.m
β”‚   β”œβ”€β”€ fusion_estimator.m
β”‚   β”œβ”€β”€ fusion_estimator_mex.cpp
β”‚   β”œβ”€β”€ Comparison/
β”‚   β”‚   └── invariant-ekf/           # MATLAB mixed-compilation workflow for invariant-ekf
β”‚   └── ...                          # optional test datasets are published via GitHub Releases
β”œβ”€β”€ Plotjuggler.xml
└── Readme.md



🧩 Architecture Notes

This repository is intentionally split into three layers:

File truncated at 100 lines see the full file

No version for distro galactic showing github. Known supported distros are highlighted in the buttons above.
Repo symbol

ros2go2estimator repository

fusion_estimator

ROS Distro
github

Repository Summary

Description
Checkout URI https://github.com/shineminxing/ros2go2estimator.git
VCS Type git
VCS Version main
Last Updated 2026-04-20
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

Name Version
fusion_estimator 0.0.0

README

CAPO-LeggedRobotOdometry 🦾 License

Language / θ―­θ¨€οΌš English δΈ­ζ–‡

CAPO-LeggedRobotOdometry is a pure proprioceptive odometry library for legged robots, implemented with a portable C++ estimator core that depends only on IMU and joint motor data.

The core estimation logic is implemented in FusionEstimator/fusion_estimator.h. The file fusion_estimator_node.cpp provides a ROS 2 wrapper around this estimator, while the Matlab/ folder contains examples for MATLAB + C++ mixed compilation and offline evaluation.

For side-by-side comparison, Matlab/Comparison/invariant-ekf/ provides a MATLAB mixed-compilation workflow for invariant-ekf, making it easier to compare this repository against a representative open-source legged odometry baseline.

In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.


πŸ“„ Paper

Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)

  • Paper: https://arxiv.org/abs/2602.17393

If you use this repository in research, please consider citing the paper.


πŸ“¦ Data Sharing (Go2-EDU ROS bags)

To help readers quickly validate the pipeline, we provide Go2-EDU trial datasets, including real-world videos, derived CSV files, and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.

  • Download (GitHub Releases): https://github.com/ShineMinxing/CAPO-LeggedRobotOdometry/releases/tag/DataForTest
  • Recommended assets in that release include GO2Flat, GO2Stairs, MPXY150Z10, MWXY150Z10, robot_flat_1_compress.zip, and robot_stairs_1_compress.zip

Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.


✨ Key Features

Category Description
Biped / Quadruped / Wheel-Legged Unified Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking.
IMU + Joint-Motor Only The estimator core works with only IMU and joint motor measurements, without requiring cameras or LiDAR.
MATLAB / C++ Mixed Compilation The Matlab/ folder provides MATLAB + MEX examples for calling the same C++ core, and Matlab/Comparison/invariant-ekf/ includes a comparable mixed-compilation setup for invariant-ekf.
Full 3D & Planar 2D Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D).
Portable Pure C++ Core The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2.
Runtime Tuning Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots.

Scope Repository Summary
Low-level / Drivers https://github.com/ShineMinxing/Ros2Go2Base DDS bridge, Unitree SDK2 control, pointcloud→LaserScan, TF utilities
Odometry CAPO-LeggedRobotOdometry (this repo) Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
SLAM / Mapping https://github.com/ShineMinxing/Ros2SLAM Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO
Voice / LLM https://github.com/ShineMinxing/Ros2Chat Offline ASR + OpenAI Chat + TTS
Vision https://github.com/ShineMinxing/Ros2ImageProcess Camera pipelines, spot / face / drone detection
Gimbal Tracking https://github.com/ShineMinxing/Ros2AmovG1 Amov G1 gimbal control and tracking
Tools https://github.com/ShineMinxing/Ros2Tools Bluetooth IMU, joystick mapping, gimbal loop, data logging

⚠️ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with Ros2SLAM and Ros2Go2Base.


πŸ“‚ Repository Layout

CAPO-LeggedRobotOdometry/
β”œβ”€β”€ CMakeLists.txt
β”œβ”€β”€ package.xml
β”œβ”€β”€ config.yaml
β”œβ”€β”€ fusion_estimator_node.cpp        # ROS2 wrapper around the C++ estimator core
β”œβ”€β”€ FusionEstimator/                 # portable pure C++ estimator core
β”‚   β”œβ”€β”€ Estimators/
β”‚   β”œβ”€β”€ fusion_estimator.h           # main estimator entry
β”‚   β”œβ”€β”€ LowlevelState.h
β”‚   β”œβ”€β”€ SensorBase.cpp
β”‚   β”œβ”€β”€ SensorBase.h
β”‚   β”œβ”€β”€ Sensor_IMU.cpp
β”‚   β”œβ”€β”€ Sensor_IMU.h
β”‚   β”œβ”€β”€ Sensor_Legs.cpp
β”‚   β”œβ”€β”€ Sensor_Legs.h
β”‚   └── Readme.md
β”œβ”€β”€ Matlab/                          # MATLAB + MEX examples for the same C++ core
β”‚   β”œβ”€β”€ build_mex.m
β”‚   β”œβ”€β”€ fusion_estimator.m
β”‚   β”œβ”€β”€ fusion_estimator_mex.cpp
β”‚   β”œβ”€β”€ Comparison/
β”‚   β”‚   └── invariant-ekf/           # MATLAB mixed-compilation workflow for invariant-ekf
β”‚   └── ...                          # optional test datasets are published via GitHub Releases
β”œβ”€β”€ Plotjuggler.xml
└── Readme.md



🧩 Architecture Notes

This repository is intentionally split into three layers:

File truncated at 100 lines see the full file

No version for distro iron showing github. Known supported distros are highlighted in the buttons above.
Repo symbol

ros2go2estimator repository

fusion_estimator

ROS Distro
github

Repository Summary

Description
Checkout URI https://github.com/shineminxing/ros2go2estimator.git
VCS Type git
VCS Version main
Last Updated 2026-04-20
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

Name Version
fusion_estimator 0.0.0

README

CAPO-LeggedRobotOdometry 🦾 License

Language / θ―­θ¨€οΌš English δΈ­ζ–‡

CAPO-LeggedRobotOdometry is a pure proprioceptive odometry library for legged robots, implemented with a portable C++ estimator core that depends only on IMU and joint motor data.

The core estimation logic is implemented in FusionEstimator/fusion_estimator.h. The file fusion_estimator_node.cpp provides a ROS 2 wrapper around this estimator, while the Matlab/ folder contains examples for MATLAB + C++ mixed compilation and offline evaluation.

For side-by-side comparison, Matlab/Comparison/invariant-ekf/ provides a MATLAB mixed-compilation workflow for invariant-ekf, making it easier to compare this repository against a representative open-source legged odometry baseline.

In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.


πŸ“„ Paper

Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)

  • Paper: https://arxiv.org/abs/2602.17393

If you use this repository in research, please consider citing the paper.


πŸ“¦ Data Sharing (Go2-EDU ROS bags)

To help readers quickly validate the pipeline, we provide Go2-EDU trial datasets, including real-world videos, derived CSV files, and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.

  • Download (GitHub Releases): https://github.com/ShineMinxing/CAPO-LeggedRobotOdometry/releases/tag/DataForTest
  • Recommended assets in that release include GO2Flat, GO2Stairs, MPXY150Z10, MWXY150Z10, robot_flat_1_compress.zip, and robot_stairs_1_compress.zip

Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.


✨ Key Features

Category Description
Biped / Quadruped / Wheel-Legged Unified Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking.
IMU + Joint-Motor Only The estimator core works with only IMU and joint motor measurements, without requiring cameras or LiDAR.
MATLAB / C++ Mixed Compilation The Matlab/ folder provides MATLAB + MEX examples for calling the same C++ core, and Matlab/Comparison/invariant-ekf/ includes a comparable mixed-compilation setup for invariant-ekf.
Full 3D & Planar 2D Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D).
Portable Pure C++ Core The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2.
Runtime Tuning Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots.

Scope Repository Summary
Low-level / Drivers https://github.com/ShineMinxing/Ros2Go2Base DDS bridge, Unitree SDK2 control, pointcloud→LaserScan, TF utilities
Odometry CAPO-LeggedRobotOdometry (this repo) Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
SLAM / Mapping https://github.com/ShineMinxing/Ros2SLAM Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO
Voice / LLM https://github.com/ShineMinxing/Ros2Chat Offline ASR + OpenAI Chat + TTS
Vision https://github.com/ShineMinxing/Ros2ImageProcess Camera pipelines, spot / face / drone detection
Gimbal Tracking https://github.com/ShineMinxing/Ros2AmovG1 Amov G1 gimbal control and tracking
Tools https://github.com/ShineMinxing/Ros2Tools Bluetooth IMU, joystick mapping, gimbal loop, data logging

⚠️ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with Ros2SLAM and Ros2Go2Base.


πŸ“‚ Repository Layout

CAPO-LeggedRobotOdometry/
β”œβ”€β”€ CMakeLists.txt
β”œβ”€β”€ package.xml
β”œβ”€β”€ config.yaml
β”œβ”€β”€ fusion_estimator_node.cpp        # ROS2 wrapper around the C++ estimator core
β”œβ”€β”€ FusionEstimator/                 # portable pure C++ estimator core
β”‚   β”œβ”€β”€ Estimators/
β”‚   β”œβ”€β”€ fusion_estimator.h           # main estimator entry
β”‚   β”œβ”€β”€ LowlevelState.h
β”‚   β”œβ”€β”€ SensorBase.cpp
β”‚   β”œβ”€β”€ SensorBase.h
β”‚   β”œβ”€β”€ Sensor_IMU.cpp
β”‚   β”œβ”€β”€ Sensor_IMU.h
β”‚   β”œβ”€β”€ Sensor_Legs.cpp
β”‚   β”œβ”€β”€ Sensor_Legs.h
β”‚   └── Readme.md
β”œβ”€β”€ Matlab/                          # MATLAB + MEX examples for the same C++ core
β”‚   β”œβ”€β”€ build_mex.m
β”‚   β”œβ”€β”€ fusion_estimator.m
β”‚   β”œβ”€β”€ fusion_estimator_mex.cpp
β”‚   β”œβ”€β”€ Comparison/
β”‚   β”‚   └── invariant-ekf/           # MATLAB mixed-compilation workflow for invariant-ekf
β”‚   └── ...                          # optional test datasets are published via GitHub Releases
β”œβ”€β”€ Plotjuggler.xml
└── Readme.md



🧩 Architecture Notes

This repository is intentionally split into three layers:

File truncated at 100 lines see the full file

No version for distro melodic showing github. Known supported distros are highlighted in the buttons above.
Repo symbol

ros2go2estimator repository

fusion_estimator

ROS Distro
github

Repository Summary

Description
Checkout URI https://github.com/shineminxing/ros2go2estimator.git
VCS Type git
VCS Version main
Last Updated 2026-04-20
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

Name Version
fusion_estimator 0.0.0

README

CAPO-LeggedRobotOdometry 🦾 License

Language / θ―­θ¨€οΌš English δΈ­ζ–‡

CAPO-LeggedRobotOdometry is a pure proprioceptive odometry library for legged robots, implemented with a portable C++ estimator core that depends only on IMU and joint motor data.

The core estimation logic is implemented in FusionEstimator/fusion_estimator.h. The file fusion_estimator_node.cpp provides a ROS 2 wrapper around this estimator, while the Matlab/ folder contains examples for MATLAB + C++ mixed compilation and offline evaluation.

For side-by-side comparison, Matlab/Comparison/invariant-ekf/ provides a MATLAB mixed-compilation workflow for invariant-ekf, making it easier to compare this repository against a representative open-source legged odometry baseline.

In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.


πŸ“„ Paper

Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)

  • Paper: https://arxiv.org/abs/2602.17393

If you use this repository in research, please consider citing the paper.


πŸ“¦ Data Sharing (Go2-EDU ROS bags)

To help readers quickly validate the pipeline, we provide Go2-EDU trial datasets, including real-world videos, derived CSV files, and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.

  • Download (GitHub Releases): https://github.com/ShineMinxing/CAPO-LeggedRobotOdometry/releases/tag/DataForTest
  • Recommended assets in that release include GO2Flat, GO2Stairs, MPXY150Z10, MWXY150Z10, robot_flat_1_compress.zip, and robot_stairs_1_compress.zip

Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.


✨ Key Features

Category Description
Biped / Quadruped / Wheel-Legged Unified Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking.
IMU + Joint-Motor Only The estimator core works with only IMU and joint motor measurements, without requiring cameras or LiDAR.
MATLAB / C++ Mixed Compilation The Matlab/ folder provides MATLAB + MEX examples for calling the same C++ core, and Matlab/Comparison/invariant-ekf/ includes a comparable mixed-compilation setup for invariant-ekf.
Full 3D & Planar 2D Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D).
Portable Pure C++ Core The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2.
Runtime Tuning Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots.

Scope Repository Summary
Low-level / Drivers https://github.com/ShineMinxing/Ros2Go2Base DDS bridge, Unitree SDK2 control, pointcloud→LaserScan, TF utilities
Odometry CAPO-LeggedRobotOdometry (this repo) Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
SLAM / Mapping https://github.com/ShineMinxing/Ros2SLAM Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO
Voice / LLM https://github.com/ShineMinxing/Ros2Chat Offline ASR + OpenAI Chat + TTS
Vision https://github.com/ShineMinxing/Ros2ImageProcess Camera pipelines, spot / face / drone detection
Gimbal Tracking https://github.com/ShineMinxing/Ros2AmovG1 Amov G1 gimbal control and tracking
Tools https://github.com/ShineMinxing/Ros2Tools Bluetooth IMU, joystick mapping, gimbal loop, data logging

⚠️ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with Ros2SLAM and Ros2Go2Base.


πŸ“‚ Repository Layout

CAPO-LeggedRobotOdometry/
β”œβ”€β”€ CMakeLists.txt
β”œβ”€β”€ package.xml
β”œβ”€β”€ config.yaml
β”œβ”€β”€ fusion_estimator_node.cpp        # ROS2 wrapper around the C++ estimator core
β”œβ”€β”€ FusionEstimator/                 # portable pure C++ estimator core
β”‚   β”œβ”€β”€ Estimators/
β”‚   β”œβ”€β”€ fusion_estimator.h           # main estimator entry
β”‚   β”œβ”€β”€ LowlevelState.h
β”‚   β”œβ”€β”€ SensorBase.cpp
β”‚   β”œβ”€β”€ SensorBase.h
β”‚   β”œβ”€β”€ Sensor_IMU.cpp
β”‚   β”œβ”€β”€ Sensor_IMU.h
β”‚   β”œβ”€β”€ Sensor_Legs.cpp
β”‚   β”œβ”€β”€ Sensor_Legs.h
β”‚   └── Readme.md
β”œβ”€β”€ Matlab/                          # MATLAB + MEX examples for the same C++ core
β”‚   β”œβ”€β”€ build_mex.m
β”‚   β”œβ”€β”€ fusion_estimator.m
β”‚   β”œβ”€β”€ fusion_estimator_mex.cpp
β”‚   β”œβ”€β”€ Comparison/
β”‚   β”‚   └── invariant-ekf/           # MATLAB mixed-compilation workflow for invariant-ekf
β”‚   └── ...                          # optional test datasets are published via GitHub Releases
β”œβ”€β”€ Plotjuggler.xml
└── Readme.md



🧩 Architecture Notes

This repository is intentionally split into three layers:

File truncated at 100 lines see the full file

No version for distro noetic showing github. Known supported distros are highlighted in the buttons above.
Repo symbol

ros2go2estimator repository

fusion_estimator

ROS Distro
github

Repository Summary

Description
Checkout URI https://github.com/shineminxing/ros2go2estimator.git
VCS Type git
VCS Version main
Last Updated 2026-04-20
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

Name Version
fusion_estimator 0.0.0

README

CAPO-LeggedRobotOdometry 🦾 License

Language / θ―­θ¨€οΌš English δΈ­ζ–‡

CAPO-LeggedRobotOdometry is a pure proprioceptive odometry library for legged robots, implemented with a portable C++ estimator core that depends only on IMU and joint motor data.

The core estimation logic is implemented in FusionEstimator/fusion_estimator.h. The file fusion_estimator_node.cpp provides a ROS 2 wrapper around this estimator, while the Matlab/ folder contains examples for MATLAB + C++ mixed compilation and offline evaluation.

For side-by-side comparison, Matlab/Comparison/invariant-ekf/ provides a MATLAB mixed-compilation workflow for invariant-ekf, making it easier to compare this repository against a representative open-source legged odometry baseline.

In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.


πŸ“„ Paper

Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)

  • Paper: https://arxiv.org/abs/2602.17393

If you use this repository in research, please consider citing the paper.


πŸ“¦ Data Sharing (Go2-EDU ROS bags)

To help readers quickly validate the pipeline, we provide Go2-EDU trial datasets, including real-world videos, derived CSV files, and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.

  • Download (GitHub Releases): https://github.com/ShineMinxing/CAPO-LeggedRobotOdometry/releases/tag/DataForTest
  • Recommended assets in that release include GO2Flat, GO2Stairs, MPXY150Z10, MWXY150Z10, robot_flat_1_compress.zip, and robot_stairs_1_compress.zip

Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.


✨ Key Features

Category Description
Biped / Quadruped / Wheel-Legged Unified Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking.
IMU + Joint-Motor Only The estimator core works with only IMU and joint motor measurements, without requiring cameras or LiDAR.
MATLAB / C++ Mixed Compilation The Matlab/ folder provides MATLAB + MEX examples for calling the same C++ core, and Matlab/Comparison/invariant-ekf/ includes a comparable mixed-compilation setup for invariant-ekf.
Full 3D & Planar 2D Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D).
Portable Pure C++ Core The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2.
Runtime Tuning Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots.

Scope Repository Summary
Low-level / Drivers https://github.com/ShineMinxing/Ros2Go2Base DDS bridge, Unitree SDK2 control, pointcloud→LaserScan, TF utilities
Odometry CAPO-LeggedRobotOdometry (this repo) Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
SLAM / Mapping https://github.com/ShineMinxing/Ros2SLAM Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO
Voice / LLM https://github.com/ShineMinxing/Ros2Chat Offline ASR + OpenAI Chat + TTS
Vision https://github.com/ShineMinxing/Ros2ImageProcess Camera pipelines, spot / face / drone detection
Gimbal Tracking https://github.com/ShineMinxing/Ros2AmovG1 Amov G1 gimbal control and tracking
Tools https://github.com/ShineMinxing/Ros2Tools Bluetooth IMU, joystick mapping, gimbal loop, data logging

⚠️ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with Ros2SLAM and Ros2Go2Base.


πŸ“‚ Repository Layout

CAPO-LeggedRobotOdometry/
β”œβ”€β”€ CMakeLists.txt
β”œβ”€β”€ package.xml
β”œβ”€β”€ config.yaml
β”œβ”€β”€ fusion_estimator_node.cpp        # ROS2 wrapper around the C++ estimator core
β”œβ”€β”€ FusionEstimator/                 # portable pure C++ estimator core
β”‚   β”œβ”€β”€ Estimators/
β”‚   β”œβ”€β”€ fusion_estimator.h           # main estimator entry
β”‚   β”œβ”€β”€ LowlevelState.h
β”‚   β”œβ”€β”€ SensorBase.cpp
β”‚   β”œβ”€β”€ SensorBase.h
β”‚   β”œβ”€β”€ Sensor_IMU.cpp
β”‚   β”œβ”€β”€ Sensor_IMU.h
β”‚   β”œβ”€β”€ Sensor_Legs.cpp
β”‚   β”œβ”€β”€ Sensor_Legs.h
β”‚   └── Readme.md
β”œβ”€β”€ Matlab/                          # MATLAB + MEX examples for the same C++ core
β”‚   β”œβ”€β”€ build_mex.m
β”‚   β”œβ”€β”€ fusion_estimator.m
β”‚   β”œβ”€β”€ fusion_estimator_mex.cpp
β”‚   β”œβ”€β”€ Comparison/
β”‚   β”‚   └── invariant-ekf/           # MATLAB mixed-compilation workflow for invariant-ekf
β”‚   └── ...                          # optional test datasets are published via GitHub Releases
β”œβ”€β”€ Plotjuggler.xml
└── Readme.md



🧩 Architecture Notes

This repository is intentionally split into three layers:

File truncated at 100 lines see the full file