No version for distro humble showing github. Known supported distros are highlighted in the buttons above.
Repo symbol

ros2go2estimator repository

fusion_estimator

ROS Distro
github

Repository Summary

Description
Checkout URI https://github.com/shineminxing/ros2go2estimator.git
VCS Type git
VCS Version main
Last Updated 2026-04-02
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

Name Version
fusion_estimator 0.0.0

README

CAPO-LeggedRobotOdometry 🦾 License

Language / θ―­θ¨€οΌš English δΈ­ζ–‡

CAPO-LeggedRobotOdometry is a ROS 2 (Humble) proprioceptive odometry / state-estimation repository for biped / quadruped / wheel-legged robots on Ubuntu 22.04.

It provides high-accuracy odometry using only IMU + joint encoders + foot contact/force signals, without requiring cameras or LiDAR.

In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.

At the repository level, fusion_estimator_node.cpp is mainly the ROS2 wrapper (topics, parameters, message conversion, publishing), while the actual fusion-estimation algorithm lives in FusionEstimator/ as a portable, pure C++ implementation. This makes it straightforward to port the estimator to ROS1, non-ROS2 applications, or embedded platforms.

In addition, the Matlab/ folder provides a MATLAB + MEX example that compiles and calls the same C++ estimator core for offline validation, algorithm analysis, and cross-platform reuse.


πŸ“„ Paper

Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)

  • Paper: https://arxiv.org/abs/2602.17393

If you use this repository in research, please consider citing the paper.


πŸ“¦ Data Sharing (Go2-EDU ROS bags)

To help readers quickly validate the pipeline, we provide two Go2-EDU trial datasets, including real-world videos and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.

  • Download (Google Drive): https://drive.google.com/drive/folders/1FfVO69rfmUu6B9crPhZCfKf9wFnV4L7n?usp=sharing

Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.


✨ Key Features

Category Description
Biped / Quadruped / Wheel-Legged Unified Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking.
Full 3D & Planar 2D Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D).
No Exteroception Required Works without cameras or LiDAR; only IMU, joint encoders, and foot contact/force signals are required.
Portable Pure C++ Core The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2.
MATLAB / MEX Validation The Matlab/ folder demonstrates how to compile and invoke the same C++ core from MATLAB.
Runtime Tuning Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots.

Scope Repository Summary
Low-level / Drivers https://github.com/ShineMinxing/Ros2Go2Base DDS bridge, Unitree SDK2 control, pointcloud→LaserScan, TF utilities
Odometry CAPO-LeggedRobotOdometry (this repo) Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
SLAM / Mapping https://github.com/ShineMinxing/Ros2SLAM Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO
Voice / LLM https://github.com/ShineMinxing/Ros2Chat Offline ASR + OpenAI Chat + TTS
Vision https://github.com/ShineMinxing/Ros2ImageProcess Camera pipelines, spot / face / drone detection
Gimbal Tracking https://github.com/ShineMinxing/Ros2AmovG1 Amov G1 gimbal control and tracking
Tools https://github.com/ShineMinxing/Ros2Tools Bluetooth IMU, joystick mapping, gimbal loop, data logging

⚠️ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with Ros2SLAM and Ros2Go2Base.


πŸ“‚ Repository Layout

CAPO-LeggedRobotOdometry/
β”œβ”€β”€ CMakeLists.txt
β”œβ”€β”€ package.xml
β”œβ”€β”€ config.yaml
β”œβ”€β”€ fusion_estimator_node.cpp        # ROS2 node wrapper: params / topics / odom publishing
β”œβ”€β”€ FusionEstimator/                 # pure C++ fusion-estimation core (portable)
β”‚   β”œβ”€β”€ Estimators/
β”‚   β”œβ”€β”€ fusion_estimator.h
β”‚   β”œβ”€β”€ LowlevelState.h
β”‚   β”œβ”€β”€ SensorBase.cpp
β”‚   β”œβ”€β”€ SensorBase.h
β”‚   β”œβ”€β”€ Sensor_IMU.cpp
β”‚   β”œβ”€β”€ Sensor_IMU.h
β”‚   β”œβ”€β”€ Sensor_Legs.cpp
β”‚   β”œβ”€β”€ Sensor_Legs.h
β”‚   └── Readme.md
β”œβ”€β”€ Matlab/                          # MATLAB + MEX example for calling the C++ core
β”‚   β”œβ”€β”€ build_mex.m
β”‚   β”œβ”€β”€ fusion_estimator.m
β”‚   β”œβ”€β”€ fusion_estimator_mex.cpp
β”‚   β”œβ”€β”€ MPXY150Z10.csv
β”‚   └── MWXY150Z10.csv
β”œβ”€β”€ Plotjuggler.xml                  # PlotJuggler layout / visualization helper
└── Readme.md


🧩 Architecture Notes

This repository is intentionally split into two layers:

File truncated at 100 lines see the full file

No version for distro jazzy showing github. Known supported distros are highlighted in the buttons above.
Repo symbol

ros2go2estimator repository

fusion_estimator

ROS Distro
github

Repository Summary

Description
Checkout URI https://github.com/shineminxing/ros2go2estimator.git
VCS Type git
VCS Version main
Last Updated 2026-04-02
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

Name Version
fusion_estimator 0.0.0

README

CAPO-LeggedRobotOdometry 🦾 License

Language / θ―­θ¨€οΌš English δΈ­ζ–‡

CAPO-LeggedRobotOdometry is a ROS 2 (Humble) proprioceptive odometry / state-estimation repository for biped / quadruped / wheel-legged robots on Ubuntu 22.04.

It provides high-accuracy odometry using only IMU + joint encoders + foot contact/force signals, without requiring cameras or LiDAR.

In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.

At the repository level, fusion_estimator_node.cpp is mainly the ROS2 wrapper (topics, parameters, message conversion, publishing), while the actual fusion-estimation algorithm lives in FusionEstimator/ as a portable, pure C++ implementation. This makes it straightforward to port the estimator to ROS1, non-ROS2 applications, or embedded platforms.

In addition, the Matlab/ folder provides a MATLAB + MEX example that compiles and calls the same C++ estimator core for offline validation, algorithm analysis, and cross-platform reuse.


πŸ“„ Paper

Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)

  • Paper: https://arxiv.org/abs/2602.17393

If you use this repository in research, please consider citing the paper.


πŸ“¦ Data Sharing (Go2-EDU ROS bags)

To help readers quickly validate the pipeline, we provide two Go2-EDU trial datasets, including real-world videos and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.

  • Download (Google Drive): https://drive.google.com/drive/folders/1FfVO69rfmUu6B9crPhZCfKf9wFnV4L7n?usp=sharing

Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.


✨ Key Features

Category Description
Biped / Quadruped / Wheel-Legged Unified Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking.
Full 3D & Planar 2D Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D).
No Exteroception Required Works without cameras or LiDAR; only IMU, joint encoders, and foot contact/force signals are required.
Portable Pure C++ Core The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2.
MATLAB / MEX Validation The Matlab/ folder demonstrates how to compile and invoke the same C++ core from MATLAB.
Runtime Tuning Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots.

Scope Repository Summary
Low-level / Drivers https://github.com/ShineMinxing/Ros2Go2Base DDS bridge, Unitree SDK2 control, pointcloud→LaserScan, TF utilities
Odometry CAPO-LeggedRobotOdometry (this repo) Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
SLAM / Mapping https://github.com/ShineMinxing/Ros2SLAM Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO
Voice / LLM https://github.com/ShineMinxing/Ros2Chat Offline ASR + OpenAI Chat + TTS
Vision https://github.com/ShineMinxing/Ros2ImageProcess Camera pipelines, spot / face / drone detection
Gimbal Tracking https://github.com/ShineMinxing/Ros2AmovG1 Amov G1 gimbal control and tracking
Tools https://github.com/ShineMinxing/Ros2Tools Bluetooth IMU, joystick mapping, gimbal loop, data logging

⚠️ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with Ros2SLAM and Ros2Go2Base.


πŸ“‚ Repository Layout

CAPO-LeggedRobotOdometry/
β”œβ”€β”€ CMakeLists.txt
β”œβ”€β”€ package.xml
β”œβ”€β”€ config.yaml
β”œβ”€β”€ fusion_estimator_node.cpp        # ROS2 node wrapper: params / topics / odom publishing
β”œβ”€β”€ FusionEstimator/                 # pure C++ fusion-estimation core (portable)
β”‚   β”œβ”€β”€ Estimators/
β”‚   β”œβ”€β”€ fusion_estimator.h
β”‚   β”œβ”€β”€ LowlevelState.h
β”‚   β”œβ”€β”€ SensorBase.cpp
β”‚   β”œβ”€β”€ SensorBase.h
β”‚   β”œβ”€β”€ Sensor_IMU.cpp
β”‚   β”œβ”€β”€ Sensor_IMU.h
β”‚   β”œβ”€β”€ Sensor_Legs.cpp
β”‚   β”œβ”€β”€ Sensor_Legs.h
β”‚   └── Readme.md
β”œβ”€β”€ Matlab/                          # MATLAB + MEX example for calling the C++ core
β”‚   β”œβ”€β”€ build_mex.m
β”‚   β”œβ”€β”€ fusion_estimator.m
β”‚   β”œβ”€β”€ fusion_estimator_mex.cpp
β”‚   β”œβ”€β”€ MPXY150Z10.csv
β”‚   └── MWXY150Z10.csv
β”œβ”€β”€ Plotjuggler.xml                  # PlotJuggler layout / visualization helper
└── Readme.md


🧩 Architecture Notes

This repository is intentionally split into two layers:

File truncated at 100 lines see the full file

No version for distro kilted showing github. Known supported distros are highlighted in the buttons above.
Repo symbol

ros2go2estimator repository

fusion_estimator

ROS Distro
github

Repository Summary

Description
Checkout URI https://github.com/shineminxing/ros2go2estimator.git
VCS Type git
VCS Version main
Last Updated 2026-04-02
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

Name Version
fusion_estimator 0.0.0

README

CAPO-LeggedRobotOdometry 🦾 License

Language / θ―­θ¨€οΌš English δΈ­ζ–‡

CAPO-LeggedRobotOdometry is a ROS 2 (Humble) proprioceptive odometry / state-estimation repository for biped / quadruped / wheel-legged robots on Ubuntu 22.04.

It provides high-accuracy odometry using only IMU + joint encoders + foot contact/force signals, without requiring cameras or LiDAR.

In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.

At the repository level, fusion_estimator_node.cpp is mainly the ROS2 wrapper (topics, parameters, message conversion, publishing), while the actual fusion-estimation algorithm lives in FusionEstimator/ as a portable, pure C++ implementation. This makes it straightforward to port the estimator to ROS1, non-ROS2 applications, or embedded platforms.

In addition, the Matlab/ folder provides a MATLAB + MEX example that compiles and calls the same C++ estimator core for offline validation, algorithm analysis, and cross-platform reuse.


πŸ“„ Paper

Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)

  • Paper: https://arxiv.org/abs/2602.17393

If you use this repository in research, please consider citing the paper.


πŸ“¦ Data Sharing (Go2-EDU ROS bags)

To help readers quickly validate the pipeline, we provide two Go2-EDU trial datasets, including real-world videos and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.

  • Download (Google Drive): https://drive.google.com/drive/folders/1FfVO69rfmUu6B9crPhZCfKf9wFnV4L7n?usp=sharing

Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.


✨ Key Features

Category Description
Biped / Quadruped / Wheel-Legged Unified Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking.
Full 3D & Planar 2D Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D).
No Exteroception Required Works without cameras or LiDAR; only IMU, joint encoders, and foot contact/force signals are required.
Portable Pure C++ Core The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2.
MATLAB / MEX Validation The Matlab/ folder demonstrates how to compile and invoke the same C++ core from MATLAB.
Runtime Tuning Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots.

Scope Repository Summary
Low-level / Drivers https://github.com/ShineMinxing/Ros2Go2Base DDS bridge, Unitree SDK2 control, pointcloud→LaserScan, TF utilities
Odometry CAPO-LeggedRobotOdometry (this repo) Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
SLAM / Mapping https://github.com/ShineMinxing/Ros2SLAM Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO
Voice / LLM https://github.com/ShineMinxing/Ros2Chat Offline ASR + OpenAI Chat + TTS
Vision https://github.com/ShineMinxing/Ros2ImageProcess Camera pipelines, spot / face / drone detection
Gimbal Tracking https://github.com/ShineMinxing/Ros2AmovG1 Amov G1 gimbal control and tracking
Tools https://github.com/ShineMinxing/Ros2Tools Bluetooth IMU, joystick mapping, gimbal loop, data logging

⚠️ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with Ros2SLAM and Ros2Go2Base.


πŸ“‚ Repository Layout

CAPO-LeggedRobotOdometry/
β”œβ”€β”€ CMakeLists.txt
β”œβ”€β”€ package.xml
β”œβ”€β”€ config.yaml
β”œβ”€β”€ fusion_estimator_node.cpp        # ROS2 node wrapper: params / topics / odom publishing
β”œβ”€β”€ FusionEstimator/                 # pure C++ fusion-estimation core (portable)
β”‚   β”œβ”€β”€ Estimators/
β”‚   β”œβ”€β”€ fusion_estimator.h
β”‚   β”œβ”€β”€ LowlevelState.h
β”‚   β”œβ”€β”€ SensorBase.cpp
β”‚   β”œβ”€β”€ SensorBase.h
β”‚   β”œβ”€β”€ Sensor_IMU.cpp
β”‚   β”œβ”€β”€ Sensor_IMU.h
β”‚   β”œβ”€β”€ Sensor_Legs.cpp
β”‚   β”œβ”€β”€ Sensor_Legs.h
β”‚   └── Readme.md
β”œβ”€β”€ Matlab/                          # MATLAB + MEX example for calling the C++ core
β”‚   β”œβ”€β”€ build_mex.m
β”‚   β”œβ”€β”€ fusion_estimator.m
β”‚   β”œβ”€β”€ fusion_estimator_mex.cpp
β”‚   β”œβ”€β”€ MPXY150Z10.csv
β”‚   └── MWXY150Z10.csv
β”œβ”€β”€ Plotjuggler.xml                  # PlotJuggler layout / visualization helper
└── Readme.md


🧩 Architecture Notes

This repository is intentionally split into two layers:

File truncated at 100 lines see the full file

No version for distro rolling showing github. Known supported distros are highlighted in the buttons above.
Repo symbol

ros2go2estimator repository

fusion_estimator

ROS Distro
github

Repository Summary

Description
Checkout URI https://github.com/shineminxing/ros2go2estimator.git
VCS Type git
VCS Version main
Last Updated 2026-04-02
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

Name Version
fusion_estimator 0.0.0

README

CAPO-LeggedRobotOdometry 🦾 License

Language / θ―­θ¨€οΌš English δΈ­ζ–‡

CAPO-LeggedRobotOdometry is a ROS 2 (Humble) proprioceptive odometry / state-estimation repository for biped / quadruped / wheel-legged robots on Ubuntu 22.04.

It provides high-accuracy odometry using only IMU + joint encoders + foot contact/force signals, without requiring cameras or LiDAR.

In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.

At the repository level, fusion_estimator_node.cpp is mainly the ROS2 wrapper (topics, parameters, message conversion, publishing), while the actual fusion-estimation algorithm lives in FusionEstimator/ as a portable, pure C++ implementation. This makes it straightforward to port the estimator to ROS1, non-ROS2 applications, or embedded platforms.

In addition, the Matlab/ folder provides a MATLAB + MEX example that compiles and calls the same C++ estimator core for offline validation, algorithm analysis, and cross-platform reuse.


πŸ“„ Paper

Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)

  • Paper: https://arxiv.org/abs/2602.17393

If you use this repository in research, please consider citing the paper.


πŸ“¦ Data Sharing (Go2-EDU ROS bags)

To help readers quickly validate the pipeline, we provide two Go2-EDU trial datasets, including real-world videos and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.

  • Download (Google Drive): https://drive.google.com/drive/folders/1FfVO69rfmUu6B9crPhZCfKf9wFnV4L7n?usp=sharing

Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.


✨ Key Features

Category Description
Biped / Quadruped / Wheel-Legged Unified Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking.
Full 3D & Planar 2D Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D).
No Exteroception Required Works without cameras or LiDAR; only IMU, joint encoders, and foot contact/force signals are required.
Portable Pure C++ Core The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2.
MATLAB / MEX Validation The Matlab/ folder demonstrates how to compile and invoke the same C++ core from MATLAB.
Runtime Tuning Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots.

Scope Repository Summary
Low-level / Drivers https://github.com/ShineMinxing/Ros2Go2Base DDS bridge, Unitree SDK2 control, pointcloud→LaserScan, TF utilities
Odometry CAPO-LeggedRobotOdometry (this repo) Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
SLAM / Mapping https://github.com/ShineMinxing/Ros2SLAM Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO
Voice / LLM https://github.com/ShineMinxing/Ros2Chat Offline ASR + OpenAI Chat + TTS
Vision https://github.com/ShineMinxing/Ros2ImageProcess Camera pipelines, spot / face / drone detection
Gimbal Tracking https://github.com/ShineMinxing/Ros2AmovG1 Amov G1 gimbal control and tracking
Tools https://github.com/ShineMinxing/Ros2Tools Bluetooth IMU, joystick mapping, gimbal loop, data logging

⚠️ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with Ros2SLAM and Ros2Go2Base.


πŸ“‚ Repository Layout

CAPO-LeggedRobotOdometry/
β”œβ”€β”€ CMakeLists.txt
β”œβ”€β”€ package.xml
β”œβ”€β”€ config.yaml
β”œβ”€β”€ fusion_estimator_node.cpp        # ROS2 node wrapper: params / topics / odom publishing
β”œβ”€β”€ FusionEstimator/                 # pure C++ fusion-estimation core (portable)
β”‚   β”œβ”€β”€ Estimators/
β”‚   β”œβ”€β”€ fusion_estimator.h
β”‚   β”œβ”€β”€ LowlevelState.h
β”‚   β”œβ”€β”€ SensorBase.cpp
β”‚   β”œβ”€β”€ SensorBase.h
β”‚   β”œβ”€β”€ Sensor_IMU.cpp
β”‚   β”œβ”€β”€ Sensor_IMU.h
β”‚   β”œβ”€β”€ Sensor_Legs.cpp
β”‚   β”œβ”€β”€ Sensor_Legs.h
β”‚   └── Readme.md
β”œβ”€β”€ Matlab/                          # MATLAB + MEX example for calling the C++ core
β”‚   β”œβ”€β”€ build_mex.m
β”‚   β”œβ”€β”€ fusion_estimator.m
β”‚   β”œβ”€β”€ fusion_estimator_mex.cpp
β”‚   β”œβ”€β”€ MPXY150Z10.csv
β”‚   └── MWXY150Z10.csv
β”œβ”€β”€ Plotjuggler.xml                  # PlotJuggler layout / visualization helper
└── Readme.md


🧩 Architecture Notes

This repository is intentionally split into two layers:

File truncated at 100 lines see the full file

Repo symbol

ros2go2estimator repository

fusion_estimator

ROS Distro
github

Repository Summary

Description
Checkout URI https://github.com/shineminxing/ros2go2estimator.git
VCS Type git
VCS Version main
Last Updated 2026-04-02
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

Name Version
fusion_estimator 0.0.0

README

CAPO-LeggedRobotOdometry 🦾 License

Language / θ―­θ¨€οΌš English δΈ­ζ–‡

CAPO-LeggedRobotOdometry is a ROS 2 (Humble) proprioceptive odometry / state-estimation repository for biped / quadruped / wheel-legged robots on Ubuntu 22.04.

It provides high-accuracy odometry using only IMU + joint encoders + foot contact/force signals, without requiring cameras or LiDAR.

In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.

At the repository level, fusion_estimator_node.cpp is mainly the ROS2 wrapper (topics, parameters, message conversion, publishing), while the actual fusion-estimation algorithm lives in FusionEstimator/ as a portable, pure C++ implementation. This makes it straightforward to port the estimator to ROS1, non-ROS2 applications, or embedded platforms.

In addition, the Matlab/ folder provides a MATLAB + MEX example that compiles and calls the same C++ estimator core for offline validation, algorithm analysis, and cross-platform reuse.


πŸ“„ Paper

Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)

  • Paper: https://arxiv.org/abs/2602.17393

If you use this repository in research, please consider citing the paper.


πŸ“¦ Data Sharing (Go2-EDU ROS bags)

To help readers quickly validate the pipeline, we provide two Go2-EDU trial datasets, including real-world videos and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.

  • Download (Google Drive): https://drive.google.com/drive/folders/1FfVO69rfmUu6B9crPhZCfKf9wFnV4L7n?usp=sharing

Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.


✨ Key Features

Category Description
Biped / Quadruped / Wheel-Legged Unified Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking.
Full 3D & Planar 2D Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D).
No Exteroception Required Works without cameras or LiDAR; only IMU, joint encoders, and foot contact/force signals are required.
Portable Pure C++ Core The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2.
MATLAB / MEX Validation The Matlab/ folder demonstrates how to compile and invoke the same C++ core from MATLAB.
Runtime Tuning Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots.

Scope Repository Summary
Low-level / Drivers https://github.com/ShineMinxing/Ros2Go2Base DDS bridge, Unitree SDK2 control, pointcloud→LaserScan, TF utilities
Odometry CAPO-LeggedRobotOdometry (this repo) Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
SLAM / Mapping https://github.com/ShineMinxing/Ros2SLAM Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO
Voice / LLM https://github.com/ShineMinxing/Ros2Chat Offline ASR + OpenAI Chat + TTS
Vision https://github.com/ShineMinxing/Ros2ImageProcess Camera pipelines, spot / face / drone detection
Gimbal Tracking https://github.com/ShineMinxing/Ros2AmovG1 Amov G1 gimbal control and tracking
Tools https://github.com/ShineMinxing/Ros2Tools Bluetooth IMU, joystick mapping, gimbal loop, data logging

⚠️ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with Ros2SLAM and Ros2Go2Base.


πŸ“‚ Repository Layout

CAPO-LeggedRobotOdometry/
β”œβ”€β”€ CMakeLists.txt
β”œβ”€β”€ package.xml
β”œβ”€β”€ config.yaml
β”œβ”€β”€ fusion_estimator_node.cpp        # ROS2 node wrapper: params / topics / odom publishing
β”œβ”€β”€ FusionEstimator/                 # pure C++ fusion-estimation core (portable)
β”‚   β”œβ”€β”€ Estimators/
β”‚   β”œβ”€β”€ fusion_estimator.h
β”‚   β”œβ”€β”€ LowlevelState.h
β”‚   β”œβ”€β”€ SensorBase.cpp
β”‚   β”œβ”€β”€ SensorBase.h
β”‚   β”œβ”€β”€ Sensor_IMU.cpp
β”‚   β”œβ”€β”€ Sensor_IMU.h
β”‚   β”œβ”€β”€ Sensor_Legs.cpp
β”‚   β”œβ”€β”€ Sensor_Legs.h
β”‚   └── Readme.md
β”œβ”€β”€ Matlab/                          # MATLAB + MEX example for calling the C++ core
β”‚   β”œβ”€β”€ build_mex.m
β”‚   β”œβ”€β”€ fusion_estimator.m
β”‚   β”œβ”€β”€ fusion_estimator_mex.cpp
β”‚   β”œβ”€β”€ MPXY150Z10.csv
β”‚   └── MWXY150Z10.csv
β”œβ”€β”€ Plotjuggler.xml                  # PlotJuggler layout / visualization helper
└── Readme.md


🧩 Architecture Notes

This repository is intentionally split into two layers:

File truncated at 100 lines see the full file

No version for distro galactic showing github. Known supported distros are highlighted in the buttons above.
Repo symbol

ros2go2estimator repository

fusion_estimator

ROS Distro
github

Repository Summary

Description
Checkout URI https://github.com/shineminxing/ros2go2estimator.git
VCS Type git
VCS Version main
Last Updated 2026-04-02
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

Name Version
fusion_estimator 0.0.0

README

CAPO-LeggedRobotOdometry 🦾 License

Language / θ―­θ¨€οΌš English δΈ­ζ–‡

CAPO-LeggedRobotOdometry is a ROS 2 (Humble) proprioceptive odometry / state-estimation repository for biped / quadruped / wheel-legged robots on Ubuntu 22.04.

It provides high-accuracy odometry using only IMU + joint encoders + foot contact/force signals, without requiring cameras or LiDAR.

In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.

At the repository level, fusion_estimator_node.cpp is mainly the ROS2 wrapper (topics, parameters, message conversion, publishing), while the actual fusion-estimation algorithm lives in FusionEstimator/ as a portable, pure C++ implementation. This makes it straightforward to port the estimator to ROS1, non-ROS2 applications, or embedded platforms.

In addition, the Matlab/ folder provides a MATLAB + MEX example that compiles and calls the same C++ estimator core for offline validation, algorithm analysis, and cross-platform reuse.


πŸ“„ Paper

Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)

  • Paper: https://arxiv.org/abs/2602.17393

If you use this repository in research, please consider citing the paper.


πŸ“¦ Data Sharing (Go2-EDU ROS bags)

To help readers quickly validate the pipeline, we provide two Go2-EDU trial datasets, including real-world videos and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.

  • Download (Google Drive): https://drive.google.com/drive/folders/1FfVO69rfmUu6B9crPhZCfKf9wFnV4L7n?usp=sharing

Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.


✨ Key Features

Category Description
Biped / Quadruped / Wheel-Legged Unified Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking.
Full 3D & Planar 2D Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D).
No Exteroception Required Works without cameras or LiDAR; only IMU, joint encoders, and foot contact/force signals are required.
Portable Pure C++ Core The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2.
MATLAB / MEX Validation The Matlab/ folder demonstrates how to compile and invoke the same C++ core from MATLAB.
Runtime Tuning Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots.

Scope Repository Summary
Low-level / Drivers https://github.com/ShineMinxing/Ros2Go2Base DDS bridge, Unitree SDK2 control, pointcloud→LaserScan, TF utilities
Odometry CAPO-LeggedRobotOdometry (this repo) Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
SLAM / Mapping https://github.com/ShineMinxing/Ros2SLAM Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO
Voice / LLM https://github.com/ShineMinxing/Ros2Chat Offline ASR + OpenAI Chat + TTS
Vision https://github.com/ShineMinxing/Ros2ImageProcess Camera pipelines, spot / face / drone detection
Gimbal Tracking https://github.com/ShineMinxing/Ros2AmovG1 Amov G1 gimbal control and tracking
Tools https://github.com/ShineMinxing/Ros2Tools Bluetooth IMU, joystick mapping, gimbal loop, data logging

⚠️ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with Ros2SLAM and Ros2Go2Base.


πŸ“‚ Repository Layout

CAPO-LeggedRobotOdometry/
β”œβ”€β”€ CMakeLists.txt
β”œβ”€β”€ package.xml
β”œβ”€β”€ config.yaml
β”œβ”€β”€ fusion_estimator_node.cpp        # ROS2 node wrapper: params / topics / odom publishing
β”œβ”€β”€ FusionEstimator/                 # pure C++ fusion-estimation core (portable)
β”‚   β”œβ”€β”€ Estimators/
β”‚   β”œβ”€β”€ fusion_estimator.h
β”‚   β”œβ”€β”€ LowlevelState.h
β”‚   β”œβ”€β”€ SensorBase.cpp
β”‚   β”œβ”€β”€ SensorBase.h
β”‚   β”œβ”€β”€ Sensor_IMU.cpp
β”‚   β”œβ”€β”€ Sensor_IMU.h
β”‚   β”œβ”€β”€ Sensor_Legs.cpp
β”‚   β”œβ”€β”€ Sensor_Legs.h
β”‚   └── Readme.md
β”œβ”€β”€ Matlab/                          # MATLAB + MEX example for calling the C++ core
β”‚   β”œβ”€β”€ build_mex.m
β”‚   β”œβ”€β”€ fusion_estimator.m
β”‚   β”œβ”€β”€ fusion_estimator_mex.cpp
β”‚   β”œβ”€β”€ MPXY150Z10.csv
β”‚   └── MWXY150Z10.csv
β”œβ”€β”€ Plotjuggler.xml                  # PlotJuggler layout / visualization helper
└── Readme.md


🧩 Architecture Notes

This repository is intentionally split into two layers:

File truncated at 100 lines see the full file

No version for distro iron showing github. Known supported distros are highlighted in the buttons above.
Repo symbol

ros2go2estimator repository

fusion_estimator

ROS Distro
github

Repository Summary

Description
Checkout URI https://github.com/shineminxing/ros2go2estimator.git
VCS Type git
VCS Version main
Last Updated 2026-04-02
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

Name Version
fusion_estimator 0.0.0

README

CAPO-LeggedRobotOdometry 🦾 License

Language / θ―­θ¨€οΌš English δΈ­ζ–‡

CAPO-LeggedRobotOdometry is a ROS 2 (Humble) proprioceptive odometry / state-estimation repository for biped / quadruped / wheel-legged robots on Ubuntu 22.04.

It provides high-accuracy odometry using only IMU + joint encoders + foot contact/force signals, without requiring cameras or LiDAR.

In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.

At the repository level, fusion_estimator_node.cpp is mainly the ROS2 wrapper (topics, parameters, message conversion, publishing), while the actual fusion-estimation algorithm lives in FusionEstimator/ as a portable, pure C++ implementation. This makes it straightforward to port the estimator to ROS1, non-ROS2 applications, or embedded platforms.

In addition, the Matlab/ folder provides a MATLAB + MEX example that compiles and calls the same C++ estimator core for offline validation, algorithm analysis, and cross-platform reuse.


πŸ“„ Paper

Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)

  • Paper: https://arxiv.org/abs/2602.17393

If you use this repository in research, please consider citing the paper.


πŸ“¦ Data Sharing (Go2-EDU ROS bags)

To help readers quickly validate the pipeline, we provide two Go2-EDU trial datasets, including real-world videos and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.

  • Download (Google Drive): https://drive.google.com/drive/folders/1FfVO69rfmUu6B9crPhZCfKf9wFnV4L7n?usp=sharing

Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.


✨ Key Features

Category Description
Biped / Quadruped / Wheel-Legged Unified Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking.
Full 3D & Planar 2D Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D).
No Exteroception Required Works without cameras or LiDAR; only IMU, joint encoders, and foot contact/force signals are required.
Portable Pure C++ Core The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2.
MATLAB / MEX Validation The Matlab/ folder demonstrates how to compile and invoke the same C++ core from MATLAB.
Runtime Tuning Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots.

Scope Repository Summary
Low-level / Drivers https://github.com/ShineMinxing/Ros2Go2Base DDS bridge, Unitree SDK2 control, pointcloud→LaserScan, TF utilities
Odometry CAPO-LeggedRobotOdometry (this repo) Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
SLAM / Mapping https://github.com/ShineMinxing/Ros2SLAM Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO
Voice / LLM https://github.com/ShineMinxing/Ros2Chat Offline ASR + OpenAI Chat + TTS
Vision https://github.com/ShineMinxing/Ros2ImageProcess Camera pipelines, spot / face / drone detection
Gimbal Tracking https://github.com/ShineMinxing/Ros2AmovG1 Amov G1 gimbal control and tracking
Tools https://github.com/ShineMinxing/Ros2Tools Bluetooth IMU, joystick mapping, gimbal loop, data logging

⚠️ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with Ros2SLAM and Ros2Go2Base.


πŸ“‚ Repository Layout

CAPO-LeggedRobotOdometry/
β”œβ”€β”€ CMakeLists.txt
β”œβ”€β”€ package.xml
β”œβ”€β”€ config.yaml
β”œβ”€β”€ fusion_estimator_node.cpp        # ROS2 node wrapper: params / topics / odom publishing
β”œβ”€β”€ FusionEstimator/                 # pure C++ fusion-estimation core (portable)
β”‚   β”œβ”€β”€ Estimators/
β”‚   β”œβ”€β”€ fusion_estimator.h
β”‚   β”œβ”€β”€ LowlevelState.h
β”‚   β”œβ”€β”€ SensorBase.cpp
β”‚   β”œβ”€β”€ SensorBase.h
β”‚   β”œβ”€β”€ Sensor_IMU.cpp
β”‚   β”œβ”€β”€ Sensor_IMU.h
β”‚   β”œβ”€β”€ Sensor_Legs.cpp
β”‚   β”œβ”€β”€ Sensor_Legs.h
β”‚   └── Readme.md
β”œβ”€β”€ Matlab/                          # MATLAB + MEX example for calling the C++ core
β”‚   β”œβ”€β”€ build_mex.m
β”‚   β”œβ”€β”€ fusion_estimator.m
β”‚   β”œβ”€β”€ fusion_estimator_mex.cpp
β”‚   β”œβ”€β”€ MPXY150Z10.csv
β”‚   └── MWXY150Z10.csv
β”œβ”€β”€ Plotjuggler.xml                  # PlotJuggler layout / visualization helper
└── Readme.md


🧩 Architecture Notes

This repository is intentionally split into two layers:

File truncated at 100 lines see the full file

No version for distro melodic showing github. Known supported distros are highlighted in the buttons above.
Repo symbol

ros2go2estimator repository

fusion_estimator

ROS Distro
github

Repository Summary

Description
Checkout URI https://github.com/shineminxing/ros2go2estimator.git
VCS Type git
VCS Version main
Last Updated 2026-04-02
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

Name Version
fusion_estimator 0.0.0

README

CAPO-LeggedRobotOdometry 🦾 License

Language / θ―­θ¨€οΌš English δΈ­ζ–‡

CAPO-LeggedRobotOdometry is a ROS 2 (Humble) proprioceptive odometry / state-estimation repository for biped / quadruped / wheel-legged robots on Ubuntu 22.04.

It provides high-accuracy odometry using only IMU + joint encoders + foot contact/force signals, without requiring cameras or LiDAR.

In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.

At the repository level, fusion_estimator_node.cpp is mainly the ROS2 wrapper (topics, parameters, message conversion, publishing), while the actual fusion-estimation algorithm lives in FusionEstimator/ as a portable, pure C++ implementation. This makes it straightforward to port the estimator to ROS1, non-ROS2 applications, or embedded platforms.

In addition, the Matlab/ folder provides a MATLAB + MEX example that compiles and calls the same C++ estimator core for offline validation, algorithm analysis, and cross-platform reuse.


πŸ“„ Paper

Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)

  • Paper: https://arxiv.org/abs/2602.17393

If you use this repository in research, please consider citing the paper.


πŸ“¦ Data Sharing (Go2-EDU ROS bags)

To help readers quickly validate the pipeline, we provide two Go2-EDU trial datasets, including real-world videos and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.

  • Download (Google Drive): https://drive.google.com/drive/folders/1FfVO69rfmUu6B9crPhZCfKf9wFnV4L7n?usp=sharing

Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.


✨ Key Features

Category Description
Biped / Quadruped / Wheel-Legged Unified Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking.
Full 3D & Planar 2D Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D).
No Exteroception Required Works without cameras or LiDAR; only IMU, joint encoders, and foot contact/force signals are required.
Portable Pure C++ Core The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2.
MATLAB / MEX Validation The Matlab/ folder demonstrates how to compile and invoke the same C++ core from MATLAB.
Runtime Tuning Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots.

Scope Repository Summary
Low-level / Drivers https://github.com/ShineMinxing/Ros2Go2Base DDS bridge, Unitree SDK2 control, pointcloud→LaserScan, TF utilities
Odometry CAPO-LeggedRobotOdometry (this repo) Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
SLAM / Mapping https://github.com/ShineMinxing/Ros2SLAM Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO
Voice / LLM https://github.com/ShineMinxing/Ros2Chat Offline ASR + OpenAI Chat + TTS
Vision https://github.com/ShineMinxing/Ros2ImageProcess Camera pipelines, spot / face / drone detection
Gimbal Tracking https://github.com/ShineMinxing/Ros2AmovG1 Amov G1 gimbal control and tracking
Tools https://github.com/ShineMinxing/Ros2Tools Bluetooth IMU, joystick mapping, gimbal loop, data logging

⚠️ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with Ros2SLAM and Ros2Go2Base.


πŸ“‚ Repository Layout

CAPO-LeggedRobotOdometry/
β”œβ”€β”€ CMakeLists.txt
β”œβ”€β”€ package.xml
β”œβ”€β”€ config.yaml
β”œβ”€β”€ fusion_estimator_node.cpp        # ROS2 node wrapper: params / topics / odom publishing
β”œβ”€β”€ FusionEstimator/                 # pure C++ fusion-estimation core (portable)
β”‚   β”œβ”€β”€ Estimators/
β”‚   β”œβ”€β”€ fusion_estimator.h
β”‚   β”œβ”€β”€ LowlevelState.h
β”‚   β”œβ”€β”€ SensorBase.cpp
β”‚   β”œβ”€β”€ SensorBase.h
β”‚   β”œβ”€β”€ Sensor_IMU.cpp
β”‚   β”œβ”€β”€ Sensor_IMU.h
β”‚   β”œβ”€β”€ Sensor_Legs.cpp
β”‚   β”œβ”€β”€ Sensor_Legs.h
β”‚   └── Readme.md
β”œβ”€β”€ Matlab/                          # MATLAB + MEX example for calling the C++ core
β”‚   β”œβ”€β”€ build_mex.m
β”‚   β”œβ”€β”€ fusion_estimator.m
β”‚   β”œβ”€β”€ fusion_estimator_mex.cpp
β”‚   β”œβ”€β”€ MPXY150Z10.csv
β”‚   └── MWXY150Z10.csv
β”œβ”€β”€ Plotjuggler.xml                  # PlotJuggler layout / visualization helper
└── Readme.md


🧩 Architecture Notes

This repository is intentionally split into two layers:

File truncated at 100 lines see the full file

No version for distro noetic showing github. Known supported distros are highlighted in the buttons above.
Repo symbol

ros2go2estimator repository

fusion_estimator

ROS Distro
github

Repository Summary

Description
Checkout URI https://github.com/shineminxing/ros2go2estimator.git
VCS Type git
VCS Version main
Last Updated 2026-04-02
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Packages

Name Version
fusion_estimator 0.0.0

README

CAPO-LeggedRobotOdometry 🦾 License

Language / θ―­θ¨€οΌš English δΈ­ζ–‡

CAPO-LeggedRobotOdometry is a ROS 2 (Humble) proprioceptive odometry / state-estimation repository for biped / quadruped / wheel-legged robots on Ubuntu 22.04.

It provides high-accuracy odometry using only IMU + joint encoders + foot contact/force signals, without requiring cameras or LiDAR.

In 3D closed-loop trials (a 200 m horizontal and 15 m vertical loop), Astrall point-foot robot A achieves 0.1638 m horizontal error and 0.219 m vertical error; for wheel-legged robot B, the corresponding errors are 0.2264 m and 0.199 m.

At the repository level, fusion_estimator_node.cpp is mainly the ROS2 wrapper (topics, parameters, message conversion, publishing), while the actual fusion-estimation algorithm lives in FusionEstimator/ as a portable, pure C++ implementation. This makes it straightforward to port the estimator to ROS1, non-ROS2 applications, or embedded platforms.

In addition, the Matlab/ folder provides a MATLAB + MEX example that compiles and calls the same C++ estimator core for offline validation, algorithm analysis, and cross-platform reuse.


πŸ“„ Paper

Contact-Anchored Proprioceptive Odometry for Quadruped Robots (arXiv:2602.17393)

  • Paper: https://arxiv.org/abs/2602.17393

If you use this repository in research, please consider citing the paper.


πŸ“¦ Data Sharing (Go2-EDU ROS bags)

To help readers quickly validate the pipeline, we provide two Go2-EDU trial datasets, including real-world videos and the corresponding ROS bag topics/messages required by this node, enabling fast reproduction and sanity checks.

  • Download (Google Drive): https://drive.google.com/drive/folders/1FfVO69rfmUu6B9crPhZCfKf9wFnV4L7n?usp=sharing

Note: the IMU on this Go2-EDU platform exhibits noticeable yaw drift, so the odometry accuracy is generally worse than the results reported for Astrall robots A and B in the paper.


✨ Key Features

Category Description
Biped / Quadruped / Wheel-Legged Unified Online contact-set switching; stance legs are detected automatically, supporting fast transitions between standing and walking.
Full 3D & Planar 2D Publishes both 6DoF odometry (SMX/Odom) and a gravity-flattened 2D odometry (SMX/Odom_2D).
No Exteroception Required Works without cameras or LiDAR; only IMU, joint encoders, and foot contact/force signals are required.
Portable Pure C++ Core The estimator core is isolated in FusionEstimator/, making it easier to reuse outside ROS2.
MATLAB / MEX Validation The Matlab/ folder demonstrates how to compile and invoke the same C++ core from MATLAB.
Runtime Tuning Key parameters can be adjusted through config.yaml, and platform-dependent thresholds can be tuned for different robots.

Scope Repository Summary
Low-level / Drivers https://github.com/ShineMinxing/Ros2Go2Base DDS bridge, Unitree SDK2 control, pointcloud→LaserScan, TF utilities
Odometry CAPO-LeggedRobotOdometry (this repo) Pure proprioceptive fusion, publishes SMX/Odom / SMX/Odom_2D
SLAM / Mapping https://github.com/ShineMinxing/Ros2SLAM Integrations for Cartographer 3D, KISS-ICP, FAST-LIO2, Point-LIO
Voice / LLM https://github.com/ShineMinxing/Ros2Chat Offline ASR + OpenAI Chat + TTS
Vision https://github.com/ShineMinxing/Ros2ImageProcess Camera pipelines, spot / face / drone detection
Gimbal Tracking https://github.com/ShineMinxing/Ros2AmovG1 Amov G1 gimbal control and tracking
Tools https://github.com/ShineMinxing/Ros2Tools Bluetooth IMU, joystick mapping, gimbal loop, data logging

⚠️ Clone as needed. If you only need state estimation, this repository is sufficient. For mapping, it is natural to pair it with Ros2SLAM and Ros2Go2Base.


πŸ“‚ Repository Layout

CAPO-LeggedRobotOdometry/
β”œβ”€β”€ CMakeLists.txt
β”œβ”€β”€ package.xml
β”œβ”€β”€ config.yaml
β”œβ”€β”€ fusion_estimator_node.cpp        # ROS2 node wrapper: params / topics / odom publishing
β”œβ”€β”€ FusionEstimator/                 # pure C++ fusion-estimation core (portable)
β”‚   β”œβ”€β”€ Estimators/
β”‚   β”œβ”€β”€ fusion_estimator.h
β”‚   β”œβ”€β”€ LowlevelState.h
β”‚   β”œβ”€β”€ SensorBase.cpp
β”‚   β”œβ”€β”€ SensorBase.h
β”‚   β”œβ”€β”€ Sensor_IMU.cpp
β”‚   β”œβ”€β”€ Sensor_IMU.h
β”‚   β”œβ”€β”€ Sensor_Legs.cpp
β”‚   β”œβ”€β”€ Sensor_Legs.h
β”‚   └── Readme.md
β”œβ”€β”€ Matlab/                          # MATLAB + MEX example for calling the C++ core
β”‚   β”œβ”€β”€ build_mex.m
β”‚   β”œβ”€β”€ fusion_estimator.m
β”‚   β”œβ”€β”€ fusion_estimator_mex.cpp
β”‚   β”œβ”€β”€ MPXY150Z10.csv
β”‚   └── MWXY150Z10.csv
β”œβ”€β”€ Plotjuggler.xml                  # PlotJuggler layout / visualization helper
└── Readme.md


🧩 Architecture Notes

This repository is intentionally split into two layers:

File truncated at 100 lines see the full file