Package Summary
| Tags | No category tags. |
| Version | 0.48.0 |
| License | Apache License 2.0 |
| Build type | AMENT_CMAKE |
| Use | RECOMMENDED |
Repository Summary
| Description | |
| Checkout URI | https://github.com/autowarefoundation/autoware_universe.git |
| VCS Type | git |
| VCS Version | main |
| Last Updated | 2025-12-03 |
| Dev Status | UNKNOWN |
| Released | UNRELEASED |
| Tags | planner ros calibration self-driving-car autonomous-driving autonomous-vehicles ros2 3d-map autoware |
| Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Additional Links
Maintainers
- Amadeusz Szymko
Authors
- Amadeusz Szymko
autoware_calibration_status_classifier
Purpose
The autoware_calibration_status_classifier package provides real-time LiDAR-camera calibration validation using deep learning inference. It detects miscalibration between LiDAR and camera sensors by analyzing projected point clouds overlaid on camera images through a neural network-based approach.
Inner-workings / Algorithms
The calibration status detection system operates through the following pipeline:
1. Data Preprocessing (CUDA-accelerated)
- Image Undistortion: Corrects camera distortion
- Point Cloud Projection: Projects 3D LiDAR points onto undistorted 2D image plane - adds depth and intensity information
- Morphological Dilation: Enhances point visibility for neural network input
2. Neural Network Inference (TensorRT)
- Input Format: 5-channel normalized data (RGB + depth + intensity)
- Architecture: Deep neural network trained on calibrated/miscalibrated data
- Output: Binary classification with confidence scores for calibration status
3. Runtime Modes
- MANUAL: On-demand validation via service calls
- PERIODIC: Regular validation at configurable intervals
- ACTIVE: Continuous monitoring with synchronized sensor data
Inputs / Outputs
Input
| Name | Type | Description |
|---|---|---|
~/input/velocity |
prerequisite.velocity_source parameter |
Vehicle velocity (multiple message types supported) |
input.cloud_topics |
sensor_msgs::msg::PointCloud2 |
LiDAR point cloud data |
input.image_topics |
sensor_msgs::msg::Image |
Camera image data (BGR8 format) |
| Camera info topics | sensor_msgs::msg::CameraInfo |
Camera intrinsic parameters and distortion coefficients |
Output
| Name | Type | Description |
|---|---|---|
/diagnostics |
diagnostic_msgs::msg::DiagnosticArray |
ROS diagnostics with calibration status |
~/validate_calibration_srv |
std_srvs::srv::Trigger |
Manual validation service (MANUAL mode) |
| Preview image topics | sensor_msgs::msg::Image |
Visualization images with projected points |
Services
| Name | Type | Description |
|---|---|---|
~/input/validate_calibration_srv |
std_srvs::srv::Trigger |
Manual calibration validation request |
Parameters
Node Parameters
{{ json_to_markdown(“sensing/autoware_calibration_status_classifier/schema/calibration_status_classifier.schema.json”) }}
Network Parameters
{{ json_to_markdown(“sensing/autoware_calibration_status_classifier/schema/ml_package_calibration_status_classifier.schema.json”) }}
Assumptions / Known Limits
- Input images must be in BGR8 format (8-bit per channel)
- Input point clouds should contain intensity information (XYZIRC format)
Usage Example
ros2 launch autoware_calibration_status_classifier calibration_status_classifier.launch.xml
Future Extensions / Unimplemented Parts
- Manual runtime mode with detailed response (custom srv)
- Replace filter for objects on the scene counter to objects within the camera FOV counter (raytracing)
- Multithreading for multiple camera-LiDAR pairs
- More filters (e.g. yaw rate)
- cuda_blackboard support
- Replace custom kernels with NPP functions where applicable
References
Changelog for package autoware_calibration_status_classifier
0.48.0 (2025-11-18)
-
Merge remote-tracking branch 'origin/main' into humble
-
fix: tf2 uses hpp headers in rolling (and is backported) (#11620)
-
feat(autoware_calibration_status_classifier): add ML-based miscalibration detection module (#11222)
- feat(autoware_calibration_status): add ML-based miscalibration detection module
- feat(autoware_calibration_status): extended configuration and diagnostics
- fix(autoware_calibration_status): model's input array format
- test(autoware_calibration_status): inference only test
- style(pre-commit): autofix
- refactor(autoware_calibration_status): rename lidar_range to max_depth
- fix(autoware_calibration_status): add missing header
- feat(autoware_calibration_status): add naive number of objects filter
- feat(autoware_calibration_status): add periodic and manual mode
* refactor(autoware_calibration_status): improve image handling and optimize calibration pipeline Refactors the calibration status module to handle both distorted and rectified images, reorganizes data structures, and optimizes the processing pipeline. Adds new utility classes for better camera/LiDAR information management.
- style(pre-commit): autofix
- style(autoware_calibration_status): pre-commit
- test(autoware_calibration_status): make that CI skip unit tests
- style(autoware_calibration_status): cspell
- test(autoware_calibration_status): skip test before loading data
- test(autoware_calibration_status): another yet attempt to fix CI
- style(autoware_calibration_status): cspell
* fix(autoware_calibration_status): correct types Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
* fix(autoware_calibration_status): correct types Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
- style(pre-commit): autofix
- fix(autoware_calibration_status): user desired function for cuda memory allocation
- style(autoware_calibration_status): early return instead of scoped implementaiton
- feat(autoware_calibration_status): use of __restrict_ keyword
- docs(autoware_calibration_status): update future work
- fix(autoware_calibration_status): include missing directory
- fix(autoware_calibration_status): use preallocated class member
- style(pre-commit): autofix
- style(autoware_calibration_status): use lambda for adding diagnostic
- style(autoware_calibration_status): split function
- style(pre-commit): autofix
- refactor(autoware_calibration_status): change atomic operation logic and extras
- refactor(autoware_calibration_status): use autoware diagnostic interface
- fix(autoware_calibration_status): cspell
- feat(autoware_calibration_status_classifier): rename autoware_calibration_status to autoware_calibration_status_classifier
- style(pre-commit): autofix
- fix(autoware_calibration_status_classifier): prevent potential race condition
- fix(autoware_calibration_status_classifier): add mutex for input msgs data access
* fix(autoware_calibration_status_classifier): pre-commit ---------Co-authored-by: pre-commit-ci-lite[bot] <117423508+pre-commit-ci-lite[bot]@users.noreply.github.com> Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
-
Contributors: Amadeusz Szymko, Ryohsuke Mitsudome, Tim Clephas
Package Dependencies
System Dependencies
| Name |
|---|
| libopencv-dev |
Dependant Packages
Launch files
- launch/calibration_status_classifier.launch.xml
-
- input_velocity [default: /sensing/vehicle_velocity_converter/twist_with_covariance]
- input_objects [default: /perception/object_recognition/objects]
- validate_calibration_srv [default: validate_calibration]
- data_path [default: $(env HOME)/autoware_data]
- model_name [default: calibration_status_classifier]
- model_path [default: $(var data_path)/calibration_status_classifier]
- model_param_path [default: $(find-pkg-share autoware_calibration_status_classifier)/config/$(var model_name).param.yaml]
- ml_package_param_path [default: $(var model_path)/ml_package_$(var model_name).param.yaml]
- build_only [default: false]
- use_sim_time [default: true]
- decompressor_param_file [default: $(find-pkg-share autoware_image_transport_decompressor)/config/image_transport_decompressor.param.yaml]
Messages
Services
Plugins
Recent questions tagged autoware_calibration_status_classifier at Robotics Stack Exchange
Package Summary
| Tags | No category tags. |
| Version | 0.48.0 |
| License | Apache License 2.0 |
| Build type | AMENT_CMAKE |
| Use | RECOMMENDED |
Repository Summary
| Description | |
| Checkout URI | https://github.com/autowarefoundation/autoware_universe.git |
| VCS Type | git |
| VCS Version | main |
| Last Updated | 2025-12-03 |
| Dev Status | UNKNOWN |
| Released | UNRELEASED |
| Tags | planner ros calibration self-driving-car autonomous-driving autonomous-vehicles ros2 3d-map autoware |
| Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Additional Links
Maintainers
- Amadeusz Szymko
Authors
- Amadeusz Szymko
autoware_calibration_status_classifier
Purpose
The autoware_calibration_status_classifier package provides real-time LiDAR-camera calibration validation using deep learning inference. It detects miscalibration between LiDAR and camera sensors by analyzing projected point clouds overlaid on camera images through a neural network-based approach.
Inner-workings / Algorithms
The calibration status detection system operates through the following pipeline:
1. Data Preprocessing (CUDA-accelerated)
- Image Undistortion: Corrects camera distortion
- Point Cloud Projection: Projects 3D LiDAR points onto undistorted 2D image plane - adds depth and intensity information
- Morphological Dilation: Enhances point visibility for neural network input
2. Neural Network Inference (TensorRT)
- Input Format: 5-channel normalized data (RGB + depth + intensity)
- Architecture: Deep neural network trained on calibrated/miscalibrated data
- Output: Binary classification with confidence scores for calibration status
3. Runtime Modes
- MANUAL: On-demand validation via service calls
- PERIODIC: Regular validation at configurable intervals
- ACTIVE: Continuous monitoring with synchronized sensor data
Inputs / Outputs
Input
| Name | Type | Description |
|---|---|---|
~/input/velocity |
prerequisite.velocity_source parameter |
Vehicle velocity (multiple message types supported) |
input.cloud_topics |
sensor_msgs::msg::PointCloud2 |
LiDAR point cloud data |
input.image_topics |
sensor_msgs::msg::Image |
Camera image data (BGR8 format) |
| Camera info topics | sensor_msgs::msg::CameraInfo |
Camera intrinsic parameters and distortion coefficients |
Output
| Name | Type | Description |
|---|---|---|
/diagnostics |
diagnostic_msgs::msg::DiagnosticArray |
ROS diagnostics with calibration status |
~/validate_calibration_srv |
std_srvs::srv::Trigger |
Manual validation service (MANUAL mode) |
| Preview image topics | sensor_msgs::msg::Image |
Visualization images with projected points |
Services
| Name | Type | Description |
|---|---|---|
~/input/validate_calibration_srv |
std_srvs::srv::Trigger |
Manual calibration validation request |
Parameters
Node Parameters
{{ json_to_markdown(“sensing/autoware_calibration_status_classifier/schema/calibration_status_classifier.schema.json”) }}
Network Parameters
{{ json_to_markdown(“sensing/autoware_calibration_status_classifier/schema/ml_package_calibration_status_classifier.schema.json”) }}
Assumptions / Known Limits
- Input images must be in BGR8 format (8-bit per channel)
- Input point clouds should contain intensity information (XYZIRC format)
Usage Example
ros2 launch autoware_calibration_status_classifier calibration_status_classifier.launch.xml
Future Extensions / Unimplemented Parts
- Manual runtime mode with detailed response (custom srv)
- Replace filter for objects on the scene counter to objects within the camera FOV counter (raytracing)
- Multithreading for multiple camera-LiDAR pairs
- More filters (e.g. yaw rate)
- cuda_blackboard support
- Replace custom kernels with NPP functions where applicable
References
Changelog for package autoware_calibration_status_classifier
0.48.0 (2025-11-18)
-
Merge remote-tracking branch 'origin/main' into humble
-
fix: tf2 uses hpp headers in rolling (and is backported) (#11620)
-
feat(autoware_calibration_status_classifier): add ML-based miscalibration detection module (#11222)
- feat(autoware_calibration_status): add ML-based miscalibration detection module
- feat(autoware_calibration_status): extended configuration and diagnostics
- fix(autoware_calibration_status): model's input array format
- test(autoware_calibration_status): inference only test
- style(pre-commit): autofix
- refactor(autoware_calibration_status): rename lidar_range to max_depth
- fix(autoware_calibration_status): add missing header
- feat(autoware_calibration_status): add naive number of objects filter
- feat(autoware_calibration_status): add periodic and manual mode
* refactor(autoware_calibration_status): improve image handling and optimize calibration pipeline Refactors the calibration status module to handle both distorted and rectified images, reorganizes data structures, and optimizes the processing pipeline. Adds new utility classes for better camera/LiDAR information management.
- style(pre-commit): autofix
- style(autoware_calibration_status): pre-commit
- test(autoware_calibration_status): make that CI skip unit tests
- style(autoware_calibration_status): cspell
- test(autoware_calibration_status): skip test before loading data
- test(autoware_calibration_status): another yet attempt to fix CI
- style(autoware_calibration_status): cspell
* fix(autoware_calibration_status): correct types Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
* fix(autoware_calibration_status): correct types Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
- style(pre-commit): autofix
- fix(autoware_calibration_status): user desired function for cuda memory allocation
- style(autoware_calibration_status): early return instead of scoped implementaiton
- feat(autoware_calibration_status): use of __restrict_ keyword
- docs(autoware_calibration_status): update future work
- fix(autoware_calibration_status): include missing directory
- fix(autoware_calibration_status): use preallocated class member
- style(pre-commit): autofix
- style(autoware_calibration_status): use lambda for adding diagnostic
- style(autoware_calibration_status): split function
- style(pre-commit): autofix
- refactor(autoware_calibration_status): change atomic operation logic and extras
- refactor(autoware_calibration_status): use autoware diagnostic interface
- fix(autoware_calibration_status): cspell
- feat(autoware_calibration_status_classifier): rename autoware_calibration_status to autoware_calibration_status_classifier
- style(pre-commit): autofix
- fix(autoware_calibration_status_classifier): prevent potential race condition
- fix(autoware_calibration_status_classifier): add mutex for input msgs data access
* fix(autoware_calibration_status_classifier): pre-commit ---------Co-authored-by: pre-commit-ci-lite[bot] <117423508+pre-commit-ci-lite[bot]@users.noreply.github.com> Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
-
Contributors: Amadeusz Szymko, Ryohsuke Mitsudome, Tim Clephas
Package Dependencies
System Dependencies
| Name |
|---|
| libopencv-dev |
Dependant Packages
Launch files
- launch/calibration_status_classifier.launch.xml
-
- input_velocity [default: /sensing/vehicle_velocity_converter/twist_with_covariance]
- input_objects [default: /perception/object_recognition/objects]
- validate_calibration_srv [default: validate_calibration]
- data_path [default: $(env HOME)/autoware_data]
- model_name [default: calibration_status_classifier]
- model_path [default: $(var data_path)/calibration_status_classifier]
- model_param_path [default: $(find-pkg-share autoware_calibration_status_classifier)/config/$(var model_name).param.yaml]
- ml_package_param_path [default: $(var model_path)/ml_package_$(var model_name).param.yaml]
- build_only [default: false]
- use_sim_time [default: true]
- decompressor_param_file [default: $(find-pkg-share autoware_image_transport_decompressor)/config/image_transport_decompressor.param.yaml]
Messages
Services
Plugins
Recent questions tagged autoware_calibration_status_classifier at Robotics Stack Exchange
Package Summary
| Tags | No category tags. |
| Version | 0.48.0 |
| License | Apache License 2.0 |
| Build type | AMENT_CMAKE |
| Use | RECOMMENDED |
Repository Summary
| Description | |
| Checkout URI | https://github.com/autowarefoundation/autoware_universe.git |
| VCS Type | git |
| VCS Version | main |
| Last Updated | 2025-12-03 |
| Dev Status | UNKNOWN |
| Released | UNRELEASED |
| Tags | planner ros calibration self-driving-car autonomous-driving autonomous-vehicles ros2 3d-map autoware |
| Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Additional Links
Maintainers
- Amadeusz Szymko
Authors
- Amadeusz Szymko
autoware_calibration_status_classifier
Purpose
The autoware_calibration_status_classifier package provides real-time LiDAR-camera calibration validation using deep learning inference. It detects miscalibration between LiDAR and camera sensors by analyzing projected point clouds overlaid on camera images through a neural network-based approach.
Inner-workings / Algorithms
The calibration status detection system operates through the following pipeline:
1. Data Preprocessing (CUDA-accelerated)
- Image Undistortion: Corrects camera distortion
- Point Cloud Projection: Projects 3D LiDAR points onto undistorted 2D image plane - adds depth and intensity information
- Morphological Dilation: Enhances point visibility for neural network input
2. Neural Network Inference (TensorRT)
- Input Format: 5-channel normalized data (RGB + depth + intensity)
- Architecture: Deep neural network trained on calibrated/miscalibrated data
- Output: Binary classification with confidence scores for calibration status
3. Runtime Modes
- MANUAL: On-demand validation via service calls
- PERIODIC: Regular validation at configurable intervals
- ACTIVE: Continuous monitoring with synchronized sensor data
Inputs / Outputs
Input
| Name | Type | Description |
|---|---|---|
~/input/velocity |
prerequisite.velocity_source parameter |
Vehicle velocity (multiple message types supported) |
input.cloud_topics |
sensor_msgs::msg::PointCloud2 |
LiDAR point cloud data |
input.image_topics |
sensor_msgs::msg::Image |
Camera image data (BGR8 format) |
| Camera info topics | sensor_msgs::msg::CameraInfo |
Camera intrinsic parameters and distortion coefficients |
Output
| Name | Type | Description |
|---|---|---|
/diagnostics |
diagnostic_msgs::msg::DiagnosticArray |
ROS diagnostics with calibration status |
~/validate_calibration_srv |
std_srvs::srv::Trigger |
Manual validation service (MANUAL mode) |
| Preview image topics | sensor_msgs::msg::Image |
Visualization images with projected points |
Services
| Name | Type | Description |
|---|---|---|
~/input/validate_calibration_srv |
std_srvs::srv::Trigger |
Manual calibration validation request |
Parameters
Node Parameters
{{ json_to_markdown(“sensing/autoware_calibration_status_classifier/schema/calibration_status_classifier.schema.json”) }}
Network Parameters
{{ json_to_markdown(“sensing/autoware_calibration_status_classifier/schema/ml_package_calibration_status_classifier.schema.json”) }}
Assumptions / Known Limits
- Input images must be in BGR8 format (8-bit per channel)
- Input point clouds should contain intensity information (XYZIRC format)
Usage Example
ros2 launch autoware_calibration_status_classifier calibration_status_classifier.launch.xml
Future Extensions / Unimplemented Parts
- Manual runtime mode with detailed response (custom srv)
- Replace filter for objects on the scene counter to objects within the camera FOV counter (raytracing)
- Multithreading for multiple camera-LiDAR pairs
- More filters (e.g. yaw rate)
- cuda_blackboard support
- Replace custom kernels with NPP functions where applicable
References
Changelog for package autoware_calibration_status_classifier
0.48.0 (2025-11-18)
-
Merge remote-tracking branch 'origin/main' into humble
-
fix: tf2 uses hpp headers in rolling (and is backported) (#11620)
-
feat(autoware_calibration_status_classifier): add ML-based miscalibration detection module (#11222)
- feat(autoware_calibration_status): add ML-based miscalibration detection module
- feat(autoware_calibration_status): extended configuration and diagnostics
- fix(autoware_calibration_status): model's input array format
- test(autoware_calibration_status): inference only test
- style(pre-commit): autofix
- refactor(autoware_calibration_status): rename lidar_range to max_depth
- fix(autoware_calibration_status): add missing header
- feat(autoware_calibration_status): add naive number of objects filter
- feat(autoware_calibration_status): add periodic and manual mode
* refactor(autoware_calibration_status): improve image handling and optimize calibration pipeline Refactors the calibration status module to handle both distorted and rectified images, reorganizes data structures, and optimizes the processing pipeline. Adds new utility classes for better camera/LiDAR information management.
- style(pre-commit): autofix
- style(autoware_calibration_status): pre-commit
- test(autoware_calibration_status): make that CI skip unit tests
- style(autoware_calibration_status): cspell
- test(autoware_calibration_status): skip test before loading data
- test(autoware_calibration_status): another yet attempt to fix CI
- style(autoware_calibration_status): cspell
* fix(autoware_calibration_status): correct types Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
* fix(autoware_calibration_status): correct types Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
- style(pre-commit): autofix
- fix(autoware_calibration_status): user desired function for cuda memory allocation
- style(autoware_calibration_status): early return instead of scoped implementaiton
- feat(autoware_calibration_status): use of __restrict_ keyword
- docs(autoware_calibration_status): update future work
- fix(autoware_calibration_status): include missing directory
- fix(autoware_calibration_status): use preallocated class member
- style(pre-commit): autofix
- style(autoware_calibration_status): use lambda for adding diagnostic
- style(autoware_calibration_status): split function
- style(pre-commit): autofix
- refactor(autoware_calibration_status): change atomic operation logic and extras
- refactor(autoware_calibration_status): use autoware diagnostic interface
- fix(autoware_calibration_status): cspell
- feat(autoware_calibration_status_classifier): rename autoware_calibration_status to autoware_calibration_status_classifier
- style(pre-commit): autofix
- fix(autoware_calibration_status_classifier): prevent potential race condition
- fix(autoware_calibration_status_classifier): add mutex for input msgs data access
* fix(autoware_calibration_status_classifier): pre-commit ---------Co-authored-by: pre-commit-ci-lite[bot] <117423508+pre-commit-ci-lite[bot]@users.noreply.github.com> Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
-
Contributors: Amadeusz Szymko, Ryohsuke Mitsudome, Tim Clephas
Package Dependencies
System Dependencies
| Name |
|---|
| libopencv-dev |
Dependant Packages
Launch files
- launch/calibration_status_classifier.launch.xml
-
- input_velocity [default: /sensing/vehicle_velocity_converter/twist_with_covariance]
- input_objects [default: /perception/object_recognition/objects]
- validate_calibration_srv [default: validate_calibration]
- data_path [default: $(env HOME)/autoware_data]
- model_name [default: calibration_status_classifier]
- model_path [default: $(var data_path)/calibration_status_classifier]
- model_param_path [default: $(find-pkg-share autoware_calibration_status_classifier)/config/$(var model_name).param.yaml]
- ml_package_param_path [default: $(var model_path)/ml_package_$(var model_name).param.yaml]
- build_only [default: false]
- use_sim_time [default: true]
- decompressor_param_file [default: $(find-pkg-share autoware_image_transport_decompressor)/config/image_transport_decompressor.param.yaml]
Messages
Services
Plugins
Recent questions tagged autoware_calibration_status_classifier at Robotics Stack Exchange
Package Summary
| Tags | No category tags. |
| Version | 0.48.0 |
| License | Apache License 2.0 |
| Build type | AMENT_CMAKE |
| Use | RECOMMENDED |
Repository Summary
| Description | |
| Checkout URI | https://github.com/autowarefoundation/autoware_universe.git |
| VCS Type | git |
| VCS Version | main |
| Last Updated | 2025-12-03 |
| Dev Status | UNKNOWN |
| Released | UNRELEASED |
| Tags | planner ros calibration self-driving-car autonomous-driving autonomous-vehicles ros2 3d-map autoware |
| Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Additional Links
Maintainers
- Amadeusz Szymko
Authors
- Amadeusz Szymko
autoware_calibration_status_classifier
Purpose
The autoware_calibration_status_classifier package provides real-time LiDAR-camera calibration validation using deep learning inference. It detects miscalibration between LiDAR and camera sensors by analyzing projected point clouds overlaid on camera images through a neural network-based approach.
Inner-workings / Algorithms
The calibration status detection system operates through the following pipeline:
1. Data Preprocessing (CUDA-accelerated)
- Image Undistortion: Corrects camera distortion
- Point Cloud Projection: Projects 3D LiDAR points onto undistorted 2D image plane - adds depth and intensity information
- Morphological Dilation: Enhances point visibility for neural network input
2. Neural Network Inference (TensorRT)
- Input Format: 5-channel normalized data (RGB + depth + intensity)
- Architecture: Deep neural network trained on calibrated/miscalibrated data
- Output: Binary classification with confidence scores for calibration status
3. Runtime Modes
- MANUAL: On-demand validation via service calls
- PERIODIC: Regular validation at configurable intervals
- ACTIVE: Continuous monitoring with synchronized sensor data
Inputs / Outputs
Input
| Name | Type | Description |
|---|---|---|
~/input/velocity |
prerequisite.velocity_source parameter |
Vehicle velocity (multiple message types supported) |
input.cloud_topics |
sensor_msgs::msg::PointCloud2 |
LiDAR point cloud data |
input.image_topics |
sensor_msgs::msg::Image |
Camera image data (BGR8 format) |
| Camera info topics | sensor_msgs::msg::CameraInfo |
Camera intrinsic parameters and distortion coefficients |
Output
| Name | Type | Description |
|---|---|---|
/diagnostics |
diagnostic_msgs::msg::DiagnosticArray |
ROS diagnostics with calibration status |
~/validate_calibration_srv |
std_srvs::srv::Trigger |
Manual validation service (MANUAL mode) |
| Preview image topics | sensor_msgs::msg::Image |
Visualization images with projected points |
Services
| Name | Type | Description |
|---|---|---|
~/input/validate_calibration_srv |
std_srvs::srv::Trigger |
Manual calibration validation request |
Parameters
Node Parameters
{{ json_to_markdown(“sensing/autoware_calibration_status_classifier/schema/calibration_status_classifier.schema.json”) }}
Network Parameters
{{ json_to_markdown(“sensing/autoware_calibration_status_classifier/schema/ml_package_calibration_status_classifier.schema.json”) }}
Assumptions / Known Limits
- Input images must be in BGR8 format (8-bit per channel)
- Input point clouds should contain intensity information (XYZIRC format)
Usage Example
ros2 launch autoware_calibration_status_classifier calibration_status_classifier.launch.xml
Future Extensions / Unimplemented Parts
- Manual runtime mode with detailed response (custom srv)
- Replace filter for objects on the scene counter to objects within the camera FOV counter (raytracing)
- Multithreading for multiple camera-LiDAR pairs
- More filters (e.g. yaw rate)
- cuda_blackboard support
- Replace custom kernels with NPP functions where applicable
References
Changelog for package autoware_calibration_status_classifier
0.48.0 (2025-11-18)
-
Merge remote-tracking branch 'origin/main' into humble
-
fix: tf2 uses hpp headers in rolling (and is backported) (#11620)
-
feat(autoware_calibration_status_classifier): add ML-based miscalibration detection module (#11222)
- feat(autoware_calibration_status): add ML-based miscalibration detection module
- feat(autoware_calibration_status): extended configuration and diagnostics
- fix(autoware_calibration_status): model's input array format
- test(autoware_calibration_status): inference only test
- style(pre-commit): autofix
- refactor(autoware_calibration_status): rename lidar_range to max_depth
- fix(autoware_calibration_status): add missing header
- feat(autoware_calibration_status): add naive number of objects filter
- feat(autoware_calibration_status): add periodic and manual mode
* refactor(autoware_calibration_status): improve image handling and optimize calibration pipeline Refactors the calibration status module to handle both distorted and rectified images, reorganizes data structures, and optimizes the processing pipeline. Adds new utility classes for better camera/LiDAR information management.
- style(pre-commit): autofix
- style(autoware_calibration_status): pre-commit
- test(autoware_calibration_status): make that CI skip unit tests
- style(autoware_calibration_status): cspell
- test(autoware_calibration_status): skip test before loading data
- test(autoware_calibration_status): another yet attempt to fix CI
- style(autoware_calibration_status): cspell
* fix(autoware_calibration_status): correct types Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
* fix(autoware_calibration_status): correct types Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
- style(pre-commit): autofix
- fix(autoware_calibration_status): user desired function for cuda memory allocation
- style(autoware_calibration_status): early return instead of scoped implementaiton
- feat(autoware_calibration_status): use of __restrict_ keyword
- docs(autoware_calibration_status): update future work
- fix(autoware_calibration_status): include missing directory
- fix(autoware_calibration_status): use preallocated class member
- style(pre-commit): autofix
- style(autoware_calibration_status): use lambda for adding diagnostic
- style(autoware_calibration_status): split function
- style(pre-commit): autofix
- refactor(autoware_calibration_status): change atomic operation logic and extras
- refactor(autoware_calibration_status): use autoware diagnostic interface
- fix(autoware_calibration_status): cspell
- feat(autoware_calibration_status_classifier): rename autoware_calibration_status to autoware_calibration_status_classifier
- style(pre-commit): autofix
- fix(autoware_calibration_status_classifier): prevent potential race condition
- fix(autoware_calibration_status_classifier): add mutex for input msgs data access
* fix(autoware_calibration_status_classifier): pre-commit ---------Co-authored-by: pre-commit-ci-lite[bot] <117423508+pre-commit-ci-lite[bot]@users.noreply.github.com> Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
-
Contributors: Amadeusz Szymko, Ryohsuke Mitsudome, Tim Clephas
Package Dependencies
System Dependencies
| Name |
|---|
| libopencv-dev |
Dependant Packages
Launch files
- launch/calibration_status_classifier.launch.xml
-
- input_velocity [default: /sensing/vehicle_velocity_converter/twist_with_covariance]
- input_objects [default: /perception/object_recognition/objects]
- validate_calibration_srv [default: validate_calibration]
- data_path [default: $(env HOME)/autoware_data]
- model_name [default: calibration_status_classifier]
- model_path [default: $(var data_path)/calibration_status_classifier]
- model_param_path [default: $(find-pkg-share autoware_calibration_status_classifier)/config/$(var model_name).param.yaml]
- ml_package_param_path [default: $(var model_path)/ml_package_$(var model_name).param.yaml]
- build_only [default: false]
- use_sim_time [default: true]
- decompressor_param_file [default: $(find-pkg-share autoware_image_transport_decompressor)/config/image_transport_decompressor.param.yaml]
Messages
Services
Plugins
Recent questions tagged autoware_calibration_status_classifier at Robotics Stack Exchange
Package Summary
| Tags | No category tags. |
| Version | 0.48.0 |
| License | Apache License 2.0 |
| Build type | AMENT_CMAKE |
| Use | RECOMMENDED |
Repository Summary
| Description | |
| Checkout URI | https://github.com/autowarefoundation/autoware_universe.git |
| VCS Type | git |
| VCS Version | main |
| Last Updated | 2025-12-03 |
| Dev Status | UNKNOWN |
| Released | UNRELEASED |
| Tags | planner ros calibration self-driving-car autonomous-driving autonomous-vehicles ros2 3d-map autoware |
| Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Additional Links
Maintainers
- Amadeusz Szymko
Authors
- Amadeusz Szymko
autoware_calibration_status_classifier
Purpose
The autoware_calibration_status_classifier package provides real-time LiDAR-camera calibration validation using deep learning inference. It detects miscalibration between LiDAR and camera sensors by analyzing projected point clouds overlaid on camera images through a neural network-based approach.
Inner-workings / Algorithms
The calibration status detection system operates through the following pipeline:
1. Data Preprocessing (CUDA-accelerated)
- Image Undistortion: Corrects camera distortion
- Point Cloud Projection: Projects 3D LiDAR points onto undistorted 2D image plane - adds depth and intensity information
- Morphological Dilation: Enhances point visibility for neural network input
2. Neural Network Inference (TensorRT)
- Input Format: 5-channel normalized data (RGB + depth + intensity)
- Architecture: Deep neural network trained on calibrated/miscalibrated data
- Output: Binary classification with confidence scores for calibration status
3. Runtime Modes
- MANUAL: On-demand validation via service calls
- PERIODIC: Regular validation at configurable intervals
- ACTIVE: Continuous monitoring with synchronized sensor data
Inputs / Outputs
Input
| Name | Type | Description |
|---|---|---|
~/input/velocity |
prerequisite.velocity_source parameter |
Vehicle velocity (multiple message types supported) |
input.cloud_topics |
sensor_msgs::msg::PointCloud2 |
LiDAR point cloud data |
input.image_topics |
sensor_msgs::msg::Image |
Camera image data (BGR8 format) |
| Camera info topics | sensor_msgs::msg::CameraInfo |
Camera intrinsic parameters and distortion coefficients |
Output
| Name | Type | Description |
|---|---|---|
/diagnostics |
diagnostic_msgs::msg::DiagnosticArray |
ROS diagnostics with calibration status |
~/validate_calibration_srv |
std_srvs::srv::Trigger |
Manual validation service (MANUAL mode) |
| Preview image topics | sensor_msgs::msg::Image |
Visualization images with projected points |
Services
| Name | Type | Description |
|---|---|---|
~/input/validate_calibration_srv |
std_srvs::srv::Trigger |
Manual calibration validation request |
Parameters
Node Parameters
{{ json_to_markdown(“sensing/autoware_calibration_status_classifier/schema/calibration_status_classifier.schema.json”) }}
Network Parameters
{{ json_to_markdown(“sensing/autoware_calibration_status_classifier/schema/ml_package_calibration_status_classifier.schema.json”) }}
Assumptions / Known Limits
- Input images must be in BGR8 format (8-bit per channel)
- Input point clouds should contain intensity information (XYZIRC format)
Usage Example
ros2 launch autoware_calibration_status_classifier calibration_status_classifier.launch.xml
Future Extensions / Unimplemented Parts
- Manual runtime mode with detailed response (custom srv)
- Replace filter for objects on the scene counter to objects within the camera FOV counter (raytracing)
- Multithreading for multiple camera-LiDAR pairs
- More filters (e.g. yaw rate)
- cuda_blackboard support
- Replace custom kernels with NPP functions where applicable
References
Changelog for package autoware_calibration_status_classifier
0.48.0 (2025-11-18)
-
Merge remote-tracking branch 'origin/main' into humble
-
fix: tf2 uses hpp headers in rolling (and is backported) (#11620)
-
feat(autoware_calibration_status_classifier): add ML-based miscalibration detection module (#11222)
- feat(autoware_calibration_status): add ML-based miscalibration detection module
- feat(autoware_calibration_status): extended configuration and diagnostics
- fix(autoware_calibration_status): model's input array format
- test(autoware_calibration_status): inference only test
- style(pre-commit): autofix
- refactor(autoware_calibration_status): rename lidar_range to max_depth
- fix(autoware_calibration_status): add missing header
- feat(autoware_calibration_status): add naive number of objects filter
- feat(autoware_calibration_status): add periodic and manual mode
* refactor(autoware_calibration_status): improve image handling and optimize calibration pipeline Refactors the calibration status module to handle both distorted and rectified images, reorganizes data structures, and optimizes the processing pipeline. Adds new utility classes for better camera/LiDAR information management.
- style(pre-commit): autofix
- style(autoware_calibration_status): pre-commit
- test(autoware_calibration_status): make that CI skip unit tests
- style(autoware_calibration_status): cspell
- test(autoware_calibration_status): skip test before loading data
- test(autoware_calibration_status): another yet attempt to fix CI
- style(autoware_calibration_status): cspell
* fix(autoware_calibration_status): correct types Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
* fix(autoware_calibration_status): correct types Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
- style(pre-commit): autofix
- fix(autoware_calibration_status): user desired function for cuda memory allocation
- style(autoware_calibration_status): early return instead of scoped implementaiton
- feat(autoware_calibration_status): use of __restrict_ keyword
- docs(autoware_calibration_status): update future work
- fix(autoware_calibration_status): include missing directory
- fix(autoware_calibration_status): use preallocated class member
- style(pre-commit): autofix
- style(autoware_calibration_status): use lambda for adding diagnostic
- style(autoware_calibration_status): split function
- style(pre-commit): autofix
- refactor(autoware_calibration_status): change atomic operation logic and extras
- refactor(autoware_calibration_status): use autoware diagnostic interface
- fix(autoware_calibration_status): cspell
- feat(autoware_calibration_status_classifier): rename autoware_calibration_status to autoware_calibration_status_classifier
- style(pre-commit): autofix
- fix(autoware_calibration_status_classifier): prevent potential race condition
- fix(autoware_calibration_status_classifier): add mutex for input msgs data access
* fix(autoware_calibration_status_classifier): pre-commit ---------Co-authored-by: pre-commit-ci-lite[bot] <117423508+pre-commit-ci-lite[bot]@users.noreply.github.com> Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
-
Contributors: Amadeusz Szymko, Ryohsuke Mitsudome, Tim Clephas
Package Dependencies
System Dependencies
| Name |
|---|
| libopencv-dev |
Dependant Packages
Launch files
- launch/calibration_status_classifier.launch.xml
-
- input_velocity [default: /sensing/vehicle_velocity_converter/twist_with_covariance]
- input_objects [default: /perception/object_recognition/objects]
- validate_calibration_srv [default: validate_calibration]
- data_path [default: $(env HOME)/autoware_data]
- model_name [default: calibration_status_classifier]
- model_path [default: $(var data_path)/calibration_status_classifier]
- model_param_path [default: $(find-pkg-share autoware_calibration_status_classifier)/config/$(var model_name).param.yaml]
- ml_package_param_path [default: $(var model_path)/ml_package_$(var model_name).param.yaml]
- build_only [default: false]
- use_sim_time [default: true]
- decompressor_param_file [default: $(find-pkg-share autoware_image_transport_decompressor)/config/image_transport_decompressor.param.yaml]
Messages
Services
Plugins
Recent questions tagged autoware_calibration_status_classifier at Robotics Stack Exchange
Package Summary
| Tags | No category tags. |
| Version | 0.48.0 |
| License | Apache License 2.0 |
| Build type | AMENT_CMAKE |
| Use | RECOMMENDED |
Repository Summary
| Description | |
| Checkout URI | https://github.com/autowarefoundation/autoware_universe.git |
| VCS Type | git |
| VCS Version | main |
| Last Updated | 2025-12-03 |
| Dev Status | UNKNOWN |
| Released | UNRELEASED |
| Tags | planner ros calibration self-driving-car autonomous-driving autonomous-vehicles ros2 3d-map autoware |
| Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Additional Links
Maintainers
- Amadeusz Szymko
Authors
- Amadeusz Szymko
autoware_calibration_status_classifier
Purpose
The autoware_calibration_status_classifier package provides real-time LiDAR-camera calibration validation using deep learning inference. It detects miscalibration between LiDAR and camera sensors by analyzing projected point clouds overlaid on camera images through a neural network-based approach.
Inner-workings / Algorithms
The calibration status detection system operates through the following pipeline:
1. Data Preprocessing (CUDA-accelerated)
- Image Undistortion: Corrects camera distortion
- Point Cloud Projection: Projects 3D LiDAR points onto undistorted 2D image plane - adds depth and intensity information
- Morphological Dilation: Enhances point visibility for neural network input
2. Neural Network Inference (TensorRT)
- Input Format: 5-channel normalized data (RGB + depth + intensity)
- Architecture: Deep neural network trained on calibrated/miscalibrated data
- Output: Binary classification with confidence scores for calibration status
3. Runtime Modes
- MANUAL: On-demand validation via service calls
- PERIODIC: Regular validation at configurable intervals
- ACTIVE: Continuous monitoring with synchronized sensor data
Inputs / Outputs
Input
| Name | Type | Description |
|---|---|---|
~/input/velocity |
prerequisite.velocity_source parameter |
Vehicle velocity (multiple message types supported) |
input.cloud_topics |
sensor_msgs::msg::PointCloud2 |
LiDAR point cloud data |
input.image_topics |
sensor_msgs::msg::Image |
Camera image data (BGR8 format) |
| Camera info topics | sensor_msgs::msg::CameraInfo |
Camera intrinsic parameters and distortion coefficients |
Output
| Name | Type | Description |
|---|---|---|
/diagnostics |
diagnostic_msgs::msg::DiagnosticArray |
ROS diagnostics with calibration status |
~/validate_calibration_srv |
std_srvs::srv::Trigger |
Manual validation service (MANUAL mode) |
| Preview image topics | sensor_msgs::msg::Image |
Visualization images with projected points |
Services
| Name | Type | Description |
|---|---|---|
~/input/validate_calibration_srv |
std_srvs::srv::Trigger |
Manual calibration validation request |
Parameters
Node Parameters
{{ json_to_markdown(“sensing/autoware_calibration_status_classifier/schema/calibration_status_classifier.schema.json”) }}
Network Parameters
{{ json_to_markdown(“sensing/autoware_calibration_status_classifier/schema/ml_package_calibration_status_classifier.schema.json”) }}
Assumptions / Known Limits
- Input images must be in BGR8 format (8-bit per channel)
- Input point clouds should contain intensity information (XYZIRC format)
Usage Example
ros2 launch autoware_calibration_status_classifier calibration_status_classifier.launch.xml
Future Extensions / Unimplemented Parts
- Manual runtime mode with detailed response (custom srv)
- Replace filter for objects on the scene counter to objects within the camera FOV counter (raytracing)
- Multithreading for multiple camera-LiDAR pairs
- More filters (e.g. yaw rate)
- cuda_blackboard support
- Replace custom kernels with NPP functions where applicable
References
Changelog for package autoware_calibration_status_classifier
0.48.0 (2025-11-18)
-
Merge remote-tracking branch 'origin/main' into humble
-
fix: tf2 uses hpp headers in rolling (and is backported) (#11620)
-
feat(autoware_calibration_status_classifier): add ML-based miscalibration detection module (#11222)
- feat(autoware_calibration_status): add ML-based miscalibration detection module
- feat(autoware_calibration_status): extended configuration and diagnostics
- fix(autoware_calibration_status): model's input array format
- test(autoware_calibration_status): inference only test
- style(pre-commit): autofix
- refactor(autoware_calibration_status): rename lidar_range to max_depth
- fix(autoware_calibration_status): add missing header
- feat(autoware_calibration_status): add naive number of objects filter
- feat(autoware_calibration_status): add periodic and manual mode
* refactor(autoware_calibration_status): improve image handling and optimize calibration pipeline Refactors the calibration status module to handle both distorted and rectified images, reorganizes data structures, and optimizes the processing pipeline. Adds new utility classes for better camera/LiDAR information management.
- style(pre-commit): autofix
- style(autoware_calibration_status): pre-commit
- test(autoware_calibration_status): make that CI skip unit tests
- style(autoware_calibration_status): cspell
- test(autoware_calibration_status): skip test before loading data
- test(autoware_calibration_status): another yet attempt to fix CI
- style(autoware_calibration_status): cspell
* fix(autoware_calibration_status): correct types Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
* fix(autoware_calibration_status): correct types Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
- style(pre-commit): autofix
- fix(autoware_calibration_status): user desired function for cuda memory allocation
- style(autoware_calibration_status): early return instead of scoped implementaiton
- feat(autoware_calibration_status): use of __restrict_ keyword
- docs(autoware_calibration_status): update future work
- fix(autoware_calibration_status): include missing directory
- fix(autoware_calibration_status): use preallocated class member
- style(pre-commit): autofix
- style(autoware_calibration_status): use lambda for adding diagnostic
- style(autoware_calibration_status): split function
- style(pre-commit): autofix
- refactor(autoware_calibration_status): change atomic operation logic and extras
- refactor(autoware_calibration_status): use autoware diagnostic interface
- fix(autoware_calibration_status): cspell
- feat(autoware_calibration_status_classifier): rename autoware_calibration_status to autoware_calibration_status_classifier
- style(pre-commit): autofix
- fix(autoware_calibration_status_classifier): prevent potential race condition
- fix(autoware_calibration_status_classifier): add mutex for input msgs data access
* fix(autoware_calibration_status_classifier): pre-commit ---------Co-authored-by: pre-commit-ci-lite[bot] <117423508+pre-commit-ci-lite[bot]@users.noreply.github.com> Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
-
Contributors: Amadeusz Szymko, Ryohsuke Mitsudome, Tim Clephas
Package Dependencies
System Dependencies
| Name |
|---|
| libopencv-dev |
Dependant Packages
Launch files
- launch/calibration_status_classifier.launch.xml
-
- input_velocity [default: /sensing/vehicle_velocity_converter/twist_with_covariance]
- input_objects [default: /perception/object_recognition/objects]
- validate_calibration_srv [default: validate_calibration]
- data_path [default: $(env HOME)/autoware_data]
- model_name [default: calibration_status_classifier]
- model_path [default: $(var data_path)/calibration_status_classifier]
- model_param_path [default: $(find-pkg-share autoware_calibration_status_classifier)/config/$(var model_name).param.yaml]
- ml_package_param_path [default: $(var model_path)/ml_package_$(var model_name).param.yaml]
- build_only [default: false]
- use_sim_time [default: true]
- decompressor_param_file [default: $(find-pkg-share autoware_image_transport_decompressor)/config/image_transport_decompressor.param.yaml]
Messages
Services
Plugins
Recent questions tagged autoware_calibration_status_classifier at Robotics Stack Exchange
Package Summary
| Tags | No category tags. |
| Version | 0.48.0 |
| License | Apache License 2.0 |
| Build type | AMENT_CMAKE |
| Use | RECOMMENDED |
Repository Summary
| Description | |
| Checkout URI | https://github.com/autowarefoundation/autoware_universe.git |
| VCS Type | git |
| VCS Version | main |
| Last Updated | 2025-12-03 |
| Dev Status | UNKNOWN |
| Released | UNRELEASED |
| Tags | planner ros calibration self-driving-car autonomous-driving autonomous-vehicles ros2 3d-map autoware |
| Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Additional Links
Maintainers
- Amadeusz Szymko
Authors
- Amadeusz Szymko
autoware_calibration_status_classifier
Purpose
The autoware_calibration_status_classifier package provides real-time LiDAR-camera calibration validation using deep learning inference. It detects miscalibration between LiDAR and camera sensors by analyzing projected point clouds overlaid on camera images through a neural network-based approach.
Inner-workings / Algorithms
The calibration status detection system operates through the following pipeline:
1. Data Preprocessing (CUDA-accelerated)
- Image Undistortion: Corrects camera distortion
- Point Cloud Projection: Projects 3D LiDAR points onto undistorted 2D image plane - adds depth and intensity information
- Morphological Dilation: Enhances point visibility for neural network input
2. Neural Network Inference (TensorRT)
- Input Format: 5-channel normalized data (RGB + depth + intensity)
- Architecture: Deep neural network trained on calibrated/miscalibrated data
- Output: Binary classification with confidence scores for calibration status
3. Runtime Modes
- MANUAL: On-demand validation via service calls
- PERIODIC: Regular validation at configurable intervals
- ACTIVE: Continuous monitoring with synchronized sensor data
Inputs / Outputs
Input
| Name | Type | Description |
|---|---|---|
~/input/velocity |
prerequisite.velocity_source parameter |
Vehicle velocity (multiple message types supported) |
input.cloud_topics |
sensor_msgs::msg::PointCloud2 |
LiDAR point cloud data |
input.image_topics |
sensor_msgs::msg::Image |
Camera image data (BGR8 format) |
| Camera info topics | sensor_msgs::msg::CameraInfo |
Camera intrinsic parameters and distortion coefficients |
Output
| Name | Type | Description |
|---|---|---|
/diagnostics |
diagnostic_msgs::msg::DiagnosticArray |
ROS diagnostics with calibration status |
~/validate_calibration_srv |
std_srvs::srv::Trigger |
Manual validation service (MANUAL mode) |
| Preview image topics | sensor_msgs::msg::Image |
Visualization images with projected points |
Services
| Name | Type | Description |
|---|---|---|
~/input/validate_calibration_srv |
std_srvs::srv::Trigger |
Manual calibration validation request |
Parameters
Node Parameters
{{ json_to_markdown(“sensing/autoware_calibration_status_classifier/schema/calibration_status_classifier.schema.json”) }}
Network Parameters
{{ json_to_markdown(“sensing/autoware_calibration_status_classifier/schema/ml_package_calibration_status_classifier.schema.json”) }}
Assumptions / Known Limits
- Input images must be in BGR8 format (8-bit per channel)
- Input point clouds should contain intensity information (XYZIRC format)
Usage Example
ros2 launch autoware_calibration_status_classifier calibration_status_classifier.launch.xml
Future Extensions / Unimplemented Parts
- Manual runtime mode with detailed response (custom srv)
- Replace filter for objects on the scene counter to objects within the camera FOV counter (raytracing)
- Multithreading for multiple camera-LiDAR pairs
- More filters (e.g. yaw rate)
- cuda_blackboard support
- Replace custom kernels with NPP functions where applicable
References
Changelog for package autoware_calibration_status_classifier
0.48.0 (2025-11-18)
-
Merge remote-tracking branch 'origin/main' into humble
-
fix: tf2 uses hpp headers in rolling (and is backported) (#11620)
-
feat(autoware_calibration_status_classifier): add ML-based miscalibration detection module (#11222)
- feat(autoware_calibration_status): add ML-based miscalibration detection module
- feat(autoware_calibration_status): extended configuration and diagnostics
- fix(autoware_calibration_status): model's input array format
- test(autoware_calibration_status): inference only test
- style(pre-commit): autofix
- refactor(autoware_calibration_status): rename lidar_range to max_depth
- fix(autoware_calibration_status): add missing header
- feat(autoware_calibration_status): add naive number of objects filter
- feat(autoware_calibration_status): add periodic and manual mode
* refactor(autoware_calibration_status): improve image handling and optimize calibration pipeline Refactors the calibration status module to handle both distorted and rectified images, reorganizes data structures, and optimizes the processing pipeline. Adds new utility classes for better camera/LiDAR information management.
- style(pre-commit): autofix
- style(autoware_calibration_status): pre-commit
- test(autoware_calibration_status): make that CI skip unit tests
- style(autoware_calibration_status): cspell
- test(autoware_calibration_status): skip test before loading data
- test(autoware_calibration_status): another yet attempt to fix CI
- style(autoware_calibration_status): cspell
* fix(autoware_calibration_status): correct types Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
* fix(autoware_calibration_status): correct types Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
- style(pre-commit): autofix
- fix(autoware_calibration_status): user desired function for cuda memory allocation
- style(autoware_calibration_status): early return instead of scoped implementaiton
- feat(autoware_calibration_status): use of __restrict_ keyword
- docs(autoware_calibration_status): update future work
- fix(autoware_calibration_status): include missing directory
- fix(autoware_calibration_status): use preallocated class member
- style(pre-commit): autofix
- style(autoware_calibration_status): use lambda for adding diagnostic
- style(autoware_calibration_status): split function
- style(pre-commit): autofix
- refactor(autoware_calibration_status): change atomic operation logic and extras
- refactor(autoware_calibration_status): use autoware diagnostic interface
- fix(autoware_calibration_status): cspell
- feat(autoware_calibration_status_classifier): rename autoware_calibration_status to autoware_calibration_status_classifier
- style(pre-commit): autofix
- fix(autoware_calibration_status_classifier): prevent potential race condition
- fix(autoware_calibration_status_classifier): add mutex for input msgs data access
* fix(autoware_calibration_status_classifier): pre-commit ---------Co-authored-by: pre-commit-ci-lite[bot] <117423508+pre-commit-ci-lite[bot]@users.noreply.github.com> Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
-
Contributors: Amadeusz Szymko, Ryohsuke Mitsudome, Tim Clephas
Package Dependencies
System Dependencies
| Name |
|---|
| libopencv-dev |
Dependant Packages
Launch files
- launch/calibration_status_classifier.launch.xml
-
- input_velocity [default: /sensing/vehicle_velocity_converter/twist_with_covariance]
- input_objects [default: /perception/object_recognition/objects]
- validate_calibration_srv [default: validate_calibration]
- data_path [default: $(env HOME)/autoware_data]
- model_name [default: calibration_status_classifier]
- model_path [default: $(var data_path)/calibration_status_classifier]
- model_param_path [default: $(find-pkg-share autoware_calibration_status_classifier)/config/$(var model_name).param.yaml]
- ml_package_param_path [default: $(var model_path)/ml_package_$(var model_name).param.yaml]
- build_only [default: false]
- use_sim_time [default: true]
- decompressor_param_file [default: $(find-pkg-share autoware_image_transport_decompressor)/config/image_transport_decompressor.param.yaml]
Messages
Services
Plugins
Recent questions tagged autoware_calibration_status_classifier at Robotics Stack Exchange
Package Summary
| Tags | No category tags. |
| Version | 0.48.0 |
| License | Apache License 2.0 |
| Build type | AMENT_CMAKE |
| Use | RECOMMENDED |
Repository Summary
| Description | |
| Checkout URI | https://github.com/autowarefoundation/autoware_universe.git |
| VCS Type | git |
| VCS Version | main |
| Last Updated | 2025-12-03 |
| Dev Status | UNKNOWN |
| Released | UNRELEASED |
| Tags | planner ros calibration self-driving-car autonomous-driving autonomous-vehicles ros2 3d-map autoware |
| Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Additional Links
Maintainers
- Amadeusz Szymko
Authors
- Amadeusz Szymko
autoware_calibration_status_classifier
Purpose
The autoware_calibration_status_classifier package provides real-time LiDAR-camera calibration validation using deep learning inference. It detects miscalibration between LiDAR and camera sensors by analyzing projected point clouds overlaid on camera images through a neural network-based approach.
Inner-workings / Algorithms
The calibration status detection system operates through the following pipeline:
1. Data Preprocessing (CUDA-accelerated)
- Image Undistortion: Corrects camera distortion
- Point Cloud Projection: Projects 3D LiDAR points onto undistorted 2D image plane - adds depth and intensity information
- Morphological Dilation: Enhances point visibility for neural network input
2. Neural Network Inference (TensorRT)
- Input Format: 5-channel normalized data (RGB + depth + intensity)
- Architecture: Deep neural network trained on calibrated/miscalibrated data
- Output: Binary classification with confidence scores for calibration status
3. Runtime Modes
- MANUAL: On-demand validation via service calls
- PERIODIC: Regular validation at configurable intervals
- ACTIVE: Continuous monitoring with synchronized sensor data
Inputs / Outputs
Input
| Name | Type | Description |
|---|---|---|
~/input/velocity |
prerequisite.velocity_source parameter |
Vehicle velocity (multiple message types supported) |
input.cloud_topics |
sensor_msgs::msg::PointCloud2 |
LiDAR point cloud data |
input.image_topics |
sensor_msgs::msg::Image |
Camera image data (BGR8 format) |
| Camera info topics | sensor_msgs::msg::CameraInfo |
Camera intrinsic parameters and distortion coefficients |
Output
| Name | Type | Description |
|---|---|---|
/diagnostics |
diagnostic_msgs::msg::DiagnosticArray |
ROS diagnostics with calibration status |
~/validate_calibration_srv |
std_srvs::srv::Trigger |
Manual validation service (MANUAL mode) |
| Preview image topics | sensor_msgs::msg::Image |
Visualization images with projected points |
Services
| Name | Type | Description |
|---|---|---|
~/input/validate_calibration_srv |
std_srvs::srv::Trigger |
Manual calibration validation request |
Parameters
Node Parameters
{{ json_to_markdown(“sensing/autoware_calibration_status_classifier/schema/calibration_status_classifier.schema.json”) }}
Network Parameters
{{ json_to_markdown(“sensing/autoware_calibration_status_classifier/schema/ml_package_calibration_status_classifier.schema.json”) }}
Assumptions / Known Limits
- Input images must be in BGR8 format (8-bit per channel)
- Input point clouds should contain intensity information (XYZIRC format)
Usage Example
ros2 launch autoware_calibration_status_classifier calibration_status_classifier.launch.xml
Future Extensions / Unimplemented Parts
- Manual runtime mode with detailed response (custom srv)
- Replace filter for objects on the scene counter to objects within the camera FOV counter (raytracing)
- Multithreading for multiple camera-LiDAR pairs
- More filters (e.g. yaw rate)
- cuda_blackboard support
- Replace custom kernels with NPP functions where applicable
References
Changelog for package autoware_calibration_status_classifier
0.48.0 (2025-11-18)
-
Merge remote-tracking branch 'origin/main' into humble
-
fix: tf2 uses hpp headers in rolling (and is backported) (#11620)
-
feat(autoware_calibration_status_classifier): add ML-based miscalibration detection module (#11222)
- feat(autoware_calibration_status): add ML-based miscalibration detection module
- feat(autoware_calibration_status): extended configuration and diagnostics
- fix(autoware_calibration_status): model's input array format
- test(autoware_calibration_status): inference only test
- style(pre-commit): autofix
- refactor(autoware_calibration_status): rename lidar_range to max_depth
- fix(autoware_calibration_status): add missing header
- feat(autoware_calibration_status): add naive number of objects filter
- feat(autoware_calibration_status): add periodic and manual mode
* refactor(autoware_calibration_status): improve image handling and optimize calibration pipeline Refactors the calibration status module to handle both distorted and rectified images, reorganizes data structures, and optimizes the processing pipeline. Adds new utility classes for better camera/LiDAR information management.
- style(pre-commit): autofix
- style(autoware_calibration_status): pre-commit
- test(autoware_calibration_status): make that CI skip unit tests
- style(autoware_calibration_status): cspell
- test(autoware_calibration_status): skip test before loading data
- test(autoware_calibration_status): another yet attempt to fix CI
- style(autoware_calibration_status): cspell
* fix(autoware_calibration_status): correct types Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
* fix(autoware_calibration_status): correct types Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
- style(pre-commit): autofix
- fix(autoware_calibration_status): user desired function for cuda memory allocation
- style(autoware_calibration_status): early return instead of scoped implementaiton
- feat(autoware_calibration_status): use of __restrict_ keyword
- docs(autoware_calibration_status): update future work
- fix(autoware_calibration_status): include missing directory
- fix(autoware_calibration_status): use preallocated class member
- style(pre-commit): autofix
- style(autoware_calibration_status): use lambda for adding diagnostic
- style(autoware_calibration_status): split function
- style(pre-commit): autofix
- refactor(autoware_calibration_status): change atomic operation logic and extras
- refactor(autoware_calibration_status): use autoware diagnostic interface
- fix(autoware_calibration_status): cspell
- feat(autoware_calibration_status_classifier): rename autoware_calibration_status to autoware_calibration_status_classifier
- style(pre-commit): autofix
- fix(autoware_calibration_status_classifier): prevent potential race condition
- fix(autoware_calibration_status_classifier): add mutex for input msgs data access
* fix(autoware_calibration_status_classifier): pre-commit ---------Co-authored-by: pre-commit-ci-lite[bot] <117423508+pre-commit-ci-lite[bot]@users.noreply.github.com> Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
-
Contributors: Amadeusz Szymko, Ryohsuke Mitsudome, Tim Clephas
Package Dependencies
System Dependencies
| Name |
|---|
| libopencv-dev |
Dependant Packages
Launch files
- launch/calibration_status_classifier.launch.xml
-
- input_velocity [default: /sensing/vehicle_velocity_converter/twist_with_covariance]
- input_objects [default: /perception/object_recognition/objects]
- validate_calibration_srv [default: validate_calibration]
- data_path [default: $(env HOME)/autoware_data]
- model_name [default: calibration_status_classifier]
- model_path [default: $(var data_path)/calibration_status_classifier]
- model_param_path [default: $(find-pkg-share autoware_calibration_status_classifier)/config/$(var model_name).param.yaml]
- ml_package_param_path [default: $(var model_path)/ml_package_$(var model_name).param.yaml]
- build_only [default: false]
- use_sim_time [default: true]
- decompressor_param_file [default: $(find-pkg-share autoware_image_transport_decompressor)/config/image_transport_decompressor.param.yaml]
Messages
Services
Plugins
Recent questions tagged autoware_calibration_status_classifier at Robotics Stack Exchange
Package Summary
| Tags | No category tags. |
| Version | 0.48.0 |
| License | Apache License 2.0 |
| Build type | AMENT_CMAKE |
| Use | RECOMMENDED |
Repository Summary
| Description | |
| Checkout URI | https://github.com/autowarefoundation/autoware_universe.git |
| VCS Type | git |
| VCS Version | main |
| Last Updated | 2025-12-03 |
| Dev Status | UNKNOWN |
| Released | UNRELEASED |
| Tags | planner ros calibration self-driving-car autonomous-driving autonomous-vehicles ros2 3d-map autoware |
| Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Additional Links
Maintainers
- Amadeusz Szymko
Authors
- Amadeusz Szymko
autoware_calibration_status_classifier
Purpose
The autoware_calibration_status_classifier package provides real-time LiDAR-camera calibration validation using deep learning inference. It detects miscalibration between LiDAR and camera sensors by analyzing projected point clouds overlaid on camera images through a neural network-based approach.
Inner-workings / Algorithms
The calibration status detection system operates through the following pipeline:
1. Data Preprocessing (CUDA-accelerated)
- Image Undistortion: Corrects camera distortion
- Point Cloud Projection: Projects 3D LiDAR points onto undistorted 2D image plane - adds depth and intensity information
- Morphological Dilation: Enhances point visibility for neural network input
2. Neural Network Inference (TensorRT)
- Input Format: 5-channel normalized data (RGB + depth + intensity)
- Architecture: Deep neural network trained on calibrated/miscalibrated data
- Output: Binary classification with confidence scores for calibration status
3. Runtime Modes
- MANUAL: On-demand validation via service calls
- PERIODIC: Regular validation at configurable intervals
- ACTIVE: Continuous monitoring with synchronized sensor data
Inputs / Outputs
Input
| Name | Type | Description |
|---|---|---|
~/input/velocity |
prerequisite.velocity_source parameter |
Vehicle velocity (multiple message types supported) |
input.cloud_topics |
sensor_msgs::msg::PointCloud2 |
LiDAR point cloud data |
input.image_topics |
sensor_msgs::msg::Image |
Camera image data (BGR8 format) |
| Camera info topics | sensor_msgs::msg::CameraInfo |
Camera intrinsic parameters and distortion coefficients |
Output
| Name | Type | Description |
|---|---|---|
/diagnostics |
diagnostic_msgs::msg::DiagnosticArray |
ROS diagnostics with calibration status |
~/validate_calibration_srv |
std_srvs::srv::Trigger |
Manual validation service (MANUAL mode) |
| Preview image topics | sensor_msgs::msg::Image |
Visualization images with projected points |
Services
| Name | Type | Description |
|---|---|---|
~/input/validate_calibration_srv |
std_srvs::srv::Trigger |
Manual calibration validation request |
Parameters
Node Parameters
{{ json_to_markdown(“sensing/autoware_calibration_status_classifier/schema/calibration_status_classifier.schema.json”) }}
Network Parameters
{{ json_to_markdown(“sensing/autoware_calibration_status_classifier/schema/ml_package_calibration_status_classifier.schema.json”) }}
Assumptions / Known Limits
- Input images must be in BGR8 format (8-bit per channel)
- Input point clouds should contain intensity information (XYZIRC format)
Usage Example
ros2 launch autoware_calibration_status_classifier calibration_status_classifier.launch.xml
Future Extensions / Unimplemented Parts
- Manual runtime mode with detailed response (custom srv)
- Replace filter for objects on the scene counter to objects within the camera FOV counter (raytracing)
- Multithreading for multiple camera-LiDAR pairs
- More filters (e.g. yaw rate)
- cuda_blackboard support
- Replace custom kernels with NPP functions where applicable
References
Changelog for package autoware_calibration_status_classifier
0.48.0 (2025-11-18)
-
Merge remote-tracking branch 'origin/main' into humble
-
fix: tf2 uses hpp headers in rolling (and is backported) (#11620)
-
feat(autoware_calibration_status_classifier): add ML-based miscalibration detection module (#11222)
- feat(autoware_calibration_status): add ML-based miscalibration detection module
- feat(autoware_calibration_status): extended configuration and diagnostics
- fix(autoware_calibration_status): model's input array format
- test(autoware_calibration_status): inference only test
- style(pre-commit): autofix
- refactor(autoware_calibration_status): rename lidar_range to max_depth
- fix(autoware_calibration_status): add missing header
- feat(autoware_calibration_status): add naive number of objects filter
- feat(autoware_calibration_status): add periodic and manual mode
* refactor(autoware_calibration_status): improve image handling and optimize calibration pipeline Refactors the calibration status module to handle both distorted and rectified images, reorganizes data structures, and optimizes the processing pipeline. Adds new utility classes for better camera/LiDAR information management.
- style(pre-commit): autofix
- style(autoware_calibration_status): pre-commit
- test(autoware_calibration_status): make that CI skip unit tests
- style(autoware_calibration_status): cspell
- test(autoware_calibration_status): skip test before loading data
- test(autoware_calibration_status): another yet attempt to fix CI
- style(autoware_calibration_status): cspell
* fix(autoware_calibration_status): correct types Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
* fix(autoware_calibration_status): correct types Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
- style(pre-commit): autofix
- fix(autoware_calibration_status): user desired function for cuda memory allocation
- style(autoware_calibration_status): early return instead of scoped implementaiton
- feat(autoware_calibration_status): use of __restrict_ keyword
- docs(autoware_calibration_status): update future work
- fix(autoware_calibration_status): include missing directory
- fix(autoware_calibration_status): use preallocated class member
- style(pre-commit): autofix
- style(autoware_calibration_status): use lambda for adding diagnostic
- style(autoware_calibration_status): split function
- style(pre-commit): autofix
- refactor(autoware_calibration_status): change atomic operation logic and extras
- refactor(autoware_calibration_status): use autoware diagnostic interface
- fix(autoware_calibration_status): cspell
- feat(autoware_calibration_status_classifier): rename autoware_calibration_status to autoware_calibration_status_classifier
- style(pre-commit): autofix
- fix(autoware_calibration_status_classifier): prevent potential race condition
- fix(autoware_calibration_status_classifier): add mutex for input msgs data access
* fix(autoware_calibration_status_classifier): pre-commit ---------Co-authored-by: pre-commit-ci-lite[bot] <117423508+pre-commit-ci-lite[bot]@users.noreply.github.com> Co-authored-by: Manato Hirabayashi <<3022416+manato@users.noreply.github.com>>
-
Contributors: Amadeusz Szymko, Ryohsuke Mitsudome, Tim Clephas
Package Dependencies
System Dependencies
| Name |
|---|
| libopencv-dev |
Dependant Packages
Launch files
- launch/calibration_status_classifier.launch.xml
-
- input_velocity [default: /sensing/vehicle_velocity_converter/twist_with_covariance]
- input_objects [default: /perception/object_recognition/objects]
- validate_calibration_srv [default: validate_calibration]
- data_path [default: $(env HOME)/autoware_data]
- model_name [default: calibration_status_classifier]
- model_path [default: $(var data_path)/calibration_status_classifier]
- model_param_path [default: $(find-pkg-share autoware_calibration_status_classifier)/config/$(var model_name).param.yaml]
- ml_package_param_path [default: $(var model_path)/ml_package_$(var model_name).param.yaml]
- build_only [default: false]
- use_sim_time [default: true]
- decompressor_param_file [default: $(find-pkg-share autoware_image_transport_decompressor)/config/image_transport_decompressor.param.yaml]