Package Summary
Tags | No category tags. |
Version | 0.0.0 |
License | Apache License 2.0 |
Build type | AMENT_CMAKE |
Use | RECOMMENDED |
Repository Summary
Description | |
Checkout URI | https://github.com/autowarefoundation/autoware_universe.git |
VCS Type | git |
VCS Version | main |
Last Updated | 2025-09-30 |
Dev Status | UNKNOWN |
Released | UNRELEASED |
Tags | planner ros calibration self-driving-car autonomous-driving autonomous-vehicles ros2 3d-map autoware |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Additional Links
Maintainers
- Amadeusz Szymko
Authors
- Amadeusz Szymko
autoware_lidar_frnet
Purpose
The autoware_lidar_frnet
package is used for 3D semantic segmentation based on LiDAR data (x, y, z, intensity).
Inner-workings / Algorithms
The implementation is based on the FRNet [1] project. It uses TensorRT library for data processing and network inference.
We trained the models using AWML [2].
Inputs / Outputs
Input
Name | Type | Description |
---|---|---|
~/input/pointcloud |
sensor_msgs::msg::PointCloud2 |
Input pointcloud. |
Output
Name | Type | Description |
---|---|---|
~/output/pointcloud/segmentation |
sensor_msgs::msg::PointCloud2 |
XYZ cloud with class ID field. |
~/output/pointcloud/visualization |
sensor_msgs::msg::PointCloud2 |
XYZ cloud with RGB field. |
~/output/pointcloud/filtered |
sensor_msgs::msg::PointCloud2 |
Input format cloud after removing specified point’s class. |
debug/cyclic_time_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Cyclic time (ms). |
debug/pipeline_latency_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Pipeline latency time (ms). |
debug/processing_time/preprocess_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Preprocess (ms). |
debug/processing_time/inference_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Inference time (ms). |
debug/processing_time/postprocess_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Postprocess time (ms). |
debug/processing_time/total_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Total processing time (ms). |
/diagnostics |
diagnostic_msgs::msg::DiagnosticArray |
Node diagnostics with respect to processing time constraints |
Parameters
FRNet node
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/frnet.schema.json”) }}
FRNet model
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/ml_package_frnet.schema.json”) }}
FRNet diagnostics
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/diagnostics_frnet.schema.json”) }}
The build_only
option
The autoware_lidar_frnet
node has build_only
option to build the TensorRT engine file from the ONNX file.
ros2 launch autoware_lidar_frnet lidar_frnet.launch.xml build_only:=true
Assumptions / Known limits
This library operates on raw cloud data (bytes). It is assumed that the input pointcloud message has XYZIRC format:
[
sensor_msgs.msg.PointField(name='x', offset=0, datatype=7, count=1),
sensor_msgs.msg.PointField(name='y', offset=4, datatype=7, count=1),
sensor_msgs.msg.PointField(name='z', offset=8, datatype=7, count=1),
sensor_msgs.msg.PointField(name='intensity', offset=12, datatype=2, count=1),
sensor_msgs.msg.PointField(name='ring', offset=13, datatype=2, count=1),
sensor_msgs.msg.PointField(name='channel', offset=14, datatype=4, count=1)
]
This input may consist of other fields as well - shown format is required minimum. For debug purposes, you can validate your pointcloud topic using simple command:
ros2 topic echo <input_topic> --field fields
Trained Models
The model was trained on the NuScenes dataset and is available in the Autoware artifacts.
References/External links
[1] X. Xu, L. Kong, H. Shuai and Q. Liu, “FRNet: Frustum-Range Networks for Scalable LiDAR Segmentation” in IEEE Transactions on Image Processing, vol. 34, pp. 2173-2186, 2025, doi: 10.1109/TIP.2025.3550011.
Package Dependencies
System Dependencies
Dependant Packages
Launch files
- launch/lidar_frnet.launch.xml
-
- input/pointcloud [default: /sensing/lidar/top/pointcloud]
- output/pointcloud/segmentation [default: segmentation]
- output/pointcloud/visualization [default: visualization]
- output/pointcloud/filtered [default: filtered]
- data_path [default: $(env HOME)/autoware_data]
- model_name [default: frnet]
- model_path [default: $(var data_path)/lidar_frnet]
- model_param_path [default: $(find-pkg-share autoware_lidar_frnet)/config/$(var model_name).param.yaml]
- ml_package_param_path [default: $(var model_path)/ml_package_$(var model_name).param.yaml]
- diagnostics_param_path [default: $(find-pkg-share autoware_lidar_frnet)/config/diagnostics_frnet.param.yaml]
- build_only [default: false]
- use_pointcloud_container [default: false]
- pointcloud_container_name [default: pointcloud_container]
Messages
Services
Plugins
Recent questions tagged autoware_lidar_frnet at Robotics Stack Exchange
Package Summary
Tags | No category tags. |
Version | 0.0.0 |
License | Apache License 2.0 |
Build type | AMENT_CMAKE |
Use | RECOMMENDED |
Repository Summary
Description | |
Checkout URI | https://github.com/autowarefoundation/autoware_universe.git |
VCS Type | git |
VCS Version | main |
Last Updated | 2025-09-30 |
Dev Status | UNKNOWN |
Released | UNRELEASED |
Tags | planner ros calibration self-driving-car autonomous-driving autonomous-vehicles ros2 3d-map autoware |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Additional Links
Maintainers
- Amadeusz Szymko
Authors
- Amadeusz Szymko
autoware_lidar_frnet
Purpose
The autoware_lidar_frnet
package is used for 3D semantic segmentation based on LiDAR data (x, y, z, intensity).
Inner-workings / Algorithms
The implementation is based on the FRNet [1] project. It uses TensorRT library for data processing and network inference.
We trained the models using AWML [2].
Inputs / Outputs
Input
Name | Type | Description |
---|---|---|
~/input/pointcloud |
sensor_msgs::msg::PointCloud2 |
Input pointcloud. |
Output
Name | Type | Description |
---|---|---|
~/output/pointcloud/segmentation |
sensor_msgs::msg::PointCloud2 |
XYZ cloud with class ID field. |
~/output/pointcloud/visualization |
sensor_msgs::msg::PointCloud2 |
XYZ cloud with RGB field. |
~/output/pointcloud/filtered |
sensor_msgs::msg::PointCloud2 |
Input format cloud after removing specified point’s class. |
debug/cyclic_time_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Cyclic time (ms). |
debug/pipeline_latency_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Pipeline latency time (ms). |
debug/processing_time/preprocess_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Preprocess (ms). |
debug/processing_time/inference_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Inference time (ms). |
debug/processing_time/postprocess_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Postprocess time (ms). |
debug/processing_time/total_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Total processing time (ms). |
/diagnostics |
diagnostic_msgs::msg::DiagnosticArray |
Node diagnostics with respect to processing time constraints |
Parameters
FRNet node
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/frnet.schema.json”) }}
FRNet model
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/ml_package_frnet.schema.json”) }}
FRNet diagnostics
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/diagnostics_frnet.schema.json”) }}
The build_only
option
The autoware_lidar_frnet
node has build_only
option to build the TensorRT engine file from the ONNX file.
ros2 launch autoware_lidar_frnet lidar_frnet.launch.xml build_only:=true
Assumptions / Known limits
This library operates on raw cloud data (bytes). It is assumed that the input pointcloud message has XYZIRC format:
[
sensor_msgs.msg.PointField(name='x', offset=0, datatype=7, count=1),
sensor_msgs.msg.PointField(name='y', offset=4, datatype=7, count=1),
sensor_msgs.msg.PointField(name='z', offset=8, datatype=7, count=1),
sensor_msgs.msg.PointField(name='intensity', offset=12, datatype=2, count=1),
sensor_msgs.msg.PointField(name='ring', offset=13, datatype=2, count=1),
sensor_msgs.msg.PointField(name='channel', offset=14, datatype=4, count=1)
]
This input may consist of other fields as well - shown format is required minimum. For debug purposes, you can validate your pointcloud topic using simple command:
ros2 topic echo <input_topic> --field fields
Trained Models
The model was trained on the NuScenes dataset and is available in the Autoware artifacts.
References/External links
[1] X. Xu, L. Kong, H. Shuai and Q. Liu, “FRNet: Frustum-Range Networks for Scalable LiDAR Segmentation” in IEEE Transactions on Image Processing, vol. 34, pp. 2173-2186, 2025, doi: 10.1109/TIP.2025.3550011.
Package Dependencies
System Dependencies
Dependant Packages
Launch files
- launch/lidar_frnet.launch.xml
-
- input/pointcloud [default: /sensing/lidar/top/pointcloud]
- output/pointcloud/segmentation [default: segmentation]
- output/pointcloud/visualization [default: visualization]
- output/pointcloud/filtered [default: filtered]
- data_path [default: $(env HOME)/autoware_data]
- model_name [default: frnet]
- model_path [default: $(var data_path)/lidar_frnet]
- model_param_path [default: $(find-pkg-share autoware_lidar_frnet)/config/$(var model_name).param.yaml]
- ml_package_param_path [default: $(var model_path)/ml_package_$(var model_name).param.yaml]
- diagnostics_param_path [default: $(find-pkg-share autoware_lidar_frnet)/config/diagnostics_frnet.param.yaml]
- build_only [default: false]
- use_pointcloud_container [default: false]
- pointcloud_container_name [default: pointcloud_container]
Messages
Services
Plugins
Recent questions tagged autoware_lidar_frnet at Robotics Stack Exchange
Package Summary
Tags | No category tags. |
Version | 0.0.0 |
License | Apache License 2.0 |
Build type | AMENT_CMAKE |
Use | RECOMMENDED |
Repository Summary
Description | |
Checkout URI | https://github.com/autowarefoundation/autoware_universe.git |
VCS Type | git |
VCS Version | main |
Last Updated | 2025-09-30 |
Dev Status | UNKNOWN |
Released | UNRELEASED |
Tags | planner ros calibration self-driving-car autonomous-driving autonomous-vehicles ros2 3d-map autoware |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Additional Links
Maintainers
- Amadeusz Szymko
Authors
- Amadeusz Szymko
autoware_lidar_frnet
Purpose
The autoware_lidar_frnet
package is used for 3D semantic segmentation based on LiDAR data (x, y, z, intensity).
Inner-workings / Algorithms
The implementation is based on the FRNet [1] project. It uses TensorRT library for data processing and network inference.
We trained the models using AWML [2].
Inputs / Outputs
Input
Name | Type | Description |
---|---|---|
~/input/pointcloud |
sensor_msgs::msg::PointCloud2 |
Input pointcloud. |
Output
Name | Type | Description |
---|---|---|
~/output/pointcloud/segmentation |
sensor_msgs::msg::PointCloud2 |
XYZ cloud with class ID field. |
~/output/pointcloud/visualization |
sensor_msgs::msg::PointCloud2 |
XYZ cloud with RGB field. |
~/output/pointcloud/filtered |
sensor_msgs::msg::PointCloud2 |
Input format cloud after removing specified point’s class. |
debug/cyclic_time_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Cyclic time (ms). |
debug/pipeline_latency_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Pipeline latency time (ms). |
debug/processing_time/preprocess_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Preprocess (ms). |
debug/processing_time/inference_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Inference time (ms). |
debug/processing_time/postprocess_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Postprocess time (ms). |
debug/processing_time/total_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Total processing time (ms). |
/diagnostics |
diagnostic_msgs::msg::DiagnosticArray |
Node diagnostics with respect to processing time constraints |
Parameters
FRNet node
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/frnet.schema.json”) }}
FRNet model
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/ml_package_frnet.schema.json”) }}
FRNet diagnostics
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/diagnostics_frnet.schema.json”) }}
The build_only
option
The autoware_lidar_frnet
node has build_only
option to build the TensorRT engine file from the ONNX file.
ros2 launch autoware_lidar_frnet lidar_frnet.launch.xml build_only:=true
Assumptions / Known limits
This library operates on raw cloud data (bytes). It is assumed that the input pointcloud message has XYZIRC format:
[
sensor_msgs.msg.PointField(name='x', offset=0, datatype=7, count=1),
sensor_msgs.msg.PointField(name='y', offset=4, datatype=7, count=1),
sensor_msgs.msg.PointField(name='z', offset=8, datatype=7, count=1),
sensor_msgs.msg.PointField(name='intensity', offset=12, datatype=2, count=1),
sensor_msgs.msg.PointField(name='ring', offset=13, datatype=2, count=1),
sensor_msgs.msg.PointField(name='channel', offset=14, datatype=4, count=1)
]
This input may consist of other fields as well - shown format is required minimum. For debug purposes, you can validate your pointcloud topic using simple command:
ros2 topic echo <input_topic> --field fields
Trained Models
The model was trained on the NuScenes dataset and is available in the Autoware artifacts.
References/External links
[1] X. Xu, L. Kong, H. Shuai and Q. Liu, “FRNet: Frustum-Range Networks for Scalable LiDAR Segmentation” in IEEE Transactions on Image Processing, vol. 34, pp. 2173-2186, 2025, doi: 10.1109/TIP.2025.3550011.
Package Dependencies
System Dependencies
Dependant Packages
Launch files
- launch/lidar_frnet.launch.xml
-
- input/pointcloud [default: /sensing/lidar/top/pointcloud]
- output/pointcloud/segmentation [default: segmentation]
- output/pointcloud/visualization [default: visualization]
- output/pointcloud/filtered [default: filtered]
- data_path [default: $(env HOME)/autoware_data]
- model_name [default: frnet]
- model_path [default: $(var data_path)/lidar_frnet]
- model_param_path [default: $(find-pkg-share autoware_lidar_frnet)/config/$(var model_name).param.yaml]
- ml_package_param_path [default: $(var model_path)/ml_package_$(var model_name).param.yaml]
- diagnostics_param_path [default: $(find-pkg-share autoware_lidar_frnet)/config/diagnostics_frnet.param.yaml]
- build_only [default: false]
- use_pointcloud_container [default: false]
- pointcloud_container_name [default: pointcloud_container]
Messages
Services
Plugins
Recent questions tagged autoware_lidar_frnet at Robotics Stack Exchange
Package Summary
Tags | No category tags. |
Version | 0.0.0 |
License | Apache License 2.0 |
Build type | AMENT_CMAKE |
Use | RECOMMENDED |
Repository Summary
Description | |
Checkout URI | https://github.com/autowarefoundation/autoware_universe.git |
VCS Type | git |
VCS Version | main |
Last Updated | 2025-09-30 |
Dev Status | UNKNOWN |
Released | UNRELEASED |
Tags | planner ros calibration self-driving-car autonomous-driving autonomous-vehicles ros2 3d-map autoware |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Additional Links
Maintainers
- Amadeusz Szymko
Authors
- Amadeusz Szymko
autoware_lidar_frnet
Purpose
The autoware_lidar_frnet
package is used for 3D semantic segmentation based on LiDAR data (x, y, z, intensity).
Inner-workings / Algorithms
The implementation is based on the FRNet [1] project. It uses TensorRT library for data processing and network inference.
We trained the models using AWML [2].
Inputs / Outputs
Input
Name | Type | Description |
---|---|---|
~/input/pointcloud |
sensor_msgs::msg::PointCloud2 |
Input pointcloud. |
Output
Name | Type | Description |
---|---|---|
~/output/pointcloud/segmentation |
sensor_msgs::msg::PointCloud2 |
XYZ cloud with class ID field. |
~/output/pointcloud/visualization |
sensor_msgs::msg::PointCloud2 |
XYZ cloud with RGB field. |
~/output/pointcloud/filtered |
sensor_msgs::msg::PointCloud2 |
Input format cloud after removing specified point’s class. |
debug/cyclic_time_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Cyclic time (ms). |
debug/pipeline_latency_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Pipeline latency time (ms). |
debug/processing_time/preprocess_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Preprocess (ms). |
debug/processing_time/inference_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Inference time (ms). |
debug/processing_time/postprocess_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Postprocess time (ms). |
debug/processing_time/total_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Total processing time (ms). |
/diagnostics |
diagnostic_msgs::msg::DiagnosticArray |
Node diagnostics with respect to processing time constraints |
Parameters
FRNet node
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/frnet.schema.json”) }}
FRNet model
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/ml_package_frnet.schema.json”) }}
FRNet diagnostics
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/diagnostics_frnet.schema.json”) }}
The build_only
option
The autoware_lidar_frnet
node has build_only
option to build the TensorRT engine file from the ONNX file.
ros2 launch autoware_lidar_frnet lidar_frnet.launch.xml build_only:=true
Assumptions / Known limits
This library operates on raw cloud data (bytes). It is assumed that the input pointcloud message has XYZIRC format:
[
sensor_msgs.msg.PointField(name='x', offset=0, datatype=7, count=1),
sensor_msgs.msg.PointField(name='y', offset=4, datatype=7, count=1),
sensor_msgs.msg.PointField(name='z', offset=8, datatype=7, count=1),
sensor_msgs.msg.PointField(name='intensity', offset=12, datatype=2, count=1),
sensor_msgs.msg.PointField(name='ring', offset=13, datatype=2, count=1),
sensor_msgs.msg.PointField(name='channel', offset=14, datatype=4, count=1)
]
This input may consist of other fields as well - shown format is required minimum. For debug purposes, you can validate your pointcloud topic using simple command:
ros2 topic echo <input_topic> --field fields
Trained Models
The model was trained on the NuScenes dataset and is available in the Autoware artifacts.
References/External links
[1] X. Xu, L. Kong, H. Shuai and Q. Liu, “FRNet: Frustum-Range Networks for Scalable LiDAR Segmentation” in IEEE Transactions on Image Processing, vol. 34, pp. 2173-2186, 2025, doi: 10.1109/TIP.2025.3550011.
Package Dependencies
System Dependencies
Dependant Packages
Launch files
- launch/lidar_frnet.launch.xml
-
- input/pointcloud [default: /sensing/lidar/top/pointcloud]
- output/pointcloud/segmentation [default: segmentation]
- output/pointcloud/visualization [default: visualization]
- output/pointcloud/filtered [default: filtered]
- data_path [default: $(env HOME)/autoware_data]
- model_name [default: frnet]
- model_path [default: $(var data_path)/lidar_frnet]
- model_param_path [default: $(find-pkg-share autoware_lidar_frnet)/config/$(var model_name).param.yaml]
- ml_package_param_path [default: $(var model_path)/ml_package_$(var model_name).param.yaml]
- diagnostics_param_path [default: $(find-pkg-share autoware_lidar_frnet)/config/diagnostics_frnet.param.yaml]
- build_only [default: false]
- use_pointcloud_container [default: false]
- pointcloud_container_name [default: pointcloud_container]
Messages
Services
Plugins
Recent questions tagged autoware_lidar_frnet at Robotics Stack Exchange
Package Summary
Tags | No category tags. |
Version | 0.0.0 |
License | Apache License 2.0 |
Build type | AMENT_CMAKE |
Use | RECOMMENDED |
Repository Summary
Description | |
Checkout URI | https://github.com/autowarefoundation/autoware_universe.git |
VCS Type | git |
VCS Version | main |
Last Updated | 2025-09-30 |
Dev Status | UNKNOWN |
Released | UNRELEASED |
Tags | planner ros calibration self-driving-car autonomous-driving autonomous-vehicles ros2 3d-map autoware |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Additional Links
Maintainers
- Amadeusz Szymko
Authors
- Amadeusz Szymko
autoware_lidar_frnet
Purpose
The autoware_lidar_frnet
package is used for 3D semantic segmentation based on LiDAR data (x, y, z, intensity).
Inner-workings / Algorithms
The implementation is based on the FRNet [1] project. It uses TensorRT library for data processing and network inference.
We trained the models using AWML [2].
Inputs / Outputs
Input
Name | Type | Description |
---|---|---|
~/input/pointcloud |
sensor_msgs::msg::PointCloud2 |
Input pointcloud. |
Output
Name | Type | Description |
---|---|---|
~/output/pointcloud/segmentation |
sensor_msgs::msg::PointCloud2 |
XYZ cloud with class ID field. |
~/output/pointcloud/visualization |
sensor_msgs::msg::PointCloud2 |
XYZ cloud with RGB field. |
~/output/pointcloud/filtered |
sensor_msgs::msg::PointCloud2 |
Input format cloud after removing specified point’s class. |
debug/cyclic_time_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Cyclic time (ms). |
debug/pipeline_latency_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Pipeline latency time (ms). |
debug/processing_time/preprocess_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Preprocess (ms). |
debug/processing_time/inference_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Inference time (ms). |
debug/processing_time/postprocess_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Postprocess time (ms). |
debug/processing_time/total_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Total processing time (ms). |
/diagnostics |
diagnostic_msgs::msg::DiagnosticArray |
Node diagnostics with respect to processing time constraints |
Parameters
FRNet node
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/frnet.schema.json”) }}
FRNet model
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/ml_package_frnet.schema.json”) }}
FRNet diagnostics
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/diagnostics_frnet.schema.json”) }}
The build_only
option
The autoware_lidar_frnet
node has build_only
option to build the TensorRT engine file from the ONNX file.
ros2 launch autoware_lidar_frnet lidar_frnet.launch.xml build_only:=true
Assumptions / Known limits
This library operates on raw cloud data (bytes). It is assumed that the input pointcloud message has XYZIRC format:
[
sensor_msgs.msg.PointField(name='x', offset=0, datatype=7, count=1),
sensor_msgs.msg.PointField(name='y', offset=4, datatype=7, count=1),
sensor_msgs.msg.PointField(name='z', offset=8, datatype=7, count=1),
sensor_msgs.msg.PointField(name='intensity', offset=12, datatype=2, count=1),
sensor_msgs.msg.PointField(name='ring', offset=13, datatype=2, count=1),
sensor_msgs.msg.PointField(name='channel', offset=14, datatype=4, count=1)
]
This input may consist of other fields as well - shown format is required minimum. For debug purposes, you can validate your pointcloud topic using simple command:
ros2 topic echo <input_topic> --field fields
Trained Models
The model was trained on the NuScenes dataset and is available in the Autoware artifacts.
References/External links
[1] X. Xu, L. Kong, H. Shuai and Q. Liu, “FRNet: Frustum-Range Networks for Scalable LiDAR Segmentation” in IEEE Transactions on Image Processing, vol. 34, pp. 2173-2186, 2025, doi: 10.1109/TIP.2025.3550011.
Package Dependencies
System Dependencies
Dependant Packages
Launch files
- launch/lidar_frnet.launch.xml
-
- input/pointcloud [default: /sensing/lidar/top/pointcloud]
- output/pointcloud/segmentation [default: segmentation]
- output/pointcloud/visualization [default: visualization]
- output/pointcloud/filtered [default: filtered]
- data_path [default: $(env HOME)/autoware_data]
- model_name [default: frnet]
- model_path [default: $(var data_path)/lidar_frnet]
- model_param_path [default: $(find-pkg-share autoware_lidar_frnet)/config/$(var model_name).param.yaml]
- ml_package_param_path [default: $(var model_path)/ml_package_$(var model_name).param.yaml]
- diagnostics_param_path [default: $(find-pkg-share autoware_lidar_frnet)/config/diagnostics_frnet.param.yaml]
- build_only [default: false]
- use_pointcloud_container [default: false]
- pointcloud_container_name [default: pointcloud_container]
Messages
Services
Plugins
Recent questions tagged autoware_lidar_frnet at Robotics Stack Exchange
Package Summary
Tags | No category tags. |
Version | 0.0.0 |
License | Apache License 2.0 |
Build type | AMENT_CMAKE |
Use | RECOMMENDED |
Repository Summary
Description | |
Checkout URI | https://github.com/autowarefoundation/autoware_universe.git |
VCS Type | git |
VCS Version | main |
Last Updated | 2025-09-30 |
Dev Status | UNKNOWN |
Released | UNRELEASED |
Tags | planner ros calibration self-driving-car autonomous-driving autonomous-vehicles ros2 3d-map autoware |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Additional Links
Maintainers
- Amadeusz Szymko
Authors
- Amadeusz Szymko
autoware_lidar_frnet
Purpose
The autoware_lidar_frnet
package is used for 3D semantic segmentation based on LiDAR data (x, y, z, intensity).
Inner-workings / Algorithms
The implementation is based on the FRNet [1] project. It uses TensorRT library for data processing and network inference.
We trained the models using AWML [2].
Inputs / Outputs
Input
Name | Type | Description |
---|---|---|
~/input/pointcloud |
sensor_msgs::msg::PointCloud2 |
Input pointcloud. |
Output
Name | Type | Description |
---|---|---|
~/output/pointcloud/segmentation |
sensor_msgs::msg::PointCloud2 |
XYZ cloud with class ID field. |
~/output/pointcloud/visualization |
sensor_msgs::msg::PointCloud2 |
XYZ cloud with RGB field. |
~/output/pointcloud/filtered |
sensor_msgs::msg::PointCloud2 |
Input format cloud after removing specified point’s class. |
debug/cyclic_time_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Cyclic time (ms). |
debug/pipeline_latency_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Pipeline latency time (ms). |
debug/processing_time/preprocess_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Preprocess (ms). |
debug/processing_time/inference_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Inference time (ms). |
debug/processing_time/postprocess_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Postprocess time (ms). |
debug/processing_time/total_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Total processing time (ms). |
/diagnostics |
diagnostic_msgs::msg::DiagnosticArray |
Node diagnostics with respect to processing time constraints |
Parameters
FRNet node
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/frnet.schema.json”) }}
FRNet model
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/ml_package_frnet.schema.json”) }}
FRNet diagnostics
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/diagnostics_frnet.schema.json”) }}
The build_only
option
The autoware_lidar_frnet
node has build_only
option to build the TensorRT engine file from the ONNX file.
ros2 launch autoware_lidar_frnet lidar_frnet.launch.xml build_only:=true
Assumptions / Known limits
This library operates on raw cloud data (bytes). It is assumed that the input pointcloud message has XYZIRC format:
[
sensor_msgs.msg.PointField(name='x', offset=0, datatype=7, count=1),
sensor_msgs.msg.PointField(name='y', offset=4, datatype=7, count=1),
sensor_msgs.msg.PointField(name='z', offset=8, datatype=7, count=1),
sensor_msgs.msg.PointField(name='intensity', offset=12, datatype=2, count=1),
sensor_msgs.msg.PointField(name='ring', offset=13, datatype=2, count=1),
sensor_msgs.msg.PointField(name='channel', offset=14, datatype=4, count=1)
]
This input may consist of other fields as well - shown format is required minimum. For debug purposes, you can validate your pointcloud topic using simple command:
ros2 topic echo <input_topic> --field fields
Trained Models
The model was trained on the NuScenes dataset and is available in the Autoware artifacts.
References/External links
[1] X. Xu, L. Kong, H. Shuai and Q. Liu, “FRNet: Frustum-Range Networks for Scalable LiDAR Segmentation” in IEEE Transactions on Image Processing, vol. 34, pp. 2173-2186, 2025, doi: 10.1109/TIP.2025.3550011.
Package Dependencies
System Dependencies
Dependant Packages
Launch files
- launch/lidar_frnet.launch.xml
-
- input/pointcloud [default: /sensing/lidar/top/pointcloud]
- output/pointcloud/segmentation [default: segmentation]
- output/pointcloud/visualization [default: visualization]
- output/pointcloud/filtered [default: filtered]
- data_path [default: $(env HOME)/autoware_data]
- model_name [default: frnet]
- model_path [default: $(var data_path)/lidar_frnet]
- model_param_path [default: $(find-pkg-share autoware_lidar_frnet)/config/$(var model_name).param.yaml]
- ml_package_param_path [default: $(var model_path)/ml_package_$(var model_name).param.yaml]
- diagnostics_param_path [default: $(find-pkg-share autoware_lidar_frnet)/config/diagnostics_frnet.param.yaml]
- build_only [default: false]
- use_pointcloud_container [default: false]
- pointcloud_container_name [default: pointcloud_container]
Messages
Services
Plugins
Recent questions tagged autoware_lidar_frnet at Robotics Stack Exchange
Package Summary
Tags | No category tags. |
Version | 0.0.0 |
License | Apache License 2.0 |
Build type | AMENT_CMAKE |
Use | RECOMMENDED |
Repository Summary
Description | |
Checkout URI | https://github.com/autowarefoundation/autoware_universe.git |
VCS Type | git |
VCS Version | main |
Last Updated | 2025-09-30 |
Dev Status | UNKNOWN |
Released | UNRELEASED |
Tags | planner ros calibration self-driving-car autonomous-driving autonomous-vehicles ros2 3d-map autoware |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Additional Links
Maintainers
- Amadeusz Szymko
Authors
- Amadeusz Szymko
autoware_lidar_frnet
Purpose
The autoware_lidar_frnet
package is used for 3D semantic segmentation based on LiDAR data (x, y, z, intensity).
Inner-workings / Algorithms
The implementation is based on the FRNet [1] project. It uses TensorRT library for data processing and network inference.
We trained the models using AWML [2].
Inputs / Outputs
Input
Name | Type | Description |
---|---|---|
~/input/pointcloud |
sensor_msgs::msg::PointCloud2 |
Input pointcloud. |
Output
Name | Type | Description |
---|---|---|
~/output/pointcloud/segmentation |
sensor_msgs::msg::PointCloud2 |
XYZ cloud with class ID field. |
~/output/pointcloud/visualization |
sensor_msgs::msg::PointCloud2 |
XYZ cloud with RGB field. |
~/output/pointcloud/filtered |
sensor_msgs::msg::PointCloud2 |
Input format cloud after removing specified point’s class. |
debug/cyclic_time_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Cyclic time (ms). |
debug/pipeline_latency_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Pipeline latency time (ms). |
debug/processing_time/preprocess_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Preprocess (ms). |
debug/processing_time/inference_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Inference time (ms). |
debug/processing_time/postprocess_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Postprocess time (ms). |
debug/processing_time/total_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Total processing time (ms). |
/diagnostics |
diagnostic_msgs::msg::DiagnosticArray |
Node diagnostics with respect to processing time constraints |
Parameters
FRNet node
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/frnet.schema.json”) }}
FRNet model
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/ml_package_frnet.schema.json”) }}
FRNet diagnostics
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/diagnostics_frnet.schema.json”) }}
The build_only
option
The autoware_lidar_frnet
node has build_only
option to build the TensorRT engine file from the ONNX file.
ros2 launch autoware_lidar_frnet lidar_frnet.launch.xml build_only:=true
Assumptions / Known limits
This library operates on raw cloud data (bytes). It is assumed that the input pointcloud message has XYZIRC format:
[
sensor_msgs.msg.PointField(name='x', offset=0, datatype=7, count=1),
sensor_msgs.msg.PointField(name='y', offset=4, datatype=7, count=1),
sensor_msgs.msg.PointField(name='z', offset=8, datatype=7, count=1),
sensor_msgs.msg.PointField(name='intensity', offset=12, datatype=2, count=1),
sensor_msgs.msg.PointField(name='ring', offset=13, datatype=2, count=1),
sensor_msgs.msg.PointField(name='channel', offset=14, datatype=4, count=1)
]
This input may consist of other fields as well - shown format is required minimum. For debug purposes, you can validate your pointcloud topic using simple command:
ros2 topic echo <input_topic> --field fields
Trained Models
The model was trained on the NuScenes dataset and is available in the Autoware artifacts.
References/External links
[1] X. Xu, L. Kong, H. Shuai and Q. Liu, “FRNet: Frustum-Range Networks for Scalable LiDAR Segmentation” in IEEE Transactions on Image Processing, vol. 34, pp. 2173-2186, 2025, doi: 10.1109/TIP.2025.3550011.
Package Dependencies
System Dependencies
Dependant Packages
Launch files
- launch/lidar_frnet.launch.xml
-
- input/pointcloud [default: /sensing/lidar/top/pointcloud]
- output/pointcloud/segmentation [default: segmentation]
- output/pointcloud/visualization [default: visualization]
- output/pointcloud/filtered [default: filtered]
- data_path [default: $(env HOME)/autoware_data]
- model_name [default: frnet]
- model_path [default: $(var data_path)/lidar_frnet]
- model_param_path [default: $(find-pkg-share autoware_lidar_frnet)/config/$(var model_name).param.yaml]
- ml_package_param_path [default: $(var model_path)/ml_package_$(var model_name).param.yaml]
- diagnostics_param_path [default: $(find-pkg-share autoware_lidar_frnet)/config/diagnostics_frnet.param.yaml]
- build_only [default: false]
- use_pointcloud_container [default: false]
- pointcloud_container_name [default: pointcloud_container]
Messages
Services
Plugins
Recent questions tagged autoware_lidar_frnet at Robotics Stack Exchange
Package Summary
Tags | No category tags. |
Version | 0.0.0 |
License | Apache License 2.0 |
Build type | AMENT_CMAKE |
Use | RECOMMENDED |
Repository Summary
Description | |
Checkout URI | https://github.com/autowarefoundation/autoware_universe.git |
VCS Type | git |
VCS Version | main |
Last Updated | 2025-09-30 |
Dev Status | UNKNOWN |
Released | UNRELEASED |
Tags | planner ros calibration self-driving-car autonomous-driving autonomous-vehicles ros2 3d-map autoware |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Additional Links
Maintainers
- Amadeusz Szymko
Authors
- Amadeusz Szymko
autoware_lidar_frnet
Purpose
The autoware_lidar_frnet
package is used for 3D semantic segmentation based on LiDAR data (x, y, z, intensity).
Inner-workings / Algorithms
The implementation is based on the FRNet [1] project. It uses TensorRT library for data processing and network inference.
We trained the models using AWML [2].
Inputs / Outputs
Input
Name | Type | Description |
---|---|---|
~/input/pointcloud |
sensor_msgs::msg::PointCloud2 |
Input pointcloud. |
Output
Name | Type | Description |
---|---|---|
~/output/pointcloud/segmentation |
sensor_msgs::msg::PointCloud2 |
XYZ cloud with class ID field. |
~/output/pointcloud/visualization |
sensor_msgs::msg::PointCloud2 |
XYZ cloud with RGB field. |
~/output/pointcloud/filtered |
sensor_msgs::msg::PointCloud2 |
Input format cloud after removing specified point’s class. |
debug/cyclic_time_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Cyclic time (ms). |
debug/pipeline_latency_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Pipeline latency time (ms). |
debug/processing_time/preprocess_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Preprocess (ms). |
debug/processing_time/inference_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Inference time (ms). |
debug/processing_time/postprocess_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Postprocess time (ms). |
debug/processing_time/total_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Total processing time (ms). |
/diagnostics |
diagnostic_msgs::msg::DiagnosticArray |
Node diagnostics with respect to processing time constraints |
Parameters
FRNet node
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/frnet.schema.json”) }}
FRNet model
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/ml_package_frnet.schema.json”) }}
FRNet diagnostics
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/diagnostics_frnet.schema.json”) }}
The build_only
option
The autoware_lidar_frnet
node has build_only
option to build the TensorRT engine file from the ONNX file.
ros2 launch autoware_lidar_frnet lidar_frnet.launch.xml build_only:=true
Assumptions / Known limits
This library operates on raw cloud data (bytes). It is assumed that the input pointcloud message has XYZIRC format:
[
sensor_msgs.msg.PointField(name='x', offset=0, datatype=7, count=1),
sensor_msgs.msg.PointField(name='y', offset=4, datatype=7, count=1),
sensor_msgs.msg.PointField(name='z', offset=8, datatype=7, count=1),
sensor_msgs.msg.PointField(name='intensity', offset=12, datatype=2, count=1),
sensor_msgs.msg.PointField(name='ring', offset=13, datatype=2, count=1),
sensor_msgs.msg.PointField(name='channel', offset=14, datatype=4, count=1)
]
This input may consist of other fields as well - shown format is required minimum. For debug purposes, you can validate your pointcloud topic using simple command:
ros2 topic echo <input_topic> --field fields
Trained Models
The model was trained on the NuScenes dataset and is available in the Autoware artifacts.
References/External links
[1] X. Xu, L. Kong, H. Shuai and Q. Liu, “FRNet: Frustum-Range Networks for Scalable LiDAR Segmentation” in IEEE Transactions on Image Processing, vol. 34, pp. 2173-2186, 2025, doi: 10.1109/TIP.2025.3550011.
Package Dependencies
System Dependencies
Dependant Packages
Launch files
- launch/lidar_frnet.launch.xml
-
- input/pointcloud [default: /sensing/lidar/top/pointcloud]
- output/pointcloud/segmentation [default: segmentation]
- output/pointcloud/visualization [default: visualization]
- output/pointcloud/filtered [default: filtered]
- data_path [default: $(env HOME)/autoware_data]
- model_name [default: frnet]
- model_path [default: $(var data_path)/lidar_frnet]
- model_param_path [default: $(find-pkg-share autoware_lidar_frnet)/config/$(var model_name).param.yaml]
- ml_package_param_path [default: $(var model_path)/ml_package_$(var model_name).param.yaml]
- diagnostics_param_path [default: $(find-pkg-share autoware_lidar_frnet)/config/diagnostics_frnet.param.yaml]
- build_only [default: false]
- use_pointcloud_container [default: false]
- pointcloud_container_name [default: pointcloud_container]
Messages
Services
Plugins
Recent questions tagged autoware_lidar_frnet at Robotics Stack Exchange
Package Summary
Tags | No category tags. |
Version | 0.0.0 |
License | Apache License 2.0 |
Build type | AMENT_CMAKE |
Use | RECOMMENDED |
Repository Summary
Description | |
Checkout URI | https://github.com/autowarefoundation/autoware_universe.git |
VCS Type | git |
VCS Version | main |
Last Updated | 2025-09-30 |
Dev Status | UNKNOWN |
Released | UNRELEASED |
Tags | planner ros calibration self-driving-car autonomous-driving autonomous-vehicles ros2 3d-map autoware |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Package Description
Additional Links
Maintainers
- Amadeusz Szymko
Authors
- Amadeusz Szymko
autoware_lidar_frnet
Purpose
The autoware_lidar_frnet
package is used for 3D semantic segmentation based on LiDAR data (x, y, z, intensity).
Inner-workings / Algorithms
The implementation is based on the FRNet [1] project. It uses TensorRT library for data processing and network inference.
We trained the models using AWML [2].
Inputs / Outputs
Input
Name | Type | Description |
---|---|---|
~/input/pointcloud |
sensor_msgs::msg::PointCloud2 |
Input pointcloud. |
Output
Name | Type | Description |
---|---|---|
~/output/pointcloud/segmentation |
sensor_msgs::msg::PointCloud2 |
XYZ cloud with class ID field. |
~/output/pointcloud/visualization |
sensor_msgs::msg::PointCloud2 |
XYZ cloud with RGB field. |
~/output/pointcloud/filtered |
sensor_msgs::msg::PointCloud2 |
Input format cloud after removing specified point’s class. |
debug/cyclic_time_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Cyclic time (ms). |
debug/pipeline_latency_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Pipeline latency time (ms). |
debug/processing_time/preprocess_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Preprocess (ms). |
debug/processing_time/inference_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Inference time (ms). |
debug/processing_time/postprocess_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Postprocess time (ms). |
debug/processing_time/total_ms |
autoware_internal_debug_msgs::msg::Float64Stamped |
Total processing time (ms). |
/diagnostics |
diagnostic_msgs::msg::DiagnosticArray |
Node diagnostics with respect to processing time constraints |
Parameters
FRNet node
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/frnet.schema.json”) }}
FRNet model
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/ml_package_frnet.schema.json”) }}
FRNet diagnostics
{{ json_to_markdown(“perception/autoware_lidar_frnet/schema/diagnostics_frnet.schema.json”) }}
The build_only
option
The autoware_lidar_frnet
node has build_only
option to build the TensorRT engine file from the ONNX file.
ros2 launch autoware_lidar_frnet lidar_frnet.launch.xml build_only:=true
Assumptions / Known limits
This library operates on raw cloud data (bytes). It is assumed that the input pointcloud message has XYZIRC format:
[
sensor_msgs.msg.PointField(name='x', offset=0, datatype=7, count=1),
sensor_msgs.msg.PointField(name='y', offset=4, datatype=7, count=1),
sensor_msgs.msg.PointField(name='z', offset=8, datatype=7, count=1),
sensor_msgs.msg.PointField(name='intensity', offset=12, datatype=2, count=1),
sensor_msgs.msg.PointField(name='ring', offset=13, datatype=2, count=1),
sensor_msgs.msg.PointField(name='channel', offset=14, datatype=4, count=1)
]
This input may consist of other fields as well - shown format is required minimum. For debug purposes, you can validate your pointcloud topic using simple command:
ros2 topic echo <input_topic> --field fields
Trained Models
The model was trained on the NuScenes dataset and is available in the Autoware artifacts.
References/External links
[1] X. Xu, L. Kong, H. Shuai and Q. Liu, “FRNet: Frustum-Range Networks for Scalable LiDAR Segmentation” in IEEE Transactions on Image Processing, vol. 34, pp. 2173-2186, 2025, doi: 10.1109/TIP.2025.3550011.
Package Dependencies
System Dependencies
Dependant Packages
Launch files
- launch/lidar_frnet.launch.xml
-
- input/pointcloud [default: /sensing/lidar/top/pointcloud]
- output/pointcloud/segmentation [default: segmentation]
- output/pointcloud/visualization [default: visualization]
- output/pointcloud/filtered [default: filtered]
- data_path [default: $(env HOME)/autoware_data]
- model_name [default: frnet]
- model_path [default: $(var data_path)/lidar_frnet]
- model_param_path [default: $(find-pkg-share autoware_lidar_frnet)/config/$(var model_name).param.yaml]
- ml_package_param_path [default: $(var model_path)/ml_package_$(var model_name).param.yaml]
- diagnostics_param_path [default: $(find-pkg-share autoware_lidar_frnet)/config/diagnostics_frnet.param.yaml]
- build_only [default: false]
- use_pointcloud_container [default: false]
- pointcloud_container_name [default: pointcloud_container]