|
wato_monorepo repositorycontrol computer-vision deep-learning motion-planning perception autonomous-driving sensor-fusion sensors-data-collection world-modeling |
|
wato_monorepo repositorycontrol computer-vision deep-learning motion-planning perception autonomous-driving sensor-fusion sensors-data-collection world-modeling |
|
wato_monorepo repositorycontrol computer-vision deep-learning motion-planning perception autonomous-driving sensor-fusion sensors-data-collection world-modeling |
|
wato_monorepo repositorycontrol computer-vision deep-learning motion-planning perception autonomous-driving sensor-fusion sensors-data-collection world-modeling |
|
wato_monorepo repositorycontrol computer-vision deep-learning motion-planning perception autonomous-driving sensor-fusion sensors-data-collection world-modeling action_bringup behaviour_planning global_planning local_planning model_predictive_control infrastructure_deps can_interfacing eve_description interfacing_bringup sensor_interfacing camera_object_detection depth_estimation patchworkpp perception_bringup perception_utils bytetrack_cpp_vendor track_viz_2d tracking_2d traffic_light_detection carla_config carla_sample_node common_msgs interfacing_msgs camera_object_detection_msgs lane_detection_msgs radar_msgs tracking_2d_msgs sample_msgs embedded_msgs path_planning_msgs world_modeling_msgs wato_test hd_map lidar_localization localization occupancy occupancy_segmentation prediction state_estimation world_modeling_bringup |
Repository Summary
| Description | Dockerized ROS2 stack for the WATonomous Autonomous Driving Software Pipeline |
| Checkout URI | https://github.com/watonomous/wato_monorepo.git |
| VCS Type | git |
| VCS Version | main |
| Last Updated | 2025-12-01 |
| Dev Status | UNKNOWN |
| Released | UNRELEASED |
| Tags | control computer-vision deep-learning motion-planning perception autonomous-driving sensor-fusion sensors-data-collection world-modeling |
| Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
| Name | Version |
|---|---|
| action_bringup | 0.0.0 |
| behaviour_planning | 0.0.0 |
| global_planning | 0.0.0 |
| local_planning | 0.0.0 |
| model_predictive_control | 0.0.0 |
| infrastructure_deps | 0.0.0 |
| can_interfacing | 0.0.0 |
| eve_description | 0.0.0 |
| interfacing_bringup | 0.0.0 |
| sensor_interfacing | 0.0.0 |
| camera_object_detection | 0.0.0 |
| depth_estimation | 0.0.0 |
| patchworkpp | 0.1.0 |
| perception_bringup | 0.0.0 |
| perception_utils | 0.0.1 |
| bytetrack_cpp_vendor | 0.0.0 |
| track_viz_2d | 0.0.0 |
| tracking_2d | 0.0.0 |
| traffic_light_detection | 0.0.0 |
| carla_config | 0.0.0 |
| carla_sample_node | 0.0.0 |
| common_msgs | 0.0.0 |
| interfacing_msgs | 0.0.0 |
| camera_object_detection_msgs | 0.0.0 |
| lane_detection_msgs | 0.0.0 |
| radar_msgs | 0.0.0 |
| tracking_2d_msgs | 0.0.0 |
| sample_msgs | 0.0.0 |
| embedded_msgs | 0.0.0 |
| path_planning_msgs | 0.0.0 |
| world_modeling_msgs | 0.0.0 |
| wato_test | 0.1.0 |
| hd_map | 0.0.0 |
| lidar_localization | 0.0.0 |
| localization | 0.0.0 |
| occupancy | 0.0.0 |
| occupancy_segmentation | 0.0.0 |
| prediction | 0.0.1 |
| state_estimation | 0.0.0 |
| world_modeling_bringup | 0.0.0 |
README
WATonomous Monorepo (for EVE)
Dockerized monorepo for the WATonomous autonomous vehicle project (dubbed EVE).
Prerequisite Installation
These steps are to setup the monorepo to work on your own PC. We utilize docker to enable ease of reproducibility and deployability.
Why docker? It’s so that you don’t need to download any coding libraries on your bare metal pc, saving headache :3
- Our monorepo infrastructure supports Linux Ubuntu >= 22.04, Windows (WSL/WSL2), and MacOS. Though, aspects of this repo might require specific hardware like NVidia GPUs.
- Once inside Linux, Download Docker Engine using the
aptrepository. If you are using WSL, install docker outside of WSL, it will automatically setup docker within WSL for you. - You’re all set! Information on running the monorepo with our infrastructure is given here
Available Modules
Infrastructure Starts the foxglove bridge and data streamer for rosbags.
Interfacing Launches packages directly connecting to hardware. This includes the sensors of the car and the car itself. see docs
Perception Launches packages for perception. see docs
World Modeling Launches packages for world modeling. see docs
Action Launches packages for action. see docs
Simulation Launches packages CARLA simulator. see docs
Contribute
Information on contributing to the monorepo is given in DEVELOPING.md
CONTRIBUTING
|
wato_monorepo repositorycontrol computer-vision deep-learning motion-planning perception autonomous-driving sensor-fusion sensors-data-collection world-modeling |
|
wato_monorepo repositorycontrol computer-vision deep-learning motion-planning perception autonomous-driving sensor-fusion sensors-data-collection world-modeling |
|
wato_monorepo repositorycontrol computer-vision deep-learning motion-planning perception autonomous-driving sensor-fusion sensors-data-collection world-modeling |
|
wato_monorepo repositorycontrol computer-vision deep-learning motion-planning perception autonomous-driving sensor-fusion sensors-data-collection world-modeling |