![]() |
opennav_amd_demonstrations repositoryhoneybee_bringup honeybee_demos honeybee_description honeybee_gazebo honeybee_nav2 honeybee_watchdogs |
Repository Summary
Description | Project containing demonstrations using AMD's Ryzen AI and other technologies with ROS 2 |
Checkout URI | https://github.com/open-navigation/opennav_amd_demonstrations.git |
VCS Type | git |
VCS Version | main |
Last Updated | 2024-08-26 |
Dev Status | UNKNOWN |
Released | UNRELEASED |
Tags | No category tags. |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
honeybee_bringup | 0.0.1 |
honeybee_demos | 1.0.0 |
honeybee_description | 0.0.1 |
honeybee_gazebo | 1.0.0 |
honeybee_nav2 | 1.0.0 |
honeybee_watchdogs | 1.0.0 |
README
Open Navigation - AMD Ryzen AI Demonstrations
This project has demonstrations and analysis using AMD’s powerful Ryzen AI CPU, GPU, NPU, and related acceleration technologies for embedded solutions with Nav2, ROS 2 Humble, and the open-source robotics community’s technologies. These demononstrations show complete & tuned reference applications to perform indoor 2D-based, urban 3D-based, and outdoor GPS-based navigation. They use AMD’s compute technologies and show that they are very well suited to robotics tasks and workloads, with plenty of compute time remaining for AI, business logic, application layers, and other computationally demanding tasks on top of advanced mobility and 3D perception.
⚠️ Need ROS 2, Nav2 help or support? Contact Open Navigation! ⚠️
These demonstrations orbit around the Honeybee reference platform, a Clearpath Robotics Jackal outfitted with:
- AMD Ryzen AI using a Miniforum UM790 Pro
- Ouster OS0-32
- Realsense D435i or Orbecc Gemini 355
- Microstrain GX-25
Demonstration 1: Outdoor GPS Navigation | Demonstration 2: Urban 3D Navigation |
---|---|
![]() |
![]() |
Demonstration 3: Long-Duration Indoor Navigation | Glamour Shot |
![]() |
![]() |
Click on the demo gifs to see the full videos on YouTube!
This project contains a typical layout for a ROS-based mobile robot:
-
honeybee_description
contains the robot’s description, meshes, and frame transformations (URDF) -
honeybee_gazebo
contains the robot’s simulation in modern Gazebo with the full sensor suite -
honeybee_bringup
contains the bringup scripts to launch the robot’s base and sensors on the physical hardware and/or simulation -
honeybee_watchdogs
contains a set of watchdogs for hardware use, such as checking on the state of lifecycle activations and recording background datasets -
honeybee_nav2
contains the navigation configurations for the various demonstrations -
honeybee_demos
contains the demo scripts, launch files, and so forth to perform the applications. These would be notionally replaced by business logic for a refined, deployed application. -
scripts
contain developer scripts used by Open Navigation to perform the demonstrations which have potential useful value to the community in getting started
Bonus: docs
contains a number of developer guides for bootstrapping new computers for robots, network setup with ROS 2, setting up field experimental networks, how to visualize data remotely, make software run on startup, and so on.
- First-Time Robot Computer Bootstrapping Guide
- Setup Robot Automatic Software Launch Guide
- Multi-Computer Time Synchronization Guide
- Offline Command, Control, and Visualization Guide
- First-Time Mapping And Localization Guide
- Honeybee: Networking Setup Guide
- Honeybee: Disable CPR Services Guide
- Make Sure To Checkout Nav2’s Great Guides Too!
See the honeybee_demos
package for detailed demonstration descriptions, videos, and datasets
Launching Robot, Nav2, and Demos
The robot can be launched using ros2 launch honeybee_bringup robot.launch.py
with the use_simulation
launch configuration option to specify whether using the physical robot (default) or simulated robot (use_simulation:=True
). This will bringup the full robot system and/or simulation with sensors.
The navigation system can be launched using ros2 launch honeyee_nav2 nav2.launch.py
with a number of launch options, such as the localization type to use (3D, 2D, GPS, Local Navigation), simulation status, parameters, SLAM, and so forth.
The demonstrations can be launched using their respective launch files in honeybee_demos
and utilize Nav2 configured for the particular application, the annotated autonomy scripts developed for the demonstrations, and appropriate watchdogs for data recording and system handling.
See launch files for a full set of launch configurations and options!
Metrics and Analysis
While running the demonstrations, the resource utilization running the autonomy program, Nav2’s autonomous navigation, 3D lidar and camera perception, and localization averaged to 10.85% of the available CPU time on the 16-core, 60W Ryzen AI-powered computer. That’s amazingly powerful leaving plenty of room for many application compute tasks, AI, and additional sensor processing pipelines. The entirety of the autonomy, perception, and localization systems can be handled only 2 AMD Zen 4 cores! This greatly unburdens systems and can run more, faster, and cheaper opening up new application possibilities.
While running outdoor GPS localization with the RPP or DWB controllers with non-persisting voxel grids, the average was 8.7%, due to the lower compute demands. While when using more comparatively expensive and modern algorithms like MPPI, temporal voxel grids, and live SLAM, it rose to only 12.8%.
It can also build all of Nav2 in only 10 min, 15 sec, as compared to 23 min, 3 sec minutes using an Intel i7-1365U on an X1 Carbon (Gen 11).
Open Navigation is incredibly impressed with these results, using a computer with equivilient power utilization as an Nvidia Jetson in max power mode or an Intel NUC. This is a powerful machine for under $500!
We’re working next to tap into the Ryzen AI computer’s built-in GPU and NPU for accelerating workloads and real-time AI (3D detection, semantic segmentation, GenAI, accelerating robotics algorithms). We’d highly recommend considering Ryzen AI-powered devices for future robotics products and projects for its power, pricepoint, and AI and hardware acceleration integrations, especially if you require more powerful x86 cores!
Build
This is straight forward to build and work with. Clone this repository into your workspace:
mkdir -p amd_ws/src
cd amd_ws/src
git clone git@github.com:open-navigation/opennav_amd_demos.git
Then, we need to pull in some dependencies that we cannot obtain from rosdep
:
sudo apt install python3-vcstool # if don't already have
vcs import . < opennav_amd_demonstrations/deps.repos
cd ouster-lidar/ouster-ros && git submodule update --init
cd ../../../
# For Orbecc 335 cameras, if used instead of Realsense D435
sudo bash src/orbbec/OrbbecSDK_ROS2/orbbec_camera/scripts/install_udev_rules.sh
sudo udevadm control --reload-rules && sudo udevadm trigger
Next, we need to obtain our dependencies that are available from rosdep
:
sudo rosdep init # if haven't done
rosdep update
rosdep install -r -y --from-paths src --ignore-src
Now, we can build using colcon:
```
File truncated at 100 lines see the full file