No version for distro humble showing github. Known supported distros are highlighted in the buttons above.

Package Summary

Tags No category tags.
Version 0.0.0
License Apache 2.0
Build type AMENT_PYTHON
Use RECOMMENDED

Repository Summary

Description ROS 1 & 2 repos based on Mini Pupper legged robots from MangDang
Checkout URI https://github.com/mangdangroboticsclub/mini_pupper_ros.git
VCS Type git
VCS Version ros2-dev
Last Updated 2025-07-18
Dev Status UNKNOWN
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Package Description

Mini Pupper tracking package

Additional Links

No additional links.

Maintainers

  • kishan

Authors

No additional authors.

Mini Pupper Tracking System

This ROS 2 package enables real-time person tracking for the Mini Pupper robot, developed independently during the 2025 Global Internship Programme at HKSTP.

It combines visual detection, multi-object tracking, and IMU-based motion control to guide the robot’s head and orientation toward detected individuals.

Features

  • YOLOv11n object detection on live camera feed
  • Real-time tracking with unique temporary IDs per person (via motpy)
  • IMU-based PID control for yaw correction and smooth pitch tracking
  • Flask web interface for monitoring camera and tracking overlays
  • RViz visualisation for 3D spatial awareness of detections and camera field of view

Demo

Tracking Behaviour

Tracking Demo

The robot uses YOLOv11n to detect people and converts these detections into movement commands via PID control. Yaw adjustments are smoothed using IMU feedback to maintain heading stability.


Web Interface (Flask)

Flask Demo

The Flask web interface shows:

  • The live camera feed
  • Detected individuals with bounding boxes
  • Assigned temporary UUIDs for short-term identification

This is useful for remote observation and debugging.


RViz Visualisation

Rviz Demo

RViz displays:

  • A pyramid cone representing the camera’s field of view
  • Red points in 3D space representing detected individuals, estimated using bounding box area and field-of-view angles

Note: This package is only supported with the Stanford Controller. The CHAMP Controller is not supported.

IMPORTANT MAKE SURE YOU HAVE PLENTY OF SPACE ON YOUR TABLE IF THE ROBOT IS NOT ON THE FLOOR, MAKE SURE YOU ARE PREPARED FOR MOVEMENT!

USE CTRL-C ON THE HOST PC TO STOP MOVEMENT

Hardware Requirements

  • Camera: A Raspberry Pi Camera Module is required to run the tracking system.
    This package was developed using the v2 module, compatibility with earlier camera versions such as v1.3 has not been verified and may vary.

Note: You will need to change the camera parameter in mini_pupper_bringup/config/mini_pupper_2.yaml to true

Package Architecture

The tracking system consists of four main components:

  • Detection & Tracking (main.py + tracking_node.py): YOLO11n-based person detection with multi-object tracking using motpy
  • Movement Control (movement_node.py): PID-based robot control for yaw and pitch tracking with configurable parameters
  • Visualisation (camera_visualisation_node.py): RViz camera FOV and 3d position markers for visualising the locations of people
  • Web Interface (flask_server.py): Real-time video streaming with detection overlays

Dependencies

Install the required Python packages and ROS2 components to use in the ROS2 workspace:

# Downgrade numpy to a compatible version
pip install "numpy<2.0"

# Python dependencies
pip install flask onnxruntime motpy

# ROS2 dependencies
sudo apt install ros-humble-imu-filter-madgwick ros-humble-tf-transformations


1. Export the YOLO11n ONNX Model

To use YOLO11n with the tracking module, export the pretrained model to ONNX format using Ultralytics. We recommend doing this in a virtual environment to avoid conflicts with other packages.

Step 1: Set up a virtual environment

```bash python3 -m venv yolo-env

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged mini_pupper_tracking at Robotics Stack Exchange

No version for distro jazzy showing github. Known supported distros are highlighted in the buttons above.

Package Summary

Tags No category tags.
Version 0.0.0
License Apache 2.0
Build type AMENT_PYTHON
Use RECOMMENDED

Repository Summary

Description ROS 1 & 2 repos based on Mini Pupper legged robots from MangDang
Checkout URI https://github.com/mangdangroboticsclub/mini_pupper_ros.git
VCS Type git
VCS Version ros2-dev
Last Updated 2025-07-18
Dev Status UNKNOWN
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Package Description

Mini Pupper tracking package

Additional Links

No additional links.

Maintainers

  • kishan

Authors

No additional authors.

Mini Pupper Tracking System

This ROS 2 package enables real-time person tracking for the Mini Pupper robot, developed independently during the 2025 Global Internship Programme at HKSTP.

It combines visual detection, multi-object tracking, and IMU-based motion control to guide the robot’s head and orientation toward detected individuals.

Features

  • YOLOv11n object detection on live camera feed
  • Real-time tracking with unique temporary IDs per person (via motpy)
  • IMU-based PID control for yaw correction and smooth pitch tracking
  • Flask web interface for monitoring camera and tracking overlays
  • RViz visualisation for 3D spatial awareness of detections and camera field of view

Demo

Tracking Behaviour

Tracking Demo

The robot uses YOLOv11n to detect people and converts these detections into movement commands via PID control. Yaw adjustments are smoothed using IMU feedback to maintain heading stability.


Web Interface (Flask)

Flask Demo

The Flask web interface shows:

  • The live camera feed
  • Detected individuals with bounding boxes
  • Assigned temporary UUIDs for short-term identification

This is useful for remote observation and debugging.


RViz Visualisation

Rviz Demo

RViz displays:

  • A pyramid cone representing the camera’s field of view
  • Red points in 3D space representing detected individuals, estimated using bounding box area and field-of-view angles

Note: This package is only supported with the Stanford Controller. The CHAMP Controller is not supported.

IMPORTANT MAKE SURE YOU HAVE PLENTY OF SPACE ON YOUR TABLE IF THE ROBOT IS NOT ON THE FLOOR, MAKE SURE YOU ARE PREPARED FOR MOVEMENT!

USE CTRL-C ON THE HOST PC TO STOP MOVEMENT

Hardware Requirements

  • Camera: A Raspberry Pi Camera Module is required to run the tracking system.
    This package was developed using the v2 module, compatibility with earlier camera versions such as v1.3 has not been verified and may vary.

Note: You will need to change the camera parameter in mini_pupper_bringup/config/mini_pupper_2.yaml to true

Package Architecture

The tracking system consists of four main components:

  • Detection & Tracking (main.py + tracking_node.py): YOLO11n-based person detection with multi-object tracking using motpy
  • Movement Control (movement_node.py): PID-based robot control for yaw and pitch tracking with configurable parameters
  • Visualisation (camera_visualisation_node.py): RViz camera FOV and 3d position markers for visualising the locations of people
  • Web Interface (flask_server.py): Real-time video streaming with detection overlays

Dependencies

Install the required Python packages and ROS2 components to use in the ROS2 workspace:

# Downgrade numpy to a compatible version
pip install "numpy<2.0"

# Python dependencies
pip install flask onnxruntime motpy

# ROS2 dependencies
sudo apt install ros-humble-imu-filter-madgwick ros-humble-tf-transformations


1. Export the YOLO11n ONNX Model

To use YOLO11n with the tracking module, export the pretrained model to ONNX format using Ultralytics. We recommend doing this in a virtual environment to avoid conflicts with other packages.

Step 1: Set up a virtual environment

```bash python3 -m venv yolo-env

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged mini_pupper_tracking at Robotics Stack Exchange

No version for distro kilted showing github. Known supported distros are highlighted in the buttons above.

Package Summary

Tags No category tags.
Version 0.0.0
License Apache 2.0
Build type AMENT_PYTHON
Use RECOMMENDED

Repository Summary

Description ROS 1 & 2 repos based on Mini Pupper legged robots from MangDang
Checkout URI https://github.com/mangdangroboticsclub/mini_pupper_ros.git
VCS Type git
VCS Version ros2-dev
Last Updated 2025-07-18
Dev Status UNKNOWN
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Package Description

Mini Pupper tracking package

Additional Links

No additional links.

Maintainers

  • kishan

Authors

No additional authors.

Mini Pupper Tracking System

This ROS 2 package enables real-time person tracking for the Mini Pupper robot, developed independently during the 2025 Global Internship Programme at HKSTP.

It combines visual detection, multi-object tracking, and IMU-based motion control to guide the robot’s head and orientation toward detected individuals.

Features

  • YOLOv11n object detection on live camera feed
  • Real-time tracking with unique temporary IDs per person (via motpy)
  • IMU-based PID control for yaw correction and smooth pitch tracking
  • Flask web interface for monitoring camera and tracking overlays
  • RViz visualisation for 3D spatial awareness of detections and camera field of view

Demo

Tracking Behaviour

Tracking Demo

The robot uses YOLOv11n to detect people and converts these detections into movement commands via PID control. Yaw adjustments are smoothed using IMU feedback to maintain heading stability.


Web Interface (Flask)

Flask Demo

The Flask web interface shows:

  • The live camera feed
  • Detected individuals with bounding boxes
  • Assigned temporary UUIDs for short-term identification

This is useful for remote observation and debugging.


RViz Visualisation

Rviz Demo

RViz displays:

  • A pyramid cone representing the camera’s field of view
  • Red points in 3D space representing detected individuals, estimated using bounding box area and field-of-view angles

Note: This package is only supported with the Stanford Controller. The CHAMP Controller is not supported.

IMPORTANT MAKE SURE YOU HAVE PLENTY OF SPACE ON YOUR TABLE IF THE ROBOT IS NOT ON THE FLOOR, MAKE SURE YOU ARE PREPARED FOR MOVEMENT!

USE CTRL-C ON THE HOST PC TO STOP MOVEMENT

Hardware Requirements

  • Camera: A Raspberry Pi Camera Module is required to run the tracking system.
    This package was developed using the v2 module, compatibility with earlier camera versions such as v1.3 has not been verified and may vary.

Note: You will need to change the camera parameter in mini_pupper_bringup/config/mini_pupper_2.yaml to true

Package Architecture

The tracking system consists of four main components:

  • Detection & Tracking (main.py + tracking_node.py): YOLO11n-based person detection with multi-object tracking using motpy
  • Movement Control (movement_node.py): PID-based robot control for yaw and pitch tracking with configurable parameters
  • Visualisation (camera_visualisation_node.py): RViz camera FOV and 3d position markers for visualising the locations of people
  • Web Interface (flask_server.py): Real-time video streaming with detection overlays

Dependencies

Install the required Python packages and ROS2 components to use in the ROS2 workspace:

# Downgrade numpy to a compatible version
pip install "numpy<2.0"

# Python dependencies
pip install flask onnxruntime motpy

# ROS2 dependencies
sudo apt install ros-humble-imu-filter-madgwick ros-humble-tf-transformations


1. Export the YOLO11n ONNX Model

To use YOLO11n with the tracking module, export the pretrained model to ONNX format using Ultralytics. We recommend doing this in a virtual environment to avoid conflicts with other packages.

Step 1: Set up a virtual environment

```bash python3 -m venv yolo-env

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged mini_pupper_tracking at Robotics Stack Exchange

No version for distro rolling showing github. Known supported distros are highlighted in the buttons above.

Package Summary

Tags No category tags.
Version 0.0.0
License Apache 2.0
Build type AMENT_PYTHON
Use RECOMMENDED

Repository Summary

Description ROS 1 & 2 repos based on Mini Pupper legged robots from MangDang
Checkout URI https://github.com/mangdangroboticsclub/mini_pupper_ros.git
VCS Type git
VCS Version ros2-dev
Last Updated 2025-07-18
Dev Status UNKNOWN
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Package Description

Mini Pupper tracking package

Additional Links

No additional links.

Maintainers

  • kishan

Authors

No additional authors.

Mini Pupper Tracking System

This ROS 2 package enables real-time person tracking for the Mini Pupper robot, developed independently during the 2025 Global Internship Programme at HKSTP.

It combines visual detection, multi-object tracking, and IMU-based motion control to guide the robot’s head and orientation toward detected individuals.

Features

  • YOLOv11n object detection on live camera feed
  • Real-time tracking with unique temporary IDs per person (via motpy)
  • IMU-based PID control for yaw correction and smooth pitch tracking
  • Flask web interface for monitoring camera and tracking overlays
  • RViz visualisation for 3D spatial awareness of detections and camera field of view

Demo

Tracking Behaviour

Tracking Demo

The robot uses YOLOv11n to detect people and converts these detections into movement commands via PID control. Yaw adjustments are smoothed using IMU feedback to maintain heading stability.


Web Interface (Flask)

Flask Demo

The Flask web interface shows:

  • The live camera feed
  • Detected individuals with bounding boxes
  • Assigned temporary UUIDs for short-term identification

This is useful for remote observation and debugging.


RViz Visualisation

Rviz Demo

RViz displays:

  • A pyramid cone representing the camera’s field of view
  • Red points in 3D space representing detected individuals, estimated using bounding box area and field-of-view angles

Note: This package is only supported with the Stanford Controller. The CHAMP Controller is not supported.

IMPORTANT MAKE SURE YOU HAVE PLENTY OF SPACE ON YOUR TABLE IF THE ROBOT IS NOT ON THE FLOOR, MAKE SURE YOU ARE PREPARED FOR MOVEMENT!

USE CTRL-C ON THE HOST PC TO STOP MOVEMENT

Hardware Requirements

  • Camera: A Raspberry Pi Camera Module is required to run the tracking system.
    This package was developed using the v2 module, compatibility with earlier camera versions such as v1.3 has not been verified and may vary.

Note: You will need to change the camera parameter in mini_pupper_bringup/config/mini_pupper_2.yaml to true

Package Architecture

The tracking system consists of four main components:

  • Detection & Tracking (main.py + tracking_node.py): YOLO11n-based person detection with multi-object tracking using motpy
  • Movement Control (movement_node.py): PID-based robot control for yaw and pitch tracking with configurable parameters
  • Visualisation (camera_visualisation_node.py): RViz camera FOV and 3d position markers for visualising the locations of people
  • Web Interface (flask_server.py): Real-time video streaming with detection overlays

Dependencies

Install the required Python packages and ROS2 components to use in the ROS2 workspace:

# Downgrade numpy to a compatible version
pip install "numpy<2.0"

# Python dependencies
pip install flask onnxruntime motpy

# ROS2 dependencies
sudo apt install ros-humble-imu-filter-madgwick ros-humble-tf-transformations


1. Export the YOLO11n ONNX Model

To use YOLO11n with the tracking module, export the pretrained model to ONNX format using Ultralytics. We recommend doing this in a virtual environment to avoid conflicts with other packages.

Step 1: Set up a virtual environment

```bash python3 -m venv yolo-env

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged mini_pupper_tracking at Robotics Stack Exchange

Package Summary

Tags No category tags.
Version 0.0.0
License Apache 2.0
Build type AMENT_PYTHON
Use RECOMMENDED

Repository Summary

Description ROS 1 & 2 repos based on Mini Pupper legged robots from MangDang
Checkout URI https://github.com/mangdangroboticsclub/mini_pupper_ros.git
VCS Type git
VCS Version ros2-dev
Last Updated 2025-07-18
Dev Status UNKNOWN
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Package Description

Mini Pupper tracking package

Additional Links

No additional links.

Maintainers

  • kishan

Authors

No additional authors.

Mini Pupper Tracking System

This ROS 2 package enables real-time person tracking for the Mini Pupper robot, developed independently during the 2025 Global Internship Programme at HKSTP.

It combines visual detection, multi-object tracking, and IMU-based motion control to guide the robot’s head and orientation toward detected individuals.

Features

  • YOLOv11n object detection on live camera feed
  • Real-time tracking with unique temporary IDs per person (via motpy)
  • IMU-based PID control for yaw correction and smooth pitch tracking
  • Flask web interface for monitoring camera and tracking overlays
  • RViz visualisation for 3D spatial awareness of detections and camera field of view

Demo

Tracking Behaviour

Tracking Demo

The robot uses YOLOv11n to detect people and converts these detections into movement commands via PID control. Yaw adjustments are smoothed using IMU feedback to maintain heading stability.


Web Interface (Flask)

Flask Demo

The Flask web interface shows:

  • The live camera feed
  • Detected individuals with bounding boxes
  • Assigned temporary UUIDs for short-term identification

This is useful for remote observation and debugging.


RViz Visualisation

Rviz Demo

RViz displays:

  • A pyramid cone representing the camera’s field of view
  • Red points in 3D space representing detected individuals, estimated using bounding box area and field-of-view angles

Note: This package is only supported with the Stanford Controller. The CHAMP Controller is not supported.

IMPORTANT MAKE SURE YOU HAVE PLENTY OF SPACE ON YOUR TABLE IF THE ROBOT IS NOT ON THE FLOOR, MAKE SURE YOU ARE PREPARED FOR MOVEMENT!

USE CTRL-C ON THE HOST PC TO STOP MOVEMENT

Hardware Requirements

  • Camera: A Raspberry Pi Camera Module is required to run the tracking system.
    This package was developed using the v2 module, compatibility with earlier camera versions such as v1.3 has not been verified and may vary.

Note: You will need to change the camera parameter in mini_pupper_bringup/config/mini_pupper_2.yaml to true

Package Architecture

The tracking system consists of four main components:

  • Detection & Tracking (main.py + tracking_node.py): YOLO11n-based person detection with multi-object tracking using motpy
  • Movement Control (movement_node.py): PID-based robot control for yaw and pitch tracking with configurable parameters
  • Visualisation (camera_visualisation_node.py): RViz camera FOV and 3d position markers for visualising the locations of people
  • Web Interface (flask_server.py): Real-time video streaming with detection overlays

Dependencies

Install the required Python packages and ROS2 components to use in the ROS2 workspace:

# Downgrade numpy to a compatible version
pip install "numpy<2.0"

# Python dependencies
pip install flask onnxruntime motpy

# ROS2 dependencies
sudo apt install ros-humble-imu-filter-madgwick ros-humble-tf-transformations


1. Export the YOLO11n ONNX Model

To use YOLO11n with the tracking module, export the pretrained model to ONNX format using Ultralytics. We recommend doing this in a virtual environment to avoid conflicts with other packages.

Step 1: Set up a virtual environment

```bash python3 -m venv yolo-env

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged mini_pupper_tracking at Robotics Stack Exchange

No version for distro galactic showing github. Known supported distros are highlighted in the buttons above.

Package Summary

Tags No category tags.
Version 0.0.0
License Apache 2.0
Build type AMENT_PYTHON
Use RECOMMENDED

Repository Summary

Description ROS 1 & 2 repos based on Mini Pupper legged robots from MangDang
Checkout URI https://github.com/mangdangroboticsclub/mini_pupper_ros.git
VCS Type git
VCS Version ros2-dev
Last Updated 2025-07-18
Dev Status UNKNOWN
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Package Description

Mini Pupper tracking package

Additional Links

No additional links.

Maintainers

  • kishan

Authors

No additional authors.

Mini Pupper Tracking System

This ROS 2 package enables real-time person tracking for the Mini Pupper robot, developed independently during the 2025 Global Internship Programme at HKSTP.

It combines visual detection, multi-object tracking, and IMU-based motion control to guide the robot’s head and orientation toward detected individuals.

Features

  • YOLOv11n object detection on live camera feed
  • Real-time tracking with unique temporary IDs per person (via motpy)
  • IMU-based PID control for yaw correction and smooth pitch tracking
  • Flask web interface for monitoring camera and tracking overlays
  • RViz visualisation for 3D spatial awareness of detections and camera field of view

Demo

Tracking Behaviour

Tracking Demo

The robot uses YOLOv11n to detect people and converts these detections into movement commands via PID control. Yaw adjustments are smoothed using IMU feedback to maintain heading stability.


Web Interface (Flask)

Flask Demo

The Flask web interface shows:

  • The live camera feed
  • Detected individuals with bounding boxes
  • Assigned temporary UUIDs for short-term identification

This is useful for remote observation and debugging.


RViz Visualisation

Rviz Demo

RViz displays:

  • A pyramid cone representing the camera’s field of view
  • Red points in 3D space representing detected individuals, estimated using bounding box area and field-of-view angles

Note: This package is only supported with the Stanford Controller. The CHAMP Controller is not supported.

IMPORTANT MAKE SURE YOU HAVE PLENTY OF SPACE ON YOUR TABLE IF THE ROBOT IS NOT ON THE FLOOR, MAKE SURE YOU ARE PREPARED FOR MOVEMENT!

USE CTRL-C ON THE HOST PC TO STOP MOVEMENT

Hardware Requirements

  • Camera: A Raspberry Pi Camera Module is required to run the tracking system.
    This package was developed using the v2 module, compatibility with earlier camera versions such as v1.3 has not been verified and may vary.

Note: You will need to change the camera parameter in mini_pupper_bringup/config/mini_pupper_2.yaml to true

Package Architecture

The tracking system consists of four main components:

  • Detection & Tracking (main.py + tracking_node.py): YOLO11n-based person detection with multi-object tracking using motpy
  • Movement Control (movement_node.py): PID-based robot control for yaw and pitch tracking with configurable parameters
  • Visualisation (camera_visualisation_node.py): RViz camera FOV and 3d position markers for visualising the locations of people
  • Web Interface (flask_server.py): Real-time video streaming with detection overlays

Dependencies

Install the required Python packages and ROS2 components to use in the ROS2 workspace:

# Downgrade numpy to a compatible version
pip install "numpy<2.0"

# Python dependencies
pip install flask onnxruntime motpy

# ROS2 dependencies
sudo apt install ros-humble-imu-filter-madgwick ros-humble-tf-transformations


1. Export the YOLO11n ONNX Model

To use YOLO11n with the tracking module, export the pretrained model to ONNX format using Ultralytics. We recommend doing this in a virtual environment to avoid conflicts with other packages.

Step 1: Set up a virtual environment

```bash python3 -m venv yolo-env

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged mini_pupper_tracking at Robotics Stack Exchange

No version for distro iron showing github. Known supported distros are highlighted in the buttons above.

Package Summary

Tags No category tags.
Version 0.0.0
License Apache 2.0
Build type AMENT_PYTHON
Use RECOMMENDED

Repository Summary

Description ROS 1 & 2 repos based on Mini Pupper legged robots from MangDang
Checkout URI https://github.com/mangdangroboticsclub/mini_pupper_ros.git
VCS Type git
VCS Version ros2-dev
Last Updated 2025-07-18
Dev Status UNKNOWN
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Package Description

Mini Pupper tracking package

Additional Links

No additional links.

Maintainers

  • kishan

Authors

No additional authors.

Mini Pupper Tracking System

This ROS 2 package enables real-time person tracking for the Mini Pupper robot, developed independently during the 2025 Global Internship Programme at HKSTP.

It combines visual detection, multi-object tracking, and IMU-based motion control to guide the robot’s head and orientation toward detected individuals.

Features

  • YOLOv11n object detection on live camera feed
  • Real-time tracking with unique temporary IDs per person (via motpy)
  • IMU-based PID control for yaw correction and smooth pitch tracking
  • Flask web interface for monitoring camera and tracking overlays
  • RViz visualisation for 3D spatial awareness of detections and camera field of view

Demo

Tracking Behaviour

Tracking Demo

The robot uses YOLOv11n to detect people and converts these detections into movement commands via PID control. Yaw adjustments are smoothed using IMU feedback to maintain heading stability.


Web Interface (Flask)

Flask Demo

The Flask web interface shows:

  • The live camera feed
  • Detected individuals with bounding boxes
  • Assigned temporary UUIDs for short-term identification

This is useful for remote observation and debugging.


RViz Visualisation

Rviz Demo

RViz displays:

  • A pyramid cone representing the camera’s field of view
  • Red points in 3D space representing detected individuals, estimated using bounding box area and field-of-view angles

Note: This package is only supported with the Stanford Controller. The CHAMP Controller is not supported.

IMPORTANT MAKE SURE YOU HAVE PLENTY OF SPACE ON YOUR TABLE IF THE ROBOT IS NOT ON THE FLOOR, MAKE SURE YOU ARE PREPARED FOR MOVEMENT!

USE CTRL-C ON THE HOST PC TO STOP MOVEMENT

Hardware Requirements

  • Camera: A Raspberry Pi Camera Module is required to run the tracking system.
    This package was developed using the v2 module, compatibility with earlier camera versions such as v1.3 has not been verified and may vary.

Note: You will need to change the camera parameter in mini_pupper_bringup/config/mini_pupper_2.yaml to true

Package Architecture

The tracking system consists of four main components:

  • Detection & Tracking (main.py + tracking_node.py): YOLO11n-based person detection with multi-object tracking using motpy
  • Movement Control (movement_node.py): PID-based robot control for yaw and pitch tracking with configurable parameters
  • Visualisation (camera_visualisation_node.py): RViz camera FOV and 3d position markers for visualising the locations of people
  • Web Interface (flask_server.py): Real-time video streaming with detection overlays

Dependencies

Install the required Python packages and ROS2 components to use in the ROS2 workspace:

# Downgrade numpy to a compatible version
pip install "numpy<2.0"

# Python dependencies
pip install flask onnxruntime motpy

# ROS2 dependencies
sudo apt install ros-humble-imu-filter-madgwick ros-humble-tf-transformations


1. Export the YOLO11n ONNX Model

To use YOLO11n with the tracking module, export the pretrained model to ONNX format using Ultralytics. We recommend doing this in a virtual environment to avoid conflicts with other packages.

Step 1: Set up a virtual environment

```bash python3 -m venv yolo-env

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged mini_pupper_tracking at Robotics Stack Exchange

No version for distro melodic showing github. Known supported distros are highlighted in the buttons above.

Package Summary

Tags No category tags.
Version 0.0.0
License Apache 2.0
Build type AMENT_PYTHON
Use RECOMMENDED

Repository Summary

Description ROS 1 & 2 repos based on Mini Pupper legged robots from MangDang
Checkout URI https://github.com/mangdangroboticsclub/mini_pupper_ros.git
VCS Type git
VCS Version ros2-dev
Last Updated 2025-07-18
Dev Status UNKNOWN
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Package Description

Mini Pupper tracking package

Additional Links

No additional links.

Maintainers

  • kishan

Authors

No additional authors.

Mini Pupper Tracking System

This ROS 2 package enables real-time person tracking for the Mini Pupper robot, developed independently during the 2025 Global Internship Programme at HKSTP.

It combines visual detection, multi-object tracking, and IMU-based motion control to guide the robot’s head and orientation toward detected individuals.

Features

  • YOLOv11n object detection on live camera feed
  • Real-time tracking with unique temporary IDs per person (via motpy)
  • IMU-based PID control for yaw correction and smooth pitch tracking
  • Flask web interface for monitoring camera and tracking overlays
  • RViz visualisation for 3D spatial awareness of detections and camera field of view

Demo

Tracking Behaviour

Tracking Demo

The robot uses YOLOv11n to detect people and converts these detections into movement commands via PID control. Yaw adjustments are smoothed using IMU feedback to maintain heading stability.


Web Interface (Flask)

Flask Demo

The Flask web interface shows:

  • The live camera feed
  • Detected individuals with bounding boxes
  • Assigned temporary UUIDs for short-term identification

This is useful for remote observation and debugging.


RViz Visualisation

Rviz Demo

RViz displays:

  • A pyramid cone representing the camera’s field of view
  • Red points in 3D space representing detected individuals, estimated using bounding box area and field-of-view angles

Note: This package is only supported with the Stanford Controller. The CHAMP Controller is not supported.

IMPORTANT MAKE SURE YOU HAVE PLENTY OF SPACE ON YOUR TABLE IF THE ROBOT IS NOT ON THE FLOOR, MAKE SURE YOU ARE PREPARED FOR MOVEMENT!

USE CTRL-C ON THE HOST PC TO STOP MOVEMENT

Hardware Requirements

  • Camera: A Raspberry Pi Camera Module is required to run the tracking system.
    This package was developed using the v2 module, compatibility with earlier camera versions such as v1.3 has not been verified and may vary.

Note: You will need to change the camera parameter in mini_pupper_bringup/config/mini_pupper_2.yaml to true

Package Architecture

The tracking system consists of four main components:

  • Detection & Tracking (main.py + tracking_node.py): YOLO11n-based person detection with multi-object tracking using motpy
  • Movement Control (movement_node.py): PID-based robot control for yaw and pitch tracking with configurable parameters
  • Visualisation (camera_visualisation_node.py): RViz camera FOV and 3d position markers for visualising the locations of people
  • Web Interface (flask_server.py): Real-time video streaming with detection overlays

Dependencies

Install the required Python packages and ROS2 components to use in the ROS2 workspace:

# Downgrade numpy to a compatible version
pip install "numpy<2.0"

# Python dependencies
pip install flask onnxruntime motpy

# ROS2 dependencies
sudo apt install ros-humble-imu-filter-madgwick ros-humble-tf-transformations


1. Export the YOLO11n ONNX Model

To use YOLO11n with the tracking module, export the pretrained model to ONNX format using Ultralytics. We recommend doing this in a virtual environment to avoid conflicts with other packages.

Step 1: Set up a virtual environment

```bash python3 -m venv yolo-env

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged mini_pupper_tracking at Robotics Stack Exchange

No version for distro noetic showing github. Known supported distros are highlighted in the buttons above.

Package Summary

Tags No category tags.
Version 0.0.0
License Apache 2.0
Build type AMENT_PYTHON
Use RECOMMENDED

Repository Summary

Description ROS 1 & 2 repos based on Mini Pupper legged robots from MangDang
Checkout URI https://github.com/mangdangroboticsclub/mini_pupper_ros.git
VCS Type git
VCS Version ros2-dev
Last Updated 2025-07-18
Dev Status UNKNOWN
Released UNRELEASED
Tags No category tags.
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

Package Description

Mini Pupper tracking package

Additional Links

No additional links.

Maintainers

  • kishan

Authors

No additional authors.

Mini Pupper Tracking System

This ROS 2 package enables real-time person tracking for the Mini Pupper robot, developed independently during the 2025 Global Internship Programme at HKSTP.

It combines visual detection, multi-object tracking, and IMU-based motion control to guide the robot’s head and orientation toward detected individuals.

Features

  • YOLOv11n object detection on live camera feed
  • Real-time tracking with unique temporary IDs per person (via motpy)
  • IMU-based PID control for yaw correction and smooth pitch tracking
  • Flask web interface for monitoring camera and tracking overlays
  • RViz visualisation for 3D spatial awareness of detections and camera field of view

Demo

Tracking Behaviour

Tracking Demo

The robot uses YOLOv11n to detect people and converts these detections into movement commands via PID control. Yaw adjustments are smoothed using IMU feedback to maintain heading stability.


Web Interface (Flask)

Flask Demo

The Flask web interface shows:

  • The live camera feed
  • Detected individuals with bounding boxes
  • Assigned temporary UUIDs for short-term identification

This is useful for remote observation and debugging.


RViz Visualisation

Rviz Demo

RViz displays:

  • A pyramid cone representing the camera’s field of view
  • Red points in 3D space representing detected individuals, estimated using bounding box area and field-of-view angles

Note: This package is only supported with the Stanford Controller. The CHAMP Controller is not supported.

IMPORTANT MAKE SURE YOU HAVE PLENTY OF SPACE ON YOUR TABLE IF THE ROBOT IS NOT ON THE FLOOR, MAKE SURE YOU ARE PREPARED FOR MOVEMENT!

USE CTRL-C ON THE HOST PC TO STOP MOVEMENT

Hardware Requirements

  • Camera: A Raspberry Pi Camera Module is required to run the tracking system.
    This package was developed using the v2 module, compatibility with earlier camera versions such as v1.3 has not been verified and may vary.

Note: You will need to change the camera parameter in mini_pupper_bringup/config/mini_pupper_2.yaml to true

Package Architecture

The tracking system consists of four main components:

  • Detection & Tracking (main.py + tracking_node.py): YOLO11n-based person detection with multi-object tracking using motpy
  • Movement Control (movement_node.py): PID-based robot control for yaw and pitch tracking with configurable parameters
  • Visualisation (camera_visualisation_node.py): RViz camera FOV and 3d position markers for visualising the locations of people
  • Web Interface (flask_server.py): Real-time video streaming with detection overlays

Dependencies

Install the required Python packages and ROS2 components to use in the ROS2 workspace:

# Downgrade numpy to a compatible version
pip install "numpy<2.0"

# Python dependencies
pip install flask onnxruntime motpy

# ROS2 dependencies
sudo apt install ros-humble-imu-filter-madgwick ros-humble-tf-transformations


1. Export the YOLO11n ONNX Model

To use YOLO11n with the tracking module, export the pretrained model to ONNX format using Ultralytics. We recommend doing this in a virtual environment to avoid conflicts with other packages.

Step 1: Set up a virtual environment

```bash python3 -m venv yolo-env

File truncated at 100 lines see the full file

CHANGELOG
No CHANGELOG found.

Launch files

No launch files found

Messages

No message files found.

Services

No service files found

Plugins

No plugins found.

Recent questions tagged mini_pupper_tracking at Robotics Stack Exchange