![]() |
sarl_star repositoryreinforcement-learning dqn crowd pedestrian-detection mobile-robot-navigation human-aware |
![]() |
sarl_star repositoryreinforcement-learning dqn crowd pedestrian-detection mobile-robot-navigation human-aware |
![]() |
sarl_star repositoryreinforcement-learning dqn crowd pedestrian-detection mobile-robot-navigation human-aware |
![]() |
sarl_star repositoryreinforcement-learning dqn crowd pedestrian-detection mobile-robot-navigation human-aware |
![]() |
sarl_star repositoryreinforcement-learning dqn crowd pedestrian-detection mobile-robot-navigation human-aware laser_filters amcl base_local_planner carrot_planner clear_costmap_recovery costmap_2d dwa_local_planner fake_localization global_planner map_server move_base move_slow_and_clear nav_core navfn navigation rotate_recovery voxel_grid face_detector leg_detector people people_msgs people_tracking_filter people_velocity_tracker catkinize_this easy_markers joy_listener kalman_filter manifest_cleaner rosbaglive roswiki_node wu_ros_tools rplidar_ros sarl_star_ros turtlebot_actions turtlebot_apps turtlebot_calibration turtlebot_follower turtlebot_navigation turtlebot_rapps |
Repository Summary
Description | SARL*: Deep RL based human-aware navigation for mobile robot in crowded indoor environments implemented in ROS. |
Checkout URI | https://github.com/leekeyu/sarl_star.git |
VCS Type | git |
VCS Version | master |
Last Updated | 2022-04-21 |
Dev Status | UNKNOWN |
Released | UNRELEASED |
Tags | reinforcement-learning dqn crowd pedestrian-detection mobile-robot-navigation human-aware |
Contributing |
Help Wanted (-)
Good First Issues (-) Pull Requests to Review (-) |
Packages
Name | Version |
---|---|
laser_filters | 1.8.6 |
amcl | 1.16.2 |
base_local_planner | 1.16.2 |
carrot_planner | 1.16.2 |
clear_costmap_recovery | 1.16.2 |
costmap_2d | 1.16.2 |
dwa_local_planner | 1.16.2 |
fake_localization | 1.16.2 |
global_planner | 1.16.2 |
map_server | 1.16.2 |
move_base | 1.16.2 |
move_slow_and_clear | 1.16.2 |
nav_core | 1.16.2 |
navfn | 1.16.2 |
navigation | 1.16.2 |
rotate_recovery | 1.16.2 |
voxel_grid | 1.16.2 |
face_detector | 1.1.2 |
leg_detector | 1.1.2 |
people | 1.1.2 |
people_msgs | 1.1.2 |
people_tracking_filter | 1.1.2 |
people_velocity_tracker | 1.1.2 |
catkinize_this | 0.2.4 |
easy_markers | 0.2.4 |
joy_listener | 0.2.4 |
kalman_filter | 0.2.4 |
manifest_cleaner | 0.2.4 |
rosbaglive | 0.2.4 |
roswiki_node | 0.2.4 |
wu_ros_tools | 0.2.4 |
rplidar_ros | 1.9.0 |
sarl_star_ros | 0.0.1 |
turtlebot_actions | 2.3.7 |
turtlebot_apps | 2.3.7 |
turtlebot_calibration | 2.3.7 |
turtlebot_follower | 2.3.7 |
turtlebot_navigation | 2.3.7 |
turtlebot_rapps | 2.3.7 |
README
sarl_star
ROS implementation of the paper SARL*: Deep Reinforcement Learning based Human-Aware Navigation for Mobile Robot in Indoor Environments presented in ROBIO 2019. This mobile robot navigation framework is implemented on a Turtlebot2 robot platform with lidar sensors (Hokuyo or RPlidar), integrating SLAM, path planning, pedestrian detection and deep reinforcement learning algorithms.
Video demonstration can be found on Youtube or bilibili.
Introduction
We present an advanced version of the Socially Attentive Reinforcement Learning (SARL) algorithm, namely SARL*, to achieve human-aware navigation in indoor environments. Recently, deep RL has achieved great success in generating human-aware navigation policies. However, there exist some limitations in the real-world implementations: the learned navigation policies are limited to certain distances associated with the training process, and the simplification of the environment neglects obstacles other than humans. In this work, we improve the SARL algorithm by introducing a dynamic local goal setting mechanism and a map-based safe action space to tackle the above problems.
Method Overview
System Setup
We use the laser scanner Hokuyo UTM-30LX or RPLIDAR-A2 as the sensor and TurtleBot 2 as the robot platform.
Some Experiments
Code Structure
- Python-RVO2: Crowd simulator using Optimal Reciprocal Collision Avoidance algorithm.
- laser_filters: ROS package to filter out unwanted laser scans. (optional)
- navigation: Modified ROS navigation stack to provide AMCL localization, costmaps and basic path planners. Note that our dynamic local goal setting algorithm is implemented in navigation/dwa_local_planner/src/dwa_planner_ros.cpp. Therefore, if you have installed the original ROS navigation stack before, we suggest that you uninstall it and build the stack in our repository (following the steps in the next part “Build & Install”) to make sure that our modification make effect.
- people: ROS stack to detect and track humans using sensor information.
- rplidar_ros: ROS package to use ROS with rplidar sensor.
- sarl_star_ros : Core ROS package to run the SARL* navigation algorithm.
- turtlebot_apps: ROS stack to use ROS with TurtleBot.
Build & Install
Our codes have been tested in Ubuntu 16.04 with Python 2.7.
- Install ROS kinetic.
- Create and build a catkin workspace and download the codes into src/:
mkdir -p ~/sarl_ws/src
cd ~/sarl_ws/
catkin_make
source devel/setup.bash
cd src
git clone https://github.com/LeeKeyu/sarl_star.git
- Install other dependencies:
sudo apt-get install libbullet-dev
sudo apt-get install libsdl-image1.2-dev
sudo apt-get install libsdl-dev
sudo apt-get install ros-kinetic-bfl
sudo apt-get install ros-kinetic-tf2-sensor-msgs
sudo apt-get install ros-kinetic-turtlebot ros-kinetic-turtlebot-apps ros-kinetic-turtlebot-interactions ros-kinetic-turtlebot-simulator ros-kinetic-kobuki-ftdi ros-kinetic-ar-track-alvar-msgs
pip install empy
pip install configparser
- Install Python-RVO2:
cd sarl_star/Python-RVO2/
pip install -r requirements.txt
python setup.py build
python setup.py install
- Install CrowdNav (Note that the CrowdNav in this repository are modified from the original SARL implementation):
cd sarl_star/sarl_star_ros/CrowdNav/
pip install -e .
- Build the catkin workspace:
cd ~/sarl_ws/
catkin_make
source devel/setup.bash
Start the Navigation
- Before starting the navigation, make sure your PC is connected with Turtlebot2 and the lidar sensor (either Hokuyo or RPlidar).
- Bringup the turtlebot
roslaunch turtlebot_bringup minimal.launch
- Build a map of your environment using gmapping package:
If you’re using Hokuyo, run
roslaunch turtlebot_navigation hokuyo_gmapping_movebase.launch
If you’re using RPlidar, run
roslaunch turtlebot_navigation rplidar_gmapping_movebase.launch
Then push or tele-operate the robot to explore the environment and build a map. You will be able to view the real-time navigation in RVIZ. To save the map, open a new terminal and run:
File truncated at 100 lines see the full file
CONTRIBUTING
![]() |
sarl_star repositoryreinforcement-learning dqn crowd pedestrian-detection mobile-robot-navigation human-aware |
![]() |
sarl_star repositoryreinforcement-learning dqn crowd pedestrian-detection mobile-robot-navigation human-aware |
![]() |
sarl_star repositoryreinforcement-learning dqn crowd pedestrian-detection mobile-robot-navigation human-aware |
![]() |
sarl_star repositoryreinforcement-learning dqn crowd pedestrian-detection mobile-robot-navigation human-aware |