No version for distro humble showing github. Known supported distros are highlighted in the buttons above.

Repository Summary

Description Reinforcement learning using rlkit, UR5, Robotiq gripper on ROS(Robot Operating System)
Checkout URI https://github.com/geonhee-lee/ur-reaching-reinforcement-learning.git
VCS Type git
VCS Version master
Last Updated 2020-09-18
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

README

Object tracking video using Reinforcement learning

Object tracking video using Reinforcement learning

How to launch original env

First launch the simulator

  roslaunch ur_robotiq_gazebo gym.launch
  

And run the training launch

  roslaunch ur_training default.launch
  

Conveyer GAZEBO env

First launch the gazebo and gym interface and node publishing block point.

 roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen
 

Run the RL algorithms and unpause the GAZEBO

  roslaunch ur_training default.launch
  

Latest block’s point:

rostopic echo /target_blocks_pose

Total block’s points:

rostopic echo /blocks_poses 

How to launch REINFORCE algorithm

First launch the simulator

roslaunch ur_robotiq_gazebo conveyer_gym.launch controller:=vel --screen gui:=false
  

And load the parameters and launch python file for reset

roslaunch ur_reaching reinforcement.launch
  

And start the learning algorithm

python reinforcement_main.py 
  

And unpause physics of GAZEBO simulator

 rosservice call /gazebo/unpause_physics "{}"
 

How to launch PPO+GAE algorithm

First launch the simulator including loading the parameters and GAZEBO Excution func

roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen gui:=false

And start the learning algorithm

  python ppo_gae_main.py
 

How to use the RLkit

RLkit is reinforcement learning framework based on rllab

Run GAZEBO simulator and gazebo_execution

First launch the simulator including loading the parameters and GAZEBO Excution func

roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen gui:=false

Training

And start the SAC learning algorithm based on RLkit

  python rlkit_sac_main.py
 

And unpause physics of GAZEBO simulator

 rosservice call /gazebo/unpause_physics "{}"
 

After training, you may find the pickled files on the rlkit/data folder.

you can easily see the results through selecting the generated folder about training like follwing:

File truncated at 100 lines see the full file

No version for distro jazzy showing github. Known supported distros are highlighted in the buttons above.

Repository Summary

Description Reinforcement learning using rlkit, UR5, Robotiq gripper on ROS(Robot Operating System)
Checkout URI https://github.com/geonhee-lee/ur-reaching-reinforcement-learning.git
VCS Type git
VCS Version master
Last Updated 2020-09-18
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

README

Object tracking video using Reinforcement learning

Object tracking video using Reinforcement learning

How to launch original env

First launch the simulator

  roslaunch ur_robotiq_gazebo gym.launch
  

And run the training launch

  roslaunch ur_training default.launch
  

Conveyer GAZEBO env

First launch the gazebo and gym interface and node publishing block point.

 roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen
 

Run the RL algorithms and unpause the GAZEBO

  roslaunch ur_training default.launch
  

Latest block’s point:

rostopic echo /target_blocks_pose

Total block’s points:

rostopic echo /blocks_poses 

How to launch REINFORCE algorithm

First launch the simulator

roslaunch ur_robotiq_gazebo conveyer_gym.launch controller:=vel --screen gui:=false
  

And load the parameters and launch python file for reset

roslaunch ur_reaching reinforcement.launch
  

And start the learning algorithm

python reinforcement_main.py 
  

And unpause physics of GAZEBO simulator

 rosservice call /gazebo/unpause_physics "{}"
 

How to launch PPO+GAE algorithm

First launch the simulator including loading the parameters and GAZEBO Excution func

roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen gui:=false

And start the learning algorithm

  python ppo_gae_main.py
 

How to use the RLkit

RLkit is reinforcement learning framework based on rllab

Run GAZEBO simulator and gazebo_execution

First launch the simulator including loading the parameters and GAZEBO Excution func

roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen gui:=false

Training

And start the SAC learning algorithm based on RLkit

  python rlkit_sac_main.py
 

And unpause physics of GAZEBO simulator

 rosservice call /gazebo/unpause_physics "{}"
 

After training, you may find the pickled files on the rlkit/data folder.

you can easily see the results through selecting the generated folder about training like follwing:

File truncated at 100 lines see the full file

No version for distro kilted showing github. Known supported distros are highlighted in the buttons above.

Repository Summary

Description Reinforcement learning using rlkit, UR5, Robotiq gripper on ROS(Robot Operating System)
Checkout URI https://github.com/geonhee-lee/ur-reaching-reinforcement-learning.git
VCS Type git
VCS Version master
Last Updated 2020-09-18
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

README

Object tracking video using Reinforcement learning

Object tracking video using Reinforcement learning

How to launch original env

First launch the simulator

  roslaunch ur_robotiq_gazebo gym.launch
  

And run the training launch

  roslaunch ur_training default.launch
  

Conveyer GAZEBO env

First launch the gazebo and gym interface and node publishing block point.

 roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen
 

Run the RL algorithms and unpause the GAZEBO

  roslaunch ur_training default.launch
  

Latest block’s point:

rostopic echo /target_blocks_pose

Total block’s points:

rostopic echo /blocks_poses 

How to launch REINFORCE algorithm

First launch the simulator

roslaunch ur_robotiq_gazebo conveyer_gym.launch controller:=vel --screen gui:=false
  

And load the parameters and launch python file for reset

roslaunch ur_reaching reinforcement.launch
  

And start the learning algorithm

python reinforcement_main.py 
  

And unpause physics of GAZEBO simulator

 rosservice call /gazebo/unpause_physics "{}"
 

How to launch PPO+GAE algorithm

First launch the simulator including loading the parameters and GAZEBO Excution func

roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen gui:=false

And start the learning algorithm

  python ppo_gae_main.py
 

How to use the RLkit

RLkit is reinforcement learning framework based on rllab

Run GAZEBO simulator and gazebo_execution

First launch the simulator including loading the parameters and GAZEBO Excution func

roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen gui:=false

Training

And start the SAC learning algorithm based on RLkit

  python rlkit_sac_main.py
 

And unpause physics of GAZEBO simulator

 rosservice call /gazebo/unpause_physics "{}"
 

After training, you may find the pickled files on the rlkit/data folder.

you can easily see the results through selecting the generated folder about training like follwing:

File truncated at 100 lines see the full file

No version for distro rolling showing github. Known supported distros are highlighted in the buttons above.

Repository Summary

Description Reinforcement learning using rlkit, UR5, Robotiq gripper on ROS(Robot Operating System)
Checkout URI https://github.com/geonhee-lee/ur-reaching-reinforcement-learning.git
VCS Type git
VCS Version master
Last Updated 2020-09-18
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

README

Object tracking video using Reinforcement learning

Object tracking video using Reinforcement learning

How to launch original env

First launch the simulator

  roslaunch ur_robotiq_gazebo gym.launch
  

And run the training launch

  roslaunch ur_training default.launch
  

Conveyer GAZEBO env

First launch the gazebo and gym interface and node publishing block point.

 roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen
 

Run the RL algorithms and unpause the GAZEBO

  roslaunch ur_training default.launch
  

Latest block’s point:

rostopic echo /target_blocks_pose

Total block’s points:

rostopic echo /blocks_poses 

How to launch REINFORCE algorithm

First launch the simulator

roslaunch ur_robotiq_gazebo conveyer_gym.launch controller:=vel --screen gui:=false
  

And load the parameters and launch python file for reset

roslaunch ur_reaching reinforcement.launch
  

And start the learning algorithm

python reinforcement_main.py 
  

And unpause physics of GAZEBO simulator

 rosservice call /gazebo/unpause_physics "{}"
 

How to launch PPO+GAE algorithm

First launch the simulator including loading the parameters and GAZEBO Excution func

roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen gui:=false

And start the learning algorithm

  python ppo_gae_main.py
 

How to use the RLkit

RLkit is reinforcement learning framework based on rllab

Run GAZEBO simulator and gazebo_execution

First launch the simulator including loading the parameters and GAZEBO Excution func

roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen gui:=false

Training

And start the SAC learning algorithm based on RLkit

  python rlkit_sac_main.py
 

And unpause physics of GAZEBO simulator

 rosservice call /gazebo/unpause_physics "{}"
 

After training, you may find the pickled files on the rlkit/data folder.

you can easily see the results through selecting the generated folder about training like follwing:

File truncated at 100 lines see the full file

Repository Summary

Description Reinforcement learning using rlkit, UR5, Robotiq gripper on ROS(Robot Operating System)
Checkout URI https://github.com/geonhee-lee/ur-reaching-reinforcement-learning.git
VCS Type git
VCS Version master
Last Updated 2020-09-18
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

README

Object tracking video using Reinforcement learning

Object tracking video using Reinforcement learning

How to launch original env

First launch the simulator

  roslaunch ur_robotiq_gazebo gym.launch
  

And run the training launch

  roslaunch ur_training default.launch
  

Conveyer GAZEBO env

First launch the gazebo and gym interface and node publishing block point.

 roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen
 

Run the RL algorithms and unpause the GAZEBO

  roslaunch ur_training default.launch
  

Latest block’s point:

rostopic echo /target_blocks_pose

Total block’s points:

rostopic echo /blocks_poses 

How to launch REINFORCE algorithm

First launch the simulator

roslaunch ur_robotiq_gazebo conveyer_gym.launch controller:=vel --screen gui:=false
  

And load the parameters and launch python file for reset

roslaunch ur_reaching reinforcement.launch
  

And start the learning algorithm

python reinforcement_main.py 
  

And unpause physics of GAZEBO simulator

 rosservice call /gazebo/unpause_physics "{}"
 

How to launch PPO+GAE algorithm

First launch the simulator including loading the parameters and GAZEBO Excution func

roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen gui:=false

And start the learning algorithm

  python ppo_gae_main.py
 

How to use the RLkit

RLkit is reinforcement learning framework based on rllab

Run GAZEBO simulator and gazebo_execution

First launch the simulator including loading the parameters and GAZEBO Excution func

roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen gui:=false

Training

And start the SAC learning algorithm based on RLkit

  python rlkit_sac_main.py
 

And unpause physics of GAZEBO simulator

 rosservice call /gazebo/unpause_physics "{}"
 

After training, you may find the pickled files on the rlkit/data folder.

you can easily see the results through selecting the generated folder about training like follwing:

File truncated at 100 lines see the full file

No version for distro galactic showing github. Known supported distros are highlighted in the buttons above.

Repository Summary

Description Reinforcement learning using rlkit, UR5, Robotiq gripper on ROS(Robot Operating System)
Checkout URI https://github.com/geonhee-lee/ur-reaching-reinforcement-learning.git
VCS Type git
VCS Version master
Last Updated 2020-09-18
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

README

Object tracking video using Reinforcement learning

Object tracking video using Reinforcement learning

How to launch original env

First launch the simulator

  roslaunch ur_robotiq_gazebo gym.launch
  

And run the training launch

  roslaunch ur_training default.launch
  

Conveyer GAZEBO env

First launch the gazebo and gym interface and node publishing block point.

 roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen
 

Run the RL algorithms and unpause the GAZEBO

  roslaunch ur_training default.launch
  

Latest block’s point:

rostopic echo /target_blocks_pose

Total block’s points:

rostopic echo /blocks_poses 

How to launch REINFORCE algorithm

First launch the simulator

roslaunch ur_robotiq_gazebo conveyer_gym.launch controller:=vel --screen gui:=false
  

And load the parameters and launch python file for reset

roslaunch ur_reaching reinforcement.launch
  

And start the learning algorithm

python reinforcement_main.py 
  

And unpause physics of GAZEBO simulator

 rosservice call /gazebo/unpause_physics "{}"
 

How to launch PPO+GAE algorithm

First launch the simulator including loading the parameters and GAZEBO Excution func

roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen gui:=false

And start the learning algorithm

  python ppo_gae_main.py
 

How to use the RLkit

RLkit is reinforcement learning framework based on rllab

Run GAZEBO simulator and gazebo_execution

First launch the simulator including loading the parameters and GAZEBO Excution func

roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen gui:=false

Training

And start the SAC learning algorithm based on RLkit

  python rlkit_sac_main.py
 

And unpause physics of GAZEBO simulator

 rosservice call /gazebo/unpause_physics "{}"
 

After training, you may find the pickled files on the rlkit/data folder.

you can easily see the results through selecting the generated folder about training like follwing:

File truncated at 100 lines see the full file

No version for distro iron showing github. Known supported distros are highlighted in the buttons above.

Repository Summary

Description Reinforcement learning using rlkit, UR5, Robotiq gripper on ROS(Robot Operating System)
Checkout URI https://github.com/geonhee-lee/ur-reaching-reinforcement-learning.git
VCS Type git
VCS Version master
Last Updated 2020-09-18
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

README

Object tracking video using Reinforcement learning

Object tracking video using Reinforcement learning

How to launch original env

First launch the simulator

  roslaunch ur_robotiq_gazebo gym.launch
  

And run the training launch

  roslaunch ur_training default.launch
  

Conveyer GAZEBO env

First launch the gazebo and gym interface and node publishing block point.

 roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen
 

Run the RL algorithms and unpause the GAZEBO

  roslaunch ur_training default.launch
  

Latest block’s point:

rostopic echo /target_blocks_pose

Total block’s points:

rostopic echo /blocks_poses 

How to launch REINFORCE algorithm

First launch the simulator

roslaunch ur_robotiq_gazebo conveyer_gym.launch controller:=vel --screen gui:=false
  

And load the parameters and launch python file for reset

roslaunch ur_reaching reinforcement.launch
  

And start the learning algorithm

python reinforcement_main.py 
  

And unpause physics of GAZEBO simulator

 rosservice call /gazebo/unpause_physics "{}"
 

How to launch PPO+GAE algorithm

First launch the simulator including loading the parameters and GAZEBO Excution func

roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen gui:=false

And start the learning algorithm

  python ppo_gae_main.py
 

How to use the RLkit

RLkit is reinforcement learning framework based on rllab

Run GAZEBO simulator and gazebo_execution

First launch the simulator including loading the parameters and GAZEBO Excution func

roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen gui:=false

Training

And start the SAC learning algorithm based on RLkit

  python rlkit_sac_main.py
 

And unpause physics of GAZEBO simulator

 rosservice call /gazebo/unpause_physics "{}"
 

After training, you may find the pickled files on the rlkit/data folder.

you can easily see the results through selecting the generated folder about training like follwing:

File truncated at 100 lines see the full file

No version for distro melodic showing github. Known supported distros are highlighted in the buttons above.

Repository Summary

Description Reinforcement learning using rlkit, UR5, Robotiq gripper on ROS(Robot Operating System)
Checkout URI https://github.com/geonhee-lee/ur-reaching-reinforcement-learning.git
VCS Type git
VCS Version master
Last Updated 2020-09-18
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

README

Object tracking video using Reinforcement learning

Object tracking video using Reinforcement learning

How to launch original env

First launch the simulator

  roslaunch ur_robotiq_gazebo gym.launch
  

And run the training launch

  roslaunch ur_training default.launch
  

Conveyer GAZEBO env

First launch the gazebo and gym interface and node publishing block point.

 roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen
 

Run the RL algorithms and unpause the GAZEBO

  roslaunch ur_training default.launch
  

Latest block’s point:

rostopic echo /target_blocks_pose

Total block’s points:

rostopic echo /blocks_poses 

How to launch REINFORCE algorithm

First launch the simulator

roslaunch ur_robotiq_gazebo conveyer_gym.launch controller:=vel --screen gui:=false
  

And load the parameters and launch python file for reset

roslaunch ur_reaching reinforcement.launch
  

And start the learning algorithm

python reinforcement_main.py 
  

And unpause physics of GAZEBO simulator

 rosservice call /gazebo/unpause_physics "{}"
 

How to launch PPO+GAE algorithm

First launch the simulator including loading the parameters and GAZEBO Excution func

roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen gui:=false

And start the learning algorithm

  python ppo_gae_main.py
 

How to use the RLkit

RLkit is reinforcement learning framework based on rllab

Run GAZEBO simulator and gazebo_execution

First launch the simulator including loading the parameters and GAZEBO Excution func

roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen gui:=false

Training

And start the SAC learning algorithm based on RLkit

  python rlkit_sac_main.py
 

And unpause physics of GAZEBO simulator

 rosservice call /gazebo/unpause_physics "{}"
 

After training, you may find the pickled files on the rlkit/data folder.

you can easily see the results through selecting the generated folder about training like follwing:

File truncated at 100 lines see the full file

No version for distro noetic showing github. Known supported distros are highlighted in the buttons above.

Repository Summary

Description Reinforcement learning using rlkit, UR5, Robotiq gripper on ROS(Robot Operating System)
Checkout URI https://github.com/geonhee-lee/ur-reaching-reinforcement-learning.git
VCS Type git
VCS Version master
Last Updated 2020-09-18
Dev Status UNKNOWN
Released UNRELEASED
Contributing Help Wanted (-)
Good First Issues (-)
Pull Requests to Review (-)

README

Object tracking video using Reinforcement learning

Object tracking video using Reinforcement learning

How to launch original env

First launch the simulator

  roslaunch ur_robotiq_gazebo gym.launch
  

And run the training launch

  roslaunch ur_training default.launch
  

Conveyer GAZEBO env

First launch the gazebo and gym interface and node publishing block point.

 roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen
 

Run the RL algorithms and unpause the GAZEBO

  roslaunch ur_training default.launch
  

Latest block’s point:

rostopic echo /target_blocks_pose

Total block’s points:

rostopic echo /blocks_poses 

How to launch REINFORCE algorithm

First launch the simulator

roslaunch ur_robotiq_gazebo conveyer_gym.launch controller:=vel --screen gui:=false
  

And load the parameters and launch python file for reset

roslaunch ur_reaching reinforcement.launch
  

And start the learning algorithm

python reinforcement_main.py 
  

And unpause physics of GAZEBO simulator

 rosservice call /gazebo/unpause_physics "{}"
 

How to launch PPO+GAE algorithm

First launch the simulator including loading the parameters and GAZEBO Excution func

roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen gui:=false

And start the learning algorithm

  python ppo_gae_main.py
 

How to use the RLkit

RLkit is reinforcement learning framework based on rllab

Run GAZEBO simulator and gazebo_execution

First launch the simulator including loading the parameters and GAZEBO Excution func

roslaunch ur_robotiq_gazebo conveyer_gym.launch --screen gui:=false

Training

And start the SAC learning algorithm based on RLkit

  python rlkit_sac_main.py
 

And unpause physics of GAZEBO simulator

 rosservice call /gazebo/unpause_physics "{}"
 

After training, you may find the pickled files on the rlkit/data folder.

you can easily see the results through selecting the generated folder about training like follwing:

File truncated at 100 lines see the full file