8000 GitHub - edgarwelteKIT/robo_imitate: End-to-end robot control based on generative diffusion model
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

edgarwelteKIT/robo_imitate

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Imitation learning with ROS 2

Object picking physics env Object picking physics env
Object picking in IsaacSim Object picking in IsaacSim
Object picking with imitation learning Object picking when pose of object is randomized

The RoboImitate project supports imitation learning through a Diffusion Policy. This policy learns behavior using expert demonstrations. (Stay tuned for our upcoming YouTube presentation for more details!)

This repository allows you to:

Important

You need to have Docker installed. If you have an Nvidia GPU, you need to additionally follow this guide. Additionally, you need to istall Isaac-Sim If you want to use simulation.

Install docker

sudo apt install git make curl
curl -sSL https://get.docker.com | sh && sudo usermod -aG docker $USER

Installation

  • Download our source code:
git clone https://github.com/MarijaGolubovic/robo_imitate.git && cd robo_imitate/docker
  • Build docker container
make build-pc run exec
  • Build ROS 2 packages
colcon build --symlink-install && source ./install/local_setup.bash

Model evaluation

Note

You can download pretrain model and aditional files from this link. Downloaded model and files you need to put inside folder imitation/outputs/train. If folder don't exist you need to create it.

  • Run Isaac-Sim or Lite 6 robot arm

    For Isaac-Sim it the easiest way to run it is to use isaac-sim.selector.sh script and selecting the correct ros2 installation (e.g. internal). You can run it with:

     ~/isaacsim/isaac-sim.selector.sh

    Open the file xarm_bringup/issac/object_picking.usda and run simulation.

Inside docker container run:

  • Run ROS 2 controler
ros2 launch xarm_bringup lite6_cartesian_launch.py rviz:=false sim:=true

If you want to vizualize robot set rviz on true. If you want to use real enviroment set sim on false.

  • Open another terminal and run docker
make exec
  • Run model inside docker
 cd src/robo_imitate && ./imitation/pick_screwdriver --sim

If you run in real environment you need to remove --sim from command.

Model training

Inside robo_imitate directory run follow commands:

docker build --build-arg UID=$(id -u) -t imitation .
docker run -v $(pwd)/imitation/:/docker/app/imitation:Z --gpus all -it -e DATA_PATH=imitation/data/sim_env_data.parquet -e EPOCH=1000 imitation

Tip

If you want to run model training inside docker, run this command inside the folder src/robo_imitate. Before that, you need to build the docker (see the Installation section for details).

python3 ./imitation/compute_stats --path imitation/data/sim_env_data.parquet  && python3 ./imitation/train_script --path imitation/data/sim_env_data.parquet  --epoch 1000

Acknowledgment

  • This project is done in collaboration with @SpesRobotics.
  • Thanks to LeRobot team for open sourcing LeRobot projects.
  • Thanks to Cheng Chi, Zhenjia Xu and colleagues for open sourcing Diffusion policy

About

End-to-end robot control based on generative diffusion model

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 98.2%
  • Other 1.8%
0