Documentation: https://ir-sim.readthedocs.io/en
IR-SIM is an open-source, lightweight robot simulator based on Python, designed for robotics navigation, control, and learning. This simulator provides a simple and user-friendly framework for simulating robots, sensors, and environments, facilitating the development and testing of robotics algorithms with minimal hardware requirements.
- Simulate a wide range of robot platforms with diverse kinematics, sensors, and behaviors (support).
- Quickly configure and customize simulation scenarios using straightforward YAML files. No complex coding required.
- Visualize simulation outcomes in real time for immediate feedback and analysis using a naive visualizer matplotlib.
- Support collision detection and behavior control for each object in the simulation.
- Python: >= 3.9
- Install this package from PyPi:
pip install ir-sim
This does not include dependencies for all features of the simulator. To install additional optional dependencies, use the following pip commands:
# install dependencies for keyboard control
pip install ir-sim[keyboard]
# install all optional dependencies
pip install ir-sim[all]
- Or for development, you may install from source:
git clone https://github.com/hanruihua/ir-sim.git
cd ir-sim
pip install -e .
import irsim
env = irsim.make('robot_world.yaml') # initialize the environment with the configuration file
for i in range(300): # run the simulation for 300 steps
env.step() # update the environment
env.render() # render the environment
if env.done(): break # check if the simulation is done
env.end() # close the environment
YAML Configuration: robot_world.yaml
world:
height: 10 # the height of the world
width: 10 # the height of the world
step_time: 0.1 # 10Hz calculate each step
sample_time: 0.1 # 10 Hz for render and data extraction
offset: [0, 0] # the offset of the world on x and y
robot:
kinematics: {name: 'diff'} # omni, diff, acker
shape: {name: 'circle', radius: 0.2} # radius
state: [1, 1, 0] # x, y, theta
goal: [9, 9, 0] # x, y, theta
behavior: {name: 'dash'} # move toward to the goal directly
color: 'g' # green
The advanced usages are listed in the irsim/usage
Currently, the simulator supports the following features. Further features, such as additional sensors, behaviors, and robot models, are under development.
Category | Features |
---|---|
Kinematics | Differential Drive mobile Robot Omni-Directional mobile Robot Ackermann Steering mobile Robot |
Sensors | 2D LiDAR FOV detector |
Geometries | Circle Rectangle Polygon linestring Binary Grid Map |
Behaviors | dash (Move directly toward the goal) rvo (Move toward the goal using Reciprocal Velocity Obstacle behavior) |
-
Academic Projects:
- rl-rvo-nav (RAL & ICRA2023)
- RDA_planner (RAL & IROS2023)
- NeuPAN (T-RO 2025)
-
Deep Reinforcement Learning Projects:
This project is under development. I appreciate and welcome all contributions. Just open an issue or a pull request. Here are some simple ways to start contributing:
- Enhance the website documentation, such as the API and tutorials.
- Add new sensors, behaviors, robot models, and functional interfaces.
- Add new usage examples and benchmarks.
- Report bugs and issues.