Figure: Neural Inerial Odometry from Lie Events. We train Neural Displacement Priors (NDPs), which enable low-drift inertial odometry, with Lie Events derived from acceleration
This repository contains code that implements the event generation on
Please cite the following paper if you use the code or paper:
Royina Karegoudra Jayanth, Yinshuang Xu, Evangelos Chatzipantazis, Kostas Daniilidis, Daniel Gehrig,"Neural Inertial Odometry from Lie Events", RSS, 2025.
@misc{jayanth2025neuralinertialodometrylie,
title={Neural Inertial Odometry from Lie Events},
author={Royina Karegoudra Jayanth and Yinshuang Xu and Evangelos Chatzipantazis and Kostas Daniilidis and Daniel Gehrig},
year={2025},
eprint={2505.09780},
archivePrefix={arXiv},
primaryClass={cs.RO},
url={https://arxiv.org/abs/2505.09780},
}
Clone the repo
git clone https://github.com/RoyinaJayanth/NIO_Lie_Events.git
or
git clone git@github.com:RoyinaJayanth/NIO_Lie_Events.git --recursive
All dependencies can be installed using conda via
conda env create -f environment.yaml
Then the virtual environment is accessible with:
conda activate nio_lie_ev
Next commands should be run from this environment.
We provide the math utils (General/utils_tlio
) needed and event generation functions (General/Event_generation.py
) that can be directly called for
original work: website
We apply our framework to this filter-based inertial odometry architecture.
- TLIO Dataset: Download Here or with the following command (with the conda env activated) at the root of the repo:
gdown 14YKW7PsozjHo_EdxivKvumsQB7JMw1eg
mkdir -p local_data/ # or ln -s /path/to/data_drive/ local_data/
unzip golden-new-format-cc-by-nc-with-imus-v1.5.zip -d local_data/
rm golden-new-format-cc-by-nc-with-imus-v1.5.zip
https://drive.google.com/file/d/14YKW7PsozjHo_EdxivKvumsQB7JMw1eg/view?usp=share_link
The dataset tree structure looks like this.
Assume for the examples we have extracted the data under root directory local_data/tlio_golden
:
local_data/tlio_golden
├── 1008221029329889
│ ├── calibration.json
│ ├── imu0_resampled_description.json
│ ├── imu0_resampled.npy
│ └── imu_samples_0.csv
├── 1014753008676428
│ ├── calibration.json
│ ├── imu0_resampled_description.json
│ ├── imu0_resampled.npy
│ └── imu_samples_0.csv
...
├── test_list.txt
├── train_list.txt
└── val_list.txt
imu0_resampled.npy
contains calibrated IMU data and processed VIO ground truth data.
imu0_resampled_description.json
describes what the different columns in the data are.
The test sequences contain imu_samples_0.csv
which is the raw IMU data for running the filter.
calibration.json
contains the offline calibration.
Attitude filter data is not included with the release.
- Aria Dataset: Download Here
- TLIO + events (
$SE(3)$ + polarity): Download Here - TLIO + events (
$SE(3)$ ): Download Here - TLIO + events (
$SO(3)$ and$R(3)$ ): Download Here - TLIO + events (
$R(3)$ ): Download Here
(Optional) Download the dataset and the pre-trained models.
To train and test NN run TLIO-master/src/main_net.py
with mode argument. Please refer to the source code for the full list of command line arguments.
python3 TLIO-master/src/main_net.py --mode train \
--root_dir local_data/tlio_golden \
--out_dir models/tlio_ev_se3p \
--batch_size 1024 \
--epochs 50 \
--arch resnet \
--input_dim 12 \
--do_bias_shift \
--perturb_gravity \
--yaw_augmentation \
--event_based_input \
--contrast_threshold 0.01 \
--add_vel_perturb \
--add_vel_perturb_range 0.5 \
--se3_events \
-- polarity_input \
--polarity_noise_range 0.5
For testing run the following
python3 TLIO-master/src/main_net.py --mode test \
--root_dir local_data/tlio_golden \
--out_dir models/tlio_ev_se3p/nn_test \
--model_path models/tlio_ev_se3p/checkpoint_best.pt\
--arch resnet \
--test_list test_list.txt\
--input_dim 12 \
--event_based_input \
--contrast_threshold 0.01 \
--se3_events \
--polarity_input
To run the EKF run TLIO-master/src/main_filter.py
. Please refer to the source code for the full list of command line arguments.
python3 TLIO-master/src/main_filter.py --root_dir local_data/tlio_golden \
--out_dir models/tlio_ev_se3p/ekf_test \
--model_path models/tlio_ev_se3p/checkpoint_best.pt \
--model_param_path models/tlio_ev_se3p/parameters.json \
--event_based_input \
--polarity_input \
--contrast_threshold 0.01 \
--se3_events
To generate the NN metrics run src/analysis/NN_output_metrics.py
python3 TLIO/src/analysis/NN_output_metrics.py --files models/tlio_ev_se3p/nn_test\
--output_file_name tlio_nn_results
and for EKF metrics run src/analysis/EKF_output_metrics.py
python3 TLIO/src/analysis/EKF_output_metrics.py --files models/tlio_ev_se3p/ekf_output \
--ground_truth_path local_data/tlio_golden \
--output_file_name tlio_ekf_results
Original work: website
We show the benefits of our framework applied to this end-to-end Neural Network architecture.
-
RoNIN Dataset: Download Here or here * Note: Only 50% of the Dataset has been made publicly available. In this work we train on only 50% of the data.
-
RIDI Dataset: Download Here
-
OXOID Dataset: Download Here
- RoNIN + Events (
$SE(3)$ + polarity): Download Here
Download the dataset and the pre-trained models.
To train/test RoNIN ResNet model run source/ronin_resnet.py
with mode argument. Please refer to the source code for the full list of command line arguments.
python3 ronin_resnet.py --mode train \
--train_list lists/list_train.txt \
--val_list lists/list_val.txt \
--step_size 10 \
--root_dir ronin_data/all_data \
--cache_path ev_data/ronin_ev_se3p \
--out_dir ev_output/ronin_ev_se3p \
--arch resnet18 \
--contrast_threshold 0.1 \
--add_vel_perturb_range 0.5 \
--polarity_noise_range 0.5 \
--batch_size 128
--epochs 120
and for testing run
python3 ronin_resnet.py --mode test \
--test_list lists/list_test_unseen.txt \
--root_dir ronin_data/all_data \
--out_dir ev_output/ronin_ev_se3p/test_ronin_unseen \
--arch resnet18
--model_path ev_output/ronin_ev_se3p/checkpoints/checkpoint_last.pt \
--contrast_threshold 0.1