8000 GitHub - MachinePerceptionLab/QQ-SLAM
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

MachinePerceptionLab/QQ-SLAM

Repository files navigation

QQ-SLAM: Query Quantized Neural SLAM

Project page| Paper

AAAI 2025

Installation

Please follow the instructions below to install the repo and dependencies.

git clone https://github.com/MachinePerceptionLab/QQ-SLAM.git
cd QQ-SLAM

Install the environment

# Create conda environment
conda create -n qqslam python=3.7
conda activate qqslam

# Install the pytorch first (Please check the cuda version)
pip install torch==1.10.1+cu113 torchvision==0.11.2+cu113 torchaudio==0.10.1 -f https://download.pytorch.org/whl/cu113/torch_stable.html

# Install all the dependencies via pip (Note here pytorch3d and tinycudann requires ~10min to build)
pip install -r requirements.txt

# Build extension (marching cubes from neuralRGBD)
cd external/NumpyMarchingCubes
python setup.py install

For tinycudann, if you cannot access network when you use GPUs, you can also try build from source as below:

# Build tinycudann 
git clone --recursive https://github.com/nvlabs/tiny-cuda-nn

# Try this version if you cannot use the latest version of tinycudann
#git reset --hard 91ee479d275d322a65726435040fc20b56b9c991

8000
cd tiny-cuda-nn/bindings/torch
python setup.py install

Dataset

Replica

Download the sequences of the Replica Dataset generated by the authors of iMAP into ./data/Replica folder.

bash scripts/download_replica.sh # Released by authors of NICE-SLAM

ScanNet

Please follow the procedure on ScanNet website, and extract color & depth frames from the .sens file using the code.

Synthetic RGB-D dataset

Download the sequences of the synethetic RGB-D dataset generated by the authors of neuralRGBD into ./data/neural_rgbd_data folder. We exclude the scenes with NaN poses generated by BundleFusion.

bash scripts/download_rgbd.sh 

TUM RGB-D

Download 3 sequences of TUM RGB-D dataset into ./data/TUM folder.

bash scripts/download_tum.sh 

Run

You can run QQ-SLAM using the code below:

python qqslam.py --config './configs/{Dataset}/{scene}.yaml 

Evaluation

We employ a slightly different evaluation strategy to measure the quality of the reconstruction, you can find out the code here. Note if you want to follow the evaluation protocol of NICE-SLAM, please refer to our supplementary material for detailed parameters setting.

Acknowledgement

We adapt codes from some awesome repositories, including NICE-SLAM, NeuralRGBD, tiny-cuda-nn, Co-SLAM.

Citation

If you find our code or paper useful, please cite

@inproceedings{jiang2025query,
  title={Query Quantized Neural SLAM},
  author={Jiang, Sijia and Hua, Jing and Han, Zhizhong},
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
  volume={39},
  number={4},
  pages={4057--4065},
  year={2025}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  
0