8000 GitHub - PRBonn/PINGS: πŸ“Œ PINGS: Gaussian Splatting Meets Distance Fields within a Point-Based Implicit Neural Map [RSS' 25]
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
/ PINGS Public

πŸ“Œ PINGS: Gaussian Splatting Meets Distance Fields within a Point-Based Implicit Neural Map [RSS' 25]

License

Notifications You must be signed in to change notification settings

PRBonn/PINGS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

17 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

πŸ“Œ PINGS: Gaussian Splatting Meets Distance Fields within a Point-Based Implicit Neural Map

TL;DR: PINGS is a LiDAR-visual SLAM system unifying distance fields and radiance fields within a neural point map

teaser

[Demo videos (click to expand)]
SLAM example 1 SLAM example 2 Render from Map
pings_demo_01.mp4
pings_demo_02.mp4
pings_demo_03.mp4

🚧 Repo under construction 🚧

Abstract

[Details (click to expand)] Robots require high-fidelity reconstructions of their environment for effective operation. Such scene representations should be both, geometrically accurate and photorealistic to support downstream tasks. While this can be achieved by building distance fields from range sensors and radiance fields from cameras, the scalable incremental mapping of both fields consistently and at the same time with high quality remains challenging. In this paper, we propose a novel map representation that unifies a continuous signed distance field and a Gaussian splatting radiance field within an elastic and compact point-based implicit neural map. By enforcing geometric consistency between these fields, we achieve mutual improvements by exploiting both modalities. We devise a LiDAR-visual SLAM system called PINGS using the proposed map representation and evaluate it on several challenging large-scale datasets. Experimental results demonstrate that PINGS can incrementally build globally consistent distance and radiance fields encoded with a compact set of neural points. Compared to the state-of-the-art methods, PINGS achieves superior photometric and geometric rendering at novel views by leveraging the constraints from the distance field. Furthermore, by utilizing dense photometric cues and multi-view consistency from the radiance field, PINGS produces more accurate distance fields, leading to improved odometry estimation and mesh reconstruction.

Installation

Platform requirement

  • Ubuntu OS (tested on 20.04)

  • With GPU, memory > 8 GB recommended

0. Clone the repository

git clone git@github.com:PRBonn/PINGS.git --recursive
cd PINGS

1. Set up conda environment

conda create --name pings python=3.10
conda activate pings

2. Install the key requirement PyTorch

conda install pytorch==2.5.1 torchvision==0.20.1 torchaudio==2.5.1 pytorch-cuda=11.8 -c pytorch -c nvidia

The commands depend on your CUDA version (check it by nvcc --version). You may check the instructions here.

3. Install other dependencies including the submodules

pip3 install -r requirements.txt

Docker

[TODO]

Data Preparation

A dataset with both RGB and depth observations (LiDAR or depth camera) with extrinsic and intrinsic calibration parameters is required. Note that the input images are supposed to have been already undistorted.

If only depth measurements are available, you can still run PINGS with the --gs-off flag while PINGS would degenerate to PIN-SLAM.

To extract individual observations from a ROS bag, you may use the ROS bag converter tool.

For your own dataset, you may need to implement a new dataloader class and put it in the dataset/dataloaders folder. Check here for an example.

Run PINGS

To check how to run PINGS and which datasets have already been supported, you can use the following command:

python3 pings.py -h 

To check how to inspect the map built by PINGS, you can use the following command:

python3 inspect_pings.py -h 

Citation

[Details (click to expand)]

If you use PINGS for any academic work, please cite our original paper.

@inproceedings{pan2025rss,
author = {Y. Pan and X. Zhong and L. Jin and L. Wiesmann and M. Popovi\'c and J. Behley and C. Stachniss},
title = {{PINGS: Gaussian Splatting Meets Distance Fields within a Point-Based Implicit Neural Map}},
booktitle= {Robotics: Science and Systems (RSS)},
year = {2025},
codeurl = {https://github.com/PRBonn/PINGS},
url = {https://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/pan2025rss.pdf}
}

Acknowledgement

[Details (click to expand)]

PINGS is built on top of our previous work PIN-SLAM and we thank the authors for the following works:

About

πŸ“Œ PINGS: Gaussian Splatting Meets Distance Fields within a Point-Based Implicit Neural Map [RSS' 25]

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

0