Simultaneous localization and mapping (SLAM) is pivotal in robotics, with photorealistic scene reconstruction emerging as a key challenge. To address this, we introduce Computational Alignment for Real-Time Gaussian Splatting SLAM (CaRtGS), a novel method enhancing the efficiency and quality of photorealistic scene reconstruction in real-time environments. Leveraging 3D Gaussian Splatting (3DGS), CaRtGS achieves superior rendering quality and processing speed, which is crucial for scene photorealistic reconstruction. Our approach tackles computational misalignment in Gaussian Splatting SLAM (GS-SLAM) through an adaptive strategy that optimizes training, addresses long-tail optimization, and refines densification. Experiments on Replica and TUM-RGBD datasets demonstrate CaRtGS's effectiveness in achieving high-fidelity rendering with fewer Gaussian primitives. This work propels SLAM towards real-time, photorealistic dense rendering, significantly advancing photorealistic scene representation. For the benefit of the research community, we release the code on our project website: https://dapengfeng.github.io/cartgs.
The overview of CaRtGS. We adopt ORB-SLAM3 as a front-end tracker, severing for localization and geometry mapping. In the photorealistic rendering back-end, we apply the proposed adaptive computational alignment strategy to enhance the 3DGS optimization process, including fast splat backward, adaptive optimization, and opacity regularization.
N O T E : Click on the video to sync the videos. Move the mouse to change the hover.
@misc{feng2024CaRtGS,
title={CaRtGS: Computational Alignment for Real-Time Gaussian Splatting SLAM},
author={Dapeng Feng and Zhiqiang Chen and Yizhen Yin and Shipeng Zhong and Yuhua Qi and Hongbo Chen},
year={2024},
eprint={2410.00486},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2410.00486},
}