WildGS-SLAM:動態環境中的單目高斯潑濺SLAM
WildGS-SLAM: Monocular Gaussian Splatting SLAM in Dynamic Environments
April 4, 2025
作者: Jianhao Zheng, Zihan Zhu, Valentin Bieri, Marc Pollefeys, Songyou Peng, Iro Armeni
cs.AI
摘要
我們提出了WildGS-SLAM,這是一個專為處理動態環境而設計的魯棒且高效的單目RGB SLAM系統,其核心在於利用不確定性感知的幾何映射。與傳統假設場景靜止的SLAM系統不同,我們的方法整合了深度和不確定性信息,以提升在移動物體存在時的追蹤、映射和渲染性能。我們引入了一種由淺層多層感知器和DINOv2特徵預測的不確定性地圖,用於在追蹤和映射過程中指導動態物體的移除。這種不確定性地圖增強了密集束調整和高斯地圖優化,從而提高了重建的準確性。我們在多個數據集上對該系統進行了評估,並展示了無偽影的視圖合成效果。結果表明,WildGS-SLAM在動態環境中的性能優於當前最先進的方法。
English
We present WildGS-SLAM, a robust and efficient monocular RGB SLAM system
designed to handle dynamic environments by leveraging uncertainty-aware
geometric mapping. Unlike traditional SLAM systems, which assume static scenes,
our approach integrates depth and uncertainty information to enhance tracking,
mapping, and rendering performance in the presence of moving objects. We
introduce an uncertainty map, predicted by a shallow multi-layer perceptron and
DINOv2 features, to guide dynamic object removal during both tracking and
mapping. This uncertainty map enhances dense bundle adjustment and Gaussian map
optimization, improving reconstruction accuracy. Our system is evaluated on
multiple datasets and demonstrates artifact-free view synthesis. Results
showcase WildGS-SLAM's superior performance in dynamic environments compared to
state-of-the-art methods.Summary
AI-Generated Summary