Simultaneous localization and mapping (SLAM) technology now has photorealistic mapping capabilities thanks to the real-time high-fidelity rendering capability of 3D Gaussian splatting (3DGS). However, due to the static representation of scenes, current 3DGS-based SLAM encounters issues with pose drift and failure to reconstruct accurate maps in dynamic environments. To address this problem, we present D4DGS-SLAM, the first SLAM method based on 4DGS map representation for dynamic environments. By incorporating the temporal dimension into scene representation, D4DGS-SLAM enables high-quality reconstruction of dynamic scenes. Utilizing the dynamics-aware InfoModule, we can obtain the dynamics, visibility, and reliability of scene points, and filter stable static points for tracking accordingly. When optimizing Gaussian points, we apply different isotropic regularization terms to Gaussians with varying dynamic characteristics. Experimental results on real-world dynamic scene datasets demonstrate that our method outperforms state-of-the-art approaches in both camera pose tracking and map quality.
D4DGS-SLAM uses an RGB-D image sequence as input. We first extract anchors from each incoming RGB frame that are well-distributed globally and reflect the image features. These anchors, along with the RGB images, are fed into the LEAP module to obtain the dynamics and reliability of the anchors. This allows us to distinguish between stable dynamic points and static points. The static anchors are used for tracking to estimate the camera pose. These poses and dynamic information are then sent to the mapping module. We use 4DGS for mapping and select different scale penalty factors based on the dynamics and reliability of the covered points to control the distribution of the Gaussian in space- time. The techniques used and our SLAM system will be introduced below.
 
 
 
 
 
 
 
@inproceedings{sun2025d4dgsslam,
title={Embracing Dynamics: Dynamics-aware 4D Gaussian Splatting SLAM},
author={Zhicong Sun, Jacqueline Lo, Jinxing Hu},
journal={arXiv preprint arXiv:2504.04844},
year={2025}
}
We acknowledge the funding from the Research Grants Council of Hong Kong, the Hong Kong Polytechnic University, and the Shenzhen Institutes of Advanced Technology of the Chinese Academy of Sciences. Additionally, this work refer to many open-source codes, and we also thank the authors of the relaed work: 3D Gaussian Splatting, Differential Gaussian Rasterization, 4D Gaussian Splatting, Gaussian Splatting SLAM, LEAP-VO, and SplaTAM.