Stereo-GS: Online 3D Gaussian Splatting Mapping Using Stereo Depth Estimation

Research output: Contribution to journalArticlepeer-review

Abstract

We present Stereo-GS, a real-time system for online 3D Gaussian Splatting (3DGS) that reconstructs photorealistic 3D scenes from streaming stereo pairs. Unlike prior offline 3DGS methods that require dense multi-view input or precomputed depth, Stereo-GS estimates metrically accurate depth maps directly from rectified stereo geometry, enabling progressive, globally consistent reconstruction. The frontend combines a stereo implementation of DROID-SLAM for robust tracking and keyframe selection with FoundationStereo, a generalizable stereo network that needs no scene-specific fine-tuning. A two-stage filtering pipeline improves depth reliability by removing outliers using a variance-based refinement filter followed by a multi-view consistency check. In the backend, we selectively initialize new Gaussians in under-represented regions flagged by low PSNR during rendering and continuously optimize them via differentiable rendering. To maintain global coherence with minimal overhead, we apply a lightweight rigid alignment after periodic bundle adjustment. On EuRoC and TartanAir, Stereo-GS attains state-of-the-art performance, improving average PSNR by 0.22 dB and 2.45 dB over the best baseline, respectively. Together with superior visual quality, these results show that Stereo-GS delivers high-fidelity, geometrically accurate 3D reconstructions suitable for real-time robotics, navigation, and immersive AR/VR applications.

Original languageEnglish
Article number4436
JournalElectronics (Switzerland)
Volume14
Issue number22
DOIs
StatePublished - Nov 2025

Keywords

  • 3D Gaussian Splatting
  • neural rendering
  • online mapping
  • SLAM
  • stereo depth estimation

Fingerprint

Dive into the research topics of 'Stereo-GS: Online 3D Gaussian Splatting Mapping Using Stereo Depth Estimation'. Together they form a unique fingerprint.

Cite this