Skip to yearly menu bar Skip to main content


Poster

ODGS: 3D Scene Reconstruction from Omnidirectional Images with 3D Gaussian Splattings

Suyoung Lee · Jaeyoung Chung · Jaeyoo Huh · Kyoung Mu Lee

East Exhibit Hall A-C #1205
[ ] [ Project Page ]
Fri 13 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

With the advantage of being able to render a whole scene with a single image, omnidirectional (360-degree) images are becoming more frequently used for 3D reconstruction applications. Whereas traditional methods only can reconstruct sparse structural 3D information from multiple omnidirectional images, other works based on neural radiance field demonstrate successful 3D reconstruction quality, yet they suffer from long training and rendering time. 3D Gaussian splattings are recently getting attention for fast optimization and real-time rendering, but directly applying rasterizer to omnidirectional images produces severe distortion because of different optical properties between two images. We present 'ODGS' which includes a new rasterization appropriate for omnidirectional image projection. The transformation matrix from the camera space to omnidirectional space is calculated and multiplied to covariance matrix for each Gaussian, providing a proper distribution in omnidirectional image plane. Our method achieves better reconstruction results than the NeRF-based methods with more than 100 times faster optimization and rendering speed. Experiment results on egocentric and roaming datasets demonstrate that ODGS can restore fine details well even when reconstructing a large 3D scene.

Live content is unavailable. Log in and register to view live content