Skip to yearly menu bar Skip to main content


Poster

Aligning Gradient and Hessian for Neural Signed Distance Function

Ruian Wang · Zixiong Wang · Yunxiao Zhang · Shuangmin Chen · Shiqing Xin · Changhe Tu · Wenping Wang

Great Hall & Hall B1+B2 (level 1) #220

Abstract:

The Signed Distance Function (SDF), as an implicit surface representation, provides a crucial method for reconstructing a watertight surface from unorganized point clouds. The SDF has a fundamental relationship with the principles of surface vector calculus. Given a smooth surface, there exists a thin-shell space in which the SDF is differentiable everywhere such that the gradient of the SDF is an eigenvector of its Hessian matrix, with a corresponding eigenvalue of zero. In this paper, we introduce a method to directly learn the SDF from point clouds in the absence of normals. Our motivation is grounded in a fundamental observation: aligning the gradient and the Hessian of the SDF provides a more efficient mechanism to govern gradient directions. This, in turn, ensures that gradient changes more accurately reflect the true underlying variations in shape. Extensive experimental results demonstrate its ability to accurately recover the underlying shape while effectively suppressing the presence of ghost geometry.

Chat is not available.