Acoustic Neural 3D Reconstruction Under Pose Drift - Robotics Institute Carnegie Mellon University

Acoustic Neural 3D Reconstruction Under Pose Drift

Master's Thesis, Tech. Report, CMU-RI-TR-25-14, May, 2025

Abstract

This thesis considers the problem of optimizing neural implicit surfaces for 3D reconstruction using acoustic images collected with drifting sensor poses. Imaging sonar is unable to disambiguate elevation angles because of its imaging processing model. Meanwhile, the state estimation of underwater autonomous vehicles (AUVs) is typically drifting along x, y and yaw axes due to the absence of absolute references such as hydrostatic pressure and direction of gravity. However, the accuracy of current state-of-the-art 3D acoustic modeling algorithms is highly dependent on accurate pose estimation. Even small errors in sensor pose can lead to severe reconstruction artifacts or structure misalignment. This thesis proposes an algorithm that jointly optimizes the neural scene representation and drifted sonar poses. The proposed algorithm does so by parameterizing the 6DoF poses as learnable parameters and backpropagating gradients through the neural renderer and implicit representation. This thesis validates the proposed algorithm on both real and simulated datasets. It produces high-fidelity 3D reconstructions even under significant pose drift. This thesis also discusses some future research directions that are potentially non-trivial to the 3D reconstruction using imaging sonar.

BibTeX

@mastersthesis{Lin-2025-146464,
author = {Tianxiang Lin},
title = {Acoustic Neural 3D Reconstruction Under Pose Drift},
year = {2025},
month = {May},
school = {Carnegie Mellon University},
address = {Pittsburgh, PA},
number = {CMU-RI-TR-25-14},
keywords = {Marine Robotics, Mapping, Field Robotics},
}