Autonomous Underwater vehicles (AUVs) have achieved significant progress in the past decades. However, it is still very difficult to bring underwater vehicles from well-controlled pool environments to applications in open environments, due to challenges in both state estimation and planning:

We target two questions that are fundamentally critical for AUVs to perform complex tasks:
- (State Estimation) How to develop an uncertainty quantification approach for underwater navigation & manipulation problem that is accurate yet computationally efficient?
- (Planning) How to generate useful strategies in a computationally-efficient way to fulfill long horizon missions with performance guarantee?
State estimation
Uncertainty-aware 3D Gaussian Splatting (3DGS) based Pose Estimation for Underwater Vehicles
Our project develops an uncertainty-aware 3D Gaussian Splatting (3DGS) framework for pose estimation and potential perception for underwater robots, addressing the challenge of reliable localization in GPS-denied, visually degraded environments. By modeling distributions over key 3DGS parameters, the framework captures pose-wise epistemic uncertainty, revealing where the robot’s localization is most and least confident. Using rendered images from sampled parameter sets, the system estimates both the mean pose and its uncertainty, enabling an autonomous underwater vehicle (AUV) to identify high-uncertainty regions for revisitation or exploration. Simulation experiments in two different underwater scenes demonstrate that the quantified uncertainty strongly correlates with training data density and scene coverage, enabling future integration into active perception and planning for efficient data acquisition.

Publications:
- Yu Zhou and Mengxue Hou, “Uncertainty-aware 3D Gaussian Splatting (3DGS) Underwater Robotic Perception”, in 2025 OCEAN’S Great Lakes, accepted.
- Yu Zhou, Ruochu Yang and Mengxue Hou, “Flow Field Estimation in Underwater Vehicle Navigation using Sporadic Image Observations”, in 22nd International Conference on Ubiquitous Robots, College Station, TX, USA, Jun. 2025, pp. 579-584, DOI: 10.1109/UR65550.2025.11078124
Planning
Uncertainty-aware Planning for Underwater Vehicle Active Perception
Our project aims to develop an active perception approach that jointly optimizes motion planning and sensing decisions under uncertainty. We formulate the problem as a POMDP in belief space, balancing energy consumption, localization accuracy, and collision avoidance. The approach incorporates GPS surfacing and USBL acoustic positioning with distinct energy costs. To enable tractable computation, we discretize the belief and action spaces, applying A* search with admissible heuristics. Our method adaptively selects sensing actions based on environmental context and uncertainty levels, rather than fixed strategies. Simulation results demonstrate that our algorithm achieves safer and more efficient trajectories compared to baseline approaches, successfully navigating complex environments while maintaining safety constraints and intelligently adjusting sensing strategies.

Publications:
- Zongyao Liu, Ruochu Yang and Mengxue Hou, “An Active Perception Strategy for Underwater Vehicle Navigation”, in 2025 OCEAN’S Great Lakes, accepted.
Evaluation: underwater robot development

We are also developing AUV platform to support our research on state estimation & planning. Miniature Underwater Robot (MUR) serves as a flexible testbed for control algorithm validation, sensor integration, and water environment monitoring. Its compact form factor and modular payload bay allow deployment in lakes, rivers, and coastal areas, supporting missions from navigation and path planning experiments to scientific tasks such as water quality sampling. By building MUR from the ground up, we retain full control over hardware and software, enabling rapid iteration and seamless integration of advanced autonomy and uncertainty-aware planning algorithms. Recently we conducted a water sampling mission in Crampton Lake, University of Notre Dame Environmental Research Center (UNDERC).