Tracking the movement of a camera or capture rig while simultaneously mapping the environment is a crucial requirement to enable digital reconstruction and overlay of assets in Augmented and Virtual Reality. This problem is formally described as dense Simultaneous Localization and Mapping (SLAM). Current SLAM methods are often hampered by the way they represent a scene.
What Will Be Covered
Nikhil introduces SplaTAM, an approach that leverages explicit volumetric representations, i.e., 3D Gaussians, to enable high-fidelity reconstruction from a single unposed RGB-D camera, surpassing the capabilities of existing methods.
SplaTAM employs a simple online tracking and mapping system tailored to the Gaussian representation. It utilizes a silhouette mask to elegantly capture the presence of scene density. This combination enables several benefits over prior volumetric representations, including fast rendering and optimization, determining if areas have been previously mapped, and structured map expansion by adding more Gaussians.
Nikhil will demonstrate SplaTAM’s up to 2X superior performance in camera pose estimation, map construction, and novel-view synthesis over existing methods, paving the way for more immersive high-fidelity SLAM applications.
Nikhil will present concepts of 3D Gaussian Splatting and how it can be used to reconstruct a scene in an online fashion. The webinar will share the research results, which were performed in collaboration with researchers at CMU and MIT, and published in a peer-reviewed conference (CVPR 2024).
For more information, visit https://spla-tam.github.io/
Who Should Attend
- AR and VR software developers and enterprise end users
- Engineers and end-users in image processing, 3D reconstruction, and camera tracking
- Product Design & Development, Civil, Construction, and Product Development engineers
- Robotics engineers
- Anyone interested in autonomous operations