Skip to content

Latest commit

 

History

History
5 lines (3 loc) · 2.3 KB

2403.20275.md

File metadata and controls

5 lines (3 loc) · 2.3 KB

Snap-it, Tap-it, Splat-it: Tactile-Informed 3D Gaussian Splatting for Reconstructing Challenging Surfaces

Touch and vision go hand in hand, mutually enhancing our ability to understand the world. From a research perspective, the problem of mixing touch and vision is underexplored and presents interesting challenges. To this end, we propose Tactile-Informed 3DGS, a novel approach that incorporates touch data (local depth maps) with multi-view vision data to achieve surface reconstruction and novel view synthesis. Our method optimises 3D Gaussian primitives to accurately model the object's geometry at points of contact. By creating a framework that decreases the transmittance at touch locations, we achieve a refined surface reconstruction, ensuring a uniformly smooth depth map. Touch is particularly useful when considering non-Lambertian objects (e.g. shiny or reflective surfaces) since contemporary methods tend to fail to reconstruct with fidelity specular highlights. By combining vision and tactile sensing, we achieve more accurate geometry reconstructions with fewer images than prior methods. We conduct evaluation on objects with glossy and reflective surfaces and demonstrate the effectiveness of our approach, offering significant improvements in reconstruction quality.

触觉和视觉相辅相成,共同增强了我们理解世界的能力。从研究的角度来看,触觉和视觉的结合是一个探索不足且充满有趣挑战的问题。为此,我们提出了一种新颖的方法——触觉信息三维高斯喷涂(Tactile-Informed 3DGS),该方法将触觉数据(局部深度图)与多视角视觉数据结合起来,以实现表面重建和新视角合成。我们的方法优化了三维高斯原语以精确地模拟接触点处对象的几何形状。通过创建一个在触觉位置降低透射率的框架,我们实现了精细的表面重建,确保了深度图的均匀平滑。当考虑非朗伯体(如光滑或反射表面)的对象时,触觉尤为有用,因为现有方法倾向于无法高保真地重建镜面高光。通过结合视觉和触觉感知,我们使用比先前方法更少的图像实现了更准确的几何重建。我们对具有光滑和反射表面的对象进行了评估,并展示了我们方法的有效性,提供了重建质量的显著改进。