Recent advancements in neural rendering, particularly 2D Gaussian Splatting (2DGS), have shown promising results for jointly reconstructing fine appearance and geometry by leveraging 2D Gaussian surfels. However, current methods face significant challenges when rendering at arbitrary viewpoints, such as anti-aliasing for down-sampled rendering, and texture detail preservation for high-resolution rendering. We proposed a novel method to align the 2D surfels with texture maps and augment it with per-ray depth sorting and fisher-based pruning for rendering consistency and efficiency. With correct order, per-surfel texture maps significantly improve the capabilities to capture fine details. Additionally, to render high-fidelity details in varying viewpoints, we designed a frustum-based sampling method to mitigate the aliasing artifacts. Experimental results on benchmarks and our custom texture-rich dataset demonstrate that our method surpasses existing techniques, particularly in detail preservation and anti-aliasing.
近年来,神经渲染技术,特别是二维高斯散点(2D Gaussian Splatting, 2DGS),通过利用二维高斯表面元素(surfels),在细节外观和几何的联合重建方面展现了令人瞩目的成果。然而,当前方法在任意视角渲染时仍面临诸多挑战,例如在下采样渲染时的抗锯齿处理以及高分辨率渲染时的纹理细节保留。 为解决这些问题,我们提出了一种新颖的方法,将二维表面元素与纹理贴图对齐,并通过每光线深度排序和基于Fisher准则的修剪增强渲染一致性和效率。在正确的顺序下,基于每表面元素的纹理贴图显著提升了捕获细节的能力。此外,为了在不同视角下渲染高保真细节,我们设计了一种基于视锥的采样方法,有效减轻了锯齿伪影。 在基准测试数据集以及我们自定义的高纹理细节数据集上的实验结果表明,我们的方法在细节保留和抗锯齿方面显著优于现有技术,为高保真渲染设立了新的标杆。