Recent advances in view synthesis and real-time rendering have achieved photorealistic quality at impressive rendering speeds. While Radiance Field-based methods achieve state-of-the-art quality in challenging scenarios such as in-the-wild captures and large-scale scenes, they often suffer from excessively high compute requirements linked to volumetric rendering. Gaussian Splatting-based methods, on the other hand, rely on rasterization and naturally achieve real-time rendering but suffer from brittle optimization heuristics that underperform on more challenging scenes. In this work, we present RadSplat, a lightweight method for robust real-time rendering of complex scenes. Our main contributions are threefold. First, we use radiance fields as a prior and supervision signal for optimizing point-based scene representations, leading to improved quality and more robust optimization. Next, we develop a novel pruning technique reducing the overall point count while maintaining high quality, leading to smaller and more compact scene representations with faster inference speeds. Finally, we propose a novel test-time filtering approach that further accelerates rendering and allows to scale to larger, house-sized scenes. We find that our method enables state-of-the-art synthesis of complex captures at 900+ FPS.
最近在视图合成和实时渲染方面的进步已经在令人印象深刻的渲染速度下实现了逼真的质量。虽然基于辐射场的方法在诸如野外捕捉和大规模场景等挑战性场景中实现了最先进的质量,但它们通常受到与体积渲染相关的过高计算需求的困扰。另一方面,基于高斯喷溅的方法依赖于光栅化,并自然实现实时渲染,但在更具挑战性的场景中表现不佳,因为它们受到脆弱的优化启发式的影响。在这项工作中,我们介绍了RadSplat,一种轻量级方法,用于复杂场景的稳健实时渲染。我们的主要贡献有三个方面。首先,我们使用辐射场作为优化基于点的场景表示的先验和监督信号,从而提高了质量并使优化更加稳健。接下来,我们开发了一种新颖的剪枝技术,减少了整体点数,同时保持高质量,导致更小且更紧凑的场景表示,以及更快的推理速度。最后,我们提出了一种新颖的测试时过滤方法,进一步加速了渲染,并允许扩展到更大的、房屋大小的场景。我们发现,我们的方法能够以900+ FPS的速度实现复杂捕捉的最先进合成。