A Perfectly Accurate, Synthetic dataset for Multi-View Stereopsis (PASMVS) is presented, consisting of 400 scenes and 18,000 model renderings together with ground truth depth maps, camera intrinsic and extrinsic parameters, and binary segmentation masks. Every scene is rendered from 45 different camera views in a circular pattern, using Blender's path-tracing rendering engine. Every scene is composed from a unique combination of two camera focal lengths, four 3D models of varying geometrical complexity, five high definition, high dynamic range (HDR) environmental textures to replicate photorealistic lighting conditions and ten materials. The material properties are primarily specular, with a selection of more diffuse materials for reference. The combination of highly specular and diffuse material properties increases the reconstruction ambiguity and complexity for MVS reconstruction algorithms and pipelines, and more recently, state-of-the-art architectures based on neural network implementations. PASMVS serves as an addition to the wide spectrum of available image datasets employed in computer vision research, improving the precision required for novel research applications.
The original PASMVS journal article can be downloaded directly from the open access Data in Brief journal. The low-resolution (576p) PASSMVS dataset can be obtained from the Mendeley data repository. The high-resolution version of the dataset (1536p) can be downloaded from Google Drive; download all 10 part files (total size: 131 Gb) and unzip into a single directory.
PASMVS samples illustrating the selection of models, variation in illumination, material properties and camera focal lengths. (a) bunny, bricks, 35 mm, (b) bunny, brushedmetal, 50 mm, (c) teapot, concrete, 35 mm, (d) teapot, ceramic, 50 mm, (e) armadillo, copper, 25 mm, (f) armadillo, grungemetal, 50 mm, (g) dragon, piano, 25 mm, (h) dragon, marble, 50 mm.
A total of 400 scene files are provided, each containing a sub-folder for the rendered images files (45 per scene folder), camera properties and binary segmentation masks and rendered depth maps (PFM file format, see BlendedMVS GitHub page). The scene files are configured in a 85%-15% test train split with the appropriate scene folder names stored in text files in the root directory.
The PASMVS repository is maintained by André Broekman in conjunction with the Department of Civil Engineering, University of Pretoria, South Africa. Feel free to get in touch via LinkedIn, ResearchGate or Twitter.