In seeking to model and render CGI environments, material shaders and their respective maps have always sat as the foundation of making the sterile virtual environment appear realistic. Data capture methods that bring 3D objects into virtual environments go some way in bringing a facsimile of the real into an entirely virtual place, and typically a mesh that was generated by Structure from Motion photogrammetry or structured light scanning for example will feature a diffuse texture map. Taking these principles, Quixel Megascans is a service and suite that allows artists/modellers alike to take advantage of a library of scanned-in material maps with a range of acquired parameters. They even built their own material scanner to generate these maps.
This is big, because it hasn’t previously been so accessible to take advantage of captured material data and integrate it into a 3D workflow with your everyday BRDF shader in 3DS Max. It means 3D reconstructions of cultural heritage sites, for example, don’t have to be accurate 3D data punctuated with environments of artistic representations of foliage, generic mudbrick, masonry, etc, but are physically based representations of those materials from colour to translucency and specular reflections (Quixel list captured maps as PBR Calibrated Albedo, Gloss, Normal, Translucency, Cavity, Displacement, AO, Bump and Alpha). This is exciting because, alongside a trend of movement from biased towards unbiased rendering algorithms and the continual advance of computational power, these richer environments aren’t just increasingly ‘realistic looking’ but actually become more accurate representations of natural and built environments.