Wednesday 26 February 2020, 1.30PM
Speaker(s): Claudio Guarnera (Computer Science, University of York)
In most 3D applications, the appearance of a homogeneous, opaque surface is represented by the Bidirectional Reflectance Distribution Function (BRDF). A great number of BRDF models have been developed for material appearance representation, due to the lack of a general BRDF model able to realistically reproduce the full range of existing materials. In digital 3D content creation, it is common to use many different commercial and in-house 3D rendering tools. Unfortunately, little has been done to facilitate consistent exchange of material models that preserve appearance across different renderers, and material appearance of rendered objects depends heavily on the underlying BRDF implementation. This is particularly complex, since it leads to visual deviations even between identically named reflectance models. Consequently, there is no way to guarantee visual consistency between different applications used in a typical work-flow pipeline. This talk will discuss a genetic algorithm-based approach to derive a perceptually accurate mapping between the parameter spaces of parametric BRDFs, exploiting features to which the human visual system is very sensitive, such as color and gradient differences.