Alban Gauthier1, Robin Faury2, Jérémy Levallois2, Théo Thonat2, Jean-Marc Thiery2, Tamy Boubekeur2
1Institut Polytechnique de Paris, 2Adobe Research
ACM Transactions on Graphics (Proc. SIGGRAPH Asia 2022)
We present MIPNet, a novel approach for SVBRDF mipmapping which preserves material appearance under varying view distances and lighting conditions. As in classical mipmapping, our method explicitly encodes the multiscale appearance of materials in a SVBRDF mipmap pyramid. To do so, we use a tensor-based representation, coping with gradient-based optimization, for encoding anisotropy which is compatible with existing real-time rendering engines. Instead of relying on a simple texture patch average for each channel independently, we propose a cascaded architecture of multilayer perceptrons to approximate the material appearance using only the fixed material channels. Our neural model learns simple mipmapping filters using a differentiable rendering pipeline based on a rendering loss and is able to transfer signal from normal to anisotropic roughness. As a result, we obtain a drop-in replacement for standard material mipmapping, offering a significant improvement in appearance preservation while still boiling down to a single per-pixel mipmap texture fetch. We report extensive experiments on two distinct BRDF models.
@article{GFLTTB:2022:MIPNet, title = "MIPNet: Neural Normal-to-Anisotropic-Roughness MIP Mapping", author = "Alban Gauthier and Robin Faury and Jeremy Levallois and Theo Thonat and Jean-Marc Thiery and Tamy Boubekeur", year = "2022", journal = "ACM Transactions on Graphics (Proc. SIGGRAPH Asia 2022)", number = "6", volume = "41", articleno = "246", pages = "1--12", doi = "10.1145/3550454.3555487" }