Welp, went on a long vacation to California & Nevada with my family and when I came back a shitload of interesting papers where released at Siggraph 2013, Carmack has left iD (sort of) to start to work at Oculus, Balmer is going to leave Microsoft (to the relief of .. everyone), NiN is back (YES!) and .. Ben Affleck is the next Batman.
I'm pretty sure I accidentally stepped through a wormhole there.
I can't leave you guys alone for a second, can I?
Bonus points for those who can tell me who's company's logo was based on this mountain:
Anyway, some random thoughts:
This paper had an interesting idea that was new to me .. take pre-defined materials (such as iron, aluminum, dust) and combine them together with masks. This works really well with physically based shading where your materials might very well be measured real world materials (or at least based on them).
So I couldn't help but wonder .. if materials are build up like this .. why not just store the masks together with a material id per mask. The masks are much easier to compress (especially if they don't need to be very precise and can be low resolution) and the final textures could be generated at run-time .. maybe even in the pixel shader if the base materials are not texture based (but quality might suffer too much)
Obviously this will only work if the number of material layers are within reason.
The base materials could have pre-generated mipmaps, avoiding the issue of needing to generate mipmaps for your final texture after combining the masks. Which is nice since you don't want to do the whole "turn normals into variance into roughness for a lower res mip" and "fixing up your transparency by estimating if your texture is still roughly as transparent as your higher res mip" for your mipmap chain at loading time.
So what if you could store the masks in your G-buffer instead of low quality specular / diffuse / normal / roughness? Still would make it hard to add fine details to your materials though, especially in the normal map. The surface normal would still need to be stored in the G-buffer, so perhaps normals could be stored directly and all other material properties would be done through masks? Roughness could probably still be stored per material .. Material id's would still need to be stored as well (one for each mask), so it might end up not being a win storage space wise.
Of course, if all the lighting would be rendered in (virtual) texture space then you wouldn't need a G-buffer and you could just combine your masks into a (virtual) texture cache when you need it.