The role Materials play in CGI

When content-designers work with their favorite model editors or scene editors, in 3D, towards providing either a 3D game or another type of 3D application, they will often not map their 3D models directly to texture images. Instead, they will often connect each model to one Material, and the Material will then base its behavior on zero or more texture images. And a friend of mine has asked, what this describes.

Effectively, these Materials replace what a programmed shader would do, to define the surface properties of the simulated, 3D model. They tend to have a greater role in CPU rendering / ray tracing than they do with raster-based / DirectX or OpenGL -based graphics, but high-level editors may also be able to apply Materials to the hardware-rendered graphics, IF they can provide some type of predefined shader, that implements what the Material is supposed to implement.

A Material will often state such parameters as Gloss, Specular, Metallicity, etc.. When a camera-reflection-vector is computed, this reflection vector will land in some 3D direction relative to the defined light sources. Hence, a dot-product can be computed between it and the direction of the light source. Gloss represents the power to which this dot-product needs to be raised, resulting in specular highlights that become narrower. Often Gloss must be compensated for the fact that the integral of a power-function, is less than (1.0) times a higher power-function, and that therefore, the average brightness of a surface with gloss would seem to decrease…

But, if a content-designer enrolls a programmed shader, especially a Fragment Shader, than this shader replaces everything that a Material would otherwise have provided. It is often less-practical, though not impossible, to implement a programmed shader in software-rendered contexts, where mainly for this reason, the use of Materials still prevails.

Also, the notion often occurs to people, however unproven, that Materials will only provide basic shading options, such as ‘DOT3 Bump-Mapping‘, so that programmed shaders need to be used if more-sophisticated shading options are required, such as Tangent-Mapping. Yet, as I just wrote, every blend-mode a Material offers, is being defined by some sort of predefined shader – i.e. by a pre-programmed algorithm.

OGRE is an open-source rendering system, which requires that content-designers assign Materials to their models, even though hardware-rendering is being used, and then these Materials cause shaders to be loaded. Hence, if an OGRE content-designer wants to code his own shader, he must first also define his own Material, which will then load his custom shader.

Also, DirectX 7 first introduced the concept of a Fixed-Function Pipeline, in which Fragment Shaders could be defined using simplified semantics, that made it simpler than it has become with real, modern, programmed shaders, for content-designers to click together a Material of sorts, but a Material which could be executed directly by the GPU. The FFP momentarily gave hardware-rendering an edge over software-rendering, because FFP Fragment Shaders were already able to perform multiple stages of computing, to arrive at the final screen-color of one Pixel.

The Fixed-Function Pipeline is by now considered to be obsolete, and using DirectX 11 or higher, a programmed shader must take its place, Yet, our favorite rendering engine could provide libraries of them.

Also, a GUI-approach to designing Materials became popular as of some time ago, in which the stages are represented as Nodes, and in which each Node can have multiple inputs, which can be connected graphically to outputs of other nodes. Some Nodes are defined by texture images.

The high-level model editor needs to compile these graphics representations of Material Nodes into some sort of explicit code instructions, such as into GPU shader instructions, in order to implement them. And then one way to make this practical, is for each type of Node being offered by a model editor or scene editor, is defined by one specific, predefined subroutine, either destined to be executed on the GPU as part of a final Fragment Shader, or otherwise destined to be executed on the more-flexible CPU, as part of a software-rendered scene.

Dirk

 

Print Friendly, PDF & Email

One thought on “The role Materials play in CGI”

Leave a Reply

Your email address will not be published. Required fields are marked *

Please Prove You Are Not A Robot *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>