Widening Our 3D Graphics Capabilities under FOSS

Just so that I can say that my 3D Graphics / Model Editing capabilities are not strictly limited to “Blender”, I have just installed the following Model Editors on the Linux computer I name ‘Klystron’, that are not available through my package-manager:

I felt that it might help others for me to note the URLs above, since correct and useful URLs can be hard to find.

In addition, I installed the following Ray-Tracing Software-Rendering Engines, which do not come with their own Model Editors:

Finally, the following was always available through my package manager:

  • Blender
  • K-3D
  • MeshLab
  • Wings3D

 

  • PovRay

 

In order to get ‘Ayam’ to run properly – i.e., be able to load its plugins and therefore load ‘Aqsis’ shaders, I needed to create a number of symlinks like so:

( Last Updated on 10/31/2017, 7h55 )

Continue reading Widening Our 3D Graphics Capabilities under FOSS

The role Materials play in CGI

When content-designers work with their favorite model editors or scene editors, in 3D, towards providing either a 3D game or another type of 3D application, they will often not map their 3D models directly to texture images. Instead, they will often connect each model to one Material, and the Material will then base its behavior on zero or more texture images. And a friend of mine has asked, what this describes.

Effectively, these Materials replace what a programmed shader would do, to define the surface properties of the simulated, 3D model. They tend to have a greater role in CPU rendering / ray tracing than they do with raster-based / DirectX or OpenGL -based graphics, but high-level editors may also be able to apply Materials to the hardware-rendered graphics, IF they can provide some type of predefined shader, that implements what the Material is supposed to implement.

A Material will often state such parameters as Gloss, Specular, Metallicity, etc.. When a camera-reflection-vector is computed, this reflection vector will land in some 3D direction relative to the defined light sources. Hence, a dot-product can be computed between it and the direction of the light source. Gloss represents the power to which this dot-product needs to be raised, resulting in specular highlights that become narrower. Often Gloss must be compensated for the fact that the integral of a power-function, is less than (1.0) times a higher power-function, and that therefore, the average brightness of a surface with gloss would seem to decrease…

But, if a content-designer enrolls a programmed shader, especially a Fragment Shader, than this shader replaces everything that a Material would otherwise have provided. It is often less-practical, though not impossible, to implement a programmed shader in software-rendered contexts, where mainly for this reason, the use of Materials still prevails.

Also, the notion often occurs to people, however unproven, that Materials will only provide basic shading options, such as ‘DOT3 Bump-Mapping‘, so that programmed shaders need to be used if more-sophisticated shading options are required, such as Tangent-Mapping. Yet, as I just wrote, every blend-mode a Material offers, is being defined by some sort of predefined shader – i.e. by a pre-programmed algorithm.

OGRE is an open-source rendering system, which requires that content-designers assign Materials to their models, even though hardware-rendering is being used, and then these Materials cause shaders to be loaded. Hence, if an OGRE content-designer wants to code his own shader, he must first also define his own Material, which will then load his custom shader.

Continue reading The role Materials play in CGI