How To Install Yafaray Under Linux

One of the computing subtopics I dabble in, is the acquisition of 3D-graphics software. Therefore, I already have “Blender 2.78a”, which has its own built-in software-rendering engine, and I have several other rendering engines installed on my Linux-based computers.

Further, the rendering engines by themselves can be useless, unless they integrate well with a GUI (such as with Blender). And so one undertaking which I’ll typically reach with a given computer, is to install “Yafaray”, which used to be ‘Yafray’, which stood for ‘Yet Another Free Ray-Tracer’. If it’s installed properly, Blender can render its scenes, using Yafaray, but from within Blender.

Yafray used to be a much simpler piece of software to install than it has become. But I’m sure the effort I put into it this evening, will be well-worth it eventually. What I’m used to doing is to download a source-tree, and if it’s CMake-based, to run ‘cmake-gui‘ on it, to custom-pick my build options, and to go. But as it happens with Yafaray, this approach led to near chaos. What this did, was to compile all the source-code properly into libraries, but then to install those libraries to nonsensical locations within my system folders. One reason was the fact that a part of the project was to create Python 3 bindings, and another was the need for the Blender-integration, where modern Blender versions are based on Python 3. In any case I was sure to install all the build dependencies via my package-manager, but doing so was not enough to obtain working outcomes.

funbutterflysphere3-0001

Continue reading How To Install Yafaray Under Linux

Widening Our 3D Graphics Capabilities under FOSS

Just so that I can say that my 3D Graphics / Model Editing capabilities are not strictly limited to “Blender”, I have just installed the following Model Editors on the Linux computer I name ‘Klystron’, that are not available through my package-manager:

I felt that it might help others for me to note the URLs above, since correct and useful URLs can be hard to find.

In addition, I installed the following Ray-Tracing Software-Rendering Engines, which do not come with their own Model Editors:

Finally, the following was always available through my package manager:

  • Blender
  • K-3D
  • MeshLab
  • Wings3D

 

  • PovRay

 

In order to get ‘Ayam’ to run properly – i.e., be able to load its plugins and therefore load ‘Aqsis’ shaders, I needed to create a number of symlinks like so:

( Last Updated on 10/31/2017, 7h55 )

Continue reading Widening Our 3D Graphics Capabilities under FOSS

The role Materials play in CGI

When content-designers work with their favorite model editors or scene editors, in 3D, towards providing either a 3D game or another type of 3D application, they will often not map their 3D models directly to texture images. Instead, they will often connect each model to one Material, and the Material will then base its behavior on zero or more texture images. And a friend of mine has asked, what this describes.

Effectively, these Materials replace what a programmed shader would do, to define the surface properties of the simulated, 3D model. They tend to have a greater role in CPU rendering / ray tracing than they do with raster-based / DirectX or OpenGL -based graphics, but high-level editors may also be able to apply Materials to the hardware-rendered graphics, IF they can provide some type of predefined shader, that implements what the Material is supposed to implement.

A Material will often state such parameters as Gloss, Specular, Metallicity, etc.. When a camera-reflection-vector is computed, this reflection vector will land in some 3D direction relative to the defined light sources. Hence, a dot-product can be computed between it and the direction of the light source. Gloss represents the power to which this dot-product needs to be raised, resulting in specular highlights that become narrower. Often Gloss must be compensated for the fact that the integral of a power-function, is less than (1.0) times a higher power-function, and that therefore, the average brightness of a surface with gloss would seem to decrease…

But, if a content-designer enrolls a programmed shader, especially a Fragment Shader, than this shader replaces everything that a Material would otherwise have provided. It is often less-practical, though not impossible, to implement a programmed shader in software-rendered contexts, where mainly for this reason, the use of Materials still prevails.

Also, the notion often occurs to people, however unproven, that Materials will only provide basic shading options, such as ‘DOT3 Bump-Mapping‘, so that programmed shaders need to be used if more-sophisticated shading options are required, such as Tangent-Mapping. Yet, as I just wrote, every blend-mode a Material offers, is being defined by some sort of predefined shader – i.e. by a pre-programmed algorithm.

OGRE is an open-source rendering system, which requires that content-designers assign Materials to their models, even though hardware-rendering is being used, and then these Materials cause shaders to be loaded. Hence, if an OGRE content-designer wants to code his own shader, he must first also define his own Material, which will then load his custom shader.

Continue reading The role Materials play in CGI