Alpha-Blending

The concept seems rather intuitive, by which a single object or entity can be translucent. But another concept which is less intuitive, is that the degree to which it is so can be stated once per pixel, through an alpha-channel.

Just as every pixel can possess one channel for each of the three additive primary colors: Red, Green and Blue, It can possess a 4th channel named Alpha, which states on a scale from [ 0.0 … 1.0 ] , how opaque it is.

This does not just apply to the texture images, whose pixels are named texels, but also to Fragment Shader output, as well as to the pixels actually associated with the drawing surface, which provide what is known as destination alpha, since the drawing surface is also the destination of the rendering, or its target.

Hence, there exist images whose pixels have a 4channel format, as opposed to others, with a mere 3-channel format.

Now, there is no clear way for a display to display alpha. In certain cases, alpha in an image being viewed is hinted by software, as a checkerboard pattern. But what we see is nevertheless color-information and not transparency. And so a logical question can be, what the function of this alpha-channel is, which is being rendered to.

There are many ways in which the content from numerous sources can be blended, but most of the high-quality ones require, that much communication takes place between rendering-stages. A strategy is desired in which output from rendering-passes is combined, without requiring much communication between the passes. And alpha-blending is a de-facto strategy for that.

By default, closer entities, according to the position of their origins in view space, are rendered first. What this does is put closer values into the Z-buffer as soon as possible, so that the Z-buffer can prevent the rendering of the more distant entities as efficiently as possible. 3D rendering starts when the CPU gives the command to ‘draw’ one entity, which has an arbitrary position in 3D. This may be contrary to what 2D graphics might teach us to predict.

Alas, alpha-entities – aka entities that possess alpha textures – do not write the Z-buffer, because if they did, they would prevent more-distant entities from being rendered. And then, there would be no point in the closer ones being translucent.

The default way in which alpha-blending works, is that the alpha-channel of the display records the extent to which entities have been left visible, by previous entities which have been rendered closer to the virtual camera.

Continue reading Alpha-Blending

Why R2VB Should Not Simply be Deprecated

The designers of certain graphics cards / GPUs, have decided that Render-To-Vertex-Buffer is deprecated. In order to appreciate why I believe this to be a mistake, the reader first needs to know what R2VB is – or was.

The rendering pipeline of DirectX 9 versus DirectX 11 is somewhat different, yet also very similar, and DirectX 9 was extremely versatile, with a wide range of applications written that use it, while the fancier Dx 11 pipeline is more powerful, but has less of an established base of algorithms.

Dx 9 is approximated in OpenGL 2, while Dx 10 and Dx 11 are approximated in OpenGL 3(+) .

Continue reading Why R2VB Should Not Simply be Deprecated

I have succeeded at enabling DirectX 11 support for OGRE 1.10 .

According to This Posting and This Posting, it has proven trying for me to compile OGRE 1.10 with full support for DirectX 11.0 on a Windows 7 machine, even though I have Visual Studio Express 2015 installed.

The trick that finally worked for me, was to install the Legacy DirectX SDK, even though what it has to offer has supposedly already been merged into the Windows SDK, which I already had installed.

This installer, Dated June of 2010, is potentially dangerous to run, even though it is being offered by Microsoft, because its default setting is to reinstall the DirectX Runtime as well. If it was allowed to do so, it would revert some files that are already up-to-date, according to 2016, back to a state which was considered to be correct in 2010. This could damage the O/S or even render some software inoperable. This realization had put fear into me for years, of installing the DirectX SDK. Yet, with the version of the installer I obtained, there is a configuration dialog, from which we can Deselect ‘to install the Runtime’. Doing so also deselects ‘to install the DirectX Utilities’. Running the installer then produces a still-conspicuous delay, where its progress bar shows “Installing Runtime”. Possibly, it just makes sure that the Runtime is installed. Then, it threatens ‘Reinstalling the Visual C++ Runtime’ in a similar fashion, and finally exits with an error code – but apparently, with all the previously-installed DirectX-based software unaffected.

This paved the way for me to be able to compile DirectX 11 support for OGRE 1.10 . I did this just to prove to myself that I could, and do not plan to make this a part of my local SDK, because the way OGRE 1.10 implements it, Dx11 offers no advantages, over using OpenGL3+ .

Dirk

(Edit 09/18/2016 : ) If the reader would like to duplicate this success, then he may also want to read the Topic which I created on the OGRE Forum.

It is a pity that OGRE developers seem to take so little time, with my petty – apparently naive questions there.

 

OGRE 1.10 and Geometry Shader Support

On my Debian / Jessie laptop ‘Klystron’, I gave up some time ago, in compiling OGRE 1.10, this just resulting in a mess. Instead, on that platform I was only able to build OGRE 1.9 from source code, which is fully stable, as long as we stick to the OpenGL (2) Render System.

On the Windows 7 desktop ‘Mithral’, my first attempt to build OGRE 1.10 also failed, resulting in a successful compile, but then a silent crash of the Sample Browser, indicating that something was not built right.

If I have OGRE installed on several machines, it makes most sense for them to be the same OGRE version, so next I compiled OGRE 1.9 on ‘Mithral’, successfully, using Visual Studio 2015.

In response to my observation, that I had not set up the dependencies correctly on the ‘Mithral’ build attempt of OGRE 1.10, I now re-attempted that, paying closer attention to all the details, specified within CMake 3.6.2 .

I found that I could not only build OGRE 1.10 on this machine but also run it. And I found, that contrarily to how it was with OGRE 1.9, the OpenGL3+ Render System of OGRE 1.10 is capable of producing working Geometry Shaders. At least this gives me some access to Geometry Shaders, and casts doubt back onto the MESA Drivers of ‘Klystron’, which would cause the X-Server to crash, if I ran the Geometry Shader example of OGRE 1.10 .

What I still cannot do, is build the DirectX 11 Render System that is supposed to work under OGRE 1.10 , due to errors I have not yet pinned down. But what that was supposed to bring me, is now being provided successfully by the OpenGL3+ Render System. Both of these two are stated by the OGRE Team, to be experimental render systems, for version 1.10 still.

I suspect that OGRE 1.10 is being neglected by their team, in favor of OGRE 2.1, which is their effort to port OGRE to Android.

Dirk