I have succeeded at enabling DirectX 11 support for OGRE 1.10 .

According to This Posting and This Posting, it has proven trying for me to compile OGRE 1.10 with full support for DirectX 11.0 on a Windows 7 machine, even though I have Visual Studio Express 2015 installed.

The trick that finally worked for me, was to install the Legacy DirectX SDK, even though what it has to offer has supposedly already been merged into the Windows SDK, which I already had installed.

This installer, Dated June of 2010, is potentially dangerous to run, even though it is being offered by Microsoft, because its default setting is to reinstall the DirectX Runtime as well. If it was allowed to do so, it would revert some files that are already up-to-date, according to 2016, back to a state which was considered to be correct in 2010. This could damage the O/S or even render some software inoperable. This realization had put fear into me for years, of installing the DirectX SDK. Yet, with the version of the installer I obtained, there is a configuration dialog, from which we can Deselect ‘to install the Runtime’. Doing so also deselects ‘to install the DirectX Utilities’. Running the installer then produces a still-conspicuous delay, where its progress bar shows “Installing Runtime”. Possibly, it just makes sure that the Runtime is installed. Then, it threatens ‘Reinstalling the Visual C++ Runtime’ in a similar fashion, and finally exits with an error code – but apparently, with all the previously-installed DirectX-based software unaffected.

This paved the way for me to be able to compile DirectX 11 support for OGRE 1.10 . I did this just to prove to myself that I could, and do not plan to make this a part of my local SDK, because the way OGRE 1.10 implements it, Dx11 offers no advantages, over using OpenGL3+ .

Dirk

(Edit 09/18/2016 : ) If the reader would like to duplicate this success, then he may also want to read the Topic which I created on the OGRE Forum.

It is a pity that OGRE developers seem to take so little time, with my petty – apparently naive questions there.

 

OGRE 1.10 and Geometry Shader Support

On my Debian / Jessie laptop ‘Klystron’, I gave up some time ago, in compiling OGRE 1.10, this just resulting in a mess. Instead, on that platform I was only able to build OGRE 1.9 from source code, which is fully stable, as long as we stick to the OpenGL (2) Render System.

On the Windows 7 desktop ‘Mithral’, my first attempt to build OGRE 1.10 also failed, resulting in a successful compile, but then a silent crash of the Sample Browser, indicating that something was not built right.

If I have OGRE installed on several machines, it makes most sense for them to be the same OGRE version, so next I compiled OGRE 1.9 on ‘Mithral’, successfully, using Visual Studio 2015.

In response to my observation, that I had not set up the dependencies correctly on the ‘Mithral’ build attempt of OGRE 1.10, I now re-attempted that, paying closer attention to all the details, specified within CMake 3.6.2 .

I found that I could not only build OGRE 1.10 on this machine but also run it. And I found, that contrarily to how it was with OGRE 1.9, the OpenGL3+ Render System of OGRE 1.10 is capable of producing working Geometry Shaders. At least this gives me some access to Geometry Shaders, and casts doubt back onto the MESA Drivers of ‘Klystron’, which would cause the X-Server to crash, if I ran the Geometry Shader example of OGRE 1.10 .

What I still cannot do, is build the DirectX 11 Render System that is supposed to work under OGRE 1.10 , due to errors I have not yet pinned down. But what that was supposed to bring me, is now being provided successfully by the OpenGL3+ Render System. Both of these two are stated by the OGRE Team, to be experimental render systems, for version 1.10 still.

I suspect that OGRE 1.10 is being neglected by their team, in favor of OGRE 2.1, which is their effort to port OGRE to Android.

Dirk

 

I have been test-driving Visual Studio Express 2015.

One of the projects which I attempted today on the Windows 7 desktop computer I name ‘Mithral’, was to compile OGRE 1.10 . This is an unstable build of OGRE, and it would be helpful for me to know whether this instability comes more, from the software, or from the weaker graphics card on my Linux laptop ‘Klystron’, which I have already had to switch to OGRE 1.9 .

My initial attempt to compile OGRE 1.10 failed in a foreboding way: The rendering window would open, and then be black for a second, and then give way to the nondescript Windows Error box, telling me that the program had crashed. There were no traces of error messages in the log to explain why. This is called “a silent crash”. Hypothetically, it could point to a borked compiler setup.

So what I did next was to download an OGRE 1.9 SDK, which had been entirely pre-compiled by OGRE devs. But then I knew this had been a waste of time, because it nowhere proves that my compiler can actually compile. And yes, that SDK was unstable on my stronger graphics card.

I have come to learn something. Even having a Microsoft compiler does not guarantee that I will be able to compile a DirectX rendering engine. The main reason for this, is the fact that DirectX 9 is almost deprecated. The up-to-date Microsoft SDK no longer includes libraries and header files, which legacy DirectX applications linked against – including OGRE. This means that the OGRE SDK can offer DirectX 11 support, not Dx 9, and its DirectX 11 support is unstable out-of-the-box. This is ultimately a fault of the software.

What I did next was to compile OGRE 1.9 . When I was setting up the CMake parameters to do so, I realized that when I had been setting up CMake for 1.10 , I had made mistakes that could have led to severe code-linking issues. Specifically, under Windows, it is tedious how we need to link to each core-dependency one-by-one. Under Linux or MinGW these can all get picked up in an automated batch. But with MSVC, it is not so easy.

Compiling OGRE 1.9 with the OpenGL2 and the OpenGL3+ rendering engines was a success, and so finally proved that my new compiler can produce moving, 3D images. Unfortunately though, 1.9 was code that still used the deprecated way of linking to the Windows SDK for Dx 9 and 11 , so that I could not build the DirectX 11 engine after all.

I found that just with OGRE 1.9.0 , and OpenGL2, I was able to get a larger set of animations to run, from the Sample Browser, than I was on my laptop. This proves, that much of the trouble I was having with ‘Klystron’, or before that, ‘Maverick’, were in fact hardware issues.

The Iso-Surface Demo works along a different principle than I had anticipated. It is one of those applications, which use a Fragment Shader, which renders to a Vertex Buffer, set up to look like a Pixel Buffer. The Pixel format output has been cleverly engineered also to correspond to a vertex attribute structure, thus achieving what was once known as ‘a poor mans Geometry Shader’.

The Iso-Surface Demo is supposed to work, even with the OpenGL2 rendering engine. Only, on my laptop, there is no support for Render To Vertex Buffer, aka ‘R2VB’.

With OGRE 1.9 , the OpenGL3+ rendering engine remains unstable as heck, unusable.

There is an issue with how VS 2015 ultimately works. Since ‘Mithral’ possesses 8 cores, threaded as 4, VS will ultimately try to build up to 8 targets at once. This pushes performance to the max, but at the expense of stability. Today I was pushing this compiler for hours and hours – and I later learned that it truly maxes out all 8 cores.

I found a setting to correct this for the future. Given 8 cores, I would like a maximum of 6 compile targets worked on concurrently. This is just, so that the system will have a full CPU left, to work on other tasks, should things go wonky. Because by the end of the day, things did go wonky – for whatever reason.

Dirk

(Edit 09/16/2016 : ) Another disadvantage, If we have an 8-core CPU, and If our compiler wants to build 8 targets concurrently for that reason, is the fact that 1 source file being compiled at a time, for each target, can consume an unpredictable amount of RAM. If the amount of RAM on the system is not taken into consideration, an ‘OOM’ (‘Out Of Memory’) condition can arise, because of the arbitrary 8 jobs running at once.

And I think that last night, such an OOM condition did arise, because I was installing tons of software… I have 8GB of RAM on ‘Mithral’. I performed numerous defragmentations as well, and, because many programs do have memory leaks, everything last night may have led up to an OOM condition by the end of the day.

I also installed Boost 1.61, and Boost 1.59, where before I only had Boost 1.47. And Boost 1.61 may in fact be necessary for OGRE 1.10 to compile, as another reason why my first attempt to compile that had failed.

 

I have now chosen against keeping the desktop cube animations.

In This Posting, I wrote that I had enabled a ‘desktop cube’, a compositing effect, on the laptop I name ‘Klystron’. In fact, this KDE effect consisted of two separate effects, the ‘Desktop Cube’, and the ‘Cube Animation’, which look very similar to each other.

Since then, I have discovered that enabling this much compositing (1) prevents me from turning compositing off quickly, with <Alt>+<Shift>+F12, and (2) prevents the screensavers from activating, reducing my screen locking capability to a simple screen locker.

It did not result in any crashes or errors, but presumably to prevent such errors, KDE just did not launch the screensaver.

So even though I felt that it was fun for a while to have these effects enabled, they added rather little to the functionality. I am in favor of using compositing, to whatever extent doing so causes the GPU to take work off the hands of the CPU. But after that, once increased use of these effects interferes in any way with the reliability of the S/W, I will choose against further increases.

And so now I have reversed these desktop effect settings, and gotten my screensaver back.

However, I also double-checked what I had run in to in This Posting, to see whether the GL 3+ Render System now works, belonging to OGRE 1.10 . And alas, the behavior of the OGRE GL 3+ Render System is unchanged. I did not push it to a full crash, but read the debug output again, before it came to that.

Dirk

(Edit 4/24/2016 : ) This behavior, of not having a screensaver, was logical. With the Desktop Cube effect, the usual KDE output was being rendered to 4 of the faces of a cube, defined as target texture images in OpenGL (Render To Texture, ‘RTT’). This cube was then rendered as a 3D Entity, to the actual screen displayed.

This 3D scene poses the question, of where the screensaver should be inserted. In theory, it would have been possible for the KDE screensaver to be rendered to each of these 4 active cube faces, but not to the actual screen, that acted as the virtual camera position. A simple lock-screen was rendered there.

This was similar, to how the KDE wallpaper was also potentially different, from the Desktop Cube, effect wallpaper. With this effect, there is the standard KDE wallpaper which is seen on each of the active cube faces, which is different from the wallpaper one might choose for the whole scene.

It is a good thing, that the developers added as default behavior, to offer a regular lock screen for the display, when the real screensaver was running, and potentially rendering its animation to 4 of the faces of the cube. Because in the latter place, that screen saver would also have been failing to lock the display.

And one of the behaviors which I have read about, state that there can be problems on some computers, to go into and out of Suspend Mode, if there is active 3D rendering (on the GPU) taking place in the background. A Desktop Cube would be an example, where a 3D rendering pipeline has been inserted, in such a way that it is not removed or deactivated, when the effect seems passive, or when we try to put the laptop into Suspend Mode.

This has sometimes resulted in crashes, because the graphics driver was not up to the Debian version of Suspend Mode, or even because the Index Buffer of the GPU contained garbled data, because the VRAM of the GPU was losing power, in Debian Suspend Mode.