OGRE 1.11.5 Working on ‘Phosphene’

One of the open-source software projects which has often fascinated me, is called OGRE, which stands for Open-Source Graphics Rendering Engine. It’s a very powerful set of libraries, that allows good coders to design 3D graphics applications, which take full advantage of hardware-accelerated – i.e., GPU-based – rendering, of virtual 3D scenes designed by such users, into simulated 2D camera views, within the same scene. This is one of the most common modes in which 3D Graphics is operated.

One of the things that OGRE is not, is a full-fledged game engine unto itself. This is due to:

  • Lack of sound implementation (Additionally linking applications to the SDL Library may solve that),
  • Lack of scripting support, without some sort of add-on. I think I compiled it with Python support, which would supply scripting, if my coding was good enough.

Modern builds of OGRE break with the past, in that they no longer use ‘OIS’ as their input system. Instead, at least their Sample Browser uses the ‘SDL library’ to do the same.

One of the feats which I have now accomplished on the computer named ‘Phosphene’, which is a Debian / Stretch, Debian 9 system, was to compile version 1.11.5 of this engine because I’m curious about Game Design, which I have been for a long time. And one of the reasons I feel that this software is stable on Phosphene, is due to the information which I already provided in This past posting. The past posting announced observations which I made, when this same hardware was called the computer ‘Plato’, but already running Debian Stretch.

What my observation essentially suggests is, that running 3D, OpenGL applications can in fact break the compositor because they suspend it, but that there is a work-around.

(Updated 2/20/2019, 19h00 … )

Continue reading OGRE 1.11.5 Working on ‘Phosphene’

Major Update, On Computer ‘Plato’

(As of 20h30 : )

One fact which may confuse some of my readers, is that I have more than one computer. The server of my Web-site and blog, is a Debian / Jessie, Debian 8 system called ‘Phoenix’.

But a much-more interesting computer is a Debian / Stretch, Debian 9 system called ‘Plato’. That is an early, 8-core machine with 12GB of RAM, which does not act as much of a server, but that has an NVidia GeForce GTX460 graphics card, with proprietary drivers.

This evening, ‘Plato’ received a set of updates, that both upgraded its Linux version from Debian 9.4 to Debian 9.5, as well as upgrading its graphics drivers. Aside from some minor misbehavior before the required reboot, the updates seemed to take place smoothly. However, I must now test my graphics capabilities, and all things related to its GPU.

‘Plato’ also received an update to its CUDA drivers.

(Update 21h10 : )

On the computer I name ‘Plato’, I can still play the game ‘Quern – Undying Thoughts’ (through my Steam account), and The LuxCore Render still works, which verifies my OpenCL capabilities.

I never did a thorough test of the CUDA capabilities, but because my NVidia control panel still tells me I have 336 CUDA cores, I’m assuming its basic functionality is still intact.

Dirk

 

Getting Steam to run with proprietary nVidia.

According to this earlier posting, I had switched my Debian / Stretch, Debian 9 -based computer named ‘Plato’ from the open-source ‘Nouveau’ drivers, which are delivered via ‘Mesa’ packages, to the ‘proprietary nVidia drivers’, because the latter offer more power, in several ways.

But then one question which we’d want an answer to, is how to get “Steam” to run. Just from the Linux package manager, the available games are slim picking, and through a Steam membership, we can buy Linux-versions of at least some powerful games, meaning, to pay for with money.

But, when I tried to launch Steam naively, which used to launch, I only got a message-box which said, that Steam could not find the 32-bit version of ‘libGL.so’ – and then Steam died. This temporary result ‘makes sense’, because I had only installed the default, 64-bit libraries, that go with the proprietary packages. Steam is a 32-bit application by default, and I have a multi-arch setup, as a prerequisite.

And so my next project became, to create a 32-bit as well as the existing, 64-bit interface to the rendering system.

The steps that I took assume, that I had previously chosen to install the ‘GLVND’ version of the GLX binaries, and unless the reader has done same, the following recipe will be incorrect. Only, the ‘GLVND’ packages which I initially installed, are not listed in the posting linked to above; they belonged to the suggested packages, which I wrote I had written down on paper, and then added to the command-line, which transformed my graphics system.

When I installed the additional, 32-bit libraries, I did get a disturbing error message, but my box still runs.

Continue reading Getting Steam to run with proprietary nVidia.

Finding Out How Many GPU Cores We Have Under Linux, Revisited!

In this earlier posting, I tried to describe in a roundabout way, what the shader cores of a GPU – the Graphics Processing Unit – actually do.

And in this earlier posting, I tried to encourage even Linux-users to find out approximately how many GPU cores they have, given a correct install of the open standard OpenCL – for actual GPU computing – using the command-line tool ‘clinfo’. But that use of ‘clinfo’ left much to be desired, including the fact that sometimes, OpenCL will only assign a maximum number of cores belonging to each core group, that’s a power of 2, even if there may be a non-power-of-two number of cores.

Well, if we have the full set of nVidia drivers installed, nVidia CUDA – which is a competitor to OpenCL, as well as having the nVidia Settings GUI installed, it turns out that there is a much-more accurate answer:

screenshot_20180502_204640_c

But, this method has as drawback, that it’s only available to us, when we have both nVidia hardware, and the proprietary drivers installed. This could lead some people to the false impression, that maybe, only nVidia graphics cards have real GPUs?

Dirk