## Major Update, On Computer ‘Plato’

(As of 20h30 : )

One fact which may confuse some of my readers, is that I have more than one computer. The server of my Web-site and blog, is a Debian / Jessie, Debian 8 system called ‘Phoenix’.

But a much-more interesting computer is a Debian / Stretch, Debian 9 system called ‘Plato’. That is an early, 8-core machine with 12GB of RAM, which does not act as much of a server, but that has an NVidia GeForce GTX460 graphics card, with proprietary drivers.

This evening, ‘Plato’ received a set of updates, that both upgraded its Linux version from Debian 9.4 to Debian 9.5, as well as upgrading its graphics drivers. Aside from some minor misbehavior before the required reboot, the updates seemed to take place smoothly. However, I must now test my graphics capabilities, and all things related to its GPU.

‘Plato’ also received an update to its CUDA drivers.

(Update 21h10 : )

On the computer I name ‘Plato’, I can still play the game ‘Quern – Undying Thoughts’ (through my Steam account), and The LuxCore Render still works, which verifies my OpenCL capabilities.

I never did a thorough test of the CUDA capabilities, but because my NVidia control panel still tells me I have 336 CUDA cores, I’m assuming its basic functionality is still intact.

Dirk

## Getting Steam to run with proprietary nVidia.

According to this earlier posting, I had switched my Debian / Stretch, Debian 9 -based computer named ‘Plato’ from the open-source ‘Nouveau’ drivers, which are delivered via ‘Mesa’ packages, to the ‘proprietary nVidia drivers’, because the latter offer more power, in several ways.

But then one question which we’d want an answer to, is how to get “Steam” to run. Just from the Linux package manager, the available games are slim picking, and through a Steam membership, we can buy Linux-versions of at least some powerful games, meaning, to pay for with money.

But, when I tried to launch Steam naively, which used to launch, I only got a message-box which said, that Steam could not find the 32-bit version of ‘libGL.so’ – and then Steam died. This temporary result ‘makes sense’, because I had only installed the default, 64-bit libraries, that go with the proprietary packages. Steam is a 32-bit application by default, and I have a multi-arch setup, as a prerequisite.

And so my next project became, to create a 32-bit as well as the existing, 64-bit interface to the rendering system.

The steps that I took assume, that I had previously chosen to install the ‘GLVND’ version of the GLX binaries, and unless the reader has done same, the following recipe will be incorrect. Only, the ‘GLVND’ packages which I initially installed, are not listed in the posting linked to above; they belonged to the suggested packages, which I wrote I had written down on paper, and then added to the command-line, which transformed my graphics system.

When I installed the additional, 32-bit libraries, I did get a disturbing error message, but my box still runs.

## Finding Out How Many GPU Cores We Have Under Linux, Revisited!

In this earlier posting, I tried to describe in a roundabout way, what the shader cores of a GPU – the Graphics Processing Unit – actually do.

And in this earlier posting, I tried to encourage even Linux-users to find out approximately how many GPU cores they have, given a correct install of the open standard OpenCL – for actual GPU computing – using the command-line tool ‘clinfo’. But that use of ‘clinfo’ left much to be desired, including the fact that sometimes, OpenCL will only assign a maximum number of cores belonging to each core group, that’s a power of 2, even if there may be a non-power-of-two number of cores.

Well, if we have the full set of nVidia drivers installed, nVidia CUDA – which is a competitor to OpenCL, as well as having the nVidia Settings GUI installed, it turns out that there is a much-more accurate answer:

But, this method has as drawback, that it’s only available to us, when we have both nVidia hardware, and the proprietary drivers installed. This could lead some people to the false impression, that maybe, only nVidia graphics cards have real GPUs?

Dirk

## Setting Up VESAFB Under GRUB2

In This Earlier Posting, I had written, that I switched to the proprietary nVidia graphics-drivers on the computer I name ‘Plato’, but that for the purposes of managing several console-sessions using

• <Ctrl>+<Alt>+F1 ,
• <Ctrl>+<Alt>+F7

My customary solution, to set up ‘uvesafb‘, no longer works. What happens is that everything runs fine, until the command is given to switch back to the X-server session, at which point the system crashes. Thus, as I had left it at first, console-sessions were available, but at some horribly-low default resolution (without ‘uvesafb’). This had to be remedied, and the way I chose to solve this was actually to use the older ‘vesafb’, which is not a 3rd-party frame-buffer ‘device’, but rather a set of kernel-instructions / kernel-settings, which can be specified in the file ‘/etc/default/grub’.

Because my computers use ‘GRUB2′, the most-elegant way to solve this problem would be, to put the following two lines / uncomment and adapt, like so:


GRUB_GFXMODE=1920x1080