I’ve finally installed the proprietary nVidia graphics drivers.

In this earlier posting, I had written about the fact that the project was risky, to switch from the open-source ‘Nouveau’ graphics drivers, which are provided by a set of packages under Debian / Linux that contain the word ‘Mesa’, to the proprietary ‘nVidia’ drivers. So risky, that for a long time I faltered at doing this.

Well just this evening I made the switch. Under Debian / Stretch – aka Debian 9, this switch is relatively straightforward to accomplish. What we do is to switch to a text-session, using <Ctrl>+<Alt>+F1, and then kill the X-server. From there, we essentially just need to give the command (as root):

apt-get install nvidia-driver nvidia-settings nvidia-xconfig

Giving this command essentially allows the Debian package-managers to perform all the post-install steps, such as black-listing the Nouveau drivers. One should expect that this command has much work as its side-effects, as it pulls in quite a few dependencies.

(Edit 04/30/2018 :

In addition, the user must have up-to-date kernel / Linux -headers installed, because to install the graphics driver, also requires to build DKMS kernel modules. But, it’s always my assumption that I’d have kernel headers installed myself. )

When I gave this command the first time, apt-get suggested additional packages to me, which I wrote down on a sheet of paper. And then I answered ‘No’ to the question of whether or not to proceed (without those), so that I could add all the suggested packages onto a new command-line.

(Update 05/05/2018 :

The additional, suggested packages which I mentioned above, offer the ‘GLVND’ version of GLX. With nVidia, there are actually two ways to deliver GLX, one of which is an nVidia-centered way, and the other of which is a generic way. ‘GLVND’ provides the generic way. It’s also potentially more-useful, if later-on, we might  want to install the 32-bit versions as well.

However, if we fail to add any other packages to the command-line, then, the graphics-driver will load, but we won’t have any OpenGL capabilities at all. Some version of GLX must also be installed, and my package manager just happened to suggest the ‘GLVND’ packages.

Without OpenGL at all, the reader will be very disappointed, especially since even his desktop-compositing will not be running – at first.

The all-nVidia packages, which are not the ‘GLVND’ packages, offer certain primitive inputs from user-space applications, which ‘GLVND’ does not implement, because those instructions are not generically a part of OpenGL. Yet, certain applications do exist, which require the non-‘GLVND’ versions of GLX to be installed, and I leave it up to the reader to find out which packages do that – if the reader needs them – and to write their names on a sheet of paper, prior to switching drivers.

It should be noted, that once we’ve decided to switch to either ‘GLVND’- or the other- version of GLX, trying to change our minds, and to switch to the other version, is yet another nightmare, which I have not even contemplated so far. I’m content with the ‘GLVND’- GLX version. )

(Edited 04/30/2018 :

There is one aspect to installing up-to-date nVidia drivers which I should mention. The GeForce GTX460 graphics card does not support 3rd-party frame-buffers. These 3rd-party frame-buffer drivers would normally allow, <Ctrl>+<Alt>+F1, to show us not only a text-session, but one with decent resolution. Well, with the older, legacy graphics-chips, what I’d normally do is to use the ‘uvesafb’ frame-buffer drivers, just to obtain that. With modern nVidia hardware and drivers, this frame-buffer driver is incompatible. It even causes crashes, because with it, essentially, two drivers are trying to control the same hardware.

Just this evening, I tried to get ‘uvesafb’ working one more time, to no avail, just as it does work on the computer I name ‘Phoenix’. )

So the way it looks now for me, the text-sessions are available, but only in very low resolution. They only exist for emergencies now.

But this is the net result I obtained, after I had disabled the ‘uvesafb’ kernel module again:

 


dirk@Plato:~$ infobash -v
Host/Kernel/OS  "Plato" running Linux 4.9.0-6-amd64 x86_64 [ Kanotix steelfire-nightly Steelfire64 171013a LXDE ]
CPU Info        8x Intel Core i7 950 @ clocked at Min:1600.000Mhz Max:2667.000Mhz
Videocard       NVIDIA GF104 [GeForce GTX 460]  X.Org 1.19.2  [ 1920x1080 ]
Processes 262 | Uptime 1:16 | Memory 3003.9/12009.6MB | HDD Size 2000GB (6%used) | GLX Renderer GeForce GTX 460/PCIe/SSE2 | GLX Version 4.5.0 NVIDIA 375.82 | Client Shell | Infobash v2.67.2
dirk@Plato:~$

dirk@Plato:~$ clinfo | grep units
  Max compute units                               7
dirk@Plato:~$ clinfo | grep multiple
  Preferred work group size multiple              32
dirk@Plato:~$ clinfo | grep Warp
  Warp size (NV)                                  32
dirk@Plato:~$


 

So what this means in practice, is that I have OpenGL 4.5 on the computer named ‘Plato’ now, as well as having a fully-functional install of ‘OpenCL‘ and ‘CUDA‘, contrarily to what I had according to this earlier posting.

Therefore, GPU-computing will not just exist in theory for me now, but also in practice.

And this displays, that the graphics card on that machine ‘only’ possesses 224 cores after all, not the 7×48 which I had expected earlier, according to a Windows-based tool – no longer installed.

(Updated 04/29/2018 … )

Continue reading I’ve finally installed the proprietary nVidia graphics drivers.

I’m impressed with the Mesa drivers.

Before we install Linux on our computers, we usually try to make sure that we either have an NVIDIA or an AMD / Radeon  GPU  – the graphics chip-set – so that we can use either the proprietary NVIDIA drivers designed by their company to run under Linux, or so that we can use the proprietary ‘fglrx’ drivers provided by AMD, or so that we can use the ‘Mesa‘ drivers, which are open-source, and which are designed by Linux specialists. Because the proprietary drivers only cover one out of the available families of chip-sets, this means that after we have installed Linux, our choice boils down to a choice between either proprietary or Mesa drivers.

I think that the main advantage of the proprietary drivers remains, that they will offer our computers the highest version of OpenGL possible from the hardware – which could go up to 4.5 ! But obviously, there are also advantages to using Mesa , one of which is the fact that to install those doesn’t install a ‘blob’ – an opaque piece of binary code which nobody can analyze. Another is the fact that the Mesa drivers will provide ‘VDPAU‘, which the ‘fglrx’ drivers fail to implement. This last detail has to do with the hardware-accelerated playback of 2D video-streams, that have been compressed with one out of a very short list of Codecs.

But I would add to the possible reasons for choosing Mesa, the fact that its stated OpenGL version-number does not set a real limit, on what the graphics-chip-set can do. Officially, Mesa offers OpenGL 3.0 , and this could make it look at the surface, as though its implementation of OpenGL is somewhat lacking, as a trade-off against its other benefits.

One way in which ‘OpenGL’ seems to differ from its competitor in real-life: ‘DirectX’, is in the system by which certain DirectX drivers and hardware offer a numeric compute-level, and where if that compute-level has been achieved, the game-designer can count on a specific set of features being implemented. What seems to happen with OpenGL instead, is that 3.0 must first be satisfied. And if it is, the 3D application next checks individually, whether the OpenGL system available, offers specific OpenGL extensions by name. If the application is very-well-written, it will test for the existence of every extension it needs, before giving the command to load that extension. But in certain cases, a failure to test this can lead to the graphics card crashing, because the graphics card itself may not have the extension requested.

As an example of what I mean, my KDE / Plasma compositor settings, allow me to choose ‘OpenGL 3.1′ as an available back-end, and when I select it, it works, in spite of my Mesa drivers ‘only’ achieving 3.0 . I think that if the drivers had been stated to be 3.1 , then this could actually mean they lose backward-compatibility with 3.0 , while in fact they preserve that backward-compatibility as much as possible.

screenshot_20171127_185831

screenshot_20171127_185939

Continue reading I’m impressed with the Mesa drivers.

Why R2VB Should Not Simply be Deprecated

The designers of certain graphics cards / GPUs, have decided that Render-To-Vertex-Buffer is deprecated. In order to appreciate why I believe this to be a mistake, the reader first needs to know what R2VB is – or was.

The rendering pipeline of DirectX 9 versus DirectX 11 is somewhat different, yet also very similar, and DirectX 9 was extremely versatile, with a wide range of applications written that use it, while the fancier Dx 11 pipeline is more powerful, but has less of an established base of algorithms.

Dx 9 is approximated in OpenGL 2, while Dx 10 and Dx 11 are approximated in OpenGL 3(+) .

Continue reading Why R2VB Should Not Simply be Deprecated

OGRE 1.10 Compiled On Laptop ‘Klystron’

One of the projects which I had been working on, while my Hewlett-Packard laptop was running Windows 8.1 and named ‘Maverick’, was to compile “OGRE 1.10″ on it to the best of my ability. And one mistake which I was adhering to, was to insist on using the ‘MinGW’ compiler suite. OGRE developers had already tried to convince me to use the MS compiler, since that was a Windows computer, but I did not comply. This was particularly pedantic of me, since by now a free version of Visual Studio is available, that can compile OGRE.

So now that the H/W has Linux installed on it, I recommenced compiling OGRE, with native compilers and tools. But the results were not exactly spectacular.

One reason for the lackluster results is, the fact that ‘Klystron’ currently has ‘Mesa’ drivers loaded for its Radeon graphics card, instead of having the proprietary, binary ‘fglrx’ driver. Mesa will give it OpenGL 3.3 tops, while ‘fglrx’ would have given it OpenGL 4.5. And the latest OGRE samples include samples with Geometry Shaders, other OpenGL 3 features, and even some Tessellators, which would be OpenGL 4 features.

Apparently, when one pushes any Mesa Drivers to their limits, these will bug out and even cause the X-server to freeze. Thus, when I switched from testing the OGRE OpenGL 2 rendering engine, to its OpenGL 3+ rendering engine, I ran in to an X-server freeze.

This did not force me to hard boot, because often, during an X-server lockup, I can <Ctrl>+<Alt>+F1 to a console window, from there do a user and a root login, and then issue an ‘init 6‘ command, which will usually do a controlled reboot, in which all file systems are unmounted correctly before the restart.

There is one detail to what the Mesa Driver does, which I like a whole lot.They allow for shader code written in the language Cg to run, even though Cg is a legacy toolkit developed by nVidia, for use on nVidia graphics cards and not on Radeon.

The fact that the Mesa Drivers allow me to do that, differently from the limitations which were only imposed on me under Windows 8.1, means that with OGRE 1.10, the Terrain System finally works 100%. OGRE 1.10 uses GPU-generated terrain, whereas most graphics engines rely entirely on their CPU, to create terrain. The earlier inability to get terrain to work with this system, was more crippling than anything else.

But as long as I am not using the ‘fglrx’ drivers, all attempts to get OpenGL 3 features to work with OGRE utterly fail, including any hope of ISO surfaces, which rely on Geometry Shaders, and any hope of GS-based particles. My particles will be limited to Point Sprites then.

What one does in a situation such as this, is not just to throw out OGRE 1.10, but rather, to disable modules. And so I disabled the GL3+ rendering engine, as well as one ‘HLMS Sample’, and am now able to get many of the samples to run, including, importantly, the Terrain Samples.

 


 

Also, there remains an advantage to using Mesa Drivers, which was pointed out to me already on the Kanotix site. The Mesa Drivers allow hardware-acceleration of high-bandwidth, 2D video streams, via ‘vdpau’, while if I was to use ‘fglrx’, the decoding of MP4 Videos would be limited to CPU decoding, which is in itself lame, if we ever wanted to watch serious video streams. And since that laptop has a screen resolution of 1600×900, wanting to watch videos on it eventually, remains a very realistic prospect.

Dirk

 

(Edit : ) I suppose that one question which I should be asking myself, about why perhaps, the OGRE 1.10 GL3+ Rendering Engine does not work, would be whether this could be due to some incompatibility with the GL3.1 Desktop Compositing which I am already running on the same machine. There have been past cases, where OpenGL 2 from an application did not agree with OpenGL 2 Desktop Compositing, but those cases have generally been solved by the developers of the desktop managers.

On ‘Klystron’, I have rich desktop effects running, that use GL 3.1. So it does not seem obvious, that the Mesa Drivers as such, have problems implementing GL 3.

Also, there is a follow-up thought, as to why maybe, Cg was not working before. Whether or not our graphics cards support Cg, it is a necessary component of OGRE, to build their Cg Program Manager. Under Windows 8.1, I was always unsure of how to provide the OGRE Dependencies when building OGRE. But among those dependencies I always linked in a file named ‘Cg.dll‘, the origin of which was unknown to me.

It is exactly the sort of goofy mistake I would make, perhaps to have taken this DLL File from the install folders of Cg on ‘Maverick’, but for some reason simply to have taken a 64-bit DLL, into my 32-bit OGRE build, or to have taken a DLL from somewhere, which may not have been compatible for some other reason.

At least when we install dependencies under Linux from the package manager, such issues as linkage of code and location of folders, are also taken care of by the package manager. So I am sure that the Cg Program Manager belonging to OGRE, recognized the nVidia Cg packages when compiling now. It is just a bit odd that those were native Cg libraries with header files, while my graphics drivers remain the Mesa Drivers.