I’ve finally installed the proprietary nVidia graphics drivers.

In this earlier posting, I had written about the fact that the project was risky, to switch from the open-source ‘Nouveau’ graphics drivers, which are provided by a set of packages under Debian / Linux that contain the word ‘Mesa’, to the proprietary ‘nVidia’ drivers. So risky, that for a long time I faltered at doing this.

Well just this evening I made the switch. Under Debian / Stretch – aka Debian 9, this switch is relatively straightforward to accomplish. What we do is to switch to a text-session, using <Ctrl>+<Alt>+F1, and then kill the X-server. From there, we essentially just need to give the command (as root):

apt-get install nvidia-driver nvidia-settings nvidia-xconfig

Giving this command essentially allows the Debian package-managers to perform all the post-install steps, such as black-listing the Nouveau drivers. One should expect that this command has much work as its side-effects, as it pulls in quite a few dependencies.

(Edit 04/30/2018 :

In addition, the user must have up-to-date kernel / Linux -headers installed, because to install the graphics driver, also requires to build DKMS kernel modules. But, it’s always my assumption that I’d have kernel headers installed myself. )

When I gave this command the first time, apt-get suggested additional packages to me, which I wrote down on a sheet of paper. And then I answered ‘No’ to the question of whether or not to proceed (without those), so that I could add all the suggested packages onto a new command-line.

(Update 05/05/2018 :

The additional, suggested packages which I mentioned above, offer the ‘GLVND’ version of GLX. With nVidia, there are actually two ways to deliver GLX, one of which is an nVidia-centered way, and the other of which is a generic way. ‘GLVND’ provides the generic way. It’s also potentially more-useful, if later-on, we might  want to install the 32-bit versions as well.

However, if we fail to add any other packages to the command-line, then, the graphics-driver will load, but we won’t have any OpenGL capabilities at all. Some version of GLX must also be installed, and my package manager just happened to suggest the ‘GLVND’ packages.

Without OpenGL at all, the reader will be very disappointed, especially since even his desktop-compositing will not be running – at first.

The all-nVidia packages, which are not the ‘GLVND’ packages, offer certain primitive inputs from user-space applications, which ‘GLVND’ does not implement, because those instructions are not generically a part of OpenGL. Yet, certain applications do exist, which require the non-‘GLVND’ versions of GLX to be installed, and I leave it up to the reader to find out which packages do that – if the reader needs them – and to write their names on a sheet of paper, prior to switching drivers.

It should be noted, that once we’ve decided to switch to either ‘GLVND’- or the other- version of GLX, trying to change our minds, and to switch to the other version, is yet another nightmare, which I have not even contemplated so far. I’m content with the ‘GLVND’- GLX version. )

(Edited 04/30/2018 :

There is one aspect to installing up-to-date nVidia drivers which I should mention. The GeForce GTX460 graphics card does not support 3rd-party frame-buffers. These 3rd-party frame-buffer drivers would normally allow, <Ctrl>+<Alt>+F1, to show us not only a text-session, but one with decent resolution. Well, with the older, legacy graphics-chips, what I’d normally do is to use the ‘uvesafb’ frame-buffer drivers, just to obtain that. With modern nVidia hardware and drivers, this frame-buffer driver is incompatible. It even causes crashes, because with it, essentially, two drivers are trying to control the same hardware.

Just this evening, I tried to get ‘uvesafb’ working one more time, to no avail, just as it does work on the computer I name ‘Phoenix’. )

So the way it looks now for me, the text-sessions are available, but only in very low resolution. They only exist for emergencies now.

But this is the net result I obtained, after I had disabled the ‘uvesafb’ kernel module again:

 


dirk@Plato:~$ infobash -v
Host/Kernel/OS  "Plato" running Linux 4.9.0-6-amd64 x86_64 [ Kanotix steelfire-nightly Steelfire64 171013a LXDE ]
CPU Info        8x Intel Core i7 950 @ clocked at Min:1600.000Mhz Max:2667.000Mhz
Videocard       NVIDIA GF104 [GeForce GTX 460]  X.Org 1.19.2  [ 1920x1080 ]
Processes 262 | Uptime 1:16 | Memory 3003.9/12009.6MB | HDD Size 2000GB (6%used) | GLX Renderer GeForce GTX 460/PCIe/SSE2 | GLX Version 4.5.0 NVIDIA 375.82 | Client Shell | Infobash v2.67.2
dirk@Plato:~$

dirk@Plato:~$ clinfo | grep units
  Max compute units                               7
dirk@Plato:~$ clinfo | grep multiple
  Preferred work group size multiple              32
dirk@Plato:~$ clinfo | grep Warp
  Warp size (NV)                                  32
dirk@Plato:~$


 

So what this means in practice, is that I have OpenGL 4.5 on the computer named ‘Plato’ now, as well as having a fully-functional install of ‘OpenCL‘ and ‘CUDA‘, contrarily to what I had according to this earlier posting.

Therefore, GPU-computing will not just exist in theory for me now, but also in practice.

And this displays, that the graphics card on that machine ‘only’ possesses 224 cores after all, not the 7×48 which I had expected earlier, according to a Windows-based tool – no longer installed.

(Updated 04/29/2018 … )

(As of 04/27/2018 : )

I suppose that one observation which the reader may have, is that I opted to install the package ‘nvidia-xconfig’, which provides a utility to create an ‘xorg.conf’ file, even though according to modern Linux usage, the mere existence of an ‘xorg.conf’ file is deprecated. Modern X-servers don’t need this configuration file.

But I eventually opted to install this package, just so that I could give the command (as root):

nvidia-xconfig –cool-bits=4

The number given here specifies one bit. Additional bits could be set, but I only felt that the bit which corresponds to the number 4 can be set safely. After a reboot, this step will allow the user to adjust his GPU-fan-speed, which cannot be accomplished without carrying out this step.

After that, even the nVidia Settings GUI, will possess an additional slider, with which the user can take control of his GPU fan.

(Edit 04/29/2018 : )

There exists a lingering question to my mind, of whether I actually possess 224 or 336 cores on that GPU. And, according to the technical specifications on the nVidia Web-site, in fact, I have 336.

But with the OpenCL drivers that produced the data above, one drawback which might exist, is only being able to allocate Warps, according to powers of two – hence, 32 at a time rather than 48 at a time. And in that case, the next relevant question becomes, whether the additional 7×16 cores are in fact available to do GPU-computing. And I suspect that the answer is ‘No.’ At least not, using OpenCL.

Yet, if a different API is being used, such as the CUDA SDK, it may be that those additional 7×16 cores can be used. And with OpenGL shaders, again, the driver may find ways to allocate the full 7×48 cores, even though an OpenCL GPU-program might not.

On the subject of OpenGL: Even though the driver now indicates its ability to support v4.5, I’ve learned that my actual hardware will not support greater than v4.1 . But I am not crying. :-)

Another subject which comes to my mind, since Computing Hardware ‘likes’ powers of two, would be, why there are 7 vector processors on the GTX460 and not 8. And the story which I heard about that was, that the GPU was designed to offer 8×48 cores. But, a certain number of chips came out partially defective, back in the days when this graphics card was state-of-the-art. Hence, if the factory discovered that all 8 core groups were working – given that all integrated circuits undergo some sort of factory-testing – then, nVidia would have sold us a GTX470 graphics card. But if 1 out of the possible 8 core groups was defective, then a laser was used to deactivate that core group, and the same design was used to sell customers a GTX460 graphics card. The GTX460 would fetch slightly less money on the market back then, than the GTX470 did, but the GTX460 was nevertheless still worth something.

The way it is today, the GTX460 is not worth anything anymore, financially, because, in order to have a considerable price-tag, the graphics card needs to be a GTX900-series or greater…

Dirk

 

Print Friendly, PDF & Email

4 thoughts on “I’ve finally installed the proprietary nVidia graphics drivers.”

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

This site uses Akismet to reduce spam. Learn how your comment data is processed.