A bit of my personal history, experimenting in 3D game design.

I was wide-eyed and curious. And much before the year 2000, I only owned Windows-based computers, purchased most of my software for money, and also purchased a license of 3D Game Studio, some version of which is still being sold today. The version that I purchased well before 2000 was using the ‘A4′ game engine, where all the 3DGS versions have a game engine specified by the latter ‘A’ and a number.

That version of 3DGS was based on DirectX 7 because Microsoft owns and uses DirectX, and DirectX 7 still had as one of its capabilities to switch back into software-mode, even though it was perhaps one of the earliest APIs that offered hardware-rendering, provided that is, that the host machine had a graphics card capable of hardware-rendering.

I created a simplistic game using that engine, which had no real title, but which I simply referred to as my ‘Defeat The Guard Game’. And in so doing I learned a lot.

The API which is referred to as OpenGL, offers what DirectX versions offer. But because Microsoft has the primary say in how the graphics hardware is to be designed, OpenGL versions are frequently just catching up to what the latest DirectX versions have to offer. There is a loose correspondence in version numbers.

Shortly after the year 2000, I upgraded to a 3D Game Studio version with their ‘A6′ game engine. This was a game engine based on DirectX 9.0c, which was also standard with Windows XP, which no longer offered any possibility of software rendering, but which gave the customers of this software product their first opportunity to program shaders. And because I was playing with the ‘A6′ game engine for a long time, in addition owning a PC that ran Windows XP for a long time, the capabilities of DirectX 9.0c became etched in my mind. However, as fate would have it, I never actually created anything significant with this game engine version – only snippets of code designed to test various capabilities.

Continue reading A bit of my personal history, experimenting in 3D game design.

A fact about how software-rendering is managed in practice, today.

People of my generation – and I’m over 50 years old as I’m writing this – first learned about CGI – computer-simulated images – in the form of ‘ray-tracing’. What my contemporaries are slow to find out is that meanwhile, an important additional form of CGI has come into existence, which is referred to as ‘raster-based rendering’.

Ray-tracing has as advantage over raster-based rendering, better optical accuracy, which leads to photo-realism. Ray-tracing therefore still gets used a lot, especially in Hollywood-originated CGI for Movies, etc.. But ray-tracing still has a big drawback, which is, that it’s slow to compute. Typically, ray-tracing cannot be done in real-time, needs to be performed on a farm of computers, and typically, an hour of CPU-time may be needed, to render a sequence which might play for 10 seconds.

But, in order for consumers to be able to play 3D games, the CGI needs to be in real-time, for which reason the other type of rendering was invented in the 1990s, and this form of rendering is carried out by the graphics hardware, in real-time.

What this dichotomy has led to, is model- and scene-editors such as “Blender”, which allow complex editing of 3D content, often with the purpose that the content eventually be rendered by arbitrary, external methods, that include software-based, ray tracing. But such editing applications still themselves possess an Editing Preview window / rectangle, in which their power-users can see the technical details of what they’re editing, in real-time. And those editing preview windows are then hardware-rendered, using raster-based methods, instead of the final result being rendered using raster-based methods.

Continue reading A fact about how software-rendering is managed in practice, today.

Getting Steam to run with proprietary nVidia.

According to this earlier posting, I had switched my Debian / Stretch, Debian 9 -based computer named ‘Plato’ from the open-source ‘Nouveau’ drivers, which are delivered via ‘Mesa’ packages, to the ‘proprietary nVidia drivers’, because the latter offer more power, in several ways.

But then one question which we’d want an answer to, is how to get “Steam” to run. Just from the Linux package manager, the available games are slim picking, and through a Steam membership, we can buy Linux-versions of at least some powerful games, meaning, to pay for with money.

But, when I tried to launch Steam naively, which used to launch, I only got a message-box which said, that Steam could not find the 32-bit version of ‘libGL.so’ – and then Steam died. This temporary result ‘makes sense’, because I had only installed the default, 64-bit libraries, that go with the proprietary packages. Steam is a 32-bit application by default, and I have a multi-arch setup, as a prerequisite.

And so my next project became, to create a 32-bit as well as the existing, 64-bit interface to the rendering system.

The steps that I took assume, that I had previously chosen to install the ‘GLVND’ version of the GLX binaries, and unless the reader has done same, the following recipe will be incorrect. Only, the ‘GLVND’ packages which I initially installed, are not listed in the posting linked to above; they belonged to the suggested packages, which I wrote I had written down on paper, and then added to the command-line, which transformed my graphics system.

When I installed the additional, 32-bit libraries, I did get a disturbing error message, but my box still runs.

Continue reading Getting Steam to run with proprietary nVidia.

xscreensaver Bug With Latest Proprietary nVidia Graphics Drivers

As described in this posting, I have just applied a major software-update to the computer I name ‘Plato’, in which I replaced its open-source graphics drivers, with the proprietary nVidia drivers, suitable for its graphics card, and for its Linux-build.

That would be drivers version ‘375.82-1~deb9u1′ , from the package manager, for a ‘Debian 9.4′ system.

I have just noticed a major bug, which other people should know about, before they also decide to go with the proprietary drivers. They tend to cause some malfunction with OpenGL-based ‘xscreensaver’ screen-savers, version ‘5.36-1′ .

The bug seems to be, that if I use the graphical configuration tool to preview several screen-savers, when I switch from one screen-saver to another, the previous GL-screen-saver being previewed fails to terminate, which in turn causes the configuration window to freeze, so that the next-chosen screen-saver cannot be previewed. A small blank rectangle takes its place, in the configuration window. When this happens, I actually need to ‘kill -9′ the screen-saver-process – belonging to the screen-saver in question and not ‘/usr/bin/xscreensaver’ – the former of which is taking up 100% of 1 CPU core with nice-time, before I can continue previewing screen-savers.

The problem with this as I see it is, it could also happen after the screen-saver has locked the screen, and when I have entered my password to unlock it. The mere fact that I was always able to unlock one GL-based screen-saver in the past was good in itself, but may only have been luck! The strangeness with which my bug seems to differ from other users’ bug-reports, is that when my OpenGL-based screen-saver was rendering to the root window – i.e., to the whole screen – it did exit properly when unlocked by me.

So as it currently stands, I have set my screen-saver on the computer ‘Plato’, to just a blank screen… :-(

At the same time, OpenGL applications seem to run just fine, like this example, just tested:

screenshot_20180430_142338

However, since the description of the screen-saver packages in the package manager states “GL(Mesa)” screen-savers, it may be better just to ‘remove’ the ‘xscreensaver-gl’ and ‘xscreensaver-gl-extra’ packages.

I found out, that this bug also affects ‘rss-glx 0.9.1-6.1′ .

(Updated 04/30/2018, 19h25 … )

Continue reading xscreensaver Bug With Latest Proprietary nVidia Graphics Drivers