How SDL Accelerates Video Output under Linux.

What we might know about the Linux, X-server, is that it offers pure X-protocol to render such features efficiently to the display, as Text with Fonts, Simple GUI-elements, and small Bitmaps such as Icons… But then, when it’s needed to send moving pictures to the display, we need extensions, which serious Linux-users take for granted. One such extension is the Shared-Memory extension.

Its premise is that the X-server shares a region of RAM with the client application, into which the client-application can draw pixels, which the X-server then transfers to Graphics Memory.

For moving pictures, this offers one way in which they can also be ~accelerated~, because that memory-region stays mapped, even when the client-application redraws it many times.

But this extension does not make significant use of the GPU, only of the CPU.

And so there exists something called SDL, which stands for Simple Direct Media Layer. And one valid question we may ask ourselves about this protocol, is how it achieves a speed improvement, if it’s only installed on Linux systems as a set of user-space libraries, not drivers.

(Updated 10/06/2017 : )

And so the explanation on my Linux machines would be, that SDL has been compiled to offer 3 possible back-ends:

  1. X11
  2. caca
  3. DirectFB

This description of the packages can be a bit confusing, because 99% of the time, we’re always using rendering-system (1). The reader may not know what ‘caca’ is – in Computing. That’s an “ASCII-Art Library”, that displays images as colorful text, which resembles the image in an artistic sort of way. It implies text-mode.

And while ‘DirectFB’ is also just another user-space library, the way it accesses the Frame-Buffer, assumes that it will get no interference from the X-server. I.e., its Frame-Buffer is outside any X-server output, which is also where some esoteric uses might want it to be.

For example, If I hit <Ctrl>+<Alt>+F7, I get my first X-server-session. But, If I hit <Ctrl>+<Alt>+F1, I obtain my first console-session, which has no graphics. That console-session explicitly exists by way of a Frame-Buffer, which DirectFB can access. Its resolution can be completely different from what my X-Display resolution happens to be.

What the documentation fails to point out about the main, X11-based rendering-mode of SDL, is that it actually comes in two flavors:

  1. With Shared Memory,
  2. Without Shared Memory.

Apparently, X-server-protocol allows for pixels to be written directly to the main X-Display, instead of to one window, or since we often prefer, just to one window. The main advantage SDL brings here, is an API that the programmer can understand, to define exactly what he wants displayed, regardless of which back-end SDL is using, or, regardless of which Operating System SDL is being accessed on.

Now, there also exist versions of SDL, that use OpenGL, in which case the GPU would assist in rendering output. But the main version in my package-manager is not stated to support that mode – under the entry of ‘X11′.  :-(

(Edit 10/06/2017 : )

It’s not unique to Linux, that user-space processes are not allowed to access graphics hardware directly. Windows imposes the same restriction, only differently.

Further, on this computer, which I name ‘Phoenix’, the Kernel is only able to recognize graphics memory-locations, due to explicit graphics drivers being loaded – that I needed to install somewhat painfully, at the beginning of its setup.

When using my X-server-based graphics, which is almost always, these drivers are provided as an add-on package to my X-server packages.

In order to get this graphics-chip-set running in Frame-Buffer mode, I actually needed to install another (Kernel-space) driver, named ‘uvesafb‘. Without this driver, my console-view was limited to an ugly-low resolution, while with uvesafb, my console-views are at 1280×1024 pixels.

In other words, when I press <Ctr>+<Alt>+F1, the graphics chip-set gets switched from providing a composited desktop, to (legacy) Frame-Buffer mode, in addition to being told to display a different session.

uvesafb is also what causes the (virtual) device-file ‘/dev/fb0′ to appear in my directory-tree.


I suppose that some readers might ask, ‘Since uvesafb will work at significantly-higher resolutions, why didn’t he set its parameters for an even-higher resolution?’

The answer would be my fear, that If I connected some other monitor to the PC, than the one presently connected, the other monitor might not support very-high resolution.

And this console-mode is a mode, which Linux users are mainly dropped-in to, when something doesn’t work. We would need to keep this mode as reliable as possible, since we’d then be repairing whatever problem was taking place, in this mode.



Print Friendly, PDF & Email

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>