How an old problem with multiple (Debian / Linux) sessions seems to require an old fix.

One of the facts which I recently updated my postings with is, that I have the ability to run multiple sessions on one PC, and to switch back and forth between them, by using the key-combinations <Ctrl>+<Alt>+F7, <Ctrl>+<Alt>+F8, etc. According to that update, each session could theoretically use either ‘Plasma 5′, or ‘LXDE’ as its desktop manager, even though I’d choose an actual LXDE session for only one of my defined usernames.

When I was recently testing this ability, I found that a Plasma 5 session which had locked its screen (which I had switched away from), using the built-in Plasma 5 locker, would start to consume 1 CPU core at 100%, as was visible from within the other session. And, it appears that This is a bug which has been known for a long time, on computers that have the proprietary NVIDIA graphics drivers, as my computer named ‘Phosphene’ does. This computer is still based on Debian 9 / Stretch. Apparently, according to common lore, what one is supposed to do about this is, to create a file named such as ‘’ (in my case), in the directory ‘/etc/profile.d’, which is supposed to set two environment variables globally like so:


# An attempt to prevent klocker from consuming 1 CPU core 100%
# when multiple desktop sessions are active...

export __GL_YIELD="USLEEP"


Unfortunately, this backfired on me when I tried to implement it, in that regardless of which way I tried to do so, ‘kwin’ would just crash in the session that’s supposed to be using ‘Plasma 5′. An additional mystery I ran in to was, that my attempts to set ‘__GL_YIELD’ would simply get reset somewhere, unless I had also set ‘KWIN_TRIPLE_BUFFER’. Only if I set both, would setting either reveal as successful, using an ‘echo $…’ command. (:1)  Therefore, what I really needed to do was, to turn off the Screen-Locking which is provided by Plasma 5 itself (for both my usernames), and to install and use ‘xscreensaver’ instead. However, doing that has two obvious caveats:

  • Under Debian 10 / Buster and later, ‘xscreensaver’ is no longer fully supported, unless one also reconfigured the new, Wayland display manager to act as an X-server proxy, And
  • Even when I apply this fix, which means that I’ve completely disabled ‘klocker’ in my settings, at the moment I tell Plasma 5 to launch a new, parallel session, foregoing-which causes <Ctrl>+<Alt>+F8 just to lead to a blank screen, Plasma 5 will lock the current session – Using ‘klocker’ again, and causing 100% CPU usage to become visible, again, from the second session.

What I find is that, once I’ve used my Plasma 5 session to create a parallel session, I need to switch to the first session once, using <Ctrl>+<Alt>+F7, and unlock that one. After that, both my Plasma 5 sessions will only lock themselves, using ‘xscreensaver’. And aside from that short, crucial interval, I haven’t seen 100% CPU-core usage again.


I should add that, for certain purposes, I sometimes choose only to install the CPU-rendered ‘xscreensaver’ packages, and deliberately do not install the hardware-accelerated ones. And in this case, the hardware-accelerated screensavers were omitted, simply because they could start running the busy-wait loop again, only this time, when invoked by ‘xscreensaver’.

(Update 3/24/2021, 13h55… )

(As of 3/23/2021, 18h55… )


I can be slightly clearer in my description of what happened with these two environment variables. The variable ‘KWIN_TRIPLE_BUFFER’ will only succeed, if triple-buffering is enabled in the ‘xorg.conf’ file. If it is not set there, then ‘kwin’ will ask for it anyway, resulting in a desktop which is either corrupted, or has no compositing, or suffers from both deficits. And on my computer, ‘xorg.conf’ was generated by the proprietary driver’s utility itself. I must assume that, if the utility did not script it in to ‘xorg.conf’, the utility must have detected that my hardware was not up to it. I’m not going to dispute, what the utility put into my ‘xorg.conf’. The feature is not listed.

Yet, ‘__GL_YIELD’ can only be set if the first variable is set first. Otherwise, entering the following command:

echo ${__GL_YIELD}

Will show an empty result, even though I tried to set this variable under ‘/etc/profile.d’.

I actually take the fact that this second variable was eventually no longer filtered, as a sign that ‘kwin’ had already given up on using my GPU, by the time it had tried to implement ‘KWIN_TRIPLE_BUFFER’.



(Update 3/24/2021, 7h10: )

More recently, I edited my ‘xorg.conf’ file, to enable triple-buffering after all. To my amazement, the computer and its X-server still booted (without generating any error messages in the log file ‘/var/log/Xorg.0.log’, where the fact that triple-buffering had been enabled, was simply acknowledged.)

I take this to mean, that the option might work now, to set the variable ‘KWIN_TRIPLE_BUFFER’ (without crashing ‘kwin’).

However, since I’ve already found a working solution to my problem, I’ll just stick with using ‘xscreensaver’.

Also, there is a button which ‘xscreensaver’ displays, that reads, ‘New Login’. Presumably, I could also use that, to start a new parallel session…


(Update 3/24/2021, 13h55: )

Because I have the ‘gkrellm’ widget displaying my CPU usage on both (Plasma 5) desktops, I would know if the currently selected session’s ‘klocker’ process had been experiencing 100% of 1 CPU-core usage, as soon as I unlocked the session displaying that desktop again. It was not. It was only doing so when locking a session that had become deselected.

It would seem to follow that If ‘klocker’ can be made to use triple-buffering, this behaviour can simply be replaced, with continuous usage of several GPU cores (to display nothing, finally).





Print Friendly, PDF & Email

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>