The laptop ‘Klystron’ suspends to RAM half-decently.

One subject which I have written a lot about, was that as soon as I close the lid of the laptop I name ‘Klystron’, it seems to lose its WiFi signal, and that that can get in the way of comfortable use, because to close the lid also helps shield the keyboard of dust etc..

This Linux laptop boots decently fast, and yet is still a hassle to reboot very often. And so I needed to come up with a different way of solving my problem, on a practical level. My solution for now, is to tell the laptop to Suspend To RAM, as soon as I close the lid. That way, the WiFi signal is gone more properly, and when the laptop resumes its session, the scripts that govern this behavior also re-initialize the WiFi chipset and its status on my LAN. This causes less confusion with running Samba servers etc., on my other computers.

There is a bit of terminology, which I do not think that the whole population understands, but which I think that people are simply using differently from how it was used in my past.

It used to be, that under Linux, we had ‘Suspend To RAM’ and ‘Suspend To Disk’. In the Windows world, these terms corresponded to ‘Standby’ and ‘Hibernate’ respectively. Well in the terminology today, they stand for ‘Sleep’ and ‘Hibernate’, borrowing those terms from mobile devices.

There are two types of Suspend working in any case.

In past days of Linux, we could not cause a laptop just to Hibernate. We needed to install special packages and modify the Grand Unified Bootloader, before we could even Suspend To Disk. Suspending To RAM used to be less reliable. Well one development with modern Linux which I welcome, among many, is the fact that Sleep and Hibernate should, in most cases, work out-of-the-box.

I just tried Sleep mode tonight, and it works 90%.

One oddity: When we Resume, on this laptop, the message is displayed on the screen numerous times, of a CPU Error. But after a few seconds of CPU errors, apparently the session is restored without corruption. Given that I have 300 (+) processes, I cannot be 100% sure that the Restore is perfectly without corruption. But I am reasonably sure, with one exception:

The second oddity is of greater relevance. After Waking Up, the clock of the laptop seems to be displaced 2 days and a certain number of hours into the future. This bug has been observed on some other devices, and I needed to add a script to the configuration files as a workaround, which simply sets the system clock back that many days and that many hours, after Waking. Thankfully, I believe that doing so, was as much of a workaround as was needed.

One side-effect of not having done so, before being aware of the problem, was that the ‘KNotify’ alarms for the next two days have also all sounded, so that it will take another two days, before personal organizer – PIM – notifications may sound for me again.

The fact that numerous CPU errors are displayed bothers me not. What that means, is that the way the CPU goes to sleep, and then wakes up, involves power-cycling in ways that do not guarantee the integrity of data throughout. But it would seem that good programming of the kernel does provide data integrity, with the exception of the system clock issue.

But the fact that the hardware is a bit testy when using the Linux version of Sleep, also suggests that maybe this is also the kind of laptop that powers down its VRAM. It is a good thing then, that I disabled the advanced compositing effects, that involve vertex arrays.


 

There is a side-note on the desktop cube animation I wanted to make.

In general, when raster-rendering a complex scene with models, each model is defined by a vertex array, an index array, one or more texture images etc., and the vertex array stores the model geometry statically, as relative to the coordinate-origin of the model. Then, a model-view-projection matrix is applied – or just a rotation matrix for the normal vectors – to position it with respect to the screen. Moving the models is then a question of the CPU updating the model-view matrix only.

Well when a desktop cube animation is the only model in the scene, as part of compositing, I think that the way in which this is managed differs slightly. I think that what happens here, is that instead of the cube having vertex coordinates of +/- 1 all the time, the model-view matrix is kept as an identity matrix.

Instead, the actual vertex data is rewritten to the vertex array, to reposition the vertices with respect to the view.

Why is this significant? Well, if it was true that Suspending the session To RAM also cut power to the VRAM, it would be useful to know, which types of data stored therein will seem corrupted after a resume, and which will not.

Technically, texture images can also get garbled. But if all it takes is one frame cycle for texture images to get refreshed, the net result is that the displayed desktop will look normal again, by the time the user unlocks it.

Similarly, if the vertex array of the only model is being rewritten by the CPU, doing so will also rewrite the header information in the vertex array, that tells the GPU how many vertices there are, as well as rewriting the normal vectors, as when they are a part of any normal vertex animation, etc.. So anything resulting from the vertex array should still not look corrupted.

But one element which generally does not get rewritten, is the index array. The index array states in its header information, whether the array is a point list, a line list, a triangle list, a line strip, a triangle strip… It then states how many primitives exist, for the GPU to draw. And then it states sets of elements, each of which is a vertex number.

The only theoretical reason fw the CPU would rewrite that, would be if the topology of the model was to change, which is as good as never in practice. And so, if the VRAM gets garbled, what was stored in the index array would get lost – and not refreshed.

And this can lead to the view, of numerous nonsensical triangles on the screen, which many of us have learned to associated with a GPU crash.

 

Dirk

 

Print Friendly, PDF & Email

I just watched the movie, “The Imitation Game”.

As the title suggests, I did watch this movie today, by way of Netflix. And, even though the desktop ‘Mithral’ has a 1920×1080 monitor, I chose to watch this movie on the laptop I name ‘Klystron’ instead, which ‘only’ has a 1600×900 monitor, and which is running Linux, as many of my earlier postings explained. In addition, I have had a suspicious eye on ‘Klystron’ in recent weeks, concerning the stability of its WiFi.

That laptop lasted through the movie without any glitches. And its CPU was active, at 4x 30% – as opposed to being active at 4x 25%, when I choose to transfer a 500MB video clip from another computer, just by way of a Samba share.

About the movie, I would say this was a good one. Further, I was able to merge the picture which was painted of Turing, and of his machine, and of Enigma, with knowledge I already had about these subjects, fairly well. The movie strikes me as well-founded in facts. And yet, it also made for some good drama.

Actually, I find that this movie was one of the better ones I have seen in quite some time. I will forgive the fact that it is merely a docu-drama.

Dirk

 

Print Friendly, PDF & Email

My Linux Laptop ‘Klystron’ And 802.11n Again

The Linux laptop I name ‘Klystron’ has been running in a single session, for 1 day and 7 hours so far, and with its lid in the open position, and remained connected to my WiFi in 802.11n mode.

Further, the last time there has been any real issue with this mode, occurred several days ago, and several reboots ago. On the rare occasion where the connection simply quit while in use, there were error messages in my ‘syslog‘, that vaguely pointed towards an 802.11n problem, according to my having Google-d those error messages.

But the behavior was introduced more recently, that simply closing the laptop lid would cause it to lose contact with my WiFi, without the actual connection being reported as ‘down’ by ‘Network Manager‘, and without resulting in any error messages. This situation would typically reverse itself, within seconds of my unlocking an active session, and would also reverse itself, without resulting in any Notifications. The laptop would simply never know, that overnight, there was no data received and transmitted over WiFi.

Similar but not identical results were obtained, while connected in 802.11g mode.

Given that nobody has ever asked me the question, of whether maybe my WiFi signal could already be weak where this laptop is situated, I would say that it remains unproven, that this setup has any 802.11n issues per se. And so, because I know how frustrating it can be to do so, I would also not encourage coders to start looking for errors very carefully, which might not even exist in the software, or in the firmware.

You see I still have this peculiar notion, that there can be something impeding the efficiency of the actual WiFi antenna, which could account for most of the instability I have reported. And I also have this peculiar notion, that the performance of an antenna is based on wave dynamics, and not on the dynamics of Quantum-Mechanical particle representations, of radio signals.

Dirk

 

Print Friendly, PDF & Email