About a minor (Home-Screen) nuisance I’ve experienced on Android deviceS.

I have owned several Android-based devices, and some of those were purchased from Samsung, those being:

  • A Galaxy Tab S, First Generation,
  • (An earlier Smart-Phone),
  • A Galaxy S6 Smart-Phone,
  • A Galaxy S9 Smart-Phone.

A feature which all these devices have, is the Touchwiz Home-Screen (program). This is the default of what the devices display, when not displaying a specific app, when not displaying the app drawer, and when not displaying ‘Bixby’ (most recently). An unfortunate behaviour of the devices is, that Touchwiz will sometimes crash. In my experience, when it does, this is no big deal, because it restarts automatically, and after a few minutes, even my Notification-Bar Entries will reappear. If certain apps fail to make their notifications reappear by themselves, then launching those apps from the application groups will make their notifications reappear.

I tend to rate each Android device, according to how rarely its Home-Screen will crash in this way. According to that, my Google Pixel C Tablet fared better because its home-screen has never crashed on me. My S9 Phone fared almost as well, in that Touchwiz seldom crashed. But now I think I’ve identified a situation which will frequently cause Touchwiz to crash on the S9 Phone.

Firstly, as I’m writing this, the firmware on that phone is at its latest version, that being the October 1 patch, of 2019, of Android 9.

I discovered that I can trigger this situation, as I was experimenting with the Super-Slow-Mo camera recording mode, in which the camera can record up to 0.4 seconds of video at 960FPS, at a resolution of 1280×720. When the camera does this, it generates a 20MB video, after that has been compressed via a standard H.264 CODEC into an .MP4 container-file. I have the default set, to record all camera footage to the external Micro SD Card. Having recorded the super-slow-mo video once, triggered this behaviour.

There is a simple way to interpret what has caused this, that does not seem to lay any blame on Samsung: When the camera is recording video that fast, it’s generating data faster than the external SD Card can store it. Therefore, the data takes up RAM, until some later point in time, when the O/S has transferred the data to the SD Card, by writing it out. This moment was reached several seconds later.

Here’s where the news gets a bit worse. I can download This 3rd-party app, that’s designed to test what speed of external SD Card I have. The reason I need to do this is the fact that I never seem to remember exactly what type of SD Card I purchased, for use with one specific device.

According to this app, my external SD Card can be written to sequentially at ~12MBytes/Sec. That makes it a Class 10 card. Yet, 20MB of data are to be stored in 0.4 seconds. In fact, simply running the benchmarking app caused a second Touchwiz crash, which was just as inconsequential as the first, that I was trying to investigate. What this seems to suggest is, that virtually no SD Card that I can buy, can really be fast enough to be written to at the speed with which the camera app can generate its data. The camera app will need to cache its footage in RAM, before that footage has been written to the SD Card.

Further, the footage is certainly being stored in RAM in an uncompressed form of data (384 raw frames), while what’s to be written to the SD Card is finally compressed. (:1)

And yet, either of these two apps will cause the Touchwiz crash. Hmm… I think that for the moment, I’ll just hold my horses, and record a maximum of 0.2 seconds of Super-Slow-Mo. Thankfully, this is a parameter that I can choose, with the little icon in the upper-right-hand corner of the view, before shooting.

(Updated 11/17/2019, 12h10 … )

Continue reading About a minor (Home-Screen) nuisance I’ve experienced on Android deviceS.

I just installed Sage (Math) under Debian / Stretch.

One of the mundane limitations which I’ve faced in past years, when installing Computer Algebra Systems etc., under Linux, that were supposed to be open-source, was that the only game in town – almost – was either ‘Maxima’ or ‘wxMaxima’, the latter of which is a fancy GUI, as well as a document exporter, for the former.

Well one fact which the rest of the computing world has known about for some time, but which I am newly finding for myself, is that software exists called ‘SageMath‘. Under Debian / Stretch, this is ‘straightforward’ to install, just by installing the meta-package from the standard repositories, named ‘sagemath’. If the reader also wants to install this, then I recommend also installing ‘sagemath-doc-en’ as well as ‘sagetex’ and ‘sagetex-doc’. Doing this will literally pull in hundreds of actual packages, so it should only be done on a strong machine, with a fast Internet connection! But once this has been done, the result will be enjoyable:

screenshot_20180915_201139

I have just clicked around a little bit, in the SageMath Notebook viewer, which is browser-based, and which I’m sure only provides a skeletal front-end to the actual software. But there is a feature which I already like: When the user wishes to Print his or her Worksheet, doing so from the browser just opens a secondary browser-window, from which we may ‘Save Page As…’ , and when we do, we discover that the HTML which gets saved, has its own, internal ‘MathJax‘ server. What this seems to suggest at first glance, is that the equations will display typeset correctly, without depending on an external CDN. Yay!

I look forward to getting more use out of this in the near future.

(Update 09/15/2018, 21h30 : )

Continue reading I just installed Sage (Math) under Debian / Stretch.

I’ve just benchmarked my GPU’s ability to run OpenCL v1.2 .

Recently I’ve come into some doubt, about whether the GPU-computing ability of my graphics hardware specifically, might be defective somehow. But, given that ability, there exist benchmarks which people can run.

One such benchmark is called “LuxMark“, and I just ran it, on the computer I name ‘Plato’.

The way LuxMark works, is that it uses software to ray-trace a scene, thereby explicitly not using the standard, ‘raster-based rendering’, which graphics hardware is most famous for. But as a twist, this engine compiles the C-code which performs this task, using OpenCL instead of using a general C compiler for the CPU. Therefore, this software runs as C, but on the GPU.

This is similar to what a demo-program once did, which nVidia used to ship with their graphics cards, which showed a highly-realistic sports-car, because ray-tracing produces greater realism, than raster-based graphics would.

Here is the result:

screenshot_20180504_140224_c

I suppose that people who are intrigued by CGI – as I am – might eventually be interested in acquiring the LuxCoreRender engine, which would allow software-customers to render scenes which they choose. LuxMark just uses LuxCoreRender, in order to benchmark the GPU with one specific, preset scene.

But what this tells me is that there is essentially still nothing wrong at the hardware-level, with my GPU, or its ability to compute using OpenCL v1.2 . And, some version of OpenCL was also what the BOINC Project was using, whose GPU Work Units I was completing for several recent days.

One question which I’d want to know next, is whether a score of “2280” is good or bad. The site suggest that visitors exist whose GPUs are much stronger. But then, I’d need to have an account with LuxCoreRender to find out… :-D  The answer to that question is logical. My graphics card is ‘only’ a series-400. Because users exist with series-900, or series-1000 graphics cards, obviously, theirs will result in much faster benchmarks.

Dirk