A disadvantage in running Linux, on a multi-core CPU that’s threaded.

One of the facts about modern computing is, that the hardware could include a multi-core CPU, with a number of virtual cores different from the number of full cores. Such CPUs were once called “Hyper-Threaded”, but are now only called “Threaded”.

If the CPU has 8 virtual cores, but is threaded as only 4 full cores, then there will only be a speed advantage, when running 4 processes. But because processes are sometimes multi-threaded, each of those 4 processes could consist of 2 fully-busy threads, and benefit from a further doubling of speed because each full core has 2 virtual cores.

It’s really a feature of Windows to exploit this fully, while Linux tends to ignore this. When Linux runs on such a CPU, it only ‘sees’ the maximum number of virtual cores, as the logical number of cores that the hardware has, without taking into account that they could be pairing in some way, to result in a lower number of full cores.

And to a certain extent, the Linux kernel is justified in doing so because unlike how it is with Windows, it’s actually just as cheap for a Linux computer to run a high number of separate processes, as it is to run processes with the same number of threads. Two threads share a code segment as well as a data segment (heap), but have two separate stack segments as well as different register-values. This makes them ‘enlightened processes’. Well they only really run faster under Windows (or maybe under OS/X).

Under Linux it’s fully feasible just to create many processes instead, so the bulk of the programming work does not make use as much of multi-threading. Of course Even under Linux, code is sometimes written to be multi-threaded, for reasons I won’t go into here.

But then under Linux, there was also never effort put into the kernel recognizing two of its logical cores, as belonging to the same full core.

(Updated 2/19/2019, 17h30 … )

Continue reading A disadvantage in running Linux, on a multi-core CPU that’s threaded.

About the LuxCore Renderer, and OpenCL Rendering.

One of the subjects which I’ve written about before, is that On one of my computers, the 3D Graphics Application ‘Blender’ is set up, optionally, to use ‘LuxCore’ to render a scene, and that the main feature of LuxCore which I’m interested in, is its ability to render the scene using OpenCL, and therefore using the GPU cores, rather than just using CPU cores.

A curious question which some of my readers might ask would be, ‘How does the user know, that LuxCore is in fact using the GPU, when he or she has simply selected OpenCL as the rendering hardware?’

In my case, the answer is the fact, that my GPU-temperature increases dramatically. And when I perform any type of GPU computing, the GPU temperature does same. I’ve had the GPU get 70⁰C hot, while, when idling, the GPU-temperature will not exceed 40⁰C. It can be observed though, that when OpenCL rendering is selected, all 8 CPU cores are also in-use, and that may just be a way in which LuxCore works. Further, it may be  away in which OpenCL works, to make the CPU cores available, in addition to the GPU cores.

Dirk

 

Now, I have LuxCoreRender working on one of my computers.

In This Earlier Posting, I wrote that a rendering-engine exists, which is actually a (‘software-based’) ray-tracer, making it better in picture-quality, from what raster-based rendering can achieve, the latter of which is more-commonly associated with hardware acceleration, but that this rendering engine has been written in a variant of C, which makes it suitable both in syntax and in semantics, to run under OpenCL, which, in turn, is a driver for running massively-parallel computations on our GPU – on our Graphics Processing Unit. Assuming we have a sufficiently-strong graphics card.

That ray-tracer is known as “LuxCoreRender”.

Well previously I had spent a long time, trying to get this to run on my computer, which I name ‘Plato’, using the wrong methodologies, and thus consuming a whole day of my time, and tiring me out.

What I finally did was to register with the user forums of LuxCoreRender, and asking them for advice. And the advice they gave me, solved that problem within about 30 minutes.

Their advice was:

  1. Download up-to-date Blender 2.79b, from the Web-site, not the package-installed 2.78a, and then just unpack the binary into a local, user-space folder on my Linux computer,
  2. Insert the binary BlendLuxCore / Tell Blender 2.79b to install it directly from its .ZIP-File.

It now works like a charm, and I am able to define 3D Scenes, which I want LuxCoreRender to render! :-)  And running the code on the GPU – using OpenCL – also works fully.

luxcore_test1

I suppose that one advantage which ray-tracing affords, over raster-based graphics, is that the Material which we give models is not limited to U,V-mapped texture images and their H/W-based shaders, but can rather be complex Materials, which Blender allows us to edit using its Node-Editor, and the input-textures for which can be mathematically-based noise-patterns, as shown above.

Dirk

 

Problems getting LuxCoreRender to work on one of my computers.

In This Earlier Posting, I had written that I had used a benchmarking tool named “LuxMark”, to test whether my GPU even works, on the computer which I name ‘Plato’. In the process, I ran across the discovery, that there exists a type of rendering-engine named ‘LuxCoreRenderer’, which is a ray-tracer, but which will do its ray-tracing with code, that runs under OpenCL.

I had assumed that the benchmarking tool was free, but that users would need to be paying customers, before they can use ‘LuxCoreRender’, to render their own scenes. To my great approval I’ve discovered that LuxRender and LuxCoreRender are also, Free, Open-Source Software. :-D  But to my disappointment I’ve learned, that there is no feasible way in which I could use that, on ‘Plato’.

The reason is fairly straightforward. The devs have renamed their renderer, from LuxRender to LuxCoreRender, at which point they also carried out a rebuild of their code, so that version 1.6 was discontinued, and ‘a cleaner build’ was started, as version 2.x . In order to make practical use of a rendering engine, I need a GUI, to create scenes for that engine to render. Well, LuxCoreRender has as minimum requirement, Blender v2.79b , while the Blender-version I have installed on ‘Plato’ is only v2.78a . The requirements that the devs state is strict, because Blender versions before 2.79 contained a bug, which would cause crashes. Not only that, but in this case, a user-space application would crash, for which there are considerable processes running on the GPU, which can cause severe memory-leaks, as I wrote Here.

Now, there does exist a stand-alone version of LuxCoreRender, v2.x , which in fact runs on ‘Plato’, but which remains rather useless to me, because it can only load and then render scene-descriptions, which have been stored to a file which is totally based on Lux, and not on any other standards.

Continue reading Problems getting LuxCoreRender to work on one of my computers.