Now, I have LuxCoreRender working on one of my computers.

In This Earlier Posting, I wrote that a rendering-engine exists, which is actually a (‘software-based’) ray-tracer, making it better in picture-quality, from what raster-based rendering can achieve, the latter of which is more-commonly associated with hardware acceleration, but that this rendering engine has been written in a variant of C, which makes it suitable both in syntax and in semantics, to run under OpenCL, which, in turn, is a driver for running massively-parallel computations on our GPU – on our Graphics Processing Unit. Assuming we have a sufficiently-strong graphics card.

That ray-tracer is known as “LuxCoreRender”.

Well previously I had spent a long time, trying to get this to run on my computer, which I name ‘Plato’, using the wrong methodologies, and thus consuming a whole day of my time, and tiring me out.

What I finally did was to register with the user forums of LuxCoreRender, and asking them for advice. And the advice they gave me, solved that problem within about 30 minutes.

Their advice was:

  1. Download up-to-date Blender 2.79b, from the Web-site, not the package-installed 2.78a, and then just unpack the binary into a local, user-space folder on my Linux computer,
  2. Insert the binary BlendLuxCore / Tell Blender 2.79b to install it directly from its .ZIP-File.

It now works like a charm, and I am able to define 3D Scenes, which I want LuxCoreRender to render! :-)  And running the code on the GPU – using OpenCL – also works fully.

luxcore_test1

I suppose that one advantage which ray-tracing affords, over raster-based graphics, is that the Material which we give models is not limited to U,V-mapped texture images and their H/W-based shaders, but can rather be complex Materials, which Blender allows us to edit using its Node-Editor, and the input-textures for which can be mathematically-based noise-patterns, as shown above.

Dirk

 

Problems getting LuxCoreRender to work on one of my computers.

In This Earlier Posting, I had written that I had used a benchmarking tool named “LuxMark”, to test whether my GPU even works, on the computer which I name ‘Plato’. In the process, I ran across the discovery, that there exists a type of rendering-engine named ‘LuxCoreRenderer’, which is a ray-tracer, but which will do its ray-tracing with code, that runs under OpenCL.

I had assumed that the benchmarking tool was free, but that users would need to be paying customers, before they can use ‘LuxCoreRender’, to render their own scenes. To my great approval I’ve discovered that LuxRender and LuxCoreRender are also, Free, Open-Source Software. :-D  But to my disappointment I’ve learned, that there is no feasible way in which I could use that, on ‘Plato’.

The reason is fairly straightforward. The devs have renamed their renderer, from LuxRender to LuxCoreRender, at which point they also carried out a rebuild of their code, so that version 1.6 was discontinued, and ‘a cleaner build’ was started, as version 2.x . In order to make practical use of a rendering engine, I need a GUI, to create scenes for that engine to render. Well, LuxCoreRender has as minimum requirement, Blender v2.79b , while the Blender-version I have installed on ‘Plato’ is only v2.78a . The requirements that the devs state is strict, because Blender versions before 2.79 contained a bug, which would cause crashes. Not only that, but in this case, a user-space application would crash, for which there are considerable processes running on the GPU, which can cause severe memory-leaks, as I wrote Here.

Now, there does exist a stand-alone version of LuxCoreRender, v2.x , which in fact runs on ‘Plato’, but which remains rather useless to me, because it can only load and then render scene-descriptions, which have been stored to a file which is totally based on Lux, and not on any other standards.

Continue reading Problems getting LuxCoreRender to work on one of my computers.

How To Install Yafaray Under Linux

One of the computing subtopics I dabble in, is the acquisition of 3D-graphics software. Therefore, I already have “Blender 2.78a”, which has its own built-in software-rendering engine, and I have several other rendering engines installed on my Linux-based computers.

Further, the rendering engines by themselves can be useless, unless they integrate well with a GUI (such as with Blender). And so one undertaking which I’ll typically reach with a given computer, is to install “Yafaray”, which used to be ‘Yafray’, which stood for ‘Yet Another Free Ray-Tracer’. If it’s installed properly, Blender can render its scenes, using Yafaray, but from within Blender.

Yafray used to be a much simpler piece of software to install than it has become. But I’m sure the effort I put into it this evening, will be well-worth it eventually. What I’m used to doing is to download a source-tree, and if it’s CMake-based, to run ‘cmake-gui‘ on it, to custom-pick my build options, and to go. But as it happens with Yafaray, this approach led to near chaos. What this did, was to compile all the source-code properly into libraries, but then to install those libraries to nonsensical locations within my system folders. One reason was the fact that a part of the project was to create Python 3 bindings, and another was the need for the Blender-integration, where modern Blender versions are based on Python 3. In any case I was sure to install all the build dependencies via my package-manager, but doing so was not enough to obtain working outcomes.

funbutterflysphere3-0001

Continue reading How To Install Yafaray Under Linux

More about Framebuffer Objects

In the past, when I was writing about hardware-accelerated graphics – i.e., graphics rendered by the GPU – such as in this article, I chose the phrasing, according to which the Fragment Shader eventually computes the color-values of pixels ‘to be sent to the screen’. I felt that this over-simplification could make my topics a bit easier to understand at the time.

A detail which I had deliberately left out, was that the rendering target may not be the screen in any given context. What happens is that memory-allocation, even the allocation of graphics-memory, is still carried out by the CPU, not the GPU. And ‘a shader’ is just another way to say ‘a GPU program’. In the case of a “Fragment Shader”, what this GPU program does can be visualized better as shading, whereas in the case of a “Vertex Shader”, it just consists of computations that affect coordinates, and may therefore be referred to just as easily as ‘a Vertex Program’. Separately, there exists the graphics-card extension, that allows for the language to be the ARB-language, which may also be referred to as defining a Vertex Program. ( :4 )

The CPU sets up the context within which the shader is supposed to run, and one of the elements of this context, is to set up a buffer, to which the given, Fragment Shader is to render its pixels. The CPU sets this up, as much as it sets up 2D texture images, from which the shader fetches texels.

The rendering target of a given shader-instance may be, ‘what the user finally sees on his display’, or it may not. Under OpenGL, the rendering target could just be a Framebuffer Object (an ‘FBO’), which has also been set up by the CPU as an available texture-image, from which another shader-instance samples texels. The result of that would be Render To Texture (‘RTT’).

Continue reading More about Framebuffer Objects