Why I don’t just compile binaries for 32-bit Linux.

One fact which I’ve posted about before is, that from time to time, I post source-code on this blog, and additionally, try to compile that into binaries, which will run on the computers of people who do not know how to compile it. In this context, I apologize to Mac users. I do not have the tools at my disposal, also to offer ‘OS/X’ or ‘MacOS’ binaries. But, I will sometimes offer 32-bit and 64-bit Windows executables (=binaries).

What some people might wonder is, ‘Why does Dirk not also compile 32-bit Linux executables, in addition to 64-bit Linux executables?’

And there is a reason, beyond sheer laziness.

The way it works with 64-bit Windows systems is, that each of them has a 32-bit Windows subsystem, which allows it to run 32-bit applications for backwards-compatibility. This 32-bit subsystem is also one reason, why it’s generally possible just to compile C++ programs to 32-bit Windows targets, on any 64-bit Windows computer (that has the tools in-place for 64-bit Windows targets in the first place).

Unfortunately, the Linux world is not as rosy.

Some Linux systems – actually, most 64-bit Linux systems, I think – are what’s called “Multi-Arch”, and what this means is, that there is a set of 32-bit libraries, in addition to a full set of 64-bit libraries. The 32-bit libraries are usually installed as dependencies of specific 32-bit executables.

The way the world of compiling software works, is, that after code has been compiled into Object Files, these Object Files, that are already binary in content, must be linked to Libraries, either static or shared, before an executable is built.

Hence, the compiler flag ‘-m32‘ will tell a Linux computer, to force compilation of object code to ‘the 32-bit, Intel, i386 architecture’, as it’s sometimes referred to, even if the CPU isn’t ultimately an Intel. But, i386 -architecture Object Files, must also be linked to present, 32-bit Libraries.

Here’s what some people may not know about the Linux world, and its Multi-Arch (userland) members: The number of 32-bit libraries they will ultimately have installed, will not be ~one tenth as many~, as the native 64-bit Libraries, which most of their computers run on (if those are in fact, multi-arch, 64-bit PCs). Hence, if a program simply consists of the “Hello World!” example, nothing will go wrong.

But, if the software project needs to be linked to 40 (+) libraries, then chances are that the host computer has, maybe, the 32-bit version of 4 of those on-hand…

Further, I use certain automated tools, such as ‘linuxdeployqt‘, which re-links executables that have already been linked to 64-bit libraries on my own computer, so that instead, they will be linked as autonomously as possible, to libraries in a generated ‘AppImage’. I cannot rely on this tool being Multi-Arch as well.

And so, in certain ways, when a Linux computer is serving as a build platform, it can be harder and not easier, than it is with Windows, just to target some other platform. More typically, that Linux computer will be installed as having the same platform as it’s targeting.

Sorry again.

Now, an exception exists, where Debian Maintainers have cross-compiled many of their packages to run on novel architectures, such as, on the ARM CPUs that power most Android devices. This is a very tedious and complex process, by which those maintainers first have to cross-compile the libraries, resulting in library packages, and then, link each executable to its compatible set of compiled libraries, resulting in ‘end-user-packages’. (:1)

 

If my readers truly only have 32-bit Linux computers and want to run my executables, and, If I provided a 32-bit Windows executable, then usually, that executable will run just fine ‘under Wine’. One could try that.

 

(Updated 8/03/2021, 22h10… )

Continue reading Why I don’t just compile binaries for 32-bit Linux.

Certain things which the almighty CMake utility cannot do.

CMake happens to be a friend of mine. On my Linux computers, if I need to custom-compile some software, and, if that software does not come with the older ‘./configure’ scripts…, then chances are, that source tree has a ‘CMakeLists.txt’ file in its root directory. On an ‘amd64′, or an ‘i386′ -based computer, this is usually all that I need to create Makefiles, and then to compile the project.

But, I have recently run into a situation where this utility became useless to me. It was an ‘aarch64-linux-gnu’, aka, an ‘arm64′ -based Guest System, running within ‘TightVNC’, running within an Android Host System, via the ‘UserLAnd’ Android app, that uses ‘proot’ to sandbox the Linux Guest System. I tried to use CMake as usual (not really trying to cross-compile anything), and was startled by what this utility next told me: That my C++ compiler was broken, because CMake could not compile a test program, that CMake, in turn, generally tests compilers with.

What I found out was, that it was not the compiler’s fault, but rather, the apparent magic that allows CMake to find the libraries, not working when run on this platform. Hence, the compiler was failing its test, because CMake could not even discern a single library’s location, nor any other variables, that would ultimately have been relevant to the project. The compiler’s test was not even being linked to ‘libstdc++.so’.

I basically had to give up on using CMake on that platform, as well as, on custom-compiling many software projects, that would have been written by other programmers.

What I have learned however is the apparent fact, that when true experts write the ‘CMakeLists.txt’ file to do so, they can even get it to cross-compile their projects to the same platform. But those would be their projects, and not my own project.

Dirk