←back to thread

196 points generichuman | 1 comments | | HN request time: 0.201s | source
Show context
sylware ◴[] No.43552021[source]
This article missed a critical point which is "the right way" to select a glibc ABI version: see binutils ld documentation, second part of the page related to VERSION support. This must include glibc internal symbols.

This will allow to craft ELF binaries on a modern distro which will run on "older" distros. This is critical for games and game engines. There is an significant upfront only-once work in order to select an "old" glibc ABI.

The quick and dirty alternative being having a toolchain configured to link with an "old" glibc on the side.

This article missed the -static-libstdc++ critical option for c++ applications (the c++ ABI is hell on earth), but did not miss the -static-libgcc and the dynamic loading of system interface shared libs.

replies(4): >>43552297 #>>43552425 #>>43552438 #>>43552524 #
api ◴[] No.43552297[source]
Is there a reason glibc can't just do a better job at keeping some legacy symbols around? It's not like it's big stuff. They're things like legacy string functions. We're talking a few kilobytes of code in most cases.

The Linux kernel goes to a lot of effort to not break user space, at least for non-exotic core features and syscalls. It seems like a lot of user-space in Linux-land does not make the same effort.

It's particularly bad when it's the C library doing this, since that's at the center of the dependency graph for almost everything.

replies(3): >>43552371 #>>43552471 #>>43552568 #
AshamedCaptain ◴[] No.43552568[source]
The problem is the opposite: they are trying to run executables built using a newer glibc in a system that has an older glibc. glibc keeps all the old function definitions since practically forever.

Frankly, I do not understand who would think glibc symbols themselves would be the challenge in this case. Even if you statically link glibc there's zero guarantee the syscalls will be present in the older Linux (cue .ABI-tag failures). Or even damn ELF format changes (e.g. gnu-style hashes). The simple solution is to build in the older Linux (&glibc).

In my long experience with ancient binaries, glibc has almost never been the problem, and its ability to _run_ ancient binaries is all but excellent; even Linux is more of a problem than glibc is (for starters paths to everywhere in /proc, /sys change every other half-decade).

replies(1): >>43552679 #
forrestthewoods ◴[] No.43552679[source]
> executables built and using a newer glibc

It’s an abomination that Linux uses system libraries when building. Catastrophically terrible and stupid decision.

It should be trivial for any program to compile and specify any arbitrary previous version of glibc as the target.

Linux got this so incredibly wildly wrong. It’s a shame.

replies(1): >>43552738 #
AshamedCaptain ◴[] No.43552738[source]
Ehm... you _can_ use the older toolchain even in the newer "Linux", in the same way you have to use a older Visual Studio to target Windows 8 from Windows 10.
replies(2): >>43552929 #>>43553479 #
1. forrestthewoods ◴[] No.43553479[source]
I am saying that compiler toolchains on Linux should never ever under any circumstances ever rely on anything on the system for compiling. Compiling based on the system global version of glibc is stupid, bad, wrong, and Linus should be ashamed for letting it happen.

It should be trivial for Windows to cross-compile for Linux for any distro and for any ancient version of glibc.

It is not trivial.

Here is a post describing the mountain range of bullshit that Zig had to move to enable trivial cross-compile and backwards targeting. https://andrewkelley.me/post/zig-cc-powerful-drop-in-replace...

Linux is far and away the worst offender out of Linux, Mac, and Windows. By leaps and bounds.