←back to thread

196 points generichuman | 1 comments | | HN request time: 0.199s | source
Show context
James_K ◴[] No.43552753[source]
Here's a thought: just distribute source code. ABI issues should be mostly fixed. Most computers can compile source code fast enough for the user not to notice and cache the results so that it's never a problem again. If you want optimised code, you can do a source to source optimisation then zip and minify the file. You could compile such a library to approximately native speeds without much user-end lag using modern JIT methods, and maybe even run LTO in a background thread so that the exectuables outdo dynamically linked ones.
replies(3): >>43552904 #>>43553064 #>>43553077 #
m463 ◴[] No.43553064[source]
the first launch of firefox would take a few hours.

let alone the first boot of the linux kernel... :)

replies(2): >>43554222 #>>43554824 #
1. adrian_b ◴[] No.43554824[source]
This is somewhat exaggerated.

The compilation of firefox could take a few hours on some laptop dual-core Skylake CPU from 10 years ago.

Nowadays, on any decent dektop CPU with many cores the compilation of Firefox should take significantly less than an hour, though it remains one of the handful of open-source applications with a really long and non-negligible compilation time.

The Linux kernel is normally compiled much faster than Firefox, except when one would enable the compilation of all existing kernel modules, for all the hardware that could be supported by Linux, even if almost all of that is not present and it would never be present on the target computer system.