←back to thread

79 points mpweiher | 10 comments | | HN request time: 0.418s | source | bottom
1. stevefan1999 ◴[] No.43116899[source]
C is undeniably a legendary programming language, but it's time to move beyond the constraints of the C abstract machine, which was heavily shaped by the PDP-11 due to Unix's origins on that architecture. C feels outdated for modern computing needs.

It lacks features like lambda calculus, closures, and coroutines—powerful and proven paradigms that are essential in modern programming languages. These limitations make it harder to fully embrace contemporary programming practices.

The dominance of C and its descendants has forced our systems to remain tied to its design, holding back progress. Intel tried to introduce hardware assisted garbage collection, which unfortunately failed miserably because C doesn't need it, and we are still having to cope with garbage collection entirely in software.

While I’m not suggesting we abandon C entirely (I still use it, like when writing C FFI for some projects), we need to explore new possibilities and move beyond C to better align with modern architectures and programming needs.

replies(4): >>43117623 #>>43117644 #>>43120821 #>>43123545 #
2. codr7 ◴[] No.43117623[source]
I'm pretty sure the second there is a significantly better alternative that offers the same flexibility and control, plenty of people will jump.

Same for C++.

Assuming everyone else is an idiot leads nowhere worth going.

replies(1): >>43117656 #
3. flohofwoe ◴[] No.43117644[source]
Newly created programming languages specifically tailored to GPUs (e.g. all the shading language dialects like MSL, GLSL, WGSL, HLSL, ...) are not limited by backward compatibility restrictions to C - and the execution model of GPUs is very different from traditional CPUs, yet all those languages turned out not much different from C extended with a handful new types and attributes.

Intels iAPX 432 had failed because it couldn't beat the much simpler, faster and cheaper, 'stop-gap' x86 design, not because of some C or PDP-11 conspiracy (and the Motorola 68k was much closer in spirit to the PDP ISA and a 'better fit' for C, yet it also lost against the rather crude x86 design).

4. ff317 ◴[] No.43117656[source]
https://ziglang.org/ is a solid future C-replacement, IMHO. There's pretty much no downsides and all upsides from a C hacker's perspective. It just hasn't reached 1.0 yet!
replies(2): >>43117769 #>>43168829 #
5. flohofwoe ◴[] No.43117769{3}[source]
Zig is a nice language, but from a 10000 ft view it's not fundamentally different from C (thankfully) - at least from the CPU's point of view. Any hardware that's a good match for C is also a good match for Zig.
6. tengwar2 ◴[] No.43120821[source]
:%s/essential/moderately desirable/g
7. an-unknown ◴[] No.43123545[source]
> It lacks features like lambda calculus, closures, and coroutines—powerful and proven paradigms that are essential in modern programming languages. These limitations make it harder to fully embrace contemporary programming practices.

And what features exactly would you propose for a future CPU to have to support such language constructs? It's not like a CPU is necessarily built to "support C", since a lot of code these days is written in Java/JavaScript/Python/..., but as it turns out, roughly any sane CPU can be used as a target for a C compiler. Many extensions of current CPUs are not necessarily used by an average C compiler. Think of various audio/video/AI/vector/... extensions. Yet, all of them can be used from C code, as well as from any software designed to make use of it. If there is a useful CPU extension which benefits let's say the JVM or v8, you can be sure these VMs will use those extensions, regardless of whether or not they are useful for C.

> Intel tried to introduce hardware assisted garbage collection, which unfortunately failed miserably because C doesn't need it, and we are still having to cope with garbage collection entirely in software.

Meanwhile IBM did in fact successfully add hardware assisted GC for the JVM on their Z series mainframes. IBM can do that, since they are literally selling CPUs purely for Java workloads. With a "normal" general purpose CPU, such a "only useful for Java" GC extension would be completely useless if you plan to only run let's say JavaScript or PHP code on it. The problem with such extensions is that every language needs just so slightly different semantics for a GC and as a result it's an active research topic how to generalize this to make a general "GC assist" instruction for a CPU which is useful for many different language VMs. Right now such extensions are being prototyped for RISC-V, in case you missed it. IIRC for GC in particular, research was going in the direction of adding a generalized graph traversal extension since that's the one thing most language VMs can use somehow.

C is in no way "holding back" CPU designs, but being able to efficiently run C code on any CPU architecture which hopes to become relevant is certainly a requirement, since a lot of software today is (still) written in C (and C++), including the OS and browser you used to write your comment.

Just to be clear: this topic here is about tiny microcontrollers. The only relevant languages for such microcontrollers are C/C++/assembly. Nobody cares if it can do hardware assisted GC or if it can do closures/coroutines/... or something.

8. baranul ◴[] No.43168829{3}[source]
Not everyone thinks of Zig as a "no downsides and all upsides" C-replacement. First, a lot of people will take issue with it still being in beta and it being unknown how many more years will it take to reach 1.0. There are a bunch of C-replacements, or at least viable alternative languages out there. Both old and new. With more "C-killers" likely to pop-up in the not so distant future.

There are also a lot of people, after doing their Zig language reviews, that don't like it. Muratori (Handmade Hero) won't touch it and there was a recent article that's been covered on here and other sites, where the person explained why they stopped using it (linked below).

https://strongly-typed-thoughts.net/blog/zig-2025 (Zig; what I think after months of using it)

replies(1): >>43179944 #
9. hitekker ◴[] No.43179944{4}[source]
IIRC, the blog you linked was written by someone who loves Rust and other languages which have, to say the least, a different philosophy from C and Zig.

I'm not familiar with Muratori's opinion on Zig; do you have a link?

replies(1): >>43197853 #
10. baranul ◴[] No.43197853{5}[source]
Muratori, who is a well known C programmer and instructor, has made his dislike of Zig quite clear[1]. Others, like Tsoding (many videos) and Kihlander (doesn't like syntax among other things), have given clear reasons for their dislike of Zig or why it was not their preference[2][3]. Various recognized programmers are not going to go along with, "no downsides and all upsides". Which for any language still in beta, would be a huge stretch in believability, by itself.

It's not a Rust thing, as many C/C++ programmers are not advocates of it either and if venturing out to something else, can prefer other languages. Tsoding has even dunked on Rust as being unreadable[4].

[1]: https://www.youtube.com/watch?v=uVVhwALd0o4 (Language Perf... from 29:50)

[2]: https://www.youtube.com/watch?v=r49hMsruwps (Tsoding when Zig :-))

[3]: https://kihlander.net/post/a-zig-diary/

[4]: https://www.youtube.com/watch?v=omVpuhch9MQ (Rust Is UNREADABLE)