←back to thread

200 points jorangreef | 2 comments | | HN request time: 0.722s | source
Show context
lokl ◴[] No.24293396[source]
Zig is appealing to me, but I wonder whether time spent mastering Zig would be better spent mastering C.
replies(4): >>24293635 #>>24293667 #>>24294737 #>>24296869 #
flohofwoe ◴[] No.24293635[source]
Why not both? Zig and C are both very simple languages, and there's not much to "master" TBH (at least not many language-specific things, so what you learn mostly transfers to other programming languages as well).
replies(1): >>24293690 #
cmrdporcupine ◴[] No.24293690[source]
To 'master' C is actually realizing C itself is not as simple as it looks from its syntax. It's an old language and the implementations are by no means straightforward. I'm by no means a C master but I have worked with people who are, and they know nuances of the language and the way it compiles down to various platforms in ways that shame me.

But in general I have gone for for generalist not specialist in my career.

replies(1): >>24293949 #
flohofwoe ◴[] No.24293949[source]
Well yes, but in the end all languages have this sort of fractal nature.

But there is diminishing value in how deep you want to go into the rabbit hole. Of course there's always more to learn, but with C you're fairly quickly leaving the language and move into the layers of compiler- and hardware-trivia (good to know nonetheless, but often not really relevant for being productive in C) where in other higher-level languages you're still working your way through the standard library ;)

replies(1): >>24294596 #
cmrdporcupine ◴[] No.24294596[source]
C exposes a lot of things, and also hides a lot of things about the underlying system that can get confusing. What's an "int"? Or a "long" You need to know for your platform what the bit width is on your platform, because it's not explicit in the name, and the language is willing to do a bunch of implicit stuff behind the scenes with only a warning or two. Should you really be using 'char'? Is yours a legit use of it or did you mean uint8_t? Other high level languages generally tend to have more sensible default patterns for these things, C ... it gives you all kinds of ammo to shoot yourself with.

It's not as big of a problem these days with things becoming less heterogeneous; almost everything is little endian now, much of it 64-bit but at least 32-bit, and we can kind of rely on POSIX being there most of the time. Most new code uses stdint.h and is explicit about word lengths by using int32_t, etc. and follows good conventions there.

But venture off the beaten path into odd microcontrollers or into retro machines or port older code or whatever ... and there's glass hidden in the grass all over.

C also exposes a model of the machine that looks low level but behind the scenes a modern processor does all sorts of branch prediction and pipelining and so on that can blow up your assumptions.

What looks like optimized clever C code can actually end up running really slow on a modern machine, and vice versa.

replies(1): >>24294917 #
1. dnautics ◴[] No.24294917[source]
If what you're talking about (obfuscated int) is due to c being a victim of its own success, hardware manufacturers implementing C's that elided the meanings of these types to match their own architecture, against the long term best interests of C, to "make porting code easier" in the short term?
replies(1): >>24296682 #
2. cmrdporcupine ◴[] No.24296682[source]
Many many decisions made over a 40 year history add up to potential confusion.