Most active commenters
  • cmrdporcupine(6)

←back to thread

200 points jorangreef | 15 comments | | HN request time: 1.389s | source | bottom
1. lokl ◴[] No.24293396[source]
Zig is appealing to me, but I wonder whether time spent mastering Zig would be better spent mastering C.
replies(4): >>24293635 #>>24293667 #>>24294737 #>>24296869 #
2. flohofwoe ◴[] No.24293635[source]
Why not both? Zig and C are both very simple languages, and there's not much to "master" TBH (at least not many language-specific things, so what you learn mostly transfers to other programming languages as well).
replies(1): >>24293690 #
3. cmrdporcupine ◴[] No.24293667[source]
Realistically a $$ career doing embedded or systems-level work will require excellent C and C++, and Zig (or Rust) would just be icing on top if you could find an employer willing to pay you to work in it.

The good thing is that mastering one of these languages gives you conceptual tools which help with becoming at least competent in the others, if not mastering them as well.

4. cmrdporcupine ◴[] No.24293690[source]
To 'master' C is actually realizing C itself is not as simple as it looks from its syntax. It's an old language and the implementations are by no means straightforward. I'm by no means a C master but I have worked with people who are, and they know nuances of the language and the way it compiles down to various platforms in ways that shame me.

But in general I have gone for for generalist not specialist in my career.

replies(1): >>24293949 #
5. flohofwoe ◴[] No.24293949{3}[source]
Well yes, but in the end all languages have this sort of fractal nature.

But there is diminishing value in how deep you want to go into the rabbit hole. Of course there's always more to learn, but with C you're fairly quickly leaving the language and move into the layers of compiler- and hardware-trivia (good to know nonetheless, but often not really relevant for being productive in C) where in other higher-level languages you're still working your way through the standard library ;)

replies(1): >>24294596 #
6. cmrdporcupine ◴[] No.24294596{4}[source]
C exposes a lot of things, and also hides a lot of things about the underlying system that can get confusing. What's an "int"? Or a "long" You need to know for your platform what the bit width is on your platform, because it's not explicit in the name, and the language is willing to do a bunch of implicit stuff behind the scenes with only a warning or two. Should you really be using 'char'? Is yours a legit use of it or did you mean uint8_t? Other high level languages generally tend to have more sensible default patterns for these things, C ... it gives you all kinds of ammo to shoot yourself with.

It's not as big of a problem these days with things becoming less heterogeneous; almost everything is little endian now, much of it 64-bit but at least 32-bit, and we can kind of rely on POSIX being there most of the time. Most new code uses stdint.h and is explicit about word lengths by using int32_t, etc. and follows good conventions there.

But venture off the beaten path into odd microcontrollers or into retro machines or port older code or whatever ... and there's glass hidden in the grass all over.

C also exposes a model of the machine that looks low level but behind the scenes a modern processor does all sorts of branch prediction and pipelining and so on that can blow up your assumptions.

What looks like optimized clever C code can actually end up running really slow on a modern machine, and vice versa.

replies(1): >>24294917 #
7. jorangreef ◴[] No.24294737[source]
The first rule of C is that no one masters C, but you could try anyway and still have time to master Zig in a matter of weeks, which is a rounding error. Given that both offer a C compatible ABI, what would serve your projects better?
replies(2): >>24294781 #>>24294866 #
8. cmrdporcupine ◴[] No.24294781[source]
<rant-time>

I can't help but feel like in our industry C is successful (vs its 80s competition of Pascal/Modula-2, or Ada etc.) partially because of some of the same reasons that Git is successful now. Yes, it is powerful and flexible; but also in some ways unnecessarily arcane and 'dangerous' and _this gives the user a feeling of cleverness_ that is seductive to software engineers.

Put another way: Most of us enjoy the mental stimulation of programming, and we enjoy the mental challenges (in general). C makes us feel clever. Witness the "obfuscated C programming contest" etc.

Same thing that has led to nonsense 'brain teaser' whiteboard-algorithm tests at job interviews. IMHO it's in many cases for the benefit of the interviewer's ego, not the company or the interviewee ("gotcha! no job for you!").

</>

replies(2): >>24294845 #>>24296848 #
9. jorangreef ◴[] No.24294845{3}[source]
"Put another way: Most of us enjoy the mental stimulation of programming, and we enjoy the mental challenges (in general). C makes us feel clever. Witness the "obfuscated C programming contest" etc."

Yep, only C makes me feel stupid (but I enjoy that experience too!).

replies(1): >>24294884 #
10. dnautics ◴[] No.24294866[source]
If one does both, almost certainly learning zig will make for a better C programmer, as zig often forces you into patterns that would be best practices for a C programmer.
11. cmrdporcupine ◴[] No.24294884{4}[source]
Oh don't get me wrong, I'm a philosophy major drop-out, not a CS student. :-) I have never gotten off on clever-C, and it makes me feel stupid, which yeah, isn't awful either (humbling).

Luckily my day-job has nothing to do with mental gymnastics even though I'm a software engineer at Google and work in plenty of low-level stuff. Most sensible software development bears little resemblance to the stuff on whiteboards in coding interviews etc.

After 20 years of this I know the right thing is to reach for a library, and if that doesn't exist, then reach for Knuth or some other reference rather than try to write it myself from scratch.

12. dnautics ◴[] No.24294917{5}[source]
If what you're talking about (obfuscated int) is due to c being a victim of its own success, hardware manufacturers implementing C's that elided the meanings of these types to match their own architecture, against the long term best interests of C, to "make porting code easier" in the short term?
replies(1): >>24296682 #
13. cmrdporcupine ◴[] No.24296682{6}[source]
Many many decisions made over a 40 year history add up to potential confusion.
14. PaulDavisThe1st ◴[] No.24296848{3}[source]
Given that you can "write Fortran in any language", I find this analysis unlikely.

I much prefer writing Python or Lisp code than C++, but I can't do my job in Python or Lisp code, so I write C++.

15. vmchale ◴[] No.24296869[source]
C has many advantages over Zig, mostly because it's standardized and extant.

Don't think Zig is worth it when it doesn't have linear or affine types.