←back to thread

Things Zig comptime won't do

(matklad.github.io)
458 points JadedBlueEyes | 1 comments | | HN request time: 0.204s | source
Show context
ephaeton ◴[] No.43745670[source]
zig's comptime has some (objectively: debatable? subjectively: definite) shortcomings that the zig community then overcomes with zig build to generate code-as-strings to be lateron @imported and compiled.

Practically, "zig build"-time-eval. As such there's another 'comptime' stage with more freedom, unlimited run-time (no @setEvalBranchQuota), can do IO (DB schema, network lookups, etc.) but you lose the freedom to generate zig types as values in the current compilation; instead of that you of course have the freedom to reduce->project from target compiled semantic back to input syntax down to string to enter your future compilation context again.

Back in the day, where I had to glue perl and tcl via C at one point in time, passing strings for perl generated through tcl is what this whole thing reminds me of. Sure it works. I'm not happy about it. There's _another_ "macro" stage that you can't even see in your code (it's just @import).

The zig community bewilders me at times with their love for lashing themselves. The sort of discussions which new sort of self-harm they'd love to enforce on everybody is borderline disturbing.

replies(7): >>43745717 #>>43746029 #>>43749212 #>>43749261 #>>43750375 #>>43750463 #>>43750751 #
bsder ◴[] No.43746029[source]
> The zig community bewilders me at times with their love for lashing themselves. The sort of discussions which new sort of self-harm they'd love to enforce on everybody is borderline disturbing.

Personally, I find the idea that a compiler might be able to reach outside itself completely terrifying (Access the network or a database? Are you nuts?).

That should be 100% the job of a build system.

Now, you can certainly argue that generating a text file may or may not be the best way to reify the result back into the compiler. However, what the compiler gets and generates should be completely deterministic.

replies(8): >>43746364 #>>43746553 #>>43747061 #>>43747350 #>>43748448 #>>43749876 #>>43763255 #>>43772068 #
SleepyMyroslav ◴[] No.43749876[source]
>Personally, I find the idea that a compiler might be able to reach outside itself completely terrifying (Access the network or a database? Are you nuts?).

In gamedev code is small part of the end product. "Data-driven" is the term if you want to look it up. Doing an optimization pass that will partially evaluate data+code together as part of the build is normal. Code has like 'development version' that supports data modifications and 'shipping version' that can assume that data is known.

The more traditional example of PGO+LTO is just another example how code can be specialized for existing data. I don't know a toolchain that survives change of PGO profiling data between builds without drastic changes in the resulting binary.

replies(1): >>43756356 #
1. bsder ◴[] No.43756356[source]
Is the PGO data not a static file which is then fed into the compiler? That still gives you a deterministic compiler, no?