Most active commenters
  • pron(17)
  • WalterBright(14)
  • pcwalton(9)
  • keybored(8)
  • matklad(6)
  • edflsafoiewq(4)
  • msteffen(3)
  • naasking(3)

←back to thread

Things Zig comptime won't do

(matklad.github.io)
458 points JadedBlueEyes | 96 comments | | HN request time: 0.219s | source | bottom
1. pron ◴[] No.43745438[source]
Yes!

To me, the uniqueness of Zig's comptime is a combination of two things:

1. comtpime replaces many other features that would be specialised in other languages with or without rich compile-time (or runtime) metaprogramming, and

2. comptime is referentially transparent [1], that makes it strictly "weaker" than AST macros, but simpler to understand; what's surprising is just how capable you can be with a comptime mechanism with access to introspection yet without the referentially opaque power of macros.

These two give Zig a unique combination of simplicity and power. We're used to seeing things like that in Scheme and other Lisps, but the approach in Zig is very different. The outcome isn't as general as in Lisp, but it's powerful enough while keeping code easier to understand.

You can like it or not, but it is very interesting and very novel (the novelty isn't in the feature itself, but in the place it has in the language). Languages with a novel design and approach that you can learn in a couple of days are quite rare.

[1]: In short, this means that you get no access to names or expressions, only the values they yield.

replies(7): >>43745704 #>>43745928 #>>43746682 #>>43747113 #>>43747250 #>>43749014 #>>43749546 #
2. User23 ◴[] No.43745704[source]
Has anyone grafted Zig style macros into Common Lisp?
replies(4): >>43745832 #>>43745860 #>>43746089 #>>43753782 #
3. toxik ◴[] No.43745832[source]
Isn’t this kind of thing sort of the default thing in Lisp? Code is data so you can transform it.
replies(2): >>43746555 #>>43747714 #
4. Zambyte ◴[] No.43745860[source]
There isn't really as clear of a distinction between "runtime" and "compile time" in Lisp. The comptime keyword is essentially just the opposite of quote in Lisp. Instead of using comptime to say what should be evaluated early, you use quote to say what should be evaluated later. Adding comptime to Lisp would be weird (though obviously not impossible, because it's Lisp), because that is essentially the default for expressions.
replies(1): >>43746099 #
5. paldepind2 ◴[] No.43745928[source]
I was a bit confused by the remark that comptime is referentially transparent. I'm familiar with the term as it's used in functional programming to mean that an expression can be replaced by its value (stemming from it having no side-effects). However, from a quick search I found an old related comment by you [1] that clarified this for me.

If I understand correctly you're using the term in a different (perhaps more correct/original?) sense where it roughly means that two expressions with the same meaning/denotation can be substituted for each other without changing the meaning/denotation of the surrounding program. This property is broken by macros. A macro in Rust, for instance, can distinguish between `1 + 1` and `2`. The comptime system in Zig in contrast does not break this property as it only allows one to inspect values and not un-evaluated ASTs.

[1]: https://news.ycombinator.com/item?id=36154447

replies(2): >>43746591 #>>43746707 #
6. Conscat ◴[] No.43746089[source]
The Scopes language might be similar to what you're asking about. Its notion of "spices" which complement the "sugars" feature is a similar kind of constant evaluation. It's not a Common Lisp dialect, though, but it is sexp based.
7. Conscat ◴[] No.43746099{3}[source]
The truth of this varies between Lisp based languages.
8. fn-mote ◴[] No.43746555{3}[source]
There are no limitations on the transformations in lisp. That can make macros very hard to understand. And hard for later program transformers to deal with.

The innovation in Zig is the restrictions that limit the power of macros.

9. deredede ◴[] No.43746591[source]
Those are equivalent, I think. If you can replace an expression by its value, any two expressions with the same value are indistinguishable (and conversely a value is an expression which is its own value).
10. cannabis_sam ◴[] No.43746682[source]
Regarding 2. How are comptime values restricted to total computations? Is it just by the fact that the compiler actually finished, or are there any restrictions on comptime evaluations?
replies(2): >>43746763 #>>43750962 #
11. pron ◴[] No.43746707[source]
Yes, I am using the term more correctly (or at least more generally), although the way it's used in functional programming is a special case. A referentially transparent term is one whose sub-terms can be replaced by their references without changing the reference of the term as a whole. A functional programming language is simply one where all references are values or "objects" in the programming language itself.

The expression `i++` in C is not a value in C (although it is a "value" in some semantic descriptions of C), yet a C expression that contains `i++` and cannot distinguish between `i++` and any other C operation that increments i by 1, is referentially transparent, which is pretty much all C expressions except for those involving C macros.

Macros are not referentially transparent because they can distinguish between, say, a variable whose name is `foo` and is equal to 3 and a variable whose name is `bar` and is equal to 3. In other words, their outcome may differ not just by what is being referenced (3) but also by how it's referenced (`foo` or `bar`), hence they're referentially opaque.

12. pron ◴[] No.43746763[source]
They don't need to be restricted to total computation to be referentially transparent. Non-termination is also a reference.
13. keybored ◴[] No.43747113[source]
I’ve never managed to understand your year-long[1] manic praise over this feature. Given that you’re a language implementer.

It’s very cool to be able to just say “Y is just X”. You know in a museum. Or at a distance. Not necessarily as something you have to work with daily. Because I would rather take something ranging from Java’s interface to Haskell’s typeclasses since once implemented, they’ll just work. With comptime types, according to what I’ve read, you’ll have to bring your T to the comptime and find out right then and there if it will work. Without enough foresight it might not.

That’s not something I want. I just want generics or parametric polymorphism or whatever it is to work once it compiles. If there’s a <T> I want to slot in T without any surprises. And whether Y is just X is a very distant priority at that point. Another distant priority is if generics and whatever else is all just X undernea... I mean just let me use the language declaratively.

I felt like I was on the idealistic end of the spectrum when I saw you criticizing other languages that are not installed on 3 billion devices as too academic.[2] Now I’m not so sure?

[1] https://news.ycombinator.com/item?id=24292760

[2] But does Scala technically count since it’s on the JVM though?

replies(4): >>43747748 #>>43748238 #>>43751080 #>>43752087 #
14. WalterBright ◴[] No.43747250[source]
It's not novel. D pioneered compile time function execution (CTFE) back around 2007. The idea has since been adopted in many other languages, like C++.

One thing it is used for is generating string literals, which then can be fed to the compiler. This takes the place of macros.

CTFE is one of D's most popular and loved features.

replies(5): >>43747836 #>>43747875 #>>43749766 #>>43750357 #>>43751134 #
15. TinkersW ◴[] No.43747714{3}[source]
Lisp is so powerful, but without static types you can't even do basic stuff like overloading, and have to invent a way to even check the type(for custom types) so you can branch on type.
replies(3): >>43747885 #>>43748318 #>>43749199 #
16. ww520 ◴[] No.43747748[source]
I'm sorry but I don't understand what you're complaining about comptime. All the stuff you said you wanted to work (generic, parametric polymorphism, slotting <T>, etc) just work with comptime. People are praising about comptime because it's a simple mechanism that replacing many features in other languages that require separate language features. Comptime is very simple and natural to use. It can just float with your day to day programming without much fuss.
replies(1): >>43749782 #
17. az09mugen ◴[] No.43747836[source]
A little bit out of context, I just want to thank you and all the contributors for the D programming language.
replies(1): >>43749238 #
18. msteffen ◴[] No.43747875[source]
If I understand TFA correctly, the author claims that D’s approach is actually different: https://matklad.github.io/2025/04/19/things-zig-comptime-won...

“In contrast, there’s absolutely no facility for dynamic source code generation in Zig. You just can’t do that, the feature isn’t! [sic]

Zig has a completely different feature, partial evaluation/specialization, which, none the less, is enough to cover most of use-cases for dynamic code generation.”

replies(4): >>43748490 #>>43749693 #>>43750330 #>>43755195 #
19. dokyun ◴[] No.43747885{4}[source]
> Lisp is so powerful, but <tired old shit from someone who's never used Lisp>.

You use defmethod for overloading. Types check themselves.

replies(1): >>43751628 #
20. hitekker ◴[] No.43748238[source]
Do you have a source for "criticizing other languages not installed on 3 billion devices as too academic" ?

Without more context, this comment sounds like rehashing old (personal?) drama.

replies(1): >>43749742 #
21. wild_egg ◴[] No.43748318{4}[source]
> but without static types

So add static types.

https://github.com/coalton-lang/coalton

22. WalterBright ◴[] No.43748490{3}[source]
The partial evaluation/specialization is accomplished in D using a template. The example from the link:

    fn f(comptime x: u32, y: u32) u32 {
        if (x == 0) return y + 1;
        if (x == 1) return y * 2;
        return y;
    }
and in D:

    uint f(uint x)(uint y) {
        if (x == 0) return y + 1;
        if (x == 1) return y * 2;
        return y;
    }
The two parameter lists make it a function template, the first set of parameters are the template parameters, which are compile time. The second set are the runtime parameters. The compile time parameters can also be types, and aliased symbols.
replies(2): >>43752564 #>>43753736 #
23. ◴[] No.43749014[source]
24. pjmlp ◴[] No.43749199{4}[source]
No need for overloading when you have CLOS and multi-method dispatch.
25. WalterBright ◴[] No.43749238{3}[source]
That means a lot to us. Thanks!
26. baranul ◴[] No.43749546[source]
Comptime is often pushed as being something extraordinarily special, when it's not. Many other languages have similar. Jai, Vlang, Dlang, etc...

What could be argued, is if Zig's version of it is comparatively better, but that is a very difficult argument to make. Not only in terms of how different languages are used, but something like an overall comparison of features looks to be needed in order to make any kind of convincing case, beyond hyping a particular feature.

replies(1): >>43754886 #
27. baazaa ◴[] No.43749693{3}[source]
that's a comically archaic way of using the verb 'to be', not a grammatical error. you see it in phrases like "to be or not to be", or "i think, therefore i am". "the feature isn't" just means it doesn't exist.
replies(1): >>43755258 #
28. keybored ◴[] No.43749742{3}[source]
pron has been posting about programming languages for years and years, here, in public, for all to see. I guess reading them makes it personal? (We don’t know each other)

The usual persona is the hard-nosed pragmatist[1] who thinks language choice doesn’t matter and that PL preference is mostly about “programmer enjoyment”.

[1] https://news.ycombinator.com/item?id=16889706

Edit: The original claim might have been skewed. Due to occupation the PL discussions often end up being about Java related things, and the JVM language which is criticized has often been Scala specifically. Here he recommends Kotlin over Scala (not Java): https://news.ycombinator.com/item?id=9948798

29. Someone ◴[] No.43749766[source]
> D pioneered compile time function execution (CTFE) back around 2007

Pioneered? Forth had that in the 1970s, lisp somewhere in the 1960s (I’m not sure whether the first versions of either had it, so I won’t say 1970 respectively 1960), and there may be other or even older examples.

replies(1): >>43754216 #
30. keybored ◴[] No.43749782{3}[source]
comptime can’t outright replace many language features because it chooses different tradeoffs to get to where it wants. You get a “one thing to rule all” at the expense of less declarative use.

Which I already said in my original comment. But here’s a source that I didn’t find last time: https://strongly-typed-thoughts.net/blog/zig-2025#comptime-i...

Academics have thought about evaluating things at compile time (or any time) for decades. No, you can’t just slot in eval at a weird place that no one ever thought of (they did) and immediately solve a suite of problems that other languages use multiple discrete features for (there’s a reason they do that).

replies(1): >>43750438 #
31. sixthDot ◴[] No.43750330{3}[source]
Sure, CTFE can be used to generate strings, then later "mixed-in" as source code, but also can be used to execute normal functions and then the result can be stored in a compile-time constant (in D that's the `enum` storage class), for example generating an array using a function literal called at compile-time:

   enum arr = { return iota(5).map!(i => i * 10).array; }();
   static assert(arr == [0,10,20,30,40]);
32. throwawaymaths ◴[] No.43750357[source]
You're missing the point. If anything D is littered with features and feature bloat (CTFE included). Zig (as the author of the blog mentions) is more than somewhat defined by what it can't do.
replies(1): >>43754253 #
33. throwawaymaths ◴[] No.43750438{4}[source]
> comptime can’t outright replace many language features because it chooses different tradeoffs to get to where it wants.

You're missing the point. I don't have any theory to qualify this, but:

I've worked in a language with lisp-ey macros, and I absolutely hate hate hate when people build too-clever DSLs that hide a lot of weird shit like creating variable names or pluralizing database tables for me, swapping camel-case and snake case, creating a ton of logic under the hood that's hard to chase.

Zig's comptime for the most part shys you away from those sorts of things. So yes, it's not fully feature parity in the language theory sense, but it really blocks you or discourages you away from shit you don't need to do, please for the love of god don't. Hard to justify theoretically. it's real though.

It's just something you notice after working with it for while.

replies(1): >>43752052 #
34. mppm ◴[] No.43750962[source]
Yes, comptime evaluation is restricted to a configurable number of back-branches. 1000 by default, I think.
35. pron ◴[] No.43751080[source]
My "manic praise" extends to the novelty of the feature as Zig's design is revolutionary. It is exciting because it's very rare to see completely novel designs in programming languages, especially in a language that is both easy to learn and intended for low-level programming.

I wait 10-15 years before judging if a feature is "good"; determining that a feature is bad is usually quicker.

> With comptime types, according to what I’ve read, you’ll have to bring your T to the comptime and find out right then and there if it will work. Without enough foresight it might not.

But the point is that all that is done at compile time, which is also the time when all more specialised features are checked.

> That’s not something I want. I just want generics or parametric polymorphism or whatever it is to work once it compiles.

Again, everything is checked at compile-time. Once it compiles it will work just like generics.

> I mean just let me use the language declaratively.

That's fine and expected. I believe that most language preferences are aesthetic, and there have been few objective reasons to prefer some designs over others, and usually it's a matter of personal preference or "extra-linguistic" concerns, such as availability of developers and libraries, maturity, etc..

> Now I’m not so sure?

Personally, I wouldn't dream of using Zig or Rust for important software because they're so unproven. But I do find novel designs fascinating. Some even match my own aesthetic preferences.

replies(1): >>43752003 #
36. pron ◴[] No.43751134[source]
It is novel to the point of being revolutionary. As I wrote in my comment, "the novelty isn't in the feature itself, but in the place it has in the language". It's one thing to come up with a feature. It's a whole other thing to position it within the language. Various compile-time evaluations are not even remotely positioned in D, Nim, or C++ as they are in Zig. The point of Zig's comptime is not that it allows you to do certain computations at compile-time, but that it replaces more specialised features such as templates/generics, interfaces, macros, and conditional compilation. That creates a completely novel simplicity/power balance.

If the presence of features is how we judge design, then the product with the most features would be considered the best design. Of course, often the opposite is the case. The absence of features is just as crucial for design as their presence. It's like saying that a device with a touchscreen and a physical keyboard has essentially the same properties as a device with only a touchscreen.

If a language has a mechanism that can do exactly what Zig's comptime does but it also has generics or templates, macros, and/or conditional compilation, then it doesn't have anything resembling Zig's comptime.

replies(1): >>43753970 #
37. User23 ◴[] No.43751628{5}[source]
And a modern compiler will jmp past the type checks if the inferencer OKs it!
38. keybored ◴[] No.43752003{3}[source]
> But the point is that all that is done at compile time, which is also the time when all more specialised features are checked.

> ...

> Again, everything is checked at compile-time. Once it compiles it will work just like generics.

No. My compile-time when using a library with a comptime type in Zig is not guaranteed to work because my user experience could depend on if the library writer tested with the types (or compile-time input) that I am using.[1] That’s not a problem in Java or Haskell: if the library works for Mary it will work for John no matter what the type-inputs are.

> That's fine and expected. I believe that most language preferences are aesthetic, and there have been few objective reasons to prefer some designs over others, and usually it's a matter of personal preference or "extra-linguistic" concerns, such as availability of developers and libraries, maturity, etc..

Please don’t retreat to aesthetics. What I brought up is a concrete and objective user experience tradeoff.

[1] based on https://strongly-typed-thoughts.net/blog/zig-2025#comptime-i...

replies(1): >>43753831 #
39. keybored ◴[] No.43752052{5}[source]
No, you are clearly missing the point because I laid out concrete critiques about how Zig doesn’t replace certain concrete language features with One Thing to Rule Them All. All in reply to someone complimenting Zig on that same subject.

That you want to make a completely different point about macros gone wild is not my problem.

40. keybored ◴[] No.43752087[source]
> Because I would rather take something ranging from Java’s interface to Haskell’s typeclasses since once implemented, they’ll just work. With comptime types, according to what I’ve read, you’ll have to bring your T to the comptime and find out right then and there if it will work. Without enough foresight it might not.

This was perhaps a bad comparison and I should have compared e.g. Java generics to Zig’s comptime T.

41. msteffen ◴[] No.43752564{4}[source]
Here is, I think, an interesting example of the kind of thing TFA is talking about. In case you’re not already familiar, there’s an issue that game devs sometimes struggle with, where, in C/C++, an array of structs (AoS) has a nice syntactic representation in the language and is easy to work with/avoid leaks, but a struct of arrays (SoA) has a more compact layout in memory and better performance.

Zig has a library to that allows you to have an AoS that is laid out in memory like a SoA: https://zig.news/kristoff/struct-of-arrays-soa-in-zig-easy-i... . If you read the implementation (https://github.com/ziglang/zig/blob/master/lib/std/multi_arr...) the SoA is an elaborately specialized type, parameterized on a struct type that it introspects at compile time.

It’s neat because one might reach for macros for this sort of the thing (and I’d expect the implementation to be quite complex, if it’s even possible) but the details of Zig’s comptime—you can inspect the fields of the type parameter struct, and the SoA can be highly flexible about its own fields—mean that you don’t need a macro system, and the Zig implementation is actually simpler than a macro approach probably would be.

replies(1): >>43753986 #
42. naasking ◴[] No.43753736{4}[source]
Using a different type vs. a different syntax can be an important usability consideration, particularly since D also has templates and other features, where Zig provides only the comptime type for all of them. Homogeneity can also be a nice usability win, though there are downsides as well.
replies(1): >>43754154 #
43. pron ◴[] No.43753782[source]
That wouldn't be very meaningful. The semantics of Zig's comptime is more like that of subroutines in a dynamic language - say, JavaScript functions - than that of macros. The point is that it's executed, and yields errors, at a different phase, i.e. compile time.
44. pron ◴[] No.43753831{4}[source]
> No. My compile-time when using a library with a comptime type in Zig is not guaranteed to work because my user experience could depend on if the library writer tested with the types (or compile-time input) that I am using.[1] That’s not a problem in Java or Haskell: if the library works for Mary it will work for John no matter what the type-inputs are.

What you're saying isn't very meaningful. Even generics may impose restrictions on their type parameters (e.g. typeclasses in Zig or type bounds in Java) and don't necessarily work for all types. In both cases you know at compile-time whether your types fit the bounds or not.

It is true that the restrictions in Haskell/Java are more declarative, but the distinction is more a matter of personal aesthetic preference, which is exactly what's expressed in that blog post (although comptime is about as different from C++ templates as it is from Haskell/Java generics). Like anything, and especially truly novel approaches, it's not for everyone's tastes, but neither are Java, Haskell, or Rust, for that matter. That doesn't make Zig's approach any less novel or interesting, even if you don't like it. I find Rust's design unpalatable, but that doesn't mean it's not interesting or impressive, and Zig's approach -- again, like it or not -- is even more novel.

replies(1): >>43754663 #
45. WalterBright ◴[] No.43753970{3}[source]
> Various compile-time evaluations are not even remotely positioned in D, Nim, or C++ as they are in Zig.

See my other reply. I don't understand your comment.

https://news.ycombinator.com/item?id=43748490

replies(1): >>43755065 #
46. WalterBright ◴[] No.43753986{5}[source]
D doesn't have a macro system, either, so I don't understand what you mean.
replies(1): >>43756684 #
47. WalterBright ◴[] No.43754154{5}[source]
Zig's use of comptime in a function argument makes it a template :-/

I bet if you use such a function with different comptime arguments, compile it, and dump the assembler you'll see that function appearing multiple times, each with somewhat different code generated for it.

replies(1): >>43754242 #
48. WalterBright ◴[] No.43754216{3}[source]
True, but consider that Forth and Lisp started out as interpreted languages, meaning the whole thing can be done at compile time. I haven't seen this feature before in a language that was designed to be compiled to machine code, such as C, Pascal, Fortran, etc.

BTW, D's ImportC C compiler does CTFE, too!! CTFE is a natural fit for C, and works like a champ. Standard C should embrace it.

replies(1): >>43754779 #
49. naasking ◴[] No.43754242{6}[source]
> Zig's use of comptime in a function argument makes it a template :-/

That you can draw an isomorphism between two things does not mean they are ergonomically identical.

replies(1): >>43755013 #
50. WalterBright ◴[] No.43754253{3}[source]
I fully agree that the difference is a matter of taste.

All living languages accrete features over time. D started out as a much more modest language. It originally eschewed templates and operator overloading, for example.

Some features were abandoned, too, like complex numbers and the "bit" data type.

51. keybored ◴[] No.43754663{5}[source]
> What you're saying isn't very meaningful. Even generics may impose restrictions on their type parameters (e.g. typeclasses in Zig or type bounds in Java) and don't necessarily work for all types. In both cases you know at compile-time whether your types fit the bounds or not.

Java type-bounds is what I mean with declarative. The library author wrote them, I know them, I have to follow them. It’s all spelled out. According to the link that’s not the case with the Zig comptime machinery. It’s effectively duck-typed from the point of view of the client (declaration).

I also had another source in mind which explicitly described how Zig comptime is “duck typed” but I can’t seem to find it. Really annoying.

> It is true that the restrictions in Haskell/Java are more declarative, but the distinction is more a matter of personal aesthetic preference, which is exactly what's expressed in that blog post (although comptime is about as different from C++ templates as it is from Haskell/Java generics).

It’s about as aesthetic as having spelled out reasons (usability) for preferring static typing over dynamic typing or vice versa. It’s really not. At all.

> , but that doesn't mean it's not interesting or impressive, and Zig's approach -- again, like it or not -- is even more novel.

I prefer meaningful leaps forward in programming language usability over supposed most-streamlined and clever approaches (comptime all the way down). I guess I’m just a pragmatist in that very narrow area.

replies(1): >>43754837 #
52. Someone ◴[] No.43754779{4}[source]
Nitpick: Lisp didn’t start out as an interpreted language. It started as an idea from a theoretical computer scientist, and wasn’t supposed to be implemented. https://en.wikipedia.org/wiki/Lisp_(programming_language)#Hi...:

"Steve Russell said, look, why don't I program this eval ... and I said to him, ho, ho, you're confusing theory with practice, this eval is intended for reading, not for computing. But he went ahead and did it. That is, he compiled the eval in my paper into IBM 704 machine code, fixing bugs, and then advertised this as a Lisp interpreter, which it certainly was. So at that point Lisp had essentially the form that it has today”

53. pron ◴[] No.43754837{6}[source]
> According to the link that’s not the case with the Zig comptime machinery. It’s effectively duck-typed from the point of view of the client (declaration).

It is "duck-typed", but it is checked at compile time. Unlike ducktyping in JS, you know whether or not your type is a valid argument just as you would for Java type bounds -- the compiler lets you know. Everything is also all spelled out, just in a different way.

> It’s about as aesthetic as having spelled out reasons (usability) for preferring static typing over dynamic typing or vice versa. It’s really not. At all.

But everything is checked statically, so all the arguments of failing fast apply here, too.

> I prefer meaningful leaps forward in programming language usability over supposed most-streamlined and clever approaches (comptime all the way down). I guess I’m just a pragmatist in that very narrow area.

We haven't had "meaningful leaps forward in programming language usability" in a very long time (and there are fundamental reasons for that, and indeed the situation was predicted decades ago). But if we were to have a meaningful leap forward, first we'd need some leap forward and then we could try learning how meaningful it is (which usually takes a very long time). I don't know that Zig's comptime is a meaningful leap forward or not, but as one of the most novel innovations in programming languages in a very long time, at least it's something that's worth a look.

replies(1): >>43765147 #
54. cassepipe ◴[] No.43754886[source]
You didn't read the article because that's the argument being made (whether you think these points have merit) :

> My understanding is that Jai, for example, doesn’t do this, and runs comptime code on the host.

> Many powerful compile-time meta programming systems work by allowing you to inject arbitrary strings into compilation, sort of like #include whose argument is a shell-script that generates the text to include dynamically. For example, D mixins work that way:

> And Rust macros, while technically producing a token-tree rather than a string, are more or less the same

replies(1): >>43759831 #
55. pcwalton ◴[] No.43755013{7}[source]
When we're responding to quite valid points about other languages having essentially the same features as Zig with subjective claims about ergonomics, the idea that Zig comptime is "revolutionary" is looking awfully flimsy. I agree with Walter: Zig isn't doing anything novel. Picking some features while leaving others out is something that every language does; if doing that is enough to make a language "revolutionary", then every language is revolutionary. The reality is a lot simpler and more boring: for Zig enthusiasts, the set of features that Zig has appeals to them. Just like enthusiasts of every programming language.
replies(4): >>43755143 #>>43755308 #>>43757580 #>>43761282 #
56. pron ◴[] No.43755065{4}[source]
The revolution in Zig isn't in what the comptime mechanism is able to do, but how it allows the language to not have other features, which is what gives that language its power to simplicity ratio.

Let me put it like this: Zig's comptime is a general compilation time computation mechanism that has introspection capabilities and replaces generics/templates, interfaces/typeclasses, macros, and conditional compilation.

It's like that the main design feature of some devices is that they have a touchscreen but not a keyboard. The novelty isn't the touchscreen; it's in the touchscreen eliminating the keyboard. The touchscreen itself doesn't have to be novel; the novelty is how it's used to eliminate the keyboard. If your device has a touchscreen and a keyboard, then it does not have the same design feature.

Zig's novel comptime is a mechanism that eliminates other specialised features, and if these features are still present, then your language doesn't have Zig's comptime. It has a touchscreen and a keyboard, whereas Zig's novelty is a touchscreen without a keyboard.

replies(1): >>43755336 #
57. pron ◴[] No.43755143{8}[source]
I'm sorry, but not being able to see that a design that uses a touchscreen to eliminate the keyboard is novel despite the touchscreen itself having been used elsewhere alongside a keyboard, shows a misunderstanding of what design is.

Show me the language that used a general purpose compile-time mechanisms to avoid specialised features such as generics/templates, interfaces/typeclasses, macros, and conditional compilation before Zig, then I'll say that language was revolutionary.

I also find it hard to believe that you can't see how replacing all these features with a single one (that isn't AST macros) is novel. I'm not saying you have to think it's a good idea -- that's a matter of personal taste (at least until we can collect more objective data) -- but it's clearly novel.

I don't know all the languages in the world and it's possible there was a language that did that before Zig, but none of the languages mentioned here did. Of course, it's possible that no other language did that because it's stupid, but that doesn't mean it's not novel (especially as the outcome does not appear stupid on the face of it).

replies(1): >>43755588 #
58. CRConrad ◴[] No.43755195{3}[source]
> the feature isn’t! [sic]

To be, or not to be... The feature is not.

(IOW, English may not be the author's native language. I'm fairly sure it means "The feature doesn't exist".)

59. CRConrad ◴[] No.43755258{4}[source]
Damn, beat me by half a day.
60. matklad ◴[] No.43755308{8}[source]
>for Zig enthusiasts, the set of features that Zig has appeals to them. Just like enthusiasts of every programming language.

I find it rather amusing that it's a Java and a Rust enthusiast who are extolling Zig approach here! I am not particularly well read with respect to programming languages, but I don't recall many languages which define generic pair as

    fn Pair(A: type, B: type) type {
        return struct { fst: A, snd: B };
    }
The only one that comes to mind is 1ML, and I'd argue that it is also revolutionary.
replies(2): >>43755480 #>>43755521 #
61. WalterBright ◴[] No.43755336{5}[source]
The example of a comptime parameter to a function is a template, whether you call it that or not :-/ A function template is a function with compile time parameters.

The irony here is back in the 2000's, many programmers were put off by C++ templates, and found them to be confusing. Myself included. But when I (belatedly) realized that function templates were functions with compile time parameters, I had an epiphany:

Don't call them templates! Call them functions with compile time parameters. The people who were confused by templates understood that immediately. Then later, after realizing that they had been using templates all along, became comfortable with templates.

BTW, I wholeheartedly agree that it is better to have a small set of features that can do the same thing as a larger set of features. But I'm not seeing how comptime is accomplishing that.

replies(1): >>43755439 #
62. pron ◴[] No.43755439{6}[source]
> But I'm not seeing how comptime is accomplishing that.

Because Zig does the work of C++'s templates, macros, conditional compilation, constexprs, and concepts with one relatively simple feature.

replies(1): >>43755658 #
63. pcwalton ◴[] No.43755480{9}[source]
Well, if you strip away the curly braces and return statement, that's just a regular type definition. Modeling generic types as functions from types to types is just System F, which goes back to 1975. Turing-complete type-level programming is common in tons of languages, from TypeScript to Scala to Haskell.

I think the innovation here is imperative type-level programming--languages that support type-level programming are typically functional languages, or functional languages at the type level. Certainly interesting, but not revolutionary IMO.

replies(1): >>43755656 #
64. WalterBright ◴[] No.43755521{9}[source]
I might be misunderstanding something, but this is how it works in D:

    struct Pair(A, B) { A fst; B snd; }

    Pair!(int, float) p; // declaration of p as instance of Pair
It's just a struct with the addition of type parameters.
65. pcwalton ◴[] No.43755588{9}[source]
But Zig's comptime only approximates the features you mentioned; it doesn't fully implement them. Which is what the original article is saying. To use your analogy, using a touchscreen to eliminate a keyboard isn't very impressive if your touchscreen keyboard is missing keys.

If you say that incomplete implementations count, then I could argue that the C preprocessor subsumes generics/templates, interfaces/typeclasses†, macros, and conditional compilation.

†Exercise for the reader: build a generics system in the C preprocessor that #error's out if the wrong type is passed using the trick in [1].

[1]: https://stackoverflow.com/a/45450646

replies(1): >>43756165 #
66. matklad ◴[] No.43755656{10}[source]
The thing is, this is not type-level programming, this is term-level programming. That there's no separate language of types is the feature. Functional/imperative is orthogonal. You can imagine functional Zig which writes

    Pair :: type -> type -> type
    let Pair a b = product a b 
This is one half of the innovation, dependent-types lite.

The second half is how every other major feature is expressed _directly_ via comptime/partial evaluation, not even syntax sugar is necessary. Generic, macros, and conditional compilation are the three big ones.

replies(1): >>43755774 #
67. WalterBright ◴[] No.43755658{7}[source]
From the article:

    fn print(comptime T: type, value: T) void {
That's a template. In D it looks like:

    void print(T)(T value) {
which is also a template.
replies(1): >>43755843 #
68. pcwalton ◴[] No.43755774{11}[source]
> This is one half of the innovation, dependent-types lite.

But that's not dependent types. Dependent types are types that depend on values. If all the arguments to a function are either types or values, then you don't have dependent types: you have kind polymorphism, as implemented for example in GHC extensions [1].

> The second half is how every other major feature is expressed _directly_ via comptime/partial evaluation, not even syntax sugar is necessary. Generic, macros, and conditional compilation are the three big ones.

I'd argue that not having syntactic sugar is pretty minor, but reasonable people can differ I suppose.

[1]: https://ghc.gitlab.haskell.org/ghc/doc/users_guide/exts/poly...

replies(1): >>43756112 #
69. pcwalton ◴[] No.43755843{8}[source]
I think another way to put it is that the fact that Zig reuses the keyword "comptime" to denote type-level parameters and to denote compile-time evaluation doesn't mean that there's only one feature. There are still two features (templates and CTFE), just two features that happen to use the same keyword.
replies(2): >>43756052 #>>43756175 #
70. pron ◴[] No.43756052{9}[source]
Maybe you can insist that these are two features (although I disagree), but calling one of them templates really misses the mark. That's because, at least in C++, templates have their own template-level language (of "metafunctions"), whereas that's not the case in Zig. E.g. that C++'s `std::enable_if` is just the regular `if` in Zig makes all the difference (and also shows why there may not really be two features here, only one).
replies(3): >>43756271 #>>43756316 #>>43760879 #
71. matklad ◴[] No.43756112{12}[source]
> Dependent types are types that depend on values.

Like this?

    fn f(comptime x: bool) if (x) u32 else bool {
        return if (x) 0 else false;
    }
replies(2): >>43756294 #>>43756555 #
72. pron ◴[] No.43756165{10}[source]
> But Zig's comptime only approximates the features you mentioned; it doesn't fully implement them

That's like saying that a touchscreen device without a keyboard only approximates a keyboard but doesn't fully implement one. The important thing is that the feature performs the duty of those other features.

> If you say that incomplete implementations count, then I could argue that the C preprocessor subsumes generics/templates, interfaces/typeclasses†, macros, and conditional compilation.

There are two problems with this, even if we assumed that the power of C's preprocessor is completely equivalent to Zig's comptime:

First, C's preprocessor is a distinct meta-language; one major point of Zig's comptime is that the metalanguage is the same language as the object language.

Second, it's unsurprising that macros -- whether they're more sophisticated or less -- can do the role of all those other features. As I wrote in my original comment (https://news.ycombinator.com/item?id=43745438) one of the exciting things about Zig is that a feature that isn't macros (and is strictly weaker than macros, as it's referentially transparent) can replace them for the most part, while enjoying a greater ease of understanding.

I remember that one of my first impressions of Zig was that it evoked the magic of Lisp (at least that was my gut feeling), but in a completely different way, one that doesn't involve AST manipulation, and doesn't suffer from many of the problems that make List macros problematic (i.e. creating DSLs with their own rules). I'm not saying it may not have other problems, but that is very novel.

I hadn't seen any such fresh designs in well over a decade. Now, it could be that I simply don't know enough languages, but you also haven't named other languages that work on this design principle, so I think my excitement was warranted. I'll let you know if I think that's not only a fresh and exciting design but also a good one in ten years.

BTW, I have no problem with you finding Zig's comptime unappealing to your tastes or even believing it suffers from fundamental issues that may prove problematic in practice (although personally I think that, when considering both pros and cons of this design versus the alternatives, there's some promise here). I just don't understand how you can say that the design isn't novel while not naming one other language with a similar core design: a mechanism for partial evaluation of the object language (with access to additional reflective operations) that replace those other features I mentioned (by performing their duty, if not exactly their mode of operation).

For example, I've looked at Terra, but it makes a distinction between the meta language and the object (or "runtime") language.

replies(2): >>43756323 #>>43758823 #
73. edflsafoiewq ◴[] No.43756175{9}[source]
They are the same thing though. Conceptually there's a partial evaluation pass whose goal is to eliminate all the comptimes by lowering them to regular runtime values. The apparent different "features" just arise from its operation on the different kinds of program constructs. To eliminate a expression, it evaluates the expression and replaces it with its value. To eliminate a loop, it unrolls it. To eliminate a call to a function with comptime arguments, it generates a specialized function for those arguments and replaces it with a call to the specialized function.
74. sekao ◴[] No.43756271{10}[source]
Agreed. Zig's approach re-uses the existing machinery of the language far more than C++ templates do. Another example of this is that Zig has almost no restrictions on what kinds of values can be `comptime` parameters. In C++, "non-type template parameters" are restricted to a small subset of types (integers, enums, and a few others). Rust's "const generics" are even more restrictive: only integers for now.

In Zig I can pass an entire struct instance full of config values as a single comptime parameter and thread it anywhere in my program. The big difference here is that when you treat compile-time programming as a "special" thing that is supported completely differently in the language, you need to add these features in a painfully piecemeal way. Whereas if it's just re-using the machinery already in place in your language, these restrictions don't exist and your users don't need to look up what values can be comptime values...they're just another kind of thing I pass to functions, so "of course" I can pass a struct instance.

replies(1): >>43758219 #
75. edflsafoiewq ◴[] No.43756294{13}[source]
No, dependent types depend on runtime values.
replies(1): >>43756340 #
76. edflsafoiewq ◴[] No.43756316{10}[source]
std::enable_if exists to disable certain overloads during overload resolution. Zig has no overloading, so it has no equivalent.
replies(1): >>43756510 #
77. matklad ◴[] No.43756323{11}[source]
>I'm not saying it may not have other problems, but that is very novel.

Just to explicitly acknowledge this, it inherits the C++ problem that you don't get type errors inside a function until you call the function and, when that happens, its not always immediately obvious whether the problem is in the caller or in the callee.

78. matklad ◴[] No.43756340{14}[source]
Yeah, that one Zig can not do, hence "-lite".
replies(1): >>43756634 #
79. matklad ◴[] No.43756510{11}[source]
I'd flip it over and say that C++ has overloading&SFINAE to enable polymorphism which it otherwise can't express.
replies(1): >>43756683 #
80. pcwalton ◴[] No.43756555{13}[source]
That's still just a function of type ∀K∀L.K → L with a bound on K. From a type theory perspective, a comptime argument, when the function is used in such a way as to return a type, is not a value, even though it looks like one. Rather, true or false in this context is a type. (Yes, really. This is a good example of why Zig reusing the keyword "comptime" obscures the semantics.) If comptime true or comptime false were actually values, then you could put runtime values in there too.
81. pcwalton ◴[] No.43756634{15}[source]
The point is that comptime isn't dependent types at all. If your types can't depend on runtime values, they aren't dependent types. It's something more like kind polymorphism in GHC (except more dynamically typed), something which GHC explicitly calls out as not dependent types. (Also it's 12 years old [1]).

[1]: https://www.seas.upenn.edu/~sweirich/papers/fckinds.pdf

82. edflsafoiewq ◴[] No.43756683{12}[source]
Such as? The basic property of overloading is it's open. Any closed set of overloads can be converted to a single function which does the same dispatch logic with ifs and type traits (it may not be very readable).
83. msteffen ◴[] No.43756684{6}[source]
IIUC, it does have code generation—the ability to generate strings at compile-time and feed them back into the compiler.

The argument that the author of TFA is making is that Zig’s comptime is a very limited feature (which, they argue, is good. It restricts users from introducing architecture dependencies/cross-compilation bugs, is more amenable to optimization, etc), and yet it allows users to do most of the things that more general alternatives (such as code generation or a macro system) are often used for.

In other words, while Zig of course didn’t invent compile-time functions (see lisp macros), it is notable and useful from a PL perspective if Zig users are doing things that seem to require macros or code generation without actually having those features. D users, in contrast, do have code generation.

Or, alternatively, while many languages support metaprogramming of some kind, Zig’s metaprogramming language is at a unique maxima of safety (which macros and code generation lack) and utility (which e.g. Java/Go runtime reflection, which couldn’t do the AoS/SoA thing, lack)

Edit Ok, I think Zig comptime expressions are just like D templates, like you said. The syntax is nicer than C++ templates. Zig’s “No host leakage” (to guarantee cross-compile-ability) looks like the one possibly substantively different thing.

replies(1): >>43758236 #
84. naasking ◴[] No.43757580{8}[source]
> Picking some features while leaving others out is something that every language does; if doing that is enough to make a language "revolutionary", then every language is revolutionary.

Picking a set of well motivated and orthogonal features that combine well in flexible ways is definitely enough to be revolutionary if that combination permits expressive programming in ways that used to be unwieldy, error-prone or redundant, eg. "redundant" in the sense that you have multiple ways of expressing the same thing in overlapping but possibly incompatible ways. It doesn't follow that every language must be revolutionary just because they pick features too, there are conditions to qualify.

For systems programming, I think Zig is revolutionary. I don't think any other language matches Zig's cross-compilation, cross-platform and metaprogramming story in such a simple package. And I don't even use Zig, I'm just a programming language theory enthusiast.

> I agree with Walter: Zig isn't doing anything novel.

"Novel" is relative. Anyone familiar with MetaOCaml wouldn't have seen Zig as particularly novel in a theoretical sense, as comptime is effectively a restricted multistage language. It's definitely revolutionary for an industry language though. I think D has too much baggage to qualify, even if many Zig expressions have translations into D.

85. WalterBright ◴[] No.43758219{11}[source]
> Zig has almost no restrictions on what kinds of values can be `comptime` parameters.

Neither does D. The main restriction is the CTFE needs to be pure. I.e. you cannot call the operating system in CTFE (this is a deliberate restriction, mainly to avoid clever malware).

CTFE isn't "special" in D, either. CTFE is triggered for any instance of a "constant expression" in the grammar, and doesn't require a keyword.

86. WalterBright ◴[] No.43758236{7}[source]
> Zig’s “No host leakage” (to guarantee cross-compile-ability) looks like the one possibly substantively different thing.

That is a good idea, but could be problematic if one relies on size_t, which changes in size from 32 to 64 bit. D's CTFE adds checks for undefined behavior, such as shifting by more bits than are in the type being shifted. These checks are not done at runtime for performance reasons.

D's CTFE also does not allow calling the operating system, and only works on functions that are "pure".

87. pcwalton ◴[] No.43758823{11}[source]
> The important thing is that the feature performs the duty of those other features.

Zig's comptime doesn't do everything that Rust (or Java, or C#, or Swift, etc.) generics do, and I know you know this given your background in type theory. Zig doesn't allow for the inference and type-directed method resolution that Rust or the above languages do, because the "generics" that you create using Zig comptime aren't typechecked until they're instantiated. You can improve the error messages using "comptime if" or whatever Zig calls it (at the cost of a lot of ergonomics), but the compiler still can't reliably typecheck the bodies of generic functions before the compiler does the comptime evaluation.

Now I imagine you think that this feature doesn't matter, or at least doesn't matter enough to be worth the complexity it adds to the compiler. (I disagree, of course, because I find reliable IDE autocomplete and inline error messages to be enormously useful when writing generic Rust functions.) But that's the entire point: Zig comptime is not performing the duty of generics; it's approximating generics in a way that offers a tradeoff.

When I first looked at Zig comptime, it didn't evoke the "magic of Lisp" at all in me (and I do share an appreciation of simplicity in programming languages, though I feel like Scheme offers more of that than Lisp). Rather, my reaction was "oh, this is basically just what D does", having played with D a decent amount in years prior. Nothing I've seen in the intervening years has changed that impression. Zig's metaprogramming features are a spin on metaprogramming facilities that D thoroughly explored over a decade before Zig came on the scene.

Edit: Here's an experiment. Start with D and start removing features: GC, the class system, exceptions, etc. etc. Do you get to something that's more or less Zig modulo syntax? From what I can tell, you do. That's what I mean by "not revolutionary".

replies(1): >>43760743 #
88. baranul ◴[] No.43759831{3}[source]
The comment made by me, is a reply to another reader, not of the article directly. The push back was on the nature of their comment.

> the uniqueness of Zig's comptime... > You can like it or not, but it is very interesting and very novel...

While true, such features in Zig can be interesting, they are not particularly novel (as other highly knowledgeable readers have pointed out). Zig's comptime is often marketed or hyped as being special, while overlooking that other languages often do similar, but have their own perspectives and reasoning on how metaprogramming and those type of features fit into their language. Not to mention, metaprogramming has its downsides too. It's not all roses.

The article does seek to make comparisons with other languages, but arguably out of context, as to what those languages are trying to achieve with their feature sets. Comptime should not be looked at in a bubble, but as part of the language as a whole.

A language creator with an interesting take on metaprogramming in general, is Ginger Bill (of Odin). Who often has enthusiasts attempt to pressure him into making more extensive use of it in his language, but he pushes back because of various problems it can cause, and has argued he often comes up with optimal solutions without it. There are different sides to the story, in regards to usage and goals, relative to the various languages being considered.

89. pron ◴[] No.43760743{12}[source]
> Zig doesn't allow for the inference and type-directed method resolution that Rust or the above languages do

Well, but Zig also doesn't allow for overloads and always opts for explicitness regardless of comptime, so I would say that's consonant with the rest of the design.

> Now I imagine you think that this feature doesn't matter, or at least doesn't matter enough to be worth the complexity it adds to the compiler.

I don't care too much about the complexity of the compiler (except in how compilation times are affected), but I do care about the complexity of the language. And yes, there are obviously tradeoffs here, but they're not the same tradeoffs as C++ templates and I think it's refreshing. I can't yet tell how "good" the tradeoff is.

> Here's an experiment. Start with D and start removing features: GC, the class system, exceptions, etc. etc. Do you get to something that's more or less Zig modulo syntax?

I don't know D well enough to tell. I'd probably start by looking at how D would do this [1]: https://ziglang.org/documentation/master/#Case-Study-print-i...

For instance, the notion of a comptime variable (for which I couldn't find an analogue in D) is essential to the point that the "metalanguage" and the object language are pretty much the same language.

Interestingly, in Zig, the "metalanguage" is closer to being a superset of the object language whereas in other languages with compile-time phases, the metalanguage, if not distinct, is closer to being a subset. I think Terra is an interesting point of comparison, because there, while distinct, the metalanguage is also very rich.

[1] which, to me, gives the "magical Lisp feeling" except without macros.

replies(1): >>43764684 #
90. TinkersW ◴[] No.43760879{10}[source]
std::enable_if is not the correct comparison, I think you mean "if constexpr"

enable_if is mostly deprecated, and was used for overloading not branching, you can use concepts now instead

91. pron ◴[] No.43761282{8}[source]
> Picking some features while leaving others out is something that every language does; if doing that is enough to make a language "revolutionary", then every language is revolutionary.

You can say that about the design of any product. Yet, once in a while, we get revolutionary designs (even if every feature in isolation is not completely novel) when the choice of what to include and what to leave out is radically different from other products in the same category in a way that creates a unique experience.

92. WalterBright ◴[] No.43764684{13}[source]
> the notion of a comptime variable (for which I couldn't find an analogue in D)

A comptime variable in D would look like:

    enum v = foo(3);
Since an enum initialization is a ConstExpression, it's initialization must be evaluated at compile time.

A comptime function parameter in D looks like:

    int mars(int x)(int y) { ... }
where the first parameter list consists of compile time parameters, and the second the runtime parameters.

D does not have a switch-over-types statement, but the equivalent can be done with a sequence of static-if statements:

    static if (is(T == int)) { ... }
    else static if (is(T == float)) { ... }
Static If is always evaluated at compile time. The IsExpression does pattern matching on types.
replies(1): >>43764796 #
93. pron ◴[] No.43764796{14}[source]
A comptime variable in Zig isn't a constant whose value is computed at compile time (that would just be a Zig constant) but rather variable that's potentially mutable by comptime: https://ziglang.org/documentation/master/#Compile-Time-Varia...

This is one of the things that allow the "comptime language" to just be Zig, as in this example: https://ziglang.org/documentation/master/#Case-Study-print-i...

replies(1): >>43765233 #
94. keybored ◴[] No.43765147{7}[source]
> It is "duck-typed", but it is checked at compile time. Unlike ducktyping in JS, you know whether or not your type is a valid argument just as you would for Java type bounds -- the compiler lets you know. Everything is also all spelled out, just in a different way.

At this point I will have to defer to Zig users.

But the wider point stands whether I am correct about Zig usability or not (mostly leaning on the aforelinked URLs). Plenty of things can be compile-time and yet have widely different usability. Something that relies on unconstrained build-time code generation can be much harder to use than macros, which in turn can be harder to use than something like “constant expressions”, and so on.

95. pcwalton ◴[] No.43765233{15}[source]
You can mutate variables at compile time in D. See the compile time Newton's method example: https://tour.dlang.org/tour/en/gems/compile-time-function-ev...
replies(1): >>43765429 #
96. pron ◴[] No.43765429{16}[source]
I don't think that's the same thing (rather, it's more like ordinary Zig variables in code that's evaluated at compile-time), as there's no arbitrary mixing of compile-time and runtime computation. Again, compare with https://ziglang.org/documentation/master/#Case-Study-print-i...

Anyway, I found this article that concludes that D's compile time evaluation is equivalent in power to Zig's, although it also doesn't cover how comptime variables can be used in Zig: https://renato.athaydes.com/posts/comptime-programming

However, as I've said many times, knowing about the theoretical power of partial evaluation, what excites me in Zig isn't what comptime can do (although I am impressed with the syntactic elegance of the mechanism) but how it is used to avoid adding other features.

A phone with a touchscreen is evolutionary; a phone without a keypad is revolutionary. The revolution is in the unique experience of using "just comptime" for many things.

It is, of course, a tradeoff, and whether or not that tradeoff is "good" remains to be seen, but I think this design is one of the most novel designs in programming languages in many, many years.