←back to thread

Open-source Zig book

(www.zigbook.net)
692 points rudedogg | 5 comments | | HN request time: 0s | source
Show context
jasonjmcghee ◴[] No.45948044[source]
So despite this...

> The Zigbook intentionally contains no AI-generated content—it is hand-written, carefully curated, and continuously updated to reflect the latest language features and best practices.

I just don't buy it. I'm 99% sure this is written by an LLM.

Can the author... Convince me otherwise?

> This journey begins with simplicity—the kind you encounter on the first day. By the end, you will discover a different kind of simplicity: the kind you earn by climbing through complexity and emerging with complete understanding on the other side.

> Welcome to the Zigbook. Your transformation starts now.

...

> You will know where every byte lives in memory, when the compiler executes your code, and what machine instructions your abstractions compile to. No hidden allocations. No mystery overhead. No surprises.

...

> This is not about memorizing syntax. This is about earning mastery.

replies(13): >>45948094 #>>45948100 #>>45948115 #>>45948220 #>>45948287 #>>45948327 #>>45948344 #>>45948548 #>>45948590 #>>45949076 #>>45949124 #>>45950417 #>>45951487 #
PaulRobinson ◴[] No.45948094[source]
You can't just say that a linguistic style "proves" or even "suggests" AI. Remember, AI is just spitting out things its seen before elsewhere. There's plenty of other texts I've seen with this sort of writing style, written long before AI was around.

Can I also ask: so what if it is or it isn't?

While AI slop is infuriating, and the bubble hype is maddening, I'm not sure every time somebody sees some content they don't like the style of we just call out it "must" be AI, and debate if it is or it isn't is not at least as maddening. It feels like all content published now gets debated like this, and I'm definitely not enjoying it.

replies(1): >>45948343 #
1. maxbond ◴[] No.45948343[source]
You can be skeptical of anything but I think it's silly to say that these "Not just A, but B" constructions don't strongly suggest that it's generated text.

As to why it matters, doesn't it matter when people lie? Aren't you worried about the veracity of the text if it's not only generated but was presented otherwise? That wouldn't erode your trust that the author reviewed the text and corrected any hallucinations even by an iota?

replies(1): >>45948873 #
2. geysersam ◴[] No.45948873[source]
> but I think it's silly to say that these "Not just A, but B" constructions don't strongly suggest ai generated text

Why? Didn't people use such constructions frequently before AI? Some authors probably overused them the same frequency AI does.

replies(1): >>45949021 #
3. maxbond ◴[] No.45949021[source]
I don't think there was very much abuse of "not just A, but B" before ChatGPT. I think that's more of a product of RLHF than the initial training. Very few people wrote with the incredibly overwrought and flowery style of AI, and the English speaking Internet where most of the (English language) training data was sourced from is largely casual, everyday language. I imagine other language communities on the Internet are similar but I wouldn't know.

Don't we all remember 5 years ago? Did you regularly encounter people who write like every followup question was absolutely brilliant and every document was life changing?

I think about why's (poignant) Guide to Ruby [1], a book explicitly about how learning to program is a beautiful experience. And the language is still pedestrian compared to the language in this book. Because most people find writing like that saccharin, and so don't write that way. Even when they're writing poetically.

Regardless, some people born in England can speak French with a French accent. If someone speaks French to you with a French accent, where are you going to guess they were born?

[1] https://poignant.guide/book/chapter-1.html

replies(1): >>45949406 #
4. PaulRobinson ◴[] No.45949406{3}[source]
It's been alleged that a major source of training data for many LLMs was libgen and SciHub - hardly casual.
replies(1): >>45949477 #
5. maxbond ◴[] No.45949477{4}[source]
Even if that were comparable in size to the conversational Internet, how many novels and academic papers have you read that used multiple "not just A, but B" constructions in a single chapter/paper (that were not written by/about AI)?