←back to thread

1062 points mixto | 1 comments | | HN request time: 0.4s | source
Show context
rco8786 ◴[] No.42941843[source]
I have nothing but fond memories of reading Beej's guides.

It's also this sort of work that's becoming less necessary with AI, for better or worse. This appears to be a crazy good guide, but I bet asking e.g. Claude to teach you about git (specific concepts or generate the whole guide outline and go wide on it) would be at least as good.

replies(5): >>42941937 #>>42941996 #>>42942409 #>>42942699 #>>42949624 #
yoyohello13 ◴[] No.42941937[source]
Seems more efficient to have one reference book rather than generating entire new 20 chapter books for every person.

I also think if you are at the “don’t know what you don’t know” point of learning a topic it’s very hard to direct an AI to generate comprehensive learning material.

replies(3): >>42941989 #>>42942007 #>>42942218 #
jonahx ◴[] No.42942218[source]
> Seems more efficient to have one reference book rather than generating entire new 20 chapter books for every person.

The main advantage of LLMs is that you can ask specific questions about things that confuse you, which makes iterating to a correct mental model much faster. It's like having your own personal tutor at your beck and call. Good guidebooks attempt to do this statically... anticipate questions and confusions at the right points, and it's a great skill to do this well. But it's still not the same as full interactivity.

replies(3): >>42942306 #>>42942455 #>>42942571 #
1. nealabq ◴[] No.42942571[source]
This is a bit of a stretch, but it's a little like distillation, where you are extracting from the vast knowledge of the LLM and inserting those patterns into your brain. Where you have an incomplete or uncertain mental model and you ask a tutor to fill in the blanks.

Altho maybe I'm stretching the analogy too far.