←back to thread

432 points tosh | 1 comments | | HN request time: 0.021s | source
Show context
perbu ◴[] No.39998892[source]
I have this 300 line Go application which manages git tags for me. I asked it to implement a -dry-run function. It failed twice. First time it just mangled the file. Second time it just made code that didn't do anything.

I asked it to rename a global variable. It broke the application and failed to understand scoping rules.

Perhaps it is bad luck, or perhaps my Go code is weird, but I don't understand how y'all wanna trust this.

replies(5): >>39999154 #>>39999426 #>>40000870 #>>40001228 #>>40001735 #
Culonavirus ◴[] No.39999426[source]
It must be your app/lang/prompt/grandma/dog/... lol. LLMs are the future, and they will replaces Allllllll the coders in the woooorld (TM), and did you know "it" can create websites??? Wooo, let's go, baby!

Nah these things are all stupid as hell. Any back and forth between a human and an LLM in terms of problem solving coding tasks is an absolute disaster.

People here and certainly in the mainstream population see some knowledge and just naturally expect intelligence to go with it. But it doesn't. Wikipedia has knowledge. Books have knowledge. LLMs are just the latest iteration of how humans store knowledge. That's about it, everything else is a hyped up bubble. There's nothing in physics that stops us from creating an artificial, generally intelligent being, but it's NEVER going to be with auto-regressive next-token prediction.

replies(2): >>40000052 #>>40005061 #
1. Seb-C ◴[] No.40000052[source]
LLMs does not store information though.

Language is a tool to convey information. LLMs are only about the language, not the information.

replies(1): >>40001402 #