"AI" is using an algorithm and statistics in the same way---it's just more accurate at making intelligible sentences than the example above. I wouldn't call either thinking, would you?
"AI" is using an algorithm and statistics in the same way---it's just more accurate at making intelligible sentences than the example above. I wouldn't call either thinking, would you?
Now we can churn out text at rates unprecedented and the original problem, no one reading, is left untouched.
The author wonders what happens when the weird lossy gap in-between these processes gets worse.
There’s lots of evidence that writing helps formulate good thinking. Interestingly, CoT reasoning mirrors this even if the underlying mechanisms differ. So while I wouldn’t call this thinking, I also don’t think reducing LLM output to mere algorithmic output exactly captures what’s happening either.
EDIT: previous != precious.
I think you miss my point a bit.
Any text that can be churned out at unprecedented rates likely isn't worth reading (or writing, or looking at, or listening to), and anyone consuming this stuff already isn't doing much thinking.
You can lead a horse etc etc