←back to thread

LLM Inevitabilism

(tomrenner.com)
1612 points SwoopsFromAbove | 1 comments | | HN request time: 0.264s | source
Show context
JimmaDaRustla ◴[] No.44571157[source]
The author seems to imply that the "framing" of an argument is done so in bad faith in order to win an argument but only provides one-line quotes where there is no contextual argument.

This tactic by the author is a straw-man argument - he's framing the position of tech leaders and our acceptance of it as the reason AI exists, instead of being honest, which is that they were simply right in their predictions: AI was inevitable.

The IT industry is full of pride and arrogance. We deny the power of AI and LLMs. I think that's fair, I welcome the pushback. But the real word the IT crowd needs to learn is "denialism" - if you still don't see how LLMs is changing our entire industry, you haven't been paying attention.

Edit: Lots of denialists using false dichotomy arguments that my opinion is invalid because I'm not producing examples and proof. I guess I'll just leave this: https://tools.simonwillison.net/

replies(13): >>44571266 #>>44571325 #>>44571342 #>>44571439 #>>44571448 #>>44571473 #>>44571498 #>>44571731 #>>44571794 #>>44571923 #>>44572035 #>>44572307 #>>44572665 #
philipwhiuk ◴[] No.44571325[source]
> AI was inevitable.

This is post hoc ergo propter hoc. AI exists thus it must have been inevitable.

You have no proof it was inevitable.

(Also AI means something wildly different than it meant a few years ago - I remember when AI meant AGI, the salesmen have persuaded you the emperor has clothes because they solved a single compelling test).

replies(5): >>44571373 #>>44571481 #>>44571636 #>>44572086 #>>44572573 #
1. dinfinity ◴[] No.44572086[source]
> This is post hoc ergo propter hoc. AI exists thus it must have been inevitable.

The problem with that statement is that it doesn't say anything about why AI will take over pretty much everything.

The actual answer to that is that AI is not limited by a biological substrate and can thus:

1. Harness (close to) the speed of light for internal signals; Biology is limited to about 200m/s, 6 orders of magnitude less. AI has no such limitations.

2. Scale very easily. Human brains are limited in how big they can get due to silly things such as the width of the birth canal and being on top of a (bipedal) body that uses organic mass to inefficiently generate power. Scaling a human brain beyond its current size and the ~20 watts it draws is an incredibly hard engineering challenge. For AI scaling is trivial by comparison.

I'm not saying it's going to be LLMs, but longterm we can say that the intelligent entities that will surpass us will not have the same biological and physical limitations as we do. That means they very, very probably have to be 'artificial' and thus, that AI taking over everything is 'inevitable'.