←back to thread

63 points tejonutella | 5 comments | | HN request time: 0.276s | source
1. seydor ◴[] No.43304274[source]
Crap article with zero insight that wasn't worth my time.

There are better ways to argue the same position but it's probably indefensible.

Neural networks are fundamentally approximators, both when they are approximating long-range relations between concepts (like LLMs do) or when they denoise some noise for images. They approximate thought and intelligence, because we have coded it in our writings. Our writings are a complete fingerprint of the thought process, because we are able to pick up any thought process from a book, without ever seeing or otherwise having contact with the author. Therefore there is an increasingly more high-fidelity "real thought" in there.

> that in and of itself is a monumental achievement but there is no real thought involved.

Pretty much anything that requires our current level of thought is therefore reachable with ANNs now that we know how to train them in depth and width.

The real question is whether this is enough. We want ASI, but our texts only have AGI, and everything that comes from biology (including intelligence) is logarithmic in scale. There is zero evidence that language models will ever learn to create abstract entities better than any collection of humans do. AI companies are advertising armies of PhD students, but we already have millions of PhD students , yet our most pressing problems have not made a lot of progress for decades. That's what should be worrying to us, not the fact that we will all lose our jobs.

replies(2): >>43304490 #>>43309287 #
2. ninetyninenine ◴[] No.43304490[source]
>yet our most pressing problems have not made a lot of progress for decades.

You mean energy and global warming? We will pretty much hit the worst case scenario for both of these. And we've been yapping about these problems for decades so nobody cares even though they are likely 10000x more relevant.

AI is new and interesting.

replies(1): >>43304503 #
3. seydor ◴[] No.43304503[source]
no, cancer and alzheimer's
replies(1): >>43304511 #
4. ninetyninenine ◴[] No.43304511{3}[source]
Oh same deal. We've been yapping about it for years. Probably reached our limits in terms of technology to deal with these things.

Another thing with AI is the insane progress and clear knowledge of the fact we are well below the limit.

5. Jensson ◴[] No.43309287[source]
> Our writings are a complete fingerprint of the thought process, because we are able to pick up any thought process from a book, without ever seeing or otherwise having contact with the author

That doesn't follow, even a simple encryption program can make that possible without an intermediate actor being able to crack that.

Humans all run very similar software/hardware so we can read what each others write, but so far the computer isn't close to the same behind the scenes thoughts. Meaning all the things we humans leave out when we write are actually important, so the text isn't everything.