←back to thread

I Am An AI Hater

(anthonymoser.github.io)
443 points BallsInIt | 2 comments | | HN request time: 0.001s | source
Show context
danielbln ◴[] No.45044286[source]
Yet it is here to stay, won't go away and even if it won't get any better at the useful things it does, it is useful. The externalities are real, some can be removed, some mitigated. If you're a hater and a human, then you don't have to mitigate anything, of course.

Me, I hate the externalities, but I love the thing. I want to use my own AI, hyper optimized and efficient and private. It would mitigate a lot. Maybe some day.

replies(6): >>45044341 #>>45044555 #>>45044591 #>>45044671 #>>45053794 #>>45115936 #
Mallowram[dead post] ◴[] No.45044341[source]
[dead]
danielbln ◴[] No.45044417[source]
AI is information.
replies(3): >>45044507 #>>45044539 #>>45044626 #
1. Mallowram ◴[] No.45044626{3}[source]
Even Shannon knew the limits of information late in career. AI is not information, it's signaling. And it embeds without decipherment or segregating dominance, bias, control, manipulation. The dark matter of language we can't extract.

"Shannon warned in 1956 that information theory “has perhaps been ballooned to an importance beyond its actual accomplishments” and that information theory is “not necessarily relevant to such fields as psychology, economics, and other social sciences.” Shannon concluded: “The subject of information theory has certainly been sold, if not oversold.” [Claude E. Shannon, “The Bandwagon,” IRE Transactions on Information Theory, Vol. 2, No. 1 (March 1956), p. 3.]"

replies(1): >>45053860 #
2. lif ◴[] No.45053860[source]
Signaling? That is correct. Also, am willing to place a _very_ long bet that clay tablets will be around longer.