Most active commenters
  • evantbyrne(5)
  • tsunamifury(5)

←back to thread

LLM Inevitabilism

(tomrenner.com)
1613 points SwoopsFromAbove | 29 comments | | HN request time: 0.445s | source | bottom
Show context
keiferski ◴[] No.44568304[source]
One of the negative consequences of the “modern secular age” is that many very intelligent, thoughtful people feel justified in brushing away millennia of philosophical and religious thought because they deem it outdated or no longer relevant. (The book A Secular Age is a great read on this, btw, I think I’ve recommended it here on HN at least half a dozen times.)

And so a result of this is that they fail to notice the same recurring psychological patterns that underly thoughts about how the world is, and how it will be in the future - and then adjust their positions because of this awareness.

For example - this AI inevitabilism stuff is not dissimilar to many ideas originally from the Reformation, like predestination. The notion that history is just on some inevitable pre-planned path is not a new idea, except now the actor has changed from God to technology. On a psychological level it’s the same thing: an offloading of freedom and responsibility to a powerful, vaguely defined force that may or may not exist outside the collective minds of human society.

replies(15): >>44568532 #>>44568602 #>>44568862 #>>44568899 #>>44569025 #>>44569218 #>>44569429 #>>44571000 #>>44571224 #>>44571418 #>>44572498 #>>44573222 #>>44573302 #>>44578191 #>>44578192 #
1. evantbyrne ◴[] No.44571000[source]
I'm pretty bearish on the idea that AGI is going to take off anytime soon, but I read a significant amount of theology growing up and I would not describe the popular essays from e.g., LessWrong as religious in nature. I also would not describe them as appearing poorly read. The whole "look they just have a new god!" is a common trope in religious apologetics that is usually just meant to distract from the author's own poorly constructed beliefs. Perhaps such a comparison is apt for some people in the inevitable AGI camp, but their worst arguments are not where we should be focusing.
replies(7): >>44571085 #>>44571353 #>>44571601 #>>44572817 #>>44572976 #>>44574689 #>>44576484 #
2. andai ◴[] No.44571085[source]
Maybe not a god, but we're intentionally designing artificial minds greater than ours, and we intend to give them control of the entire planet. While also expecting them to somehow remain subservient to us (or is that part just lip service)?
replies(1): >>44571492 #
3. miningape ◴[] No.44571353[source]
While it's a fair criticism, just because someone doesn't believe in a god doesn't mean the religious hardware in their brain has been turned off. It's still there and operational - I don't think it's a surprise that this hardware's attention would then be automatically tuned to a different topic.

I think you can also see this in the intensification of political discussion, which has a similar intensity to religious discussions 100-200+ years ago (i.e. Protestant reformation). Indicating that this "religious hardware" has shifted domains to the realm of politics. I believe this shift can also be seen through the intense actions and rhetoric we saw in the mid-20th century.

You can also look at all of these new age "religions" (spiritualism, horoscopes, etc.) as that religious hardware searching for something to operate on in the absence of traditional religion.

replies(3): >>44571788 #>>44572896 #>>44579463 #
4. yladiz ◴[] No.44571492[source]
I’m sorry, but are you arguing that an LLM is anywhere near a human mind? Or are you arguing about some other AI?
replies(1): >>44573002 #
5. gspencley ◴[] No.44571601[source]
Philosophy and religion are not mutually inclusive, though one can certainly describe a religious belief as being a philosophical belief.

Even a scientifically inclined atheist has philosophical ideas grounding their world view. The idea that the universe exists as an objective absolute with immutable laws of nature is a metaphysical idea. The idea that nature can be observed and that reason is a valid tool for acquiring knowledge about nature is an epistemological idea. Ethics is another field of philosophy and it would be a mistake to assume a universal system of ethics that has been constant throughout all cultures across all of human history.

So while I certainly agree that there is a very common hand-wave of "look the atheists have just replaced God with a new 'god' by a different name", you don't have to focus on religion, theology and faith based belief systems to identify different categories of philosophical ideas and how they have shaped different cultures, their beliefs and behaviours throughout history.

A student of philosophy would identify the concept of "my truth" as being an idea put forward by Emmanuel Kant, for example, even though the person saying that doesn't know that that's the root of the idea that reality is subjective. Similarly, the empirically grounded scientist would be recognized as following in the footsteps of Aristotle. The pious bible thumper parroting ideas published by Plato.

The point is that philosophy is not the same thing as religion and philosophy directly shapes how people think, what they believe and therefore how they act and behave. And it's kind of uncanny how an understanding of philosophy can place historical events in context and what kinds of predictive capabilities it has when it comes to human behaviour in the aggregate.

replies(1): >>44576666 #
6. svieira ◴[] No.44571788[source]
Which then leads you to the question "who installed the hardware"?
replies(1): >>44571949 #
7. cootsnuck ◴[] No.44571949{3}[source]
No, that lead you to that question.

It leads me to the question, "Is it really 'religious hardware' or the same ol' 'make meaning out of patterns' hardware we've had for millenia that has allowed us to make shared language, make social constructs, mutually believe legal fictions that hold together massive societies, etc.?"

replies(3): >>44572539 #>>44572745 #>>44574460 #
8. jffhn ◴[] No.44572539{4}[source]
Or: the hardware that generates beliefs about how things should be - whether based on religious or ideological dogma -, as opposed to science which is not prescriptive and can only describe how things are.
9. yubblegum ◴[] No.44572745{4}[source]
Your entire outlook is based on an assumption. The assumption that 'emergence of meaning' is a 2nd order epiphenomena of an organic structure. The 1st order epiphenomena in your view is of course consciousness itself.

None of these assumptions can be proven, yet like the ancients looking at the sky and seeing a moving sun but missing a larger bit of the big picture you now have a 'theory of mind' that satisfies your rational impluses given a poor diet of facts and knowledge. But hey, once you manage to 'get into orbit' you get access to more facts and then the old 'installed hardware' theory of yours starts breaking down.

The rational position regarding these matters is to admit "we do not have sufficient information and knowledge to make conclusive determinations based on reason alone". Who knows, one day Humanity may make it to the orbit and realize the 'simple and self apparent idea' of "everything revoles around the Earth" is false.

replies(1): >>44574357 #
10. authorfly ◴[] No.44572817[source]
Would you say LessWrong posts are dogmatic?
11. buu700 ◴[] No.44572896[source]
I agree that modern hyper-online moralist progressivism and QAnonism are just fresh coats of paint on religion, but that isn't similar to AI.

AI isn't a worldview; it's an extremely powerful tool which some people happen to be stronger at using than others, like computers or fighter jets. For people who empirically observe that they've been successful at extracting massive amounts of value from the tool, it's easy to predict a future in which aggregate economic output in their field by those who are similarly successful will dwarf that of those who aren't. For others, it's understandable that their mismatched experience would lead to skepticism of the former group, if not outright comfort in the idea that such productivity claims are dishonest or delusional. And then of course there are certainly those who are actually lying or deluded about fitting in the former group.

Every major technology or other popular thing has some subset of its fandom which goes too far in promotion of the thing to a degree that borders on evangelical (operating systems, text editors, video game consoles, TV shows, diets, companies, etc.), but that really has nothing to do with the thing itself.

Speaking for myself, anecdotally, I've recently been able to deliver a product end-to-end on a timeline and level of quality/completeness/maturity that would have been totally impossible just a few years ago. The fact that something has been brought into existence in substantially less time and at orders of magnitude lower cost than would have been required a few years ago is an undeniable observation of the reality in front of me, not theological dogma.

It is, however, a much more cognitively intense way to build a product — with AI performing all the menial labor parts of development, you're boxed into focusing on the complex parts in a far more concentrated time period than would otherwise be required. In other words, you no longer get the "break" of manually coding out all the things you've decided need to be done and making every single granular decision involved. You're working at a higher level of abstraction and your written output for prompting is far more information-dense than code. The skills required are also a superset of those required for manual development; you could be the strongest pre-LLM programmer in the world, but if you're lacking in areas like human language/communication, project/product management, the ability to build an intuition for "AI psychology", or thinking outside the box in how you use your tools, adapting to AI is going to be a struggle.

It's like an industry full of mechanics building artisan vehicles by hand suddenly finding themselves foisted with budgets to design and implement assembly lines; they still need to know how to build cars, but the nature of the job has now fundamentally changed, so it's unsurprising that many or even most who'd signed up for the original job would fail to excel in the new job and rationalize that by deciding the old ways are the best. It's not fair, and it's not anyone's fault, but it's important for us all to be honest and clear-eyed about what's really happening here. Society as a whole will ultimately enjoy some degree of greater abundance of resources, but in the process a lot of people are going to lose income and find hard-won skills devalued. The next generation's version of coal miners being told to "learn to code" will be coders being told to "learn to pilot AI".

replies(1): >>44574891 #
12. tsunamifury ◴[] No.44572976[source]
I jsut want to comment here that this is the classic arrogant, underread “I reject half of humanities thoughts” foolishness that OP is referring to.

I mean the lack of self awareness you have here is amazing.

replies(1): >>44574267 #
13. tsunamifury ◴[] No.44573002{3}[source]
If you understand the cultural concepts of Adam Curtis’s All Watched Over by Machines of Loving Grace, then yes we do keep trying to make gods out of inanimate things.

And it’s the atheists who continuously do it, claiming they don’t believe in God just markets or ai etc.

It’s an irony of ironies.

14. evantbyrne ◴[] No.44574267[source]
To the contrary. I sped through my compsci capstone coursework first year of college and spent most of the rest of my time in philosophy, psychology, and sociology classrooms. The "hey if you squint this thing it looks like religion for the non-religious" perspective is just one I've heard countless times. It is perfectly valid to have a fact based discussion on whether there is a biological desire for religiosity, but drawing a long line from that to broadly critique someone's well-articulated ideas is pretty sloppy.
replies(1): >>44575999 #
15. dmbche ◴[] No.44574357{5}[source]
I've enjoyed reading the books of Peter Watts (Blindsight, free on their backlog, sci-fi), on seemingly this subject
replies(1): >>44575521 #
16. ryandv ◴[] No.44574460{4}[source]
> It leads me to the question, "Is it really 'religious hardware' or the same ol' 'make meaning out of patterns' hardware

They are the same thing. Call it "religion" or "meaning making," both activities can be subsumed by the more encompassing concept and less-loaded term of "psycho-technology," [0] or non-physical tools for the mind.

Language is such a psycho-technology, as are social constructs such as law; legal fictions are given memorable names and personified into "religious" figures, such as Libra from astrology or Themis/Lady Justice from Greek mythology.

Ancient shamans and priests were proto-wetware engineers, designing software for your brain and providing tools for making meaning out of the world. In modern day we now have psychologists, "social commentators" (for lack of a better term and interpreted as broadly as possible), and, yes, software engineers, amongst other disciplines, playing a similar role.

[0] https://www.meaningcrisis.co/episode-1-introduction/

17. keiferski ◴[] No.44574689[source]
I didn’t say that “it’s just a new god,” I said:

The notion that history is just on some inevitable pre-planned path is not a new idea, except now the actor has changed from God to technology.

This is a more nuanced sentence.

replies(1): >>44575097 #
18. tsimionescu ◴[] No.44574891{3}[source]
> It's not fair, and it's not anyone's fault, but it's important for us all to be honest and clear-eyed about what's really happening here.

Or we can just refuse this future and act as a society to prevent it from happening. We absolutely have that power, if we choose to organize and use it.

replies(1): >>44575066 #
19. buu700 ◴[] No.44575066{4}[source]
Sure, but how so? If I'm understanding your argument correctly, it sounds like you may be implying that we should escalate the war on general-purpose computing and outlaw generative AI.

If we were to consider that, then to what end? If you accept my framing of the long-term implications of LLMs on the industry, then what you're suggesting is effectively that we should deprive society of greater prosperity for the benefit of a small minority. Personally, I'd rather improve democratization of entrepreneurship (among other things) than artificially prop up software engineering salaries.

And let's say the US did all that. What then? We neuter our economy and expect our adversaries to just follow suit? More likely it hobbles our ability to compete and ultimately ushers in an era of global hegemony under the CCP.

20. evantbyrne ◴[] No.44575097[source]
Before that quoted sentence you drew a line from the reformation to people believing that AI is inevitable, then went on to imply these people may even believe such a thing will happen without the involvement of people. These are generalizations which don't fit a lot of the literature and make their best ideas look a bit sillier than they are. It is situations like these that make me think that analogies are better suited as a debate tactic than a method of study.
21. yubblegum ◴[] No.44575521{6}[source]
https://en.wikipedia.org/wiki/Blindsight_(Watts_novel) (will check it out. thanks!)
22. tsunamifury ◴[] No.44575999{3}[source]
Quoting your college classes is the first sign of inexperience but I’ll Share some modern concepts.

In Adam Curtis‘s all watched over by machines of loving Grace, he makes a pretty long and complete argument that humanity has a rich history of turning over its decision-making to inanimate objects in a desire to discover ideologies we can’t form ourselves in growing complexity of our interconnectivity.

He tells a history of them constantly failing because the core ideology of “cybernetics” is underlying them all and fails to be adaptive enough to match our DNA/Body/mind combined cognitive system. Especially when scaled to large groups.

He makes the second point that humanity and many thinkers constantly also resort to the false notion of “naturalism” as the ideal state of humanity, when in reality there is no natural state of anything, except maybe complexity and chaos.

Giving yourself up to something. Specially something that doesn’t work is very much “believing in a false god.”

replies(1): >>44576458 #
23. evantbyrne ◴[] No.44576458{4}[source]
You seem to be lost. While referencing a TV show may or may not be a rebuttal to a very specific kind of worldview, it is out of place as a response to my post to which you've failed to actually directly reference at all.

I'm addressing this point at you personally because we can all see your comments: being nasty to atheists on the internet will never be a substitute for hard evidence for your ideology.

replies(1): >>44576652 #
24. madrox ◴[] No.44576484[source]
I've read LessWrong very differently from you. The entire thrust of that society is that humanity is going to create the AI god.
25. tsunamifury ◴[] No.44576652{5}[source]
you seem to be profoundly confused Adam Curtis is a leading thinker in documentarian of our time and widely recognized in continental philosophy. The fact that you tried to dismiss him as a TV show shows you seem to be completely naïve about the topic you’re speaking about.

Second, I’m not being nasty to atheists and speaking specifically about not having false gods which if anything is a somewhat atheistic perspective

Honestly, what are you trying to say?

replies(1): >>44576735 #
26. staunton ◴[] No.44576666[source]
This sounds very educated but I don't really see what it has to do with the comment you're responding to (or with AI).
27. evantbyrne ◴[] No.44576735{6}[source]
Like I said, we can all read your comments. Needs no further elaboration. If I receive a second recommendation for Curtis then I might be inclined to check it out. Take it easy.
replies(1): >>44577984 #
28. tsunamifury ◴[] No.44577984{7}[source]
This is such a baffling and bizarre response.
29. uncircle ◴[] No.44579463[source]
> While it's a fair criticism, just because someone doesn't believe in a god doesn't mean the religious hardware in their brain has been turned off.

Max Stirner said that after the Enlightenment and the growth of liberalism, which is still very much in vogue to this day, all we’ve done is replace the idea of God with the idea of Man.

The object might be different, but it is still the unshakable belief in an idealised and subjective truth, with its own rituals and ministers i.e a religion.

I guess the Silicon Valley hyper-technological optimism of the past years is yet another shift from Man to religious belief in the Machine.