←back to thread

721 points bradgessler | 3 comments | | HN request time: 0s | source
Show context
abathologist ◴[] No.44010933[source]
I think we are going to be seeing a vast partitioning in society in the next months and years.

The process of forming expressions just is the process of conceptual and rational articulation (as per Brandom). Those who misunderstand this -- believing that concepts are ready made, then encoded and decoded from permutations of tokens, or, worse, who have no room to think of reasoning or conceptualization at all -- they will be automated away.

I don't mean that their jobs will be automated: I mean that they will cede sapience and resign to becoming robotic. A robot is just a "person whose work or activities are entirely mechanical" (https://www.etymonline.com/search?q=robot).

I'm afraid far too many are captive to the ideology of productionism (which is just a corollary of consumerism). Creative activity is not about content production. The aim of our creation is communication and mutual-transformation. Generation of digital artifacts may be useful for these purposes, but most uses seem to assume content production is the point, and that is a dark, sad, dead end.

replies(7): >>44011338 #>>44011643 #>>44012297 #>>44012674 #>>44012689 #>>44017606 #>>44025036 #
1. Aerbil313 ◴[] No.44017606[source]

  ...The industrial-technological system may survive or it may break down. If it survives, it may eventually achieve a low level of physical and psychological suffering, but only after passing through a long and very painful period of adjustment and only at the cost of permanently reducing human beings and many other living organisms to engineered products and mere cogs in the social machine. Furthermore, if the system survives, the consequences will be inevitable: There is no way of reforming or modifying the system so as to prevent it from depriving people of dignity and autonomy.
- Industrial Society And Its Future, Ted Kaczynski (1975)
replies(1): >>44026230 #
2. abathologist ◴[] No.44026230[source]
As a matter of fact, I don't think its true that

> if the system survives, the consequences will be inevitable: There is no way of reforming or modifying the system so as to prevent it from depriving people of dignity and autonomy.

But I also think that this claim is (1) practically impossible to prove and (2) a claim we morally ought to attempt to disprove.

replies(1): >>44046575 #
3. Aerbil313 ◴[] No.44046575[source]
Highly recommend to read the full manifesto. The reason is game theory for why human civilization will increasingly automate until human autonomy is reduced to nil, which is the state of suffering he is talking about. If you don't automate, $ENEMY will. So you automate. If you still choose to not automate, $ENEMY wins and controls you, and they will automate ever more with your resources. In fact, these 3 sentences are a good summary of past 1000 years of history, possibly more. See the recent AI 2027 paper (https://ai-2027.com/) for a more detailed approach. And Kaczynski talked about this stuff in the 70s.

For the (2), I want to ask where did you get the idea that people have a moral responsibility to unconditionally defend the progress of technology?

Do you think new technologies' good or bad impact depend entirely on the virtue of the people using them? Do you think new tools dictate their own usage on society when they make contact with constants like immediate-reward-seeking human neurological system?