←back to thread

AI as Normal Technology

(knightcolumbia.org)
237 points randomwalker | 3 comments | | HN request time: 0.65s | source
Show context
roxolotl ◴[] No.43715735[source]
This is a rare piece on AI which takes a coherent middle of the road viewpoint. Saying both that AI is “normal” and that it will be transformative is a radical statement in today’s discussions about AI.

Looking back on other normal but transformative technologies: steam power, electricity, nuclear physics, the transistor, etc you do actually see similarly stratified opinions. Most of those are surrounded by an initial burst of enthusiasm and pessimism and follow a hype cycle.

The reason this piece is compelling is because during the initial hype phase taking a nuanced middle of the road viewpoint is difficult. Maybe AI really is some “next step” but it is significantly more likely that belief is propped up by science fiction and it’s important to keep expectations inline historically.

replies(6): >>43716287 #>>43716474 #>>43716567 #>>43716597 #>>43717099 #>>43718044 #
pdfernhout ◴[] No.43716474[source]
From the article:

====

History suggests normal AI may introduce many kinds of systemic risks While the risks discussed above have the potential to be catastrophic or existential, there is a long list of AI risks that are below this level but which are nonetheless large-scale and systemic, transcending the immediate effects of any particular AI system. These include the systemic entrenchment of bias and discrimination, massive job losses in specific occupations, worsening labor conditions, increasing inequality, concentration of power, erosion of social trust, pollution of the information ecosystem, decline of the free press, democratic backsliding, mass surveillance, and enabling authoritarianism.

If AI is normal technology, these risks become far more important than the catastrophic ones discussed above. That is because these risks arise from people and organizations using AI to advance their own interests, with AI merely serving as an amplifier of existing instabilities in our society.

There is plenty of precedent for these kinds of socio-political disruption in the history of transformative technologies. Notably, the Industrial Revolution led to rapid mass urbanization that was characterized by harsh working conditions, exploitation, and inequality, catalyzing both industrial capitalism and the rise of socialism and Marxism in response.

The shift in focus that we recommend roughly maps onto Kasirzadeh’s distinction between decisive and accumulative x-risk. Decisive x-risk involves “overt AI takeover pathway, characterized by scenarios like uncontrollable superintelligence,” whereas accumulative x-risk refers to “a gradual accumulation of critical AI-induced threats such as severe vulnerabilities and systemic erosion of econopolitical structures.” ... But there are important differences: Kasirzadeh’s account of accumulative risk still relies on threat actors such as cyberattackers to a large extent, whereas our concern is simply about the current path of capitalism. And we think that such risks are unlikely to be existential, but are still extremely serious.

====

That tangentially relates to my sig: "The biggest challenge of the 21st century is the irony of technologies of abundance in the hands of those still thinking in terms of scarcity." Because as our technological capabilities continue to change, it becomes ever more essential to revisit our political and economic assumptions.

As I outline here: https://pdfernhout.net/recognizing-irony-is-a-key-to-transce... "There is a fundamental mismatch between 21st century reality and 20th century security [and economic] thinking. Those "security" agencies [and economic corporations] are using those tools of abundance, cooperation, and sharing mainly from a mindset of scarcity, competition, and secrecy. Given the power of 21st century technology as an amplifier (including as weapons of mass destruction), a scarcity-based approach to using such technology ultimately is just making us all insecure. Such powerful technologies of abundance, designed, organized, and used from a mindset of scarcity could well ironically doom us all whether through military robots, nukes, plagues, propaganda, or whatever else... Or alternatively, as Bucky Fuller and others have suggested, we could use such technologies to build a world that is abundant and secure for all. ... The big problem is that all these new war machines [and economic machines] and the surrounding infrastructure are created with the tools of abundance. The irony is that these tools of abundance are being wielded by people still obsessed with fighting over scarcity. So, the scarcity-based political [and economic] mindset driving the military uses the technologies of abundance to create artificial scarcity. That is a tremendously deep irony that remains so far unappreciated by the mainstream."

A couple Slashdot comments by me from Tuesday, linking to stuff I have posted on risks form AI and other advanced tech -- and ways to address those risks -- back to 1999:

https://slashdot.org/comments.pl?sid=23665937&cid=65308877

https://slashdot.org/comments.pl?sid=23665937&cid=65308923

So, AI just cranks up an existing trend of technology-as-an-amplifier to "11". And as I've written before, if it is possible our path out of any singularity may have a lot to do with our moral path going into the singularity, we really need to step up our moral game right now to make a society that works better for everyone in healthy joyful ways.

replies(1): >>43717125 #
kjkjadksj ◴[] No.43717125[source]
The idea of abundance vs scarecety makes sense on the outset. But I have to wonder where all this alleged abundance is hiding. Sometimes the assumptions feel a bit like “drill baby drill” to me without figures and projections behind it. One would think if there was much untapped capacity in resources today it would get used up. We can look at how agriculture yields improved over the 19th century and see how that lead to higher populations but also less land under the plow and fewer hands working that land, vs having an equal land under plow and I don’t know dumping the excess yield someplace where it isn’t participating in the market?
replies(1): >>43719017 #
1. noddingham ◴[] No.43719017[source]
I think to the parent's point it is as you say: there is already untapped capacity that isn't being used due to (geo)political forces maintaining the scarcity side of the argument. Using your agriculture example, a simple Google search will yield plenty of examples going back more than a decade of food sitting/rotting in warehouses/ports due to red tape and bureaucracy. So, we already can/do produce enough food to feed _everyone_ (abundance) but cannot get out of our own way to do so due to a number of human factors like greed or politics (scarcity).
replies(1): >>43730122 #
2. kjkjadksj ◴[] No.43730122[source]
And that sort of analysis is exactly what is suspect to me about this. Have people considered why an onion might be in a warehouse or why it might go unsold after a time? The answer is no and reveals a lack of understanding of nuance of how the global economy actually works. Everything has some loss factor and removing it all to nill might not be realistic at all at the scale we do things to feed ourselves. Its like making pancakes: some mix stays in the bag you can’t get out, some batter stays on your bow, some stays on your spoon, you make pancakes with some, some scrap is left in the pan, some crumbs on your plate. All this waste making pancakes and yet to chase down every scrap would be impossible. And at massive scale that scrap probably ads up.

Besides we are crushing global hunger over the decades so something is working on that front. The crisis in most of the western world today at least is that merely wages are depressed compared to costs for housing (really land) versus not being able to afford food.

replies(1): >>43752259 #
3. pdfernhout ◴[] No.43752259[source]
I'm getting more at things like a perspective shift, like represented with ideas at these links:

https://en.wikipedia.org/wiki/The_Ultimate_Resource

https://www.remineralize.org/

https://en.wikipedia.org/wiki/Voyage_from_Yesteryear

https://duckduckgo.com/?q=cost+of+militarism

https://en.wikipedia.org/wiki/War_Is_a_Racket

https://www.alfiekohn.org/article/case-competition/

https://www.pop.org/overpopulation-myth/

https://www.fifthestate.org/archive/298-june-19-1979/the-ori...

https://archive.org/details/AdvancedAutomationForSpaceMissio...

https://archive.org/details/TheUndergroundHistoryOfAmericanE...

https://www.kurtz-fernhout.com/oscomak/AchievingAStarTrekSoc...

https://pdfernhout.net/basic-income-from-a-millionaires-pers...

https://pdfernhout.net/recognizing-irony-is-a-key-to-transce...

https://web.archive.org/web/20080930065642/http://www.whywor... "I [Bob Black] don't suggest that most work is salvageable in this way. But then most work isn't worth trying to save. Only a small and diminishing fraction of work serves any useful purpose independent of the defense and reproduction of the work-system and its political and legal appendages. Twenty years ago, Paul and Percival Goodman estimated that just five percent of the work then being done -- presumably the figure, if accurate, is lower now -- would satisfy our minimal needs for food, clothing and shelter. Theirs was only an educated guess but the main point is quite clear: directly or indirectly, most work serves the unproductive purposes of commerce or social control. Right off the bat we can liberate tens of millions of salesmen, soldiers, managers, cops, stockbrokers, clergymen, bankers, lawyers, teachers, landlords, security guards, ad-men and everyone who works for them. There is a snowball effect since every time you idle some bigshot you liberate his flunkies and underlings also. Thus the economy implodes."

And so on...