←back to thread

AI 2027

(ai-2027.com)
949 points Tenoke | 1 comments | | HN request time: 0.001s | source
Show context
KaiserPro ◴[] No.43575908[source]
> AI has started to take jobs, but has also created new ones.

Yeah nah, theres a key thing missing here, the number of jobs created needs to be more than the ones it's destroyed, and they need to be better paying and happen in time.

History says that actually when this happens, an entire generation is yeeted on to the streets (see powered looms, Jacquard machine, steam powered machine tools) All of that cheap labour needed to power the new towns and cities was created by automation of agriculture and artisan jobs.

Dark satanic mills were fed the decedents of once reasonably prosperous crafts people.

AI as presented here will kneecap the wages of a good proportion of the decent paying jobs we have now. This will cause huge economic disparities, and probably revolution. There is a reason why the royalty of Europe all disappeared when they did...

So no, the stock market will not be growing because of AI, it will be in spite of it.

Plus china knows that unless they can occupy most of its population with some sort of work, they are finished. AI and decent robot automation are an existential threat to the CCP, as much as it is to what ever remains of the "west"

replies(6): >>43576145 #>>43576483 #>>43576494 #>>43576705 #>>43577174 #>>43579468 #
kypro ◴[] No.43576483[source]
> and probably revolution

I theorise that revolution would be near-impossible in post-AGI world. If people consider where power comes from it's relatively obvious that people will likely suffer and die on mass if we ever create AGI.

Historically the general public have held the vast majority of power in society. 100+ years ago this would have been physical power – the state has to keep you happy or the public will come for them with pitchforks. But in an age of modern weaponry the public today would be pose little physical threat to the state.

Instead in todays democracy power comes from the publics collective labour and purchasing power. A government can't risk upsetting people too much because a government's power today is not a product of its standing army, but the product of its economic strength. A government needs workers to create businesses and produce goods and therefore the goals of government generally align with the goals of the public.

But in an post-AGI world neither businesses or the state need workers or consumers. In this world if you want something you wouldn't pay anyone for it or workers to produce it for you, instead you would just ask your fleet of AGIs to get you the resource.

In this world people become more like pests. They offer no economic value yet demand that AGI owners (wherever publicly or privately owned) share resources with them. If people revolted any AGI owner would be far better off just deploying a bioweapon to humanely kill the protestors rather than sharing resources with them.

Of course, this is assuming the AGI doesn't have it's own goals and just sees the whole of humanely as nuance to be stepped over in the same way humans will happy step over animals if they interfere with our goals.

Imo humanity has 10-20 years left max if we continue on this path. There can be no good outcome of AGI because it would even make sense for the AGI or those who control the AGI to be aligned with goals of humanity.

replies(6): >>43577284 #>>43577301 #>>43578075 #>>43578703 #>>43581702 #>>43583712 #
robinhoode ◴[] No.43577301[source]
> In this world people become more like pests. They offer no economic value yet demand that AGI owners (wherever publicly or privately owned) share resources with them. If people revolted any AGI owner would be far better off just deploying a bioweapon to humanely kill the protestors rather than sharing resources with them.

This is a very doomer take. The threats are real, and I'm certain some people feel this way, but eliminating large swaths of humanity is something dicatorships have tried in the past.

Waking up every morning means believing there are others who will cooperate with you.

Most of humanity has empathy for others. I would prefer to have hope that we will make it through, rather than drown in fear.

replies(2): >>43577435 #>>43582032 #
758597464 ◴[] No.43577435{3}[source]
> This is a very doomer take. The threats are real, and I'm certain some people feel this way, but eliminating large swaths of humanity is something dicatorships have tried in the past.

Tried, and succeeded in. In times where people held more power than today. Not sure what point you're trying to make here.

> Most of humanity has empathy for others. I would prefer to have hope that we will make it through, rather than drown in fear.

I agree that most of humanity has empathy for others — but it's been shown that the prevalence of psychopaths increases as you climb the leadership ladder.

Fear or hope are the responses of the passive. There are other routes to take.

replies(1): >>43579797 #
1. bamboozled ◴[] No.43579797{4}[source]
Basically why open source everything is increasingly more important and imo already making “AI” safer.

If the many have access to the latest AI then there is less chance the masses are blindsided by some rogue tech.