←back to thread

129 points NotInOurNames | 1 comments | | HN request time: 0.199s | source
Show context
theropost ◴[] No.44067710[source]
Honestly, I’ve been thinking about this whole AGI timeline talk—like, people saying we’re going to hit some major point by 2027 where AI just changes everything. And to me, it feels less like a purely tech-driven prediction and more like something being pushed. Like there’s an agenda behind it, probably coming from certain elites or people in power, especially in the West, who see the current system and think it needs a serious reset.

What’s really happening, in my view, is a forced economic shift. We’re heading into a kind of engineered recession—huge layoffs, lots of instability—where millions of service and admin-type jobs are going to disappear. Not because the tech is ready in a full AGI sense, but because those roles are the easiest to replace with automation and AI agents. They’re not core to the economy, and a lot of them are wrapped in red tape anyway.

So in the next couple years, I think we’ll see AI being used to clear out that mental bureaucracy—forms, paperwork, pointless approvals, inefficient systems. AI isn’t replacing deep creativity or physical labor yet, but it is filling in the cracks and acting like a smart band-aid. It’ll seem useful and “intelligent,” but it’s really just a transition tool.

And once that’s done, the next step is workforce reallocation—pushing people into real-world industries where hands-on labor still matters. Building, manufacturing, infrastructure, things that can’t be automated yet. It’s like the short-term goal is to use AI to wipe out all the mindless middle-layers of the system, and the longer-term vision is full automation—including robotics and real-world systems—maybe 10 or 20 years out.

But right now? This all looks like a top-down move to shift the population out of the “mind” industries and into something else. It’s not just AI progressing—it’s a strategic reset, wrapped in the language of innovation.

replies(2): >>44068534 #>>44068715 #
1. boshalfoshal ◴[] No.44068534[source]
My take is less tinfoil-hatty than this.

I simply think that the majority of people in AI today are scifi nerds who want to live out these fantisies and want to be part of something much larger than they are.

Theres also the obvious incentive from AI companies that automating everything is extremely lucrative (i.e, they stand to gain lots of money/power from the hype and in the event that AGI is real).