←back to thread

358 points andrewstetsenko | 1 comments | | HN request time: 0.001s | source
Show context
hintymad ◴[] No.44362187[source]
Copying from another post. I’m very puzzled on why people don’t talk more about essential complexity of specifying systems any more:

In No Silver Bullet, Fred Brooks argues that the hard part of software engineering lies in essential complexity - understanding, specifying, and modeling the problem space - while accidental complexity like tool limitations is secondary. His point was that no tool or methodology would "magically" eliminate the difficulty of software development because the core challenge is conceptual, not syntactic. Fast forward to today: there's a lot of talk about AI agents replacing engineers by writing entire codebases from natural language prompts. But that seems to assume the specification problem is somehow solved or simplified. In reality, turning vague ideas into detailed, robust systems still feels like the core job of engineers.

If someone provides detailed specs and iteratively works with an AI to build software, aren’t they just using AI to eliminate accidental complexity—like how we moved from assembly to high-level languages? That doesn’t replace engineers; it boosts our productivity. If anything, it should increase opportunities by lowering the cost of iteration and scaling our impact.

So how do we reconcile this? If an agent writes a product from a prompt, that only works because someone else has already fully specified the system—implicitly or explicitly. And if we’re just using AI to replicate existing products, then we’re not solving technical problems anymore; we’re just competing on distribution or cost. That’s not an engineering disruption—it’s a business one.

What am I missing here?

replies(22): >>44362234 #>>44362259 #>>44362323 #>>44362411 #>>44362713 #>>44362779 #>>44362791 #>>44362811 #>>44363426 #>>44363487 #>>44363510 #>>44363707 #>>44363719 #>>44364280 #>>44364282 #>>44364296 #>>44364302 #>>44364456 #>>44365037 #>>44365998 #>>44368818 #>>44371963 #
1. ozim ◴[] No.44364296[source]
There is a lot of truth in No Silver Bullet and I had the same idea in my mind.

Downside is there is much more non essential busy work because of which people had their jobs and now loads of those people will lose the job.

People who do work on real essential complexity of systems are far and between. People who say things like "proper professionals will always have work" are utter assholes thinking mostly that "they are those proper professionals".

In reality AI will be like F1 racing team not needing pit workers and have only drivers, how many drivers are there like 20 so it is 10 teams each having 2 drivers. Each team has 300-1000 people that do all the other things.

If you go to corporate level let's say 1 person working on essential complexity requires 10-20 people doing all kinds of non essential stuff that will be taken over by AI, or to be realistic instead of 10-20 people that person will need headcount of 5.

That is still 15 people out of job - are those people able to take over some essential complexity in a different company or different area of the same company, some would but it is also if they would like to do it. So those people will be pushed around or end up jobless, bitter whatever.

That is not great future coming in.