←back to thread

183 points WolfOliver | 2 comments | | HN request time: 0s | source
Show context
manoDev ◴[] No.45066299[source]
I'm tired of the anthropomorphization marketing behind AI driving this kind of discussion. In a few years, all this talk will sound as dumb as stating "MS Word spell checker will replace writers" or "Photoshop will replace designers".

We'll reap the productivity benefits from this new tool, create more work for ourselves, output will stabilize at a new level and salaries will stagnate again, as it always happens.

replies(9): >>45066425 #>>45066524 #>>45067057 #>>45067320 #>>45067348 #>>45067450 #>>45068047 #>>45068717 #>>45068934 #
ACCount37 ◴[] No.45066524[source]
I'm tired of all the "yet another tool" reductionism. It reeks of cope.

It took under a decade to get AI to this stage - where it can build small scripts and tiny services entirely on its own. I see no fundamental limitations that would prevent further improvements. I see no reason why it would stop at human level of performance either.

replies(11): >>45066554 #>>45066563 #>>45066599 #>>45066617 #>>45066649 #>>45066675 #>>45066708 #>>45066751 #>>45067130 #>>45067218 #>>45067573 #
1. tashoecraft ◴[] No.45066599[source]
There’s this saying that humans are terrible at predicting exponential growth. I believe we need another saying, those who expect exponential growth have a tough time not expecting it.

It’s not under a decade for ai to get to this stage but multiple decades of work, with algorithms finally able to take advantage of gpu hardware to massively excel.

There’s already feeling that growth has slowed, I’m not seeing the rise in performance at coding tasks that I saw over the past few years. I see no fundamental improvements that would suggest exponential growth or human level of performance.

replies(1): >>45067875 #
2. ACCount37 ◴[] No.45067875[source]
I'm not sure if there will be exponential growth, but I also don't believe that it's entirely necessary. Some automation-relevant performance metrics, like "task-completion time horizon", appear to increase exponentially - but do they have to?

All you really need is for performance to keep increasing steadily at a good rate.

If the exponential growth tops out, and AI only gains a linear two days per year of "task-completion time horizon" once it does? It'll be able to complete a small scrum sprint autonomously by year 2035. Edging more and more into the "seasoned professional developer" territory with each passing year, little by little.