←back to thread

14 points johnwheeler | 3 comments | | HN request time: 0.599s | source

On Hacker News and Twitter, the consensus view is that no one is afraid. People concede that junior engineers and grad students might be the most affected. But, they still seem to hold on to their situations as being sustainable. My question is, is this just a part of wishful thinking and human nature, trying to combat the inevitable? The reason I ask is because I seriously don't see a future where there's a bunch of programmers anymore. I see mass unemployment for programmers. People are in denial, and all of these claims that the AI can't write code without making mistakes are no longer valid once an AI is released potentially overnight, that writes flawless code. Claude 4.5 is a good example. I just really don't see any valid arguments that the technology is not going to get to a point where it makes the job irrelevant, not irrelevant, but completely changes the economics.
1. daringrain32781 ◴[] No.46340880[source]
I don’t think so, here’s why:

I have a few co workers who are deep into the current AI trends. I also have the pleasure of reviewing their code. The garbage that gets pushed is insane. I feel I can’t comment on a lot of the issues I see because there’s just so much slop and garbage that hasn’t been thought through that it would be re-writing half of their PR. Maybe it speaks more to their coding ability for accepting that stuff. I see comments that are clearly AI written and pushed like it hasn’t been reviewed by a human. I guard public facing infrastructure and apps as much as I can for fear of this having preventable impacts on customers.

I think this is just more indicative that AI assists can be powerful, but in the hands of an already decent developer.

I kind of lost respect for these developers deep into the AI ecosystem who clearly have no idea what’s being spat out and are just looking to get 8 hours of productivity in the span of 2 or 3.

replies(2): >>46340936 #>>46343020 #
2. bhag2066 ◴[] No.46340936[source]
Are you talking about human generated code or machine generated code?

99% of the work I've ever received from humans has been error riddled garbage. <1% of the work I've received from a machine has been error riddled garbage. Granted I don't work in code, I work in a field that's more difficult for a machine

3. QuiEgo ◴[] No.46343020[source]
> there’s just so much slop and garbage that hasn’t been thought through that it would be re-writing half of their PR

If I were their tech lead I would make them do exactly that. Over and over until they got the point or they got a bad performance review and got PIPed out.

The quality bar is the bar, and it's there for a reason (in my case, I work on security and safety critical stuff, so management has my back (usually)).

I'm glad when people can use AI to help themselves. If it speeds them up, great. I don't care if it writes 100% of their code. They still are responsible for making sure it holds the quality bar. If they constantly submit code that doesn't meet the quality bar they have a performance problem regardless of the tools they used.