←back to thread

258 points signa11 | 1 comments | | HN request time: 0.213s | source
Show context
kirubakaran ◴[] No.42732804[source]
> A major project will discover that it has merged a lot of AI-generated code

My friend works at a well-known tech company in San Francisco. He was reviewing his junior team member's pull request. When asked what a chunk of code did, the team member matter-of-factly replied "I don't know, chatgpt wrote that"

replies(16): >>42733064 #>>42733126 #>>42733357 #>>42733510 #>>42733737 #>>42733790 #>>42734461 #>>42734543 #>>42735030 #>>42735130 #>>42735456 #>>42735525 #>>42735773 #>>42736703 #>>42736792 #>>42737483 #
deadbabe ◴[] No.42733126[source]
I hope that junior engineer was reprimanded or even put on a PIP instead of just having the reviewer say lgtm and approve the request.
replies(2): >>42733168 #>>42733515 #
WaxProlix ◴[] No.42733168[source]
Probably depends a lot on the team culture. Depending on what part of the product lifecycle you're on (proving a concept, rushing to market, scaling for the next million TPS, moving into new verticals,...) and where the team currently is, it makes a lot of sense to generate more of the codebase by AI. Write some decent tests, commit, move on.

I wish my reports would use more AI tools for parts of our codebase that don't need a high bar of scrutiny, boilerplate at enterprise scale is a major source of friction and - tbh - burnout.

replies(3): >>42733431 #>>42733542 #>>42736479 #
1. GeoAtreides ◴[] No.42736479[source]
> Write some decent tests, commit, move on.

Move on to what?! Where does a junior programmer who doesn't understand what the code does moves on to?