Most active commenters
  • jama211(7)
  • eschaton(6)
  • hodgehog11(3)

←back to thread

728 points freetonik | 30 comments | | HN request time: 3.008s | source | bottom
1. hodgehog11 ◴[] No.44976891[source]
How does this not lead to a situation where no honest person can use any AI in their submissions? Surely pull requests that acknowledge AI tooling will be given significantly less attention, on the grounds that no one wants to read work that they know is written by AI.
replies(8): >>44976947 #>>44976989 #>>44977022 #>>44977044 #>>44977416 #>>44977491 #>>44977540 #>>44977886 #
2. MerrimanInd ◴[] No.44976947[source]
It just might. But if people generate a bias against AI generated code because AI can generate massive amounts of vaguely correct looking yet ultimately bad code then that seems like an AI problem not a people problem. Get better, AI coding tools.
3. andunie ◴[] No.44976989[source]
Isn't that a good thing?
replies(2): >>44977047 #>>44977103 #
4. Workaccount2 ◴[] No.44977022[source]
Make a knowledgeable reply and mention you used chat-gpt - comment immediately buried.

Make a knowledgeable reply and give no reference to the AI you used- comment is celebrated.

We are already barreling full speed down the "hide your AI use" path.

replies(3): >>44977616 #>>44979921 #>>44984197 #
5. skogweb ◴[] No.44977044[source]
I don't think this is the case. Mitchell writes that he himself uses LLMs, so it's not black and white. A PR author who has a deep understanding of their changes and used an LLM for convenience will be able to convey this without losing credibility imo
6. hodgehog11 ◴[] No.44977047[source]
It might encourage people to be dishonest, or to not contribute at all. Maybe that's fine for now, but what if the next generation come to rely on these tools?
7. jama211 ◴[] No.44977103[source]
What, building systems where we’re specifically incentivised not to disclose ai use?
replies(1): >>44977947 #
8. KritVutGu ◴[] No.44977416[source]
Good point. That's the point exactly. Don't use AI for writing your patch. At all.

Why are you surprised? Do companies want to hire "honest" people whose CVs were written by some LLM?

replies(2): >>44978035 #>>44978057 #
9. whimsicalism ◴[] No.44977491[source]
i'm happy to read work written by AI and it is often better than a non-assisted PR
10. alfalfasprout ◴[] No.44977540[source]
No one is saying to not use AI. The intent here is to be honest about AI usage in your PRs.
11. showcaseearth ◴[] No.44977616[source]
I doubt a PR is going to be buried if it's useful, well designed, good code, etc, just because of this disclosure. Articulate how you used AI and I think you've met the author's intent.

If the PR has issues and requires more than superficial re-work to be acceptable, the authors don't want to spend time debugging code spit out by an AI tool. They're more willing to spend a cycle or two if the benefit is you learning (either generally as a dev or becoming more familiar with the project). If you can make clear that you created or understand the code end to end, then they're more likely to be willing to take these extra steps.

Seems pretty straightforward to me and thoughtful by the maintainers here.

replies(1): >>44978224 #
12. eschaton ◴[] No.44977886[source]
You ask this as if it’s a problem.
13. eschaton ◴[] No.44977947{3}[source]
Submitting a PR also means you’re not submitting code copied from elsewhere without calling that out and ensuring license compatibility, we don’t refer to that as incentivizing lying about the origin of submitted code.

Fraud and misrepresentation are always options for contributors, at some point one needs to trust that they’re adhering to the rules that they agreed to adhere to.

replies(1): >>45006974 #
14. Octoth0rpe ◴[] No.44978035[source]
> Do companies want to hire "honest" people whose CVs were written by some LLM?

Yes, some companies do want to hire such people, the justification given is something along the lines of "we need devs who are using the latest tools/up to date on the latest trends! They will help bring in those techniques and make all of our current devs more productive!". This isn't a bad set of motivations or assumptions IMO.

Setting aside what companies _want_, they almost certainly are already hiring devs with llm-edited CVs, whether they want it or not. Such CVs/resumes are more likely to make it through HR filters.

15. hodgehog11 ◴[] No.44978057[source]
I don't know if future generations will agree with this sentiment, in which case we lock ourselves out of future talent (i.e. those that use AI to assist, not to completely generate). The same arguments were made about Photoshop once upon a time.

> Do companies want to hire "honest" people whose CVs were written by some LLM?

Unfortunately yes, they very much seem to. Since many are using LLMs to assess CVs, those which use LLMs to help write their CV have a measured advantage.

16. rane ◴[] No.44978224{3}[source]
> I doubt a PR is going to be buried if it's useful, well designed, good code, etc, just because of this disclosure

If that were the case, why would this rule be necessary, if it indeed is the substance that matters? AI generated anything has a heavy slop stigma right now, even if the content is solid.

This would make for an interesting experiment to submit a PR that was absolute gold but with the disclaimer it was generated with help of ChatGPT. I would almost guarantee it would be received with skepticism and dismissals.

replies(2): >>44980328 #>>44980480 #
17. wmf ◴[] No.44979921[source]
HN works that way but Mitchell said he isn't opposed to AI. You have to know the vibe of your environment.
18. s-lambert ◴[] No.44980328{4}[source]
The rule is necessary because the maintainers want to build good will with contributors and if a contributor makes a bad PR but could still learn from it then they will put effort into it. It's a "if you made a good effort, we'll give you a good effort" and using AI tools gives you a very low floor for what "effort" is.

If you make a PR where you just used AI, it seems to work, but didn't go further then the maintainers can go "well I had a look, it looks bad, you didn't put effort in, I'm not going to coach you through this". But if you make a PR where you go "I used AI to learn about X then tried to implement X myself with AI writing some of it" then the maintainers can go "well this PR doesn't look good quality but looks like you tried, we can give some good feedback but still reject it".

In a world without AI, if they were getting a lot of PRs from people who obviously didn't spend any time on their PRs then maybe they would have a "tell us how long this change took you" disclosure as well.

19. davidcbc ◴[] No.44980480{4}[source]
The author explains why

> While we aren't obligated to in any way, I try to assist inexperienced contributors and coach them to the finish line, because getting a PR accepted is an achievement to be proud of. But if it's just an AI on the other side, I don't need to put in this effort, and it's rude to trick me into doing so.

If it's bad code from a person he'll help them get it fixed. If it's bad code from an AI why bother?

20. vultour ◴[] No.44984197[source]
The last three GitHub issues I ran across when looking something up had people literally copy pasting the entire ChatGPT response as their comment. It feels like I'm living in some crazy dystopia when several _different_ people post a 30+ line message that's 95% the same. I'm sorry but I refuse to interact with people who do this, if I wanted to talk to a computer I'd do it myself.
21. jama211 ◴[] No.45006974{4}[source]
If you removed all PR’s from the world that included copy-pasted code from stackoverflow that wasn’t mentioned, you’d be removing a LOT of PR’s. It’s not even considered a problem to copy and paste code from stackoverflow among most devs as long as you have reviewed it and modified it where necessary for your purposes. AI should be treated like that, if it’s not, people will just hide it and do it anyway.
replies(1): >>45008516 #
22. eschaton ◴[] No.45008516{5}[source]
Just because many developers are irresponsible in their approach to incorporating others’ works doesn’t change how intellectual property actually works, and any project that actually cares (whether open or proprietary) will ensure that people understand their responsibilities.

What you’re saying is essentially the code equivalent of “I found this image via Google search so of course it’s OK to put into a presentation, it’s on the web so that means I can use it.” This may not be looked at too hard for an investor presentation, but if you’re doing a high profile event like Apple’s WWDC you’ll learn quickly that all assets require clearance and “I found it on the web” won’t cut it—you’ll be made to use a different image or, if you actually present with the unlicensed image, you could be disciplined or outright fired for causing the company liability.

It’s amazing how many people in this industry think it’s OK to just wing this shit and even commit outright fraud just because it’s convenient.

replies(1): >>45010365 #
23. jama211 ◴[] No.45010365{6}[source]
Your argument breaks down when you realise that doing simple operations will result in the exact same code anyway. If I want to pull the first word from a string, whether I write it myself or copy and paste it from stack overflow we’re going to likely literally result in the exact same line of code. It’s not the same as an image from google images, because that image has a far higher chance of being unique.

You can talk about how we should act and be all high and mighty all you like, but it’s just burying your head in the sand about the reality of how code is written.

Also, technically, I never said this made it perfectly ok. It’s just that it’s the reality we live in and if we got rid of everyone doing it we’d have to fire 99% of programmers.

replies(1): >>45015714 #
24. eschaton ◴[] No.45015714{7}[source]
I’ve been in this industry 30+ years working for both very small companies and some of the largest. I thus have a pretty good understanding of the reality of how code is written, and you’re the one burying your head in the sand: Companies care about the provenance of what you write, and if you lie about that—whether explicitly or through omission—and it’s discovered, you’re going to be in a world of hurt since you may be exposing the company to liability and also violating your employer’s trust.
replies(2): >>45031306 #>>45031320 #
25. jama211 ◴[] No.45031306{8}[source]
You’re seriously trying to analogously solve teenage pregnancy by advocating for abstinence.

Look around. Do you see the majority of programmers getting fired for copying a line from stackoverflow or using AI?

You must either work in an ultra high security area or are so removed from the groundwork of most programming jobs that you don’t know how people do anything anymore. I’m not surprised you mentioned 30+ years, because that likely puts you squarely out of the trenches where the development is actually done.

Outside of like, the military or airplane software, companies really don’t care about provenance most of the time, their lack of processes to avoid looking into any of that are absolute PROOF of that. It’s don’t ask don’t tell out there.

You can be delusional all you like, it doesn’t change the reality of how most development is done.

Again, I didn’t say it’s a good thing, it’s just that it is reality.

replies(1): >>45031377 #
26. jama211 ◴[] No.45031320{8}[source]
As a side note, the company I work for actively encourages ai use in development, and this is really quite common now.
27. eschaton ◴[] No.45031377{9}[source]
My last 20 years in the industry were at Apple.
replies(1): >>45037023 #
28. jama211 ◴[] No.45037023{10}[source]
Ahh, so you see and accept my points then.
replies(1): >>45044421 #
29. eschaton ◴[] No.45044421{11}[source]
No, I think you’re only exposed to a limited portion of the industry where people play fast and loose with IP provenance and as a result develop very bad habits that need to be broken when they enter an environment that takes it seriously.
replies(1): >>45050048 #
30. jama211 ◴[] No.45050048{12}[source]
I think you’ll find it’s a limited portion of the industry that takes it seriously, when the vast majority don’t. You’ve worked in one place for like 20 years and it’s one of the strictest places for this stuff, can’t you see that that might bias you somewhat?