←back to thread

551 points todsacerdoti | 2 comments | | HN request time: 0.421s | source
Show context
benlivengood ◴[] No.44383064[source]
Open source and libre/free software are particularly vulnerable to a future where AI-generated code is ruled to be either infringing or public domain.

In the former case, disentangling AI-edits from human edits could tie a project up in legal proceedings for years and projects don't have any funding to fight a copyright suit. Specifically, code that is AI-generated and subsequently modified or incorporated in the rest of the code would raise the question of whether subsequent human edits were non-fair-use derivative works.

In the latter case the license restrictions no longer apply to portions of the codebase raising similar issues from derived code; a project that is only 98% OSS/FS licensed suddenly has much less leverage in takedowns to companies abusing the license terms; having to prove that infringers are definitely using the human-generated and licensed code.

Proprietary software is only mildly harmed in either case; it would require speculative copyright owners to disassemble their binaries and try to make the case that AI-generated code infringed without being able to see the codebase itself. And plenty of proprietary software has public domain code in it already.

replies(10): >>44383156 #>>44383218 #>>44383229 #>>44384184 #>>44385081 #>>44385229 #>>44386155 #>>44387156 #>>44391757 #>>44392409 #
AJ007 ◴[] No.44383229[source]
I understand what experienced developers don't want random AI contributions from no-knowledge "developers" contributing to a project. In any situation, if a human is review AI code line by line that would tie up humans for years, even ignoring anything legally.

#1 There will be no verifiable way to prove something was AI generated beyond early models.

#2 Software projects that somehow are 100% human developed will not be competitive with AI assisted or written projects. The only room for debate on that is an apocalypse level scenario where humans fail to continue producing semiconductors or electricity.

#3 If a project successfully excludes AI contributions (not clear how other than controlling contributions to a tight group of anti-AI fanatics), it's just going to be cloned, and the clones will leave it in the dust. If the license permits forking then it could be forked too, but cloning and purging any potential legal issues might be preferred.

There still is a path for open source projects. It will be different. There's going to be much, much more software in the future and it's not going to be all junk (although 99% might.)

replies(17): >>44383277 #>>44383278 #>>44383309 #>>44383367 #>>44383381 #>>44383421 #>>44383553 #>>44383615 #>>44383810 #>>44384306 #>>44384448 #>>44384472 #>>44385173 #>>44386408 #>>44387925 #>>44389059 #>>44397514 #
amake ◴[] No.44383278[source]
> #2 Software projects that somehow are 100% human developed will not be competitive with AI assisted or written projects

Still waiting to see evidence of AI-driven projects eating the lunch of "traditional" projects.

replies(4): >>44383368 #>>44383382 #>>44383858 #>>44386542 #
ben_w ◴[] No.44386542[source]
How can you tell which project is which?

I mean, sure, there's plenty of devs who refuse to use AI, but how many projects rather than individuals are in each category?

And is Microsoft "traditional"? I name them specifically because their CEO claims 20-30% of their new code is AI generated: https://techcrunch.com/2025/04/29/microsoft-ceo-says-up-to-3...

replies(2): >>44397298 #>>44422339 #
throwawayoldie ◴[] No.44397298[source]
He's the CEO of a large corporation. You have to allow for the significant possibility that he's either lying or doesn't know what he's talking about.
replies(1): >>44397698 #
ben_w ◴[] No.44397698[source]
Sure. But what's the alternative way of finding out?

Most (perhaps all) places I've worked have had NDAs, which means statements like this are one of the few ways most of us find out what companies like MS are doing internally.

The only other ways I can think of are when hackers leak their entire commit history, or if a court case reveals it. (I don't expect a whistleblower to do so unless they're also triggering one of those other two categories).

replies(1): >>44429825 #
1. int_19h ◴[] No.44429825[source]
My former colleagues at Microsoft have a few things to say about Satya's statement, but none of them are SFW.
replies(1): >>44474315 #
2. ben_w ◴[] No.44474315[source]
I can very much believe it, but that doesn't really fix the problem of public vs. private information.