Most active commenters
  • amake(4)
  • alganet(4)
  • fc417fc802(3)

←back to thread

489 points todsacerdoti | 15 comments | | HN request time: 1.818s | source | bottom
Show context
benlivengood ◴[] No.44383064[source]
Open source and libre/free software are particularly vulnerable to a future where AI-generated code is ruled to be either infringing or public domain.

In the former case, disentangling AI-edits from human edits could tie a project up in legal proceedings for years and projects don't have any funding to fight a copyright suit. Specifically, code that is AI-generated and subsequently modified or incorporated in the rest of the code would raise the question of whether subsequent human edits were non-fair-use derivative works.

In the latter case the license restrictions no longer apply to portions of the codebase raising similar issues from derived code; a project that is only 98% OSS/FS licensed suddenly has much less leverage in takedowns to companies abusing the license terms; having to prove that infringers are definitely using the human-generated and licensed code.

Proprietary software is only mildly harmed in either case; it would require speculative copyright owners to disassemble their binaries and try to make the case that AI-generated code infringed without being able to see the codebase itself. And plenty of proprietary software has public domain code in it already.

replies(8): >>44383156 #>>44383218 #>>44383229 #>>44384184 #>>44385081 #>>44385229 #>>44386155 #>>44387156 #
AJ007 ◴[] No.44383229[source]
I understand what experienced developers don't want random AI contributions from no-knowledge "developers" contributing to a project. In any situation, if a human is review AI code line by line that would tie up humans for years, even ignoring anything legally.

#1 There will be no verifiable way to prove something was AI generated beyond early models.

#2 Software projects that somehow are 100% human developed will not be competitive with AI assisted or written projects. The only room for debate on that is an apocalypse level scenario where humans fail to continue producing semiconductors or electricity.

#3 If a project successfully excludes AI contributions (not clear how other than controlling contributions to a tight group of anti-AI fanatics), it's just going to be cloned, and the clones will leave it in the dust. If the license permits forking then it could be forked too, but cloning and purging any potential legal issues might be preferred.

There still is a path for open source projects. It will be different. There's going to be much, much more software in the future and it's not going to be all junk (although 99% might.)

replies(16): >>44383277 #>>44383278 #>>44383309 #>>44383367 #>>44383381 #>>44383421 #>>44383553 #>>44383615 #>>44383810 #>>44384306 #>>44384448 #>>44384472 #>>44385173 #>>44386408 #>>44387925 #>>44389059 #
amake ◴[] No.44383278[source]
> #2 Software projects that somehow are 100% human developed will not be competitive with AI assisted or written projects

Still waiting to see evidence of AI-driven projects eating the lunch of "traditional" projects.

replies(4): >>44383368 #>>44383382 #>>44383858 #>>44386542 #
viraptor ◴[] No.44383368[source]
It's happening slowly all around. It's not obvious because people producing high quality stuff have no incentive at all to mark their changes as AI-generated. But there are also local tools generated faster than you could adjust existing tools to do what you want. I'm running 3 things now just for myself that I generated from scratch instead of trying to send feature requests to existing apps I can buy.

It's only going to get more pervasive from now on.

replies(2): >>44383499 #>>44384560 #
alganet ◴[] No.44383499[source]
Can you show these 3 things to us?
replies(4): >>44383630 #>>44383710 #>>44383844 #>>44384062 #
WD-42 ◴[] No.44383630[source]
For some reason these fully functional ai generated projects that the authors vibe out while playing guitar and clipping their toenails are never open source.
replies(6): >>44383999 #>>44384026 #>>44384847 #>>44385049 #>>44386161 #>>44387603 #
fc417fc802 ◴[] No.44384026[source]
> the authors vibe out while playing guitar and clipping their toenails

I don't think anyone is claiming that. If you submit changes to a FOSS project and an LLM assisted you in writing them how would anyone know? Assuming at least that you are an otherwise competent developer and that you carefully review all code before you commit it.

The (admittedly still controversial) claim being made is that developers with LLM assistance are more productive than those without. Further, that there is little incentive for such developers to advertise this assistance. Less trouble for all involved to represent it as 100% your own unassisted work.

replies(2): >>44384302 #>>44387261 #
1. EGreg ◴[] No.44384302[source]
Why would you need to carefully review code? That is so 2024. You’re bottlenecking the process and are at a disadvantage when the AI could be working 24/7. We have AI agents that have been trained to review thousands of PRs that are produced by other, generative agents, and together they have already churned out much more software than human teams can write in a year.

AI “assistance” is a short intermediate phase, like the “centaurs” that Garry Kasparov was very fond of (human + computer beat both a human and a computer by itself… until the computer-only became better).

https://en.wikipedia.org/wiki/Advanced_chess

replies(1): >>44384547 #
2. amake ◴[] No.44384547[source]
> We have AI agents that have been trained to review thousands of PRs that are produced by other, generative agents, and together they have already churned out much more software than human teams can write in a year.

Was your comment tongue-in-cheek? If not, where is this huge mass of AI-generated software?

replies(1): >>44384719 #
3. rvnx ◴[] No.44384719[source]
All around you, just that it doesn’t make sense for developers to reveal that a lot of their work is now about chunking and refining the specifications written by the product owner.

Admitting such is like admitting you are overpaid for your job, and that a 20 USD AI-agent can do better and faster than you for 75% of the work.

Is it easy to admit that you have learnt skills for 10+ years that are progressively already getting replaced by a machine ? (like thousands of jobs in the past).

More and more, developer is going to be a monkey job where your only task is to make sure there is enough coal in the steam machine.

Compilers destroyed the jobs of developers writing assembler code, they had to adapt. They insisted that hand-written assembler was better.

Here is the same, except you write code in natural language. It may not be optimal in all situations but it often gets the job done.

replies(3): >>44384964 #>>44385243 #>>44385508 #
4. bonzini ◴[] No.44384964{3}[source]
Good luck debugging
5. amake ◴[] No.44385243{3}[source]
> All around you, just that it doesn’t make sense for developers to reveal that

OK, but I asked for evidence and people just keep not providing any.

"God is all around you; he just works in mysterious ways"

OK, good luck with that.

replies(2): >>44386914 #>>44388626 #
6. alganet ◴[] No.44385508{3}[source]
I have a complete proof that P=NP but it doesn't make sense to reveal to the world that now I'm god. It would crush their little hearts.
replies(1): >>44386572 #
7. ben_w ◴[] No.44386572{4}[source]
P = NP is less "crush their little hearts", more "may cause widespread heart attacks across every industry due to cryptography failing, depending on if the polynomial exponent is small enough".
replies(1): >>44386766 #
8. Dylan16807 ◴[] No.44386766{5}[source]
A very very big if.

Also a sufficiently good exponential solver would do the same thing.

9. rvnx ◴[] No.44386914{4}[source]
Billions of people believe in god(s). In fact, 75 to 85% of the world population, btw.
replies(3): >>44386983 #>>44387302 #>>44388915 #
10. amake ◴[] No.44386983{5}[source]
And?
replies(1): >>44389395 #
11. latexr ◴[] No.44387302{5}[source]
And not that long ago, the majority of the population believed the Earth is flat, and that cigarettes are good for your health. Radioactive toys were being sold to children.

Wide belief does not equal truth.

12. alganet ◴[] No.44388915{5}[source]
Billions of people _say_ they believe in god. It's very different.

--

When you analyze church attendance, it drops to roughly 50% instead of 85% of the population:

https://en.wikipedia.org/wiki/Church_attendance#Demographics

If you start to investigate many aspects of religious belief, like how many christians read the bible, the numbers drop drastically to less than 15%

https://www.statista.com/statistics/299433/bible-readership-...

This demonstrates that we cannot rely on self-reporting to understand religious belief. In practice, most people are closer to atheists than believers.

replies(1): >>44389320 #
13. fc417fc802 ◴[] No.44389320{6}[source]
That's rather silly. Neither of those things is a requirement for belief.
replies(1): >>44389506 #
14. fc417fc802 ◴[] No.44389395{6}[source]
Obviously it's the basis for a religion. We're to have faith in the ability of LLMs. To ask for evidence of that is to question the divine. You can ask a model itself for the relevant tenants pertaining to any given situation.
15. alganet ◴[] No.44389506{7}[source]
You can believe all you want, but practice is what actually matters.

It's the same thing with AI.