Universities have this issue too, despite many offering students and staff Grammarly (Gen AI) while also trying to ban Gen AI.
Universities have this issue too, despite many offering students and staff Grammarly (Gen AI) while also trying to ban Gen AI.
I'm sure that if a contributor working on a feature used cursor to initially generate the code but then goes over it to ensure it's working as expected that would be allowed, this is more for those folks that just want to jam in a quick vibe-coded PR so they can add "contributed to the QEMU project" on their resumes.
Use AI if you want to, but if the person on the other side can tell, and you can't defend the submission as your own, that's a problem.
The actual policy is "don't use AI code generators"; don't try to weasel that into "use it if you want to, but if the person on the other side can tell". That's effectively "it's only cheating if you get caught".
By way of analogy, Open Source projects also typically have policies (whether written or unwritten) that you only submit code you are legally allowed to submit. In theory, you could take a pile of proprietary reverse-engineered code that you have no license to, or a pile of code from another project that you aren't respecting the license of, and submit it anyway, and slap a `Signed-off-by` on it. Nothing will physically stop you, and people might not be able to tell. That doesn't make it OK.
The rules regarding the origin of code contributions are rather strict, that is, you can't contribute other people code unless you can make sure that the licence is appropriate. A LLM may output a copy of someone else code, sometimes verbatim, without giving you its origin, so you can't contribute code written by a LLM.
Getting AI to remind you of the libraries API is a fair bit different to having it generate 1000 lines of code you have hardly read before submitting.