logoalt Hacker News

sarchertechyesterday at 6:36 PM1 replyview on HN

That’s the difference. In practice a human has to commit fraud to do this.

But a human just using an LLM to generate code will do it accidentally. The difference is that regurgitation of training text is a documented failure mode of LLMs.

And there’s no way for the human using it to be aware it’s happening.


Replies

testing22321yesterday at 8:14 PM

You can not accidentally sign your name saying “this code is GPL compliant”

If you can’t be sure, don’t sign.