logoalt Hacker News

alpaca128today at 1:34 AM1 replyview on HN

So type errors are not hallucinations in your book, but "a reference to something which doesn't exist at all" is?

In the context of AI most people I know tend to mean wrong output, not just hallucinations in the literal sense of the word or things you cannot catch in an automated way.


Replies

johnfntoday at 1:40 AM

My statement is that if your only hallucinations are type errors, that can be solved by simply wrapping the LLM in a harness that says "Please continue working until `yarn run tsc` is clean". Yes, the LLM still hallucinates, but it doesn't affect me, because by the time I see the code, the hallucinations are gone.

This is something I do every day; to be quite honest, it's a fairly mundane use of AI and I don't understand why it's controversial. To give context, I've probably generated somewhere on the order of 100k loc of AI generated code and I can't remember the last time I have seen a hallucination.

show 1 reply