logoalt Hacker News

AlecSchuelerlast Friday at 8:17 AM1 replyview on HN

So we should believe the hallucinations because they sound like something that could be true? Does the LLM in the middle somehow makes it more trustworthy than if GP had just shared their own pattern-matching conjecture?


Replies

Random_BSD_Geeklast Friday at 9:27 AM

No. I think LLMs are garbage. Separately, and unrelated: I think Facebook is behind these bills. The LLM may be garbage and still sometimes produce a correct result.

show 1 reply