logoalt Hacker News

JAlexoidlast Friday at 9:22 PM1 replyview on HN

I mean... That is exactly how our memory works. So in a sense, the factually incorrect information coming from LLM is as reliable as someone telling you things from memory.


Replies

dgacmulast Friday at 10:02 PM

But not really? If you ask me a question about Thai grammar or how to build a jet turbine, I'm going to tell you that I don't have a clue. I have more of a meta-cognitive map of my own manifold of knowledge than an LLM does.