logoalt Hacker News

tsunamifurylast Friday at 4:26 PM3 repliesview on HN

Hallucinations are a feature of reality that LLMs have inherited.

It’s amazing that experts like yourself who have a good grasp of the manifold MoE configuration don’t get that.

LLMs much like humans weight high dimensionality across the entire model then manifold then string together an attentive answer best weighted.

Just like your doctor occasionally giving you wrong advice too quickly so does this sometimes either get confused by lighting up too much of the manifold or having insufficient expertise.


Replies

jakewinslast Friday at 4:41 PM

I asked Gemini the other day to research and summarise the pinout configuration for CANbus outputs on a list of hardware products, and to provide references for each. It came back with a table summarising pin outs for each of the eight products, and a URL reference for each.

Of the 8, 3 were wrong, and the references contained no information about pin outs whatsoever.

That kind of hallucination is, to me, entirely different than what a human researcher would ever do. They would say “for these three I couldn’t find pinouts” or perhaps misread a document and mix up pinouts from one model for another.. they wouldn’t make up pinouts and reference a document that had no such information in it.

Of course humans also imagine things, misremember etc, but what the LLMs are doing is something entirely different, is it not?

show 2 replies
acdhalast Friday at 6:18 PM

> Hallucinations are a feature of reality that LLMs have inherited.

Huh? Are you arguing that we still live in a pre-scientific era where there’s no way to measure truth?

As a simple example, I asked Google about houseplant biology recently. The answer was very confidently wrong telling me that spider plants have a particular metabolic pathway because it confused them with jade plants and the two are often mentioned together. Humans wouldn’t make this mistake because they’d either know the answer or say that they don’t. LLMs do that constantly because they lack understanding and metacognitive abilities.

show 1 reply
freejazzlast Friday at 6:12 PM

> Hallucinations are a feature of reality that LLMs have inherited.

Really? When I search for cases on LexisNexis, it does not return made-up cases which do not actually exist.

show 1 reply