For the record, brains are also not free of hallucinations.
How much do you hallucinate at work? How many of your work hallucinations do you confidently present as reality in communication or code?
LLMs are being sold as viable replacement of paid employees.
If they were not, they wouldn’t be funded the way they are.
That’s not a very useful observation though is it?
The purpose of mechanisation is to standardise and over the long term reduce errors to zero.
Otoh “The final truth is there is no truth”
Hallucinations are not bad. It adds some kind of creativity, which is good for e.g. image generation, coding, or story telling.
It is bad only in case of reporting on facts.
I still don’t really get this argument/excuse for why it’s acceptable that LLMs hallucinate. These tools are meant to support us, but we end up with two parties who are, as you say, prone to “hallucination” and it becomes a situation of the blind leading the blind. Ideally in these scenarios there’s at least one party with a definitive or deterministic view so the other party (i.e. us) at least has some trust in the information they’re receiving and any decisions they make off the back of it.