In other words, LLMs are probabilistic, not deterministic.
Determinism is a red herring here. The problem is that LLMs are inductive systems, not deductive systems. This makes them powerfully general, and yet inherently unreliable.
Dare I say, so are humans?
Determinism is a red herring here. The problem is that LLMs are inductive systems, not deductive systems. This makes them powerfully general, and yet inherently unreliable.