logoalt Hacker News

WithinReasontoday at 8:48 AM0 repliesview on HN

from your first source:

> As a consequence, the model is no longer deterministic at the sequence-level, but only at the batch-level

therefore they are deterministic when the batch size is 1

Your second source lists a large number of ways how to make LLMs determnistic. The title of your third source is "Defeating Nondeterminism in LLM Inference" which also means that they can be made deterministic.

Every single one of your sources proves you wrong, so no more sources need to be cited.