> In which world are these compilers producing non-deterministic output if you run them again and again?
The one where deterministic iteration order (think use of unordered maps/sets, parallel execution, etc.) isn't considered a priority or is even seen as something to avoid as determinism here can lead to slower compiler times. Where there is use of heuristics that can have a "tie score" without effort to choose the winner deterministically. Where there is a need to support features like __DATE__ and __TIME__. So on and so on. It is a little out of the way so you've probably never heard of it, but its inhabitants call it "Earth".
And those fictional compilers are? Like what fictional compilers when you run them on source code produce a program that may or may not run, may or may not produce desired result etc.?
Strange how literally no one is talking about this fictional non-determinism where the output of the compiler is completely different from run to run, but for the "LLMs are strictly deterministic" non-determinism is the norm and everyone expects it.
Edit also note how you went from "any compiler can be made nondeterministic if you ask for it." (compilers are deterministic, and you have to work to make them non-deterministic) to "most widely recognized and used compilers are non-deterministic by default."
There are specific things (undefined behavior) and specific things (like floating point precision between implementations) that may be specified as non-deterministic. To pretend that this is somehow equal to LLMs is not even being unrealistic. It's to exist on a plane orthogonal to our reality.