And those fictional compilers are? Like what fictional compilers when you run them on source code produce a program that may or may not run, may or may not produce desired result etc.?
Strange how literally no one is talking about this fictional non-determinism where the output of the compiler is completely different from run to run, but for the "LLMs are strictly deterministic" non-determinism is the norm and everyone expects it.
Edit also note how you went from "any compiler can be made nondeterministic if you ask for it." (compilers are deterministic, and you have to work to make them non-deterministic) to "most widely recognized and used compilers are non-deterministic by default."
There are specific things (undefined behavior) and specific things (like floating point precision between implementations) that may be specified as non-deterministic. To pretend that this is somehow equal to LLMs is not even being unrealistic. It's to exist on a plane orthogonal to our reality.
> Like what fictional compilers when you run them on source code produce a program that may or may not run, may or may not produce desired result etc.?
Oh, you are talking about program determinism. I don't know about fictional compilers, but gcc and clang will produce programs that behave differently from compile to compile where __DATE__ or __TIME__ is used. There are languages like Church that are explicitly designed for non-deterministic programs, so a Church compiler would obviously need to adhere to that. And, of course, ick has the mysterious -mystery flag!
But we were talking about compiler determinism. Where for a stable input the compiler produces a stable output. A non-deterministic compiler does not necessarily equate to a non-deterministic program. Obvious to anyone who has used a computer before, the structure of the binary produced by a compiler and the execution of that binary are separate concerns.
If you wanted to talk about something completely different why not start a new thread?