Someone asserts, almost religiously, that LLMs do and/or can "think." When asked how to falsify their assertion, perhaps by explaining what exactly is "thinking" in the human brain that can and/or will be possible to emulate...
One mostly sees people aggressively claiming they can’t, ever. On the other side people seem to simply allow that they might, or might eventually.
Or they just point to the turing test which was the defacto standard test for something so nebulous. And behold: LLM can pass the turing test. So they think. Can you come up with something better (than the turing test)?
When asked how to falsify their assertion, perhaps by explaining what exactly is "thinking" in the human brain that can and/or will be possible to emulate...
... someone else points out that the same models that can't "think" are somehow turning in gold-level performance at international math and programming competitions, making Fields Medalists sit up and take notice, winning art competitions, composing music indistinguishable from human output, and making entire subreddits fail the Turing test.
Err, no, that’s not what’s happening. Nobody, at least in this thread (and most others like it I’ve seen), is confidently claiming LLMs can think.
There are people confidently claiming they can’t and then other people expressing skepticism at their confidence and/or trying to get them to nail down what they mean.