> It's a piece of software that predicts the most likely token, it is not and can never be conscious.
A brain is a collection of cells that transmit electrical signals and sodium. It is not and can never be conscious.
Except an LLM actually is a piece of software. And the brain is not what you said.
I think this is a useful way to look at things. We often point out that LLMs are not conscious because of x, but we tend to forget that we don't really know what consciousness is, nor do we really know what intelligence is beyond the Justice Potter Stewart definition. It's helpful to occasionally remind ourselves how much uncertainty is involved here.