> I trust none of us would presume that the decentralized labor of pen & paper calculations somehow instantiated a “psychology” in the sense of a mind experiencing various levels of despair
Your argument is based on an appeal to intuition. But the scenario that you ask people to imagine is profoundly misleading in scale. Let's assume a modern frontier model, around 1 trillion parameters. Let's assume that the math is being done by an immortal monk, who can perform one weight's calculations per second.
The monk will generate the first "token", about 4 characters, in 31,688 years. In a bit over 900,000 years, the immortal monk will have generated a single Tweet.
At that point, I no longer have any intuition. The sort of math I could do by hand in a human lifetime could never "experience" anything.
But I can't rule out the possibility that 900,000 years of math might possibly become a glacial mind, expressing a brief thought across a time far greater than the human species has existed.
As the saying goes, sometimes quantity has a quality all its own.
(This is essentially the "systems response" to Searle's "Chinese room" argument. It's a old discussion.)
In discussions like this, we're always going to bottom out at certain assumptions we bring with us, so I agree.
One reason I like bringing up examples like this (the xkcd in sister reply is also good) is that it makes really visible what our assumptions are. The scales are big both in space and time in order to emphasize what weight is given to functional equivalence.
I feel pretty confident most people wouldn't presume that doing a bunch of math by hand on paper can create glacial ephiphenomenal experiences (though I like the term).
Another thing that's interesting to me is that the converse assumption, i.e. one with a strong allegiance to functionalism, ends up feeling far more idealistic than you might expect. A box of gas, left on its own for long enough, will engage in a pattern of collisions that in a certain interpretative framework correspond to an LLM forward pass. In another, it can be a game of minesweeper.
The individual particles of course, couldn't care less whether you see them as part of one or the other. Yet your ability to see them in light of the first one is perhaps enough for the lights to truly turn on, if transiently, in some mind somewhere.
I don't personally believe LLMs are sentient, but I've always enjoyed this thought experiment: https://xkcd.com/505. I have a signed copy framed on my wall.