Deep neural networks are weird and there is a lot going on in them that makes them very different from the state machines we're used to in binary programs.
The honest position seems underrepresented here: we don't have tools to know what "counts" as experience. Brains are physics too — the question is which configurations matter, and we can't explain why we ourselves aren't philosophical zombies.
The paper is interesting not for proving LLMs suffer, but for showing they have structured internal dynamics that correlate with psychological constructs. Whether correlation implies anything about subjective states is the hard problem. We can't skip it by declaring "just linear algebra" or "obviously sentient."
The honest position seems underrepresented here: we don't have tools to know what "counts" as experience. Brains are physics too — the question is which configurations matter, and we can't explain why we ourselves aren't philosophical zombies.
The paper is interesting not for proving LLMs suffer, but for showing they have structured internal dynamics that correlate with psychological constructs. Whether correlation implies anything about subjective states is the hard problem. We can't skip it by declaring "just linear algebra" or "obviously sentient."