logoalt Hacker News

tracerbulletxtoday at 1:35 AM3 repliesview on HN

We don't even know what the pre-requisites for consciousness are so we have no way of knowing. LLMs have emergent behavior that is reminiscent of language forming brains, but they're also missing a lot of properties that are probably necessary? Mainly continuity over time, more integrated memory, and a better sense of space and time? Brains use the rhythm and timing of neuronal firings, and the length of axons effects computation, they do a lot of different things with signal and patterns, but in any case without knowing what consciousness is I don't know which of those things are required.


Replies

KaiserProtoday at 10:51 AM

> LLMs have emergent behavior that is reminiscent of language forming brains,

Indeed, but then we need to prove that they are not "chinese box" conscious. Which is hard, because it might be that the thing running the chinese box is conscious, but can only communicate in a way it doesn't understand

boxedtoday at 7:30 AM

> We don't even know what the pre-requisites for consciousness are so we have no way of knowing.

Imo we don't even have a definition of the word that we agree on.

show 2 replies
throwuxiytayqtoday at 7:12 AM

Clive Wearing's memory lasts for less than 30 seconds, so he has no memory of being awake before now. He is permanently in a state of feeling like he has just woken up, observing his surroundings for the first time.

Clive Wearing's mind has no time continuity and basically zero memory integration. Is he not conscious? There's interviews with the guy.

Where on the scale [No mind <-> Clive Wearing <-> Healthy human brain] would you put an LLM with a 10M token context window?