logoalt Hacker News

mrobyesterday at 10:57 PM2 repliesview on HN

How can consciousness be possible without internal state? LLM inference is equivalent to repeatedly reading a giant look-up table (a pure function mapping a list of tokens to a set of token probabilities). Is the look-up table conscious merely by existing or does the act of reading it make it conscious? Does the format it's stored in make a difference?


Replies

Kim_Bruningtoday at 12:43 AM

For all practical purposes, calling it a LUT is somewhat too reductive to be useful here I think. But we can try: leaving aside LLMs for a second; with this LUT reasoning model you're using, would you be able to prove the existence of just a computer?

Chance-Deviceyesterday at 11:29 PM

What state is lacking? There is a result which requires computation to be output. The model is the state. The computation must be performed for each input to produce a given output. What are you even objecting to?