logoalt Hacker News

Retriclast Friday at 10:50 PM1 replyview on HN

Thinking is different than forming long term memories.

An LLM could be thinking in one of two ways. Either between adding each individual token, or collectively across multiple tokens. At the individual token level the physical mechanism doesn’t seem to fit the definition being essentially reflexive action, but across multiple tokens that’s a little more questionable especially as multiple approaches are used.


Replies

t23414321last Friday at 10:59 PM

An LLM ..is calculated ..from language (or from things being said by humans before being true or not). It's not some antropomorfic process what using the word thinking would suggest (to sell well).

> across multiple tokens

- but how many ? how many of them happen in sole person life ? How many in some calculation ? Does it matter, if a calculation doesn't reflect it but stay all the same ? (conversation with.. a radio - would it have any sense ?)

show 1 reply