logoalt Hacker News

anonymous908213yesterday at 6:52 PM6 repliesview on HN

Are you trying to suppose that an LLM is more intelligent than a small child with simple thought processes, almost no language capability, little long-term planning, and minimal ability to form long-term memory? Even with all of those qualifiers, you'd still be wrong. The LLM is predicting what tokens come next, based on a bunch of math operations performed over a huge dataset. That, and only that. That may have more utility than a small child with [qualifiers], but it is not intelligence. There is no intent to deceive.


Replies

ctothyesterday at 7:02 PM

A small child's cognition is also "just" electrochemical signals propagating through neural tissue according to physical laws!

The "just" is doing all the lifting. You can reductively describe any information processing system in a way that makes it sound like it couldn't possibly produce the outputs it demonstrably produces. "The sun is just hydrogen atoms bumping into each other" is technically accurate and completely useless as an explanation of solar physics.

show 2 replies
mikepurvisyesterday at 9:43 PM

Short term memory is the context window, and it's a relatively short hop from the current state of affairs to here's an MCP server that gives you access to a big queryable scratch space where you can note anything down that you think might be important later, similar to how current-gen chatbots take multiple iterations to produce an answer; they're clearly not just token-producing right out of the gate, but rather are using an internal notepad to iteratively work on an answer for you.

Or maybe there's even a medium term scratchpad that is managed automatically, just fed all context as it occurs, and then a parallel process mulls over that content in the background, periodically presenting chunks of it to the foreground thought process when it seems like it could be relevant.

All I'm saying is there are good reasons not to consider current LLMs to be AGI, but "doesn't have long term memory" is not a significant barrier.

pfischyesterday at 7:43 PM

Yes. I also don't think it is realistic to pretend you understand how frontier LLMs operate because you understand the basic principles of how the simple LLMs worked that weren't very good.

Its even more ridiculous than me pretending I understand how a rocket ship works because I know there is fuel in a tank and it gets lit on fire somehow and aimed with some fins on the rocket...

show 1 reply
jvidalvyesterday at 6:56 PM

What is the definition for intelligence?

show 1 reply
coldteayesterday at 7:18 PM

>The LLM is predicting what tokens come next, based on a bunch of math operations performed over a huge dataset.

Whereas the child does what exactly, in your opinion?

You know the child can just as well to be said to "just do chemical and electrical exchanges" right?

show 2 replies
nurettinyesterday at 8:53 PM

Intelligence is about acquiring and utilizing knowledge. Reasoning is about making sense of things. Words are concatenations of letters that form meaning. Inference is tightly coupled with meaning which is coupled with reasoning and thus, intelligence. People are paying for these monthly subscriptions to outsource reasoning, because it works. Half-assedly and with unnerving failure modes, but it works.

What you probably mean is that it is not a mind in the sense that it is not conscious. It won't cringe or be embarrassed like you do, it costs nothing for an LLM to be awkward, it doesn't feel weird, or get bored of you. Its curiosity is a mere autocomplete. But a child will feel all that, and learn all that and be a social animal.