logoalt Hacker News

tsunamifuryyesterday at 3:59 PM1 replyview on HN

It’s amazing how much you get wrong here. As LLM attention layers are stacked goal functions.

What they lack is multi turn long walk goal functions — which is being solved to some degree by agents.


Replies

strkenyesterday at 11:37 PM

I don't argue that thinking and attention are missing. I argue that they are trying to do the job of human executive function but aren't as good at it.