logoalt Hacker News

TheLNLtoday at 5:37 AM1 replyview on HN

Humans have a mechanism to make live changes to their neural network and clean up messes while sleeping. I see no reason for llms to not be able to do this other than the fact that it is resource intensive (which will continue to go down)


Replies

shinycodetoday at 7:59 AM

The analogy holds technically, but there’s a missing piece: the brain doesn’t just update weights, it does so guided by experience that matters to a situated, embodied agent with drives and stakes. Sleep consolidation isn’t random cleanup, it’s selective based on salience and emotion. An LLM updating more efficiently is progress, but it’s still optimizing a loss function. Whether that ever approximates what the brain does during sleep depends entirely on whether you think the what (weight updates) is sufficient, or whether the why (relevance to a lived experience) is what makes it meaningful. So yes, the resource argument will weaken over time. But the architectural gap may be deeper than just compute.