logoalt Hacker News

esafakyesterday at 9:28 PM2 repliesview on HN

Models do not need to hold the whole code base in memory, and neither do you. You both search for what you need. Models can already memorize more than you !


Replies

Jenssonyesterday at 9:35 PM

> Models do not need to hold the whole code base in memory, and neither do you

Humans rewire their mind to optimize it for the codebase, that is why new programmers takes a while to get up to speed in the codebase. LLM doesn't do that and until they do they need the entire thing in context.

And the reason we can't do that today is that there isn't enough data in a single codebase to train an LLM to be smart about it, so first we need to solve the problem that LLM needs billions of examples to do a good job. That isn't on the horizon so we are probably safe for a while.

show 1 reply
Nextgridyesterday at 9:36 PM

I’ll believe it when coding agents can actually make concise & reusable code instead of reimplementing 10 slightly-different versions of the same basic thing on every run (this is not a rant, I would love for agents to stop doing that, and I know how to make them - with proper AGENTS.md that serves as a table of contents for where stuff is - but my point is that as a human I don’t need this and yet they still do for now).

show 1 reply