logoalt Hacker News

Nextgridyesterday at 8:32 PM1 replyview on HN

Until they can magically increase context length to such a size that can conveniently fit the whole codebase, we're safe.

It seems like the billions so far mostly go to talk of LLMs replacing every office worker, rather than any action to that effect. LLMs still have major (and dangerous) limitations that make this unlikely.


Replies

esafakyesterday at 9:28 PM

Models do not need to hold the whole code base in memory, and neither do you. You both search for what you need. Models can already memorize more than you !

show 2 replies