logoalt Hacker News

tayo42yesterday at 2:19 AM2 repliesview on HN

>Especially for LLMs, they are not (till now) learning on the fly.

Was this just awkward phrasing or did something change and they learn after training?


Replies

Dusseldorfyesterday at 3:07 AM

There have been several projects lately attempting to create running context/memory, and Claude Code also has some concept of continuous conversational memory, but all if these are bolted at inference time, there's still no concept of conversations feeding back into base model training/weights on the fly.

Skyy93yesterday at 11:30 AM

If you are implying I am a bot myself i have to disappoint you. I am just not a native speaker, so some phrases could be awkward because I translate german to english.