>Especially for LLMs, they are not (till now) learning on the fly.
Was this just awkward phrasing or did something change and they learn after training?
If you are implying I am a bot myself i have to disappoint you. I am just not a native speaker, so some phrases could be awkward because I translate german to english.
There have been several projects lately attempting to create running context/memory, and Claude Code also has some concept of continuous conversational memory, but all if these are bolted at inference time, there's still no concept of conversations feeding back into base model training/weights on the fly.