logoalt Hacker News

jlduggeryesterday at 9:17 PM0 repliesview on HN

IDK how it applies to LLMs but the original meaning was a change in a distribution over time. Like if you had some model based app trained on American English, but slowly more and more American Spanish users adopt your app; training set distribution is drifting away from the actual usage distribution.

In that situation, your model accuracy will look good on holdout sets but underperform in user's hands.