For long context, yes this is at least plausible. And the latest models are reaching context lengths of 1M tokens or perhaps more.