> The 196k context length that M2.7 was natively trained up represents neither a hard technical ceiling (this is metadata that can easily adjusted), nor a meaningful degradation threshold
FWIW, I find that in OpenCode it starts becoming erratic after around 80k tokens (sometimes less).