logoalt Hacker News

minimaxiryesterday at 8:29 PM1 replyview on HN

Good find, and that's too small a print for comfort.


Replies

ValentineCyesterday at 9:31 PM

It's also in the linked article:

> GPT‑5.4 in Codex includes experimental support for the 1M context window. Developers can try this by configuring model_context_window and model_auto_compact_token_limit. Requests that exceed the standard 272K context window count against usage limits at 2x the normal rate.