logoalt Hacker News

energy123last Saturday at 7:21 AM4 repliesview on HN

A public warning about OpenAI's Plus chat subscription as of today.

They advertise 196k tokens context length[1], but you can't submit more than ~50k tokens in one prompt. If you do, the prompt goes through, but they chop off the right-hand-side of your prompt (something like _tokens[:50000]) before calling the model.

This is the same "bug" that existed 4 months ago with GPT-5.0 which they "fixed" only after some high-profile Twitter influencers made noise about it. I haven't been a subscriber for a while, but I re-subscribed recently and discovered that the "bug" is back.

Anyone with a Plus sub can replicate this by generating > 50k tokens of noise then asking it "what is 1+1?". It won't answer.

[1] https://help.openai.com/en/articles/11909943-gpt-52-in-chatg...


Replies

hu3last Saturday at 1:11 PM

Well this explains the weird behaviour of GPT-5 often ignoring a large part of my prompt when I attatched many code/csv files despite keeping total token count under control. That is with Github Copilot inside VSCode.

The fix was to just switch to Claude 3.5 and now to 4.5 in VSCode.

wrcwilllast Saturday at 12:47 PM

ugh this is so amateurish. i swear since the release of o3 this has been happening on and off.

scrolloplast Saturday at 7:23 AM

And the Xhigh version is only available via API, not chatgpt.

show 1 reply
ismailmajlast Saturday at 2:08 PM

"Oh sorry guys, we made the mistake again that saves us X% in compute cost, we will fix it soon!"