I've never seen a summarizing mistake from any modern LLM. What are you even talking about?
LLMs hallucinate when they don't have enough context. Not when they're just cutting down the message in their context.