Why can't the LLM/agent edit the context and dump that file if it decides it was dumb to have the whole thing in the context?
Base model is content. If it reads to much it becomes the content.
What you want is a harness that continually inserts file portions until a sufficiently bright light bulb goes off.
When they say agentic AI, ITS BASICALLY:
<command><content-chunk-1/></command>
its the ugliest string mashing indeterministic garbage the bearded masters would face palm.
Base model is content. If it reads to much it becomes the content.
What you want is a harness that continually inserts file portions until a sufficiently bright light bulb goes off.
When they say agentic AI, ITS BASICALLY:
<command><content-chunk-1/></command>
its the ugliest string mashing indeterministic garbage the bearded masters would face palm.