People are overestimating the value on having AI create something given loose instructions, and underestimating the value of using AI as a tool for a human to learn and explore a problem space. The bias shows on the terminology (“agents”).
We finally made the computer able to speak “our” language - but we still see computers as just automation. There’s a lot of untapped potential in the other direction, in encoding and compressing knowledge IMO.
Problem space is rich. The thing doesnt actually know what a problem is.
The thing is incredibly good at searching through large spaces of information.
> AI create something
To have AI recreate something that was already in it's training set.
> in encoding and compressing knowledge IMO.
I'd rather have the knowledge encoded in a way that doesn't generate hallucinations.