I relate to the idea of having a different level of thinking now with AI. How would you evaluate that someone is overestimating themselves?
As in every little thing that used to be too much effort before, I can just easily get the info, the data now with prompt. The data analysis of something, which otherwise might have taken hours to figure out, I can just have AI write scripts for everything, which allows me to see more data about everything that previously was out of touch. Now you will probably ask of course "how do I know the data is accurate?" -- I can still cross reference things and it is still far faster because even if I spent hours before trying to access that data there wouldn't have been similarly guarantees that it was accurate.
I am thinking so much more about the things now that I couldn't have possibly time to think about before because they were so far out of reach, or even unimaginable to do in my lifetime. Now I'm thinking about automating everything, having perfect visualizations, data about everything, being able to study/learn everything quickly etc.
It sounds like you're optimizing for a system of self-deception. If you never check how the data is collated, but rather whether the collation appears consistent, you will eventually be left only with data that has the appearance of consistency, regardless of how correct it is.