No, it doesn’t have to take effort, but that does mean that someone genuinely cares.
Like, I love blog posts. Really do, I’ll read anyone’s about anything. Someone thought of something and cared about it and put it into the world and that’s wonderful.
But someone making an AI post doesn’t care. And worse, it makes anyone who does care feel silly, like, why am I wasting my time on this thing that’s so worthless that whatever the first thing the computer spits out is good enough for them
AI output often 'looks like something' on first try, which makes it easy to assume no effort went in.
But there's a big difference between prompting and accepting the first output versus someone using search, multiple LLMs, actually READING the underlying papers, and iterating until it's done.
Sometimes that still means getting to 'done' faster than by more traditional means. Sometimes it means more depth than you'd manage otherwise. Sometimes somewhere in between.
Of course, by that point, either way, it doesn't really look like lazy AI output anymore.
Maybe it's not so much about the tools/agents as it is about the intent-to-engage behind them?