> because the reader must now sift the synthetic context for whatever the document was originally about
> time wasted using AI on tasks that did not need it, on artifacts no one will read, on processes that exist only because the tool made it cheap to construct them. On decks that spell out things that previously didn’t even need to be said or were assumed.
I work at MSFT and at-least in my org, this is happening at warp speed. Every document I read, my first thoughts are what is the kernel of the idea that the writer was trying to convey ? Because 95% of the content of the doc is just verbiage. You can always tell its verbiage, the em-dashes, the rhythmic text, the green check mark emoji etc. We are hoping that volume of output will make up for the quality or lack thereof. More markdown files, more AGENTS.md file but is that making us better developers ? It certainly is giving the illusion that we are faster but I don't know how management thinks this will lead to tangible impact on the top line or bottom line.
In my experience, some of the best writing (in design docs and PM specs) at MSFT have been human written. You can see the clarity of purpose from the writer, ithere is no need to read it again, it is equivalent to having a 1-on-1 with the writer themselves. But AI written slop, the less said the better.
This piece hits home, I wonder how the experience is at other Big Tech companies.