logoalt Hacker News

jcalvinowenstoday at 6:29 PM3 repliesview on HN

Based on a lot of real world experience, I'm convinced LLM-generated documentation is worse than nothing. It's a complete waste of everybody's time.

The number of people who I see having E-mail conversations where person A uses an LLM to turn two sentences into ten paragraphs, and person B uses an LLM to summarize the ten paragraphs into two sentences, is becoming genuinely alarming to me.


Replies

ryandraketoday at 6:54 PM

> The number of people who I see having E-mail conversations where person A uses an LLM to turn two sentences into ten paragraphs, and person B uses an LLM to summarize the ten paragraphs into two sentences, is becoming genuinely alarming to me.

I remember in the early days of LLMs this was the joke meme. But, now seeing it happen in real life is more than just alarming. It's ridiculous. It's like the opposite of compressing a payload over the wire: We're taking our output, expanding it, transmitting it over the wire, and then compressing it again for input. Why do we do this?

claudiulodrotoday at 8:05 PM

> Based on a lot of real world experience, I'm convinced LLM-generated documentation is worse than nothing. It's a complete waste of everybody's time.

I had a similar realization. My team was discussing whether we should hook our open-source codebases into an AI to generate documentation for other developers, and someone said "why can't they just generate documentation for it themselves with AI"? It's a good point: what value would our AI-generated documentation provide that theirs wouldn't?

The_Foxtoday at 10:54 PM

Yesterday my manager sent LLM-generated code that did a thing. Of course I didn't read it, I only read Claude's summary of it. Then I died a little inside.

It was especially unfortunate because to do its thing, the code required a third party's own personal user credentials including MFA, which is a complete non-starter in server-side code, but apparently the manager's LLM wasn't aware enough to know that.