Yep, signals are signals, but I think it's quite complicated now. (In any case, this is still the embryonic era of LLMs).
An interesting point to consider: an author that goes out of their way to hide any LLM influence may actually be degrading the signal. Because in that case, you'll not see the LLM's etchings, and misattribute skill to the author under the belief an LLM was not involved. Complicated times.
> An interesting point to consider: an author that goes out of their way to hide any LLM influence may actually be degrading the signal. Because in that case, you'll not see the LLM's etchings, and misattribute skill to the author under the belief an LLM was not involved. Complicated times.
To someone who thinks that LLM use is an of-course-I-did-that, other people complaining about LLM-tells might seem like complaining about not post-processing the input enough. But they are more likely to be complaining about using it in the first place.
I don't particularly care about LLM use per se, but when I see LLM text it makes me think I'm about to read something devoid of content - just word vomit. The equivalent of yesteryear's listicle. This instinct usually serves me well. A good text is a good text. If an LLM wrote all of it that'd be fine with me, but that's usually not how things go.