I think the most interesting predictions are the ones that sound bold and even a little bit insane at the time. I think a lot more of the people who were willing to say saying "ASI will kill us all" 20+ years ago, because they were taking a risk (and routinely ridiculed for it).
Even today, "ASI will kill us all" can be a pretty divisive declaration - hardly safe and boring.
From the couple of threads I clicked, it seemed like this LLM-driven analysis was picking up on that, too: the top comments were usually bold, and some of the worst-rated comments was the "safe and boring" declaration that nothing interesting ever really happens.