logoalt Hacker News

Toxicity on Social Media – The Noisy Room

72 pointsby skmtoday at 7:31 AM45 commentsview on HN

Comments

robot-wranglertoday at 10:55 AM

This is amazing analysis, presentation, and has a call to action at the end. Some of this guys other stuff: https://tobias.cc/reading

The only point I'd add is that it's not handling time evolution in wicked problems quite right. Agree that the noisy room is distorting the world in exactly the ways described. But what if we've been in there so long, and the world has become so distorted.. that reality itself slides towards the once-extreme positions? Easiest to see this with climate-change controversy since that is the way that sort of thing happens, regardless of whether you think it's happened yet. Cascade, phase change, and collapse don't just call a truce.

So you have to anticipate that, acknowledging the pessimist is actually right, and that systems are a real bitch. Then you point out that if we're already doomed, we have nothing to lose nothing by trying. Systems are complex after all, that's the whole problem.. so if we miscalculated on the doom, then bothering to try actually saves us. Checkmate pessimists.

card_zerotoday at 11:09 AM

OK, so to start with you're saying that there's a small noisy pro- side, and a small noisy anti- side, and a moderate majority. But then suddenly:

> The Majority Goes Silent - When the majority of people looks at the feed and assumes they're outnumbered, people will often self-censor.

That's not the same thing, is it? Here the majority is, say, anti-, but they are being frightened by a noisy pro- minority. They're moderates in the sense that anti- is the conventional position to take. But they have opinions. (They could also be in the minority, and this fear of speaking up would still be a bad thing.)

Otherwise, if they're truly moderate, but are frightened into silence supposedly, what would they be saying if they dared? "Everybody listen to me, I have no strong opinion on this matter"?

show 4 replies
robot-wranglertoday at 11:31 AM

Hackers might be interested to know that there's an "open questions" section at the end of TFA. Some of it probably wants simulation, some wants theorems.

Camel-ai framework/pubs seems related / useful, like maybe this one: https://github.com/camel-ai/agent-trust

Several model checkers also have primitives for working with common knowledge

> Learning a fact changes what you know. Seeing it displayed publicly — where everyone else can see it too — where you know others can also see it, changes what everyone knows, and subsequently how they act.

An important piece of technical vocabulary, it really seems we need this to talk about a lot of problems lately.

tolerancetoday at 11:33 AM

This proposal assumes that the social media of the next five years will be anything like the social media of the last ten.

It's an interesting initiative though. One that I also think could have unintended consequences that would additionally seed greater distrust in the media—which isn't necessarily a bad thing. But I imagine that the people who already sense this distrust and distaste toward the impression of polarization that the media gives are becoming less and less likely to subject themselves to the nude opinions of anonymous strangers online.

JimmyBucketstoday at 10:39 AM

This seems like a great idea. Even without the linked surveys. Two questions I have:

- how you does this handle the fact that a lot of accounts on social media platforms are bots that maybe controlled by a small number of people.

- how do we actually get this implemented?

show 2 replies
seltzerboystoday at 11:03 AM

This article is awesome but it doesn't acknowledge that the problem has been maliciously manufactured by social media companies. they do not have incentive to curb the distortion of extremism and therefore any attempt to do so in a grassroots way will likely not be effective. then there's the bot problem but that is probably easier to address if we actually committed to doing so.

show 1 reply
vintermanntoday at 11:26 AM

The "random sample" part of the solution is good. The "trusted polls" part of the solution is not good, because who decides if a poll is trusted? There are certainly a lot of polls I don't trust, because I suspect them of

1. cheating or being lazy with the sampling

2. Being a weasel with the phrasing to get the desired result

3. Being a push poll.

Still, a "trusted" poll is slightly better than a freeform "community note", especially if it sticks solely to how prevalent an opinion is.

Slashdot used random sampling in moderation 30-ish years ago. It worked OK, except that scores were used for very little (crucially they didn't even sort by them), and they had a more gameable non-randomized system to moderate the random system. And of course it was probably vulnerable to Sybil attacks.

(By the way, I guessed 4% for the number of toxic users)

show 1 reply
dependsontheqtoday at 11:19 AM

I have been working on a monitoring and prebunking system for digital manipulation and desinformation. We are focusing not on the content or narrative but on the psychological patterns and manipulation techniques that are used.

It's the most disturbing thing I have ever worked on, there is much more out there than moste people realize and a lot of it uses deceptive dark patterns.

If somebody is interested in talking more about this or is working on similar things, always welcome!

show 1 reply
neogodlesstoday at 10:55 AM

There's money in politics and money in social media.

And the money decides how to run the circus. Not for the benefit of all.

So it is a really hard problem.

hermitcrabtoday at 10:30 AM

New social trends and technologies frequently cause some level of moral panic. Moral panics of the past have been caused by all sorts of things, that now seem rather quaint: novels, bicycles, comics, television, videos, heavy metal, dungeons and dragons etc. But social media feels very different. It really does seem to be causing major societal disruption.

show 2 replies
po1nttoday at 11:01 AM

I was on social media since sharing Zynga game invites was majority of the posts. I've seen countless of magic bullets attempting to fix the polarization issues. Algorithm adjustments, fact checkers, community notes.

I feel like the real problem is the people. Many of us just want to be told what to think to blend in with society, some of us demonstrate Dunning-Kruger publicly and a few of us really want to drive the polarization for clout and attention.

Everyday I see people promote increasingly stupid ideas on both sides, further pushing my believe that the only solution is to severely limit what government can do, therefore making all this discussion pointless.

ketzutoday at 11:08 AM

> toxic tweets receive ~86% more retweets

The part that annoys me about the toxicity, or repetetive and annoying topics on reddit, HN, etc. is not that I am unaware that the content is produced by a small fraction. (I underestimated the count! I guessed 2%)

It's that people espouse it: They upvote and retweet it.

> Both sides develop wildly inaccurate beliefs about who the other side actually is.

That was a guess I had for a while. People have a strawman version of their out-groups in mind and quickly map people to that if an unknown person says something that indicates they might be part of the out-group.

> What percentage of the other side supports political violence?

It would be interesting to see the in-group statistic as well: "What percentage of your own side supports policical violence?", in my experience people also justify very shitty behavior as long as its from their in-group. (This plays heavily into the first point of espousing all kinds of shit)

---

It would be interesting to see if the community check actually changes anything. But the actual data seems to be only possibly for very generic topics - those we have the data on already. Something that would not be available for daily-fresh topics.

For my personal sanity I simply left reddit and stopped opening comments on certain HN posts - of course that does not help with the societal problems. Unfortunately.

show 1 reply
boxedtoday at 11:31 AM

Huh. I guessed 13% of Demographic voters as LGBTQ, and the correct answer is 6%. But if you look at wikipedia the numbers globally for gay, lesbian, and bi should be above 6%. That's weird. I would expect Democrats to be slightly above the general numbers...

camillomillertoday at 11:09 AM

Fantastic presentation. Unfortunately the conclusion is painfully naive and forgives the platforms too much.

>We Could Do This Now - Platforms already have a lot of these capabilities. They already survey users. They even know how to run sophisticated polls. There are a few technical details to work out (spec here), but this is not a hard problem to solve.

Why do you think something like this is not already implemented? Platforms literally profit from this division, so why would they be incentivised to do anything? What's needed is not a good gesture from the overly powerful platforms, is fast, hard and deep regulation.

wrxdtoday at 10:28 AM

The claim that this isn't a hard problem to solve seems very optimistic to me.

The tiny minority dominates the feeds because that's how the incentives for algorithmic driven social media are structured. Do we really expect Meta, X, TikTok to anything that could reduce engagement?

Good luck having any of the mainstream social media apps add the banner they propose.

rapnietoday at 9:46 AM

Great article format with all the dynamic widgets in it. Will have to give this a good read. It is a very interesting topic given how much of (global) public opinion is formed through "social" media.

apitoday at 10:33 AM

“The nuts are always the loudest” has been an observation forever.

This is showing how in the social media system the dynamics play out.

breppptoday at 9:58 AM

"What percentage of the other side supports political violence"

Both Democrats and Republicans estimated 30% but actually.. only 10% of both sides supported political violence

That number is crazy in so many ways and the post is overly nonchalant about it. The "distortion" isn't what's worrying here

show 1 reply
aaron695today at 11:09 AM

[dead]