- This is one of those studies that presents evidence confirming what many people already know. The majority of the bad content comes from a small number of very toxic and very active users (and bots). This creates the illusion that a large number of people overall are toxic, and only those who are in deep already recognize the truth.
It is also why moderation is so effective. You only have to ban a small number of bad actors to create a rather nice online space.
And of course, this is why for-profit platforms are loathe to properly moderate. A social network that bans the trolls would be like a casino banning the whales or a bar banning the alcoholics.
- Bad content being shoved in our face is a symptom of the real problem, which is bad mechanics. Solutions that reform the mechanics (e.g. require a chronological feed instead of boosting by likes) are going to be more effective, less divisive, neutral by design, and politically/legally easier to implement.
- This is intentional: make people think there's nothing online except harmful content, and propose a regulatory solution, which creates a barrier to entry. It's "meta" trying to stop any insurgent network.
- Turns out the kids are alright after all!
- Abstract: "Americans can become more cynical about the state of society when they see harmful behavior online. Three studies of the American public (n = 1,090) revealed that they consistently and substantially overestimated how many social media users contribute to harmful behavior online. On average, they believed that 43% of all Reddit users have posted severely toxic comments and that 47% of all Facebook users have shared false news online. In reality, platform-level data shows that most of these forms of harmful content are produced by small but highly active groups of users (3–7%). This misperception was robust to different thresholds of harmful content classification. An experiment revealed that overestimating the proportion of social media users who post harmful content makes people feel more negative emotion, perceive the United States to be in greater moral decline, and cultivate distorted perceptions of what others want to see on social media. However, these effects can be mitigated through a targeted educational intervention that corrects this misperception. Together, our findings highlight a mechanism that helps explain how people's perceptions and interactions with social media may undermine social cohesion."
- Isnt it still an accurate perception of moral decline? Even if its only 3% sharing misinfo and toxic posts its still 47% that is sharing them, commenting on them and interacting positively with them. This gives the in my opinion correct perception that there is moral decline.
- If more people were obligated to undergo KYC to get posting rights, Less people would be able to objectively claim to be other than they are.
If more channels were subject to moderation, and moderators incurred penalty for their failure, channels would be significantly more circumspect in what they permitted to be said.
Free speech reductionists: Not interested.
by makeitdouble
1 subcomments
- This study seems to be playing with what toxicity means.
Is the 43% cited at the top of the piece matching the same criteria they use for digging deeper in the study ?
Their specific definition of toxicity is in the supplementary material, and honestly I don't think it matches the spectrum of what people perceive as toxic in general:
> The study looked at how many of these Reddit accounts posted toxic comments. These were
mostly comments containing insults, identity-based attacks, profanity, threats, or sexual
harassment.
That's basically very direct, ad hominem comments.
and example cited:
> DONT CUT AWAY FROM THE GAME YOU FUCKING FUCK FUCKS!
Also why judge Reddit on toxicity but not fake news or any other social trait peolple care about ? I'm not sure what's the valuable takeaway from this study, only 3% of reddit users will straight insult you ?
by tsunamifury
0 subcomment
- any basic nodal theory will help you understand its not about how many who post, its about their reach and correlations with viewership of overall graph.
A few bad apples, spoil the whole bunch is illustrated to an extreme in any nodal graph or community.
So it's more about how much toxic content is pushed, not how much is produced. At an extreme a node can be connected to 100% of other nodes and be the only toxic node, yet also make the entire system toxic.
by bongodongobob
0 subcomment
- Isn't this just saying they are bad at estimating? It's not like any of these people did any rigorous studies to come to their conclusion.
by JuniperMesos
0 subcomment
- > When US-Americans go on social media, how many of their fellow citizens do they expect to post harmful content?
Just because an American citizen sees something psoted on social media in English, it doesn't mean that it was a fellow American citizen who posted it. There are many other major and minor Anglophone countries, and English is probably the most widely spoken second language in the history of humanity. Not to mention that even if someone does live in America and speak English and post online, they are not necessarily a US citizen.