Over the past week, the algorithms that shape my social media feeds have been serving up tons of content about the Major League Baseball playoffs. This because the algorithms know that I am a fan of the Mets, who have been -- you should know -- on a surreal playoff run for the last two weeks.
A lot of that content is the usual: sportswriter opinion pieces or interviews with players talking about how their teams are “a great group of guys just trying to go out there and win one game at a time,” or team accounts rallying their fan bases with slick highlight videos or “drip reports” on the players’ fashion choices.
But there’s been a lot of uglier stuff too: Padres and Dodgers fan pages threatening each other after some on-field tension between the two teams and their opposing fanbases last week. Or a Mets fan page declaring “war” on Phillies fans who had been filmed chanting “f*ck the Mets” on their way out of their home stadium after a win. Or a clip of a Philly fan’s podcast in which he mocked Mets fans for failing to make Phillies fans feel "fear" at the Mets' ballpark.
As a person who writes often about political polarization for a living, my first thought upon seeing all this stuff was: aha, further evidence that polarization is fueling a deep anger and violence in American life, which is now bleeding into sports, making players more aggressive and fans more violent.
But in fact, there isn’t much evidence for this. Baseball games and crowds are actually safer now than in the past.
I had fallen for social media reflections of the real world that were distorted. It’s what some experts call “The Funhouse Mirror” aspect of the internet.
One of those experts is Claire Robertson, a postgraduate research fellow in political psychology at NYU and the University of Toronto, who studies how the online world warps our understanding of the offline world.
Since Robertson recently published a new paper on precisely this subject, I called her up to ask why it’s so easy for social media to trick us into believing that things are worse than they actually are.
Part of the problem, she says, is that “the things that get the most attention on social media tend to be the most extreme ones.” And that’s because of a nasty feedback loop between two things: first, an incentive structure for social media where profits depend on attention and engagement; and second, our natural inclination as human beings to pay the most attention to the most sensational, provocative, or alarming content.
“We’ve evolved to pay attention to things that are threatening,” says Robertson. “So it makes more sense for us to pay attention to a snake in the grass than to a squirrel.”
And as it happens, a huge amount of those snakes are released into social media by a very small number of people. “A lot of people use social media,” says Robertson, “but far fewer actually post – and the most ideologically extreme people are the most likely to post.”
People with moderate opinions, which is actually most people, tend to fare poorly on social media, says Robertson. One study, of Reddit, showed that 33% of all content was generated by just 3% of accounts, which spew hate. Another revealed that 80% of fake news on Facebook came from just 0.1% of all accounts.
“But the interesting thing,” she says, “is, what’s happening to the 99.9% of people that aren’t sharing fake news? What's happening to the good actors? How does the structure of the internet, quite frankly, screw them over?”
In fact, we screw ourselves over, and we can’t help it. Blame our brains. For the sake of efficiency, our gray matter is wired to take some shortcuts when we seek to form views about groups of people in the world. And social media is where a lot of us go to form those opinions.
When we get there, we are bombarded, endlessly, with the most extreme versions of people and groups – “Socialist Democrats” or “Fascist Republicans” or “Pro-Hamas Arabs” or “Genocidal Jews” or “immigrant criminals” or “racist cops.” As a result, we start to see all members of these groups as hopelessly extreme, bad, and threatening in the real world too.
Small wonder that Democrats’ and Republicans’ opinions of each other in the abstract have, over the past two decades, gotten so much worse. We don’t see each other as ideological opponents with different views but, increasingly, as existential threats to each other and our society.
Of course, it only makes matters worse when people in the actual real world are committed to spreading known lies – say, that elections are stolen or that legal immigrants who are going hungry are actually illegal immigrants who are eating cats.
But what’s the fix for all of this? Regulators in many countries are turning to tighter rules on content moderation. But Robertson says that’s not effective. For one thing, it raises “knotty” philosophical questions about what should be moderated and by whom. But beyond that, it’s not practical.
“It's a hydra,” she says. “If you moderate content on Twitter, people who want to see extreme content are going to go to FourChan. If you moderate the content on FourChan, they're going to go somewhere else.”
Rather than trying to kill the supply of toxic crap on social media directly, Robertson wants to reduce the demand for it, by getting the rest of us to think more critically about what we see online. Part of that means stopping to compare what we see online with what we know about the actual human beings in our lives – family, friends, neighbors, colleagues, classmates.
Do all “Republicans” really believe the loony theory that Hurricane Milton is a man-made weather event? Or is that just the opinion of one particularly fringe Republican? Do all people calling for an end to the suffering in Gaza really “support Hamas,” or is that the view of a small fringe with outsized exposure on social media?
“When you see something that’s really extreme and you start to think everybody must think that, really think: ‘Does my mom believe that? Do my friends believe that? Do my classmates believe that?’ It will help you realize that what you are seeing online is not actually a true reflection of reality.”