Facebook and other social media companies didn’t cause America’s massive political divide, but they have widened it and pushed it towards violence, according to a report from New York University released Monday.
Why it matters: Congress, the Biden administration and governments around the world are moving on from blame-apportioning to choosing penalties and remedies for curbing online platforms’ influence and fighting misinformation.
Driving the news: Paul Barrett, deputy director of NYU’s Stern Center for Business and Human Rights, and his co-authors reviewed more than 50 social science studies and interviewed dozens of academics, policy experts, activists, and current and former industry people.
They found that while social media platforms are not the cause of political polarization, they have intensified it.”Social media is the mechanism for spreading the kind of mis- and disinformation that fuels the fire of political polarization,” Barrett told Axios. He said social platforms erode trust and democratic norms in ways that have undermined the U.S.’ pandemic response and acceptance of the 2020 election results.
Nick Clegg, Facebook’s vice president of global affairs, argued this year that it is not in Facebook’s interest to “push users” toward extremist content.Clegg also highlighted studies about polarization to say the results are mixed, including one that found a break from Facebook did not lessen someone’s negative feelings about the opposite political party. “What evidence there is simply does not support the idea that social media, or the filter bubbles it supposedly creates, are the unambiguous driver of polarization that many assert,” Clegg wrote.
Yes, but: Barrett’s team said the study Clegg cited shows that staying off Facebook does reduce polarization on policy issues rather than partisan affiliation, and other research indicates Facebook has a heightening effect on polarization.
“It’s important to overcome the message that Facebook has been trying to project that we really can’t tell whether social media use has anything to do with political divisiveness and partisan hatred,” Barrett said. “That just doesn’t match up with facts.”
What’s next: The report offers several recommendations for both the government and platforms. The government, it says, should:
Mandate more disclosure of companies’ ranking, recommendation and removal algorithms;Give the Federal Trade Commission new powers to create industry standards;And invest in alternative social media options like a PBS for the internet.
The report also recommends that platforms:
Adjust algorithms transparently to discourage polarization;Increase the size of their content moderation teams;And hide “like” and share counts to stop rewarding polarizing content.