Walking the fine line of curbing the spread of violence and misinformation
Walking the fine line of curbing the spread of violence and misinformation Saqib Majeed/SOPA Images/ZUMA

Facebook announced this week it would be banning all pages, groups, and Instagram accounts linked to the conspiracy theory movement “QAnon.” After months of criticism for delayed content moderation and removal, QAnon was labeled as a “militarized social movement,” which is prohibited under its current rules, according to a Facebook spokesperson. It’s just the latest attempt for the social media giant to walk the fine line of curbing the spread of violence and misinformation across the platform, while being careful not to slow down the constant flow of interactions that drives its billion-dollar business.

While it’s a step in the right direction, Facebook has allowed to many of these fringe movement pages and groups to multiply, with many expanding exponentially during the pandemic. Indeed, like COVID-19 itself, controlling the ill effects of the social media is an always morphing global plague. And most agree that Facebook, estimated to have been used by 28% of the global population, still hasn’t had a true reckoning with the ways in which it is becoming a tool to undermine democratic systems — both by opponents and governments themselves.

Targeted in India: In what is often referred to as the world’s “biggest democracy,” an Indian journalist and editor for a popular Jammu-based newspaper State Times, Tsewang Rigzin was arrested early last month. The charges had nothing to do with anything he’d written or published in the paper, but simply for being an “administrator” of a Facebook group.

• Rigzin had established a Facebook group “Ladakh in the Media,” which followed coverage of the Ladakh region of Kashmir, the long-disputed territory that has sparked tensions between India and Pakistan.

• As Rigzin later recounted to the Indian news website The Wire, another user had posted a comment that was deemed by the BJP government of Prime Minister Narendra Modi to be “disobeying a public servant.”

• Though Rigzin was released on bail later that same day, it speaks to a greater theme of Facebook being used around the world by governments or individuals to undermine elections in some cases and even democracy itself in others.

[rebelmouse-image 27070668 original_size=”1200×800″ expand=1]

Facebook CEO Mark Zuckerberg in Washington in July — Photo: Graeme Jennings/CNP/ZUMA

Rewind, context: In an 18-month period, the Brexit vote in the UK, the 2016 U.S. Presidential elections, and the 2017 elections in France, it became clear that Facebook’s algorithm and services were being manipulated for political ends, often (by not only) by Russia.

• In the United States, when Facebook, Twitter and Google were questioned on evidence of Russian interference in the 2016 elections, some of the posts in question had reached more than 126 million users on Facebook, over 131,000 messages on Twitter and at least 1,000 videos over YouTube.

• In France, the run-up to the 2017 presidential election saw a barrage of fake news in the form of rumors, stories and even doctored videos against President Emmanuel Macron circulating on Facebook. According to Le Monde, some posts were shared more than 250,000 times. One manipulated video, seen more than 15 million times on Facebook, was revealed to have been filmed in Russia, not France.

• Russian interference has also been linked to Italy’s 2018 parliamentary elections to benefit the election of Matteo Salvini as well as the Lega Nord and Five Star Movement (M5S) political parties. According to a report in Wired Italia, one of the main sources of M5S fake news was the Russian disinformation newspaper “Sputnik.” An analysis of more than a million posts from over 98,000 Italian social media profiles by El Pais, revealed that the vast majority of xenophobic fake news came from Sputnik.

However, the problem isn’t purely political interference from Russia or other foreign entities. Facebook’s inability to curb the spread of misinformation on its own platform reaches farther. Like with the QAnon movement and other far-right and fringe groups that use the platform to spread their message and organize, Facebook has long come under fire for its inadequate moderation, content policies and removal.

• A recent study conducted by the Oxford Internet Institute found that less than 1% of the misinformation videos had been flagged for fake information or removed as per Facebook’s content moderation policies.

• Researchers found coronavirus-related misinformation videos originating on YouTube spreading exponentially on Facebook, with some videos shared 20 million times on Facebook between October 2019 and January 2020. The reach of these posts across social media was higher than those of the five largest English-language news sources in the world.

So Zuck? Facebook founder Mark Zuckerberg has attributed the popularity of partisan posts to the inherent virality of provocative content, thanks to their ability to drive conversations and interaction. While conceding that Facebook, and big tech in general, needed a privacy update, he argued in a recent Axios interview that he didn’t necessarily believe it was his problem to solve: “I have a little more confidence in democracy than that. And I hope my confidence isn’t misplaced.” We hope so too.

All rights reserved