From India (Its No. 2 Market), Doubts About Facebook Speech Monitoring
India knows about the power of Facebook. And a single 'board' will likely be insufficient to incorporate more diverse opinions and contexts into the company's content moderation practices.
BANGALORE — In January 2019, Facebook made public its blueprint for an independent ‘content oversight board" — a high-level committee tasked with the oversight of the social media giant's content moderation decisions.
The board, framed as an appeals court from decisions of the company's operational side, will have the unenviable task of adjudicating and informing how Facebook regulates the online habits of 2.3 billion users.
This decision is significant for a number of reasons — not just in that it will affect a user-base that is roughly a third of the world's population — but also for potentially bringing about a paradigm shift in the operation of the online governance of speech, a notoriously secretive and unaccountable process.
It has even been referred to as Facebook's ‘constitutional moment". It is crucial to examine what this decision portends for the future of online speech in India, and globally.
Facebook presently takes decisions regarding third-party content through a patchwork of policy formulations, including both its public-facing ‘community standards", which vaguely define its contours of permissible speech, as well as more confidential and intricate rules which deal with content in more specificity, including within national and local contexts.
At a more practical level, the daily task of applying the rules, identifying and deciding to retain or remove disputed posts is delegated to thousands of human workers, or, in some cases, to automated systems. The human moderators are provided minimal training and low pay, and due to vaguely formulated rules, must take decisions on content based on subjective considerations.
Both the process of formulating its content rules, as well as enforcing them, are opaque, error-prone and lack meaningful accountability. Facebook's ‘global" rules and policies on content indicate a distinct bent towards the American free speech traditions and ignorance of social, political and cultural realities elsewhere, including in India, its second-largest market.
Leaked documents suggest that Facebook asks its moderators to vet content from India.
A recent study by Equality Labs documents Facebook's failure to consistently or effectively moderate hate speech targeted against minorities in India, indicating the company's unwillingness or inability to grapple with harmful content which is outside of its own cultural and legal context. Leaked documents, for example, suggest that Facebook asks its moderators to vet content from India on grounds of ‘hurting religious sentiment" — a vague standard which can pave the way for censorship.
The arbitrariness and opacity is further compounded by the vagaries of the Indian legal system, where few practical legal avenues are available for individuals, both to request social media companies to remove content, or to challenge their actions in taking down, blocking, or censoring content. In a situation where the online habits of millions of Indians are in effect governed largely by the private rules and practices of Facebook, its decision to potentially overhaul its governance practices assumes even greater significance.
Facebook's proposal will divest limited power over content moderation decisions to a proposed Oversight Board, a 40-member panel (initially chosen by Facebook), which will scrutinize the company's own commitment to its internal rules and adjudicate whether they were correctly applied. The announcement comes in the wake of increasing criticism at this opacity and discomfort at the amount of power exercised by the company over conversations often involving the world's most politically sensitive issues.
The social media giant has also faced criticism for its handling of content moderation decisions ranging from the takedown of violent live streaming videos such as the Christchurch shooting, or the removal of content deemed to be ‘coordinated inauthentic behavior" (or ‘fake news' in general parlance) during the Indian elections. In this atmosphere of distrust, the last two years have also seen an exponential increase in political efforts to exercise greater control over the governance of online content — from Singapore's law for curbing ‘fake news', to imminent efforts in India to automatically filter ‘unlawful" speech.
Under attack from all fronts, the proposal can be seen as an attempt to allay fears that Facebook is acting in a motivated manner, detrimental to the interests of its users, as well as an acknowledgement that the freedom of expression of such a large public forum should not be governed by a monolithic, for-profit, US-based corporation.
Will Facebook's oversight board solve its crisis? The acknowledgement by Facebook that it exercises too much undemocratic control over the online expression of billions of users is in itself unprecedented, let alone its decision to outsource some of this power to an ‘independent" authority. Facebook, and similar social media platforms, have long shunned responsibility for third-party content, a status that has entitled them to significant legal protection as well as freedom from public scrutiny of their role in governing online speech.
A departure from this position is an acceptance of what has been known for some time now — that social media companies are the primary actors in structuring and moderating online speech, and consequently responsible for privately shaping public and private discourse at an unprecedented scale. We must be wary of how this power is exercised and what it means for the free expression of societies and individuals to be subject to the whims of unaccountable private corporations.
As a recent corporate accountability index which studies online freedom indicates, most major online platforms continue to be non-transparent and unaccountable towards their users and shun responsibility towards fostering both free and equitable online communities.
They must take into account the massive scale of Facebook's speech regulation — Photo: Scott Webb
Divesting this enormous power to an independent board has been compared to a ‘constitutional" moment for Facebook, where it attempts to create a political structure distinct from its commercial and operational motives. Indeed, Facebook's Oversight Board has been compared to a Supreme Court within a constitutional system, which separates an executive system from the power of its own oversight.
Yet, from the limited information released about the content oversight board so far, there remain some important and uncomfortable questions regarding the true impact of this board.
First, while the proposals repeatedly insist upon the oversight board's independence from Facebook, the body will necessarily be nested within the company's corporate structure, and will be beholden to it. In the event of a conflict between the board's independence and the company's primary obligation to its shareholders, the latter would necessarily prevail as a matter of law, which casts uncertainty on the claims that the decisions of the body will remain independent of Facebook's commercial motives.
Second, the proposal does not go far enough to remedy the problems Facebook has identified. For one, the structure of the Board does not take into account the incomprehensible scale of Facebook's speech regulation — the 40-member panel (one member per 57 million users) can hardly keep track of tens of thousands of decisions made daily, let alone distinguish their varied contexts. Moreover, the company has indicated that, while the Board may bind itself to precedent, it will not directly influence company policy such as the ‘community standards'. This is a serious restriction on the scope of the Board.
Finally, a single Board will likely be insufficient to incorporate more diverse opinions and contexts into its content moderation practices. While Facebook has indicated that board members will represent the ‘entire Facebook community" and not specific constituencies, it is unclear how meaningful representation will work out in practice, and on what grounds the membership of the board is expected to be built.
The Board may bind itself to precedent.
Focusing on concerns of the ‘entire community" could once again belie an ignorance of community and context-specific concerns, and could weaken its commitment to diversity, particularly when the large bulk of its user base and revenue is likely to come from countries like India and Bangladesh.
Establishing and enforcing global standards for speech is an enormously complicated and difficult issue, and we should not expect Facebook to bear the entirety of the burden of maintaining a free and equitable online community. Facebook's efforts towards greater independent stewardship of speech regulation should be lauded to the extent that it reckons with Silicon Valley's enormous and undemocratic political power, and other large firms would do well to follow this example and abandon their hubris on matters of speech regulation.
Meaningful reform, however, will have to stem from democratic political communities. Their institutions must be up to the task of framing the appropriate rules and conditions to temper the power of private platforms and introduce transparency and accountability for the future of online speech.
Divij Joshi is a research fellow at the Vidhi Centre for Legal Policy, Bengaluru.