When the world gets closer.

We help you see farther.

Sign up to our expressly international daily newsletter.

Already a subscriber? Log in.

You've reach your limit of free articles.

Get unlimited access to Worldcrunch

You can cancel anytime.

SUBSCRIBERS BENEFITS

Ad-free experience NEW

Exclusive international news coverage

Access to Worldcrunch archives

Monthly Access

30-day free trial, then $2.90 per month.

Annual Access BEST VALUE

$19.90 per year, save $14.90 compared to monthly billing.save $14.90.

Subscribe to Worldcrunch
Italy

For Facebook Moderators, The Soul-Crushing Job Must Go On

Underpaid and overexposed to what in some cases can be truly disturbing content, moderators are the invisible, human grease that keep the social media machine running. It's grueling but essential work that happens behind the scenes.

Facebook moderators, 'the clandestine guardians of what a contemporary network puts out.'
Facebook moderators, "the clandestine guardians of what a contemporary network puts out."
Maurizio Di Fazio

The message was for Mark Zuckerberg. "Without our work, Facebook would be unusable. Its empire collapses," the founder of the social media titan was told in a letter sent last year and signed by more than 200 people.

"Your algorithms cannot spot satire. They cannot sift journalism from disinformation. They cannot respond quickly enough to self-harm or child abuse," the missive went on to say. "We can."

The we, in this case, are social media content moderators. Employed not only by Facebook, but also Twitter, TikTok, YouTube and all the other major digital platforms, they are the clandestine guardians of what a contemporary network puts out. It's a crucial profession, but one that's also goes largely unseen.

Moderators are sacrificed in the chase of the illusion of complete editorial automation.

"I believe that the most difficult aspect is the condition of total invisibility in which they are forced to work — for safety reasons, but also to minimize the importance of human work," says Jacopo Franchi, author of the book Obsolete. "Today, it is impossible to establish with certainty whether a moderation decision depends on the intervention of a man or a machine. Moderators are sacrificed in the chase of the illusion of complete editorial automation."

Speed is of the essence, silence is golden

Because technology fails to grasp the way we mean some of our words — and who knows if it will ever understand them — platforms still need someone to hide the dirt under the carpet in the eyes of the billions of subscribers and advertisers. Someone, in other words, needs to take that stuff down before it infects too many monitors and smartphones.

Digital moderators are men and women without specific skills or specializations, and of any ethnicity and background. They're absolutely interchangeable workforce. To be hired, you just need to be immediately available, have a stable connection and some nerve.

They sift through and possibly delete the millions of anonymous daily posts, videos and stories reported by users. Such content includes child pornography, hate messages, fake accounts, hoaxes, revenge porn, cyberbullying, torture, rape, murder, suicide, local wars and live massacres. These rivers of mud escape the fallible dam of algorithms, and can end up making unspeakable horrors viral. These are the people that resolve machine selection errors, even if everything must appear, to the end user, to be a uniform and indistinct projection of artificial intelligence.

It's essential and misunderstood work. It's also, in many ways, barbaric. "I was paid 10 cents per piece of content," writes Tarleton Gillespie in his Custodians of the Internet. "For this amount I had to catalog the video, published by ISIS, of a boy who had been set on fire."

A former moderator said that Facebook even keeps track of their bathroom breaks.

The custodians work at a frenzied pace, deleting up to 1,500 pieces of content per shift. This happens one at a time, following the guidelines provided by the companies, the changing Community Standards (which the moderators refer to as the Bible).

If a post is in a language they don't know, they use an online translator. The important thing is to be fast: They have a few seconds to determine what needs to be removed from our feeds. Valera Zaicev, a former moderator and one of the major activists in the battle for rights in this category, said that Facebook even keeps track of their bathroom breaks. Nobody knows anything about their mandate, forced as they are to silence by martial confidentiality agreements.

"Content moderators are an example, perhaps the most extreme, of the new forms of precarious work generated and directed by algorithms," says Franchi. "Nobody can say how many there are: We are talking about 100,000 to 150,000 moderators, but it has never been clarified how many of these are hired full time by companies, how many are hired with temporary contracts by subcontracted agencies and how many instead are paid piecemeal on the "gig working" platforms."

Always answering to the algorithm

At Facebook, the most protected moderators in the United States have a stable contract paying about $15 per hour. But there are also roughly 1,600 moderators employed by the contractor Genpact in Hyderabad, India, where they are paid $6 dollars per day, according to Reuters.

The latter are part of a reserve neo-industrial army that responds at the platform's disposal, thanks to outsourcing companies like TaskUs — people in unspecified corners of the globe, paid peanuts for one gig after another.

Facebook EU HQ in Dublin, Ireland — Photo: Niall Carson/PA Wire/ZUMA

They face immense body and mind fatigue, commanded by an algorithm, a mathematical-metaphysical entity that never stops, and makes for an authoritarian leader.

"It is an algorithm that selects them on LinkedIn or Indeed through deliberately generic job offers," says Iacopo Franchi. "It is an algorithm that organizes social content that can be reported by users. It is an algorithm that plans review queues and it is often an algorithm that determines their score on the basis of their "mistakes' and decides on their possible dismissal."

Yes, if they are wrong in more than 5% of cases, they risk getting the boot.

For those who manage to keep their jobs, it's essential to disconnect completely in their free time. "There are thousands of moderators in the European Union and all of them are working in critical conditions for their mental health," says Cori Crider, director of Foxglove, a pressure group that assists them in lawsuits.

In 2020, Facebook paid $52 million to thousands of moderators who had developed psychological problems due to their work.

Few last more than a few months on the job before being fired for disappointing performances or leaving by their own volition because they are no longer able to observe the evil of the world without being able to do anything other than hide it.

For those who manage to keep their jobs, it's essential to disconnect completely in their free time.

The aftermath can be heavy. The accumulation of bloody visions traces a deep furrow. Who else has ever plunged so deeply into the abysses of human nature?

"Exposure to complex and potentially traumatic contents, as well as information overload, is certainly a relevant aspect of their daily professional experience, but we must also not forget the high repetitiveness of their tasks," says Massimiliano Barattucci, work psychologist and professor of organizational psychology.

"Unlike another new job, that of delivery couriers, content moderators are exposed to all sources of technology-fueled stress," he adds. "And this helps to understand their high turnover and burnout rates, and their general job dissatisfaction."

Alienation and emotional addiction to horror could be just around the corner. "A progressive cynicism can arise, a habit that allows you to maintain detachment from the shocking content they see in their work," says Barattucci. "They may develop disorders such as insomnia, nightmares, intrusive thoughts or memories, anxiety reactions, and in several cases, PTSD."

One day, in the Facebook center of Phoenix, Arizona, everyone's attention was caught by a man who threatened to jump from the roof of a nearby building, a former moderator tells The Verge. Eventually, they discovered he was a moderator, a colleague of theirs: He had walked away during one his two allowed breaks. He wanted to log off the horror.

You've reached your limit of free articles.

To read the full story, start your free trial today.

Get unlimited access. Cancel anytime.

Exclusive coverage from the world's top sources, in English for the first time.

Insights from the widest range of perspectives, languages and countries.

eyes on the U.S.

The Weight Of Trump's Indictment Will Test The Strength Of American Democracy

The U.S. legal system cannot simply run its course in a vacuum. Presidential politics, and democracy itself, are at stake in the coming weeks and months.

The Weight Of Trump's Indictment Will Test The Strength Of American Democracy

File photo of former U.S. President Donald Trump in Clyde, Ohio, in 2020.

Emma Shortis*

-Analysis-

Events often seem inevitable in hindsight. The indictment of former U.S. President Donald Trump on criminal charges has been a possibility since the start of his presidency – arguably, since close to the beginning of his career in New York real estate.

But until now, the potential consequences of such a cataclysmic development in American politics have been purely theoretical.

Today, after much build-up in the media, The New York Times reported that a Manhattan grand jury has voted to indict Trump and the Manhattan district attorney will now likely attempt to negotiate Trump’s surrender.

The indictment stems from a criminal investigation by the district attorney’s office into “hush money” payments made to the adult film star Stormy Daniels (through Trump’s attorney Michael Cohen), and whether they contravened electoral laws.

Trump also faces a swathe of other criminal investigations and civil suits, some of which may also result in state or federal charges. As he pursues another run for the presidency, Trump could simultaneously be dealing with multiple criminal cases and all the court appearances and frenzied media attention that will come with that.

These investigations and possible charges won’t prevent Trump from running or even serving as president again (though, as with everything in the U.S. legal system, it’s complicated).

Keep reading...Show less

You've reached your limit of free articles.

To read the full story, start your free trial today.

Get unlimited access. Cancel anytime.

Exclusive coverage from the world's top sources, in English for the first time.

Insights from the widest range of perspectives, languages and countries.

Already a subscriber? Log in.

You've reach your limit of free articles.

Get unlimited access to Worldcrunch

You can cancel anytime.

SUBSCRIBERS BENEFITS

Ad-free experience NEW

Exclusive international news coverage

Access to Worldcrunch archives

Monthly Access

30-day free trial, then $2.90 per month.

Annual Access BEST VALUE

$19.90 per year, save $14.90 compared to monthly billing.save $14.90.

Subscribe to Worldcrunch

The latest