When the world gets closer.

We help you see farther.

Sign up to our expressly international daily newsletter.

Already a subscriber? Log in.

You've reach your limit of free articles.

Get unlimited access to Worldcrunch

You can cancel anytime.

SUBSCRIBERS BENEFITS

Ad-free experience NEW

Exclusive international news coverage

Access to Worldcrunch archives

Monthly Access

30-day free trial, then $2.90 per month.

Annual Access BEST VALUE

$19.90 per year, save $14.90 compared to monthly billing.save $14.90.

Subscribe to Worldcrunch
Spain

Inside Facebook’s Top Secret Moderation Center

Photo of Facebook's Barcelona center, provided by Facebook
Photo of Facebook's Barcelona center, provided by Facebook
Morgane Tual

BARCELONA — It's a large, bright open space, in which 80 people work, sitting behind brand new desks. The grey of the carpet is still pristine, the walls too white, impersonal. Except for a large sticker, whose shape, known throughout the world, provides an indication of what's going on precisely in this room: it's a large blue thumb, Facebook's iconic "Like".

In this gleaming tower in Barcelona, there are 800 people spread over six open spaces on several floors who work for the social network. Or, more precisely, for the Competence Call Center (CCC), a Facebook subcontractor to whom the company delegates, as it does to others, the moderation of content published by users.

Their mission is to clean up the platform of publications forbidden under Facebook rules, including pornography, hate content and terrorist propaganda. Through the software they use, moderators receive content reported by users as well as by an artificial intelligence system. For each piece of content, they must decide whether to delete the post or leave it online. If they are in doubt, they refer the case to a manager. Their bible? Facebook's "community standards," the social network's very specific rules explaining, for example, that you can show buttocks (as long as it's a wide-angle shot), but not a nipple, except in the case of a work of art or a breast cancer campaign.

We don't get to see this piece of software. Nor do we see these moderators working. When we arrive in the open space, a cheerful noise fills the room. It's that of the employees chatting, still at their stations, but with their hands away from their keyboards. On their 80 screens, the same still image of the colorful community standards' home page.

It all goes according to script: in the presence of a journalist, everything must stop. For Facebook, it's absolutely out of the question that we might even see a single screen on. "It's to respect user privacy, since we see the names of the people who published the content," says Facebook, which organized the visit.

During this brief visit — we spend only a few minutes in the open space — some four people accompany us. And pictures are not allowed either. Instead, Facebook provides us with its own images. We stay close to the walls, a few meters away from the moderators, to whom we don't get to speak at this moment. That will happen later, for a brief 15-minute window during which we meet five hand-picked employees, under the supervision of their boss and a Facebook representative.

We can't afford to be picky. For three years now we've been asking the world's largest social network (with more than two billion active users) to give us access to one of its moderation centers, a right that the company has almost never granted to any journalist. In November 2018, the company finally agreed to open its doors, albeit very slightly, offering 12 European journalists access to a center in Barcelona.

It seems to be a huge step forward for Facebook. Until recently, the greatest secrecy reigned around its moderation practices, be it the rules themselves, which are confusing and are only roughly summarized on the site, or the work of its moderation teams, of which we knew nothing. For the US company, this is an extremely sensitive issue, which has led to scandals and accusations of censorship and laxity.

The 2015 terror attacks around Europe marked a turning point. The use of Facebook by the so-called Islamic State (ISIS) to conduct its propaganda and recruit highlighted major shortcomings. Governments then began to raise their voices, to threaten, and even, in the case of Germany, to legislate. As of January 2018, platforms in Germany face a 50-million-euro fine if illegal content remains online for more than 24 hours.

Faced with an emergency, Facebook rolled up its sleeves. Three years later, its efforts finally paid off. Terrorist propaganda has become rare on the social network, although other serious problems have emerged, such as disinformation. For example, the UN has accused Facebook of allowing hate speech to proliferate in Myanmar against the Rohingya Muslim minority, inciting violence and hatred.

We're trying, very sincerely, to protect people.

In response to the complaints, the company decided to provide more information about its moderation process. Last year, 13 years after its creation, Facebook finally revealed one figure: its moderation teams were composed of 4,500 people worldwide. Today, the scale has changed. Facebook claims 30,000 people are working in this sector, half of them as foot soldiers of moderation, responsible for examining 2 million pieces of content per day.

"There's secrecy, but we're really starting to open up, because we're trying, very sincerely, to protect people," says David Geraghty, who leads Facebook's moderation teams. At the CCC in Barcelona, he mentions as examples of this new openness the online publication of the company's very detailed internal moderation rules, as well as a new "transparency report" on the removed content, available since May.

Why did it take so long? "We're more comfortable with our numbers, which are more accurate today," says Geraghty by way of an explanation. But transparency has its limits. How many such centers exist? They won't tell us. Where are they located? Again, nothing. And when you ask Ulf Herbrechter, one of CCC's managers, how many French people work at the Barcelona center, he hesitates to answer, then asks David Geraghty for the right to respond, which he doesn't get.

"We keep certain things secret, such as the location of centers, for security reasons. The YouTube shooting in California in April targeted content managers. That's also why we don't give their names."

The center we are visiting opened its doors in May, in one of Barcelona's business districts. Here, employees from France, Spain, Italy, Portugal, the Netherlands and Scandinavia examine the content from their respective countries. "For the French market, we only recruit French people, not just French speakers," Herbrechter points out. "It is crucial that they know French culture, that they understand the political context, for example."

CCC, a company initially dedicated to call centers, now has 22 offices in Europe, two of which are entirely dedicated to moderating Facebook content. The moderators are paid "about 25,000 euros per year", says Herbrechter. At least, that's the figure for the French employees. "Wages vary according to nationality," he adds, saying this was a matter of supply and demand.

What's a moderator's typical day like? Before entering the open space, he must leave his coat, bag and smartphone in a cloakroom, again "to respect user privacy," according to Facebook, and to prevent documents from leaving the center. Once he sits at his workstation, the great flood of content begins. How many pieces of content do they moderate per day? "It depends," says Herbrechter. Do they have quantified objectives? "There is a rumor that they only have a few seconds to make a decision. That's not what is happening here. A decision on a photo can be immediate, but if you have to read a long post, it takes time."

So, does this means Facebook's contract with CCC doesn't contain any objectives regarding the volume of content processed? "Our contract is quality, not numbers," says Geraghty for Facebook. "Our priority is to make the right decision."

lemonde_facebook_moderation_centers

Photo of a break room on Facebook's Barcelona office, provided to Le Monde by Facebook

The moderators sing from the same hymn sheet. Gathered in the presence of their boss in a "break room" with colorful furniture and overflowing fruit baskets, all five of them say they're proud of their work. "We're helpful because we protect people. We clean up," one of them says. We don't get to know their names. All signed a confidentiality agreement when they were hired, prohibiting them from speaking to journalists.

How do they deal with the daily confrontation with this sometimes difficult content? "We're warned from the beginning. In general, we have the ability to distance ourselves so that it doesn't affect us," says one of them. "There is often a misconception about what we do," adds a colleague. "You might think you see rape videos all day long, but you don't," she says. Hate content, for example, is more frequent in the French market. "But we know that some countries see more violent things," warns another. For example, Arabic-speaking moderators were much more often confronted with decapitation videos.

But this work is not as stressful as you might think.

Five psychologists work full-time at the center to support employees. All over the building, stickers encourage moderators to go and see them if they feel "stressed, upset". "People can come after seeing content that has shocked them," explains Natalia, one of the psychologists at the center. "When this happens, it's often because it reminds them of something they've experienced. But this work is not as stressful as you might think; they're not that exposed to things that are very painful to see."

Not everyone agrees, however. In September, a former Facebook moderator in Californiafiled a complaint against the company, claiming to be suffering from post-traumatic stress disorder after being exposed to "highly toxic content," according to her lawyers.

What conclusion can be drawn from such a visit? Are these modern offices in Barcelona representative of all of Facebook's moderation centers, of which we know nothing and are often run by other subcontractors? Geraghty swears the answer is "Yes'. "We have one practically like this one in the Philippines. If you go there, you will see that they don't work in dark basements, as it has been described."

This is a reference to the documentary The Cleaners, which shows the work of Filipino moderators working for the major social network platforms and is unflattering to digital giants. The rare testimonies that some media, including Le Monde, were able to obtain from former moderators, reflect a more difficult reality than the one presented in Barcelona — quantified objectives, time counted in seconds, psychological damage for those confronted with the harshest content. But these experiences often date back a few years. Today, the message Facebook hopes to send out is clear: Times have changed.

You've reached your limit of free articles.

To read the full story, start your free trial today.

Get unlimited access. Cancel anytime.

Exclusive coverage from the world's top sources, in English for the first time.

Insights from the widest range of perspectives, languages and countries.

Green

Forest Networks? Revisiting The Science Of Trees And Funghi "Reaching Out"

A compelling story about how forest fungal networks communicate has garnered much public interest. Is any of it true?

Thomas Brail films the roots of a cut tree with his smartphone.

Arborist and conservationist Thomas Brail at a clearcutting near his hometown of Mazamet in the Tarn, France.

Melanie Jones, Jason Hoeksema, & Justine Karst

Over the past few years, a fascinating narrative about forests and fungi has captured the public imagination. It holds that the roots of neighboring trees can be connected by fungal filaments, forming massive underground networks that can span entire forests — a so-called wood-wide web. Through this web, the story goes, trees share carbon, water, and other nutrients, and even send chemical warnings of dangers such as insect attacks. The narrative — recounted in books, podcasts, TV series, documentaries, and news articles — has prompted some experts to rethink not only forest management but the relationships between self-interest and altruism in human society.

But is any of it true?

The three of us have studied forest fungi for our whole careers, and even we were surprised by some of the more extraordinary claims surfacing in the media about the wood-wide web. Thinking we had missed something, we thoroughly reviewed 26 field studies, including several of our own, that looked at the role fungal networks play in resource transfer in forests. What we found shows how easily confirmation bias, unchecked claims, and credulous news reporting can, over time, distort research findings beyond recognition. It should serve as a cautionary tale for scientists and journalists alike.

First, let’s be clear: Fungi do grow inside and on tree roots, forming a symbiosis called a mycorrhiza, or fungus-root. Mycorrhizae are essential for the normal growth of trees. Among other things, the fungi can take up from the soil, and transfer to the tree, nutrients that roots could not otherwise access. In return, fungi receive from the roots sugars they need to grow.

As fungal filaments spread out through forest soil, they will often, at least temporarily, physically connect the roots of two neighboring trees. The resulting system of interconnected tree roots is called a common mycorrhizal network, or CMN.

Keep reading...Show less

You've reached your limit of free articles.

To read the full story, start your free trial today.

Get unlimited access. Cancel anytime.

Exclusive coverage from the world's top sources, in English for the first time.

Insights from the widest range of perspectives, languages and countries.

Already a subscriber? Log in.

You've reach your limit of free articles.

Get unlimited access to Worldcrunch

You can cancel anytime.

SUBSCRIBERS BENEFITS

Ad-free experience NEW

Exclusive international news coverage

Access to Worldcrunch archives

Monthly Access

30-day free trial, then $2.90 per month.

Annual Access BEST VALUE

$19.90 per year, save $14.90 compared to monthly billing.save $14.90.

Subscribe to Worldcrunch

The latest