When the world gets closer.

We help you see farther.

Sign up to our expressly international daily newsletter.

Enjoy unlimited access to quality journalism.

Limited time offer

Get your 30-day free trial!

Google, Facebook And Beyond: Why Algorithms Discriminate

Yes, technology can be biased, racist and sexist. That's because the creators of algorithms are overwhelmingly white men, and the online results are becoming a huge problem in an increasingly digitized world.

Google Images search for CEO
Google Images search for CEO
Sara Weber

MUNICH Algorithms decide who Facebook suggests you should befriend and when future self-driving cars will apply the brakes. They are a sequence of orders that can solve a problem, if applied correctly. They are supposed to be systematic, logical and produce the same result even after the umpteenth application.

But every algorithm is programmed by humans who are part of various social strata and whose work is influenced, consciously or not, by their preferences and prejudices. And the more the world is driven by technology, the more this is a problem.

"We assume that technology is neutral, but there are no neutral algorithms," says Corinna Bath, a computer specialist who researches technology development and gender issues. She says technology is always influenced by social standing, though very few people realize that.

Algorithms can be downright dangerous if they have been programmed to make racist or sexist decisions. Mind you, this probably doesn't happen on purpose. Most programmers probably aren't aware of the fact that their prejudices are influencing the code they develop.

Blatant sexism in image returns

If you were to Google the term "CEO," the search engine suggests a disproportionately high number of images of white men. But in reality about 27% of all CEOs in the U.S. are actually women. According to a study conducted by the University of Washington, only 11% of all the images suggested are those of women. And one of the first female images, by the way, is of the doll CEO Barbie.

[rebelmouse-image 27089868 alt="""" original_size="634x615" expand=1]

"Technological developments are progressing on a constant basis, and if we don't start to examine algorithms on their ethical suitability soon, this could cause some serious problems," Bath says. She has observed that antiquated gender roles are perpetuated in technological developments, which means algorithms are actually more sexist than today's society. And besides, the people who use these algorithms don't question the results they produce but accept them to be an accurate depiction of reality.

If you Google the phrase starting with "women should …," Google search predictions autocomplete the sentence with phrases such as "should not study," "should not wear trousers" and "should not have rights." According to Google, an algorithm automatically generates the suggested responses without any human interference. The algorithm, the search giant says, is based on many objective factors, among them the frequency with which a term is searched. Which is why the predictive phrases may seem "strange or surprising."

Another description for this search function's behavior would be "discriminatory." The UN Women organization criticized the supposed "autocomplete truth" in 2013 in one of its campaigns, saying that women should have equal rights everywhere, even on Google.

But it's getting worse. A study conducted by Carnegie Mellon University has shown that women get fewer advertisements on Google for well-paid jobs in managerial positions than men. Searching Google for names that sound African-American results in advertisements that imply a criminal record, according to a 2013 study by Harvard Professor Latanya Sweeney.

Even photos of hands are racist

When you Google the term "hand," the search engine primarily returns images of fair-skinned hands. Such symbolic images, also called "stock images," are used to illustrate texts that relate to medical topics, cosmetics or jewelry and in other advertisements. Brown or black hands are very rarely used for these purposes. An initiative called World White Web is attempting to draw attention to the "whiteness" of the Internet by using these same examples. It offers stock images of hands in various skin colors, which can be used free of charge.

Bath says that technology development itself is responsible for what's happening. Programmers are overwhelmingly white men and therefore members of a very homogenous group. They develop products under the assumption that the user will have similar interests and competencies. In addition, programmers will often think of a stereotype while working rather than of real people. So, for example, telephones designed for senior citizens will often have large dialing pads because eyesight degenerates with age. But not old people have poor eyesight: Other age-related problems, such as dementia or hearing difficulties, also influence the way elderly people use phones and which functions are relevant to them — but many programmers don't think of these influencing factors.

Every new technological development can lead to renewed discrimination, for example, when images are automatically given a keyword, such as is the case on Google and Flickr. A picture of a dachshund, for example, will be given the keyword "dog," while a picture of the Eiffel Tower is given "Paris." When the algorithm was produced, it was fed information and examples to enable it to recognize certain categories on its own — or so the programmers promised. But this doesn't always work. This is demonstrated in photos of black people to which Flickr associated the keyword "ape" and Google the term "gorilla." Images of white people, however, weren't confused with those of animals. Google apologized and removed the keyword completely after staunch protests.

But algorithms are also used to by companies to determine who among hundreds of applicants will be invited to a job interview. Andreas Dewes, physicist and data analyst, researches discrimination and ethics in a data-driven society to understand, for example, whether men are preferred over women when job interview candidates are chosen. And are women treated worse than men?

His simulation, based on more than 10,000 case studies, demonstrates that the more information an algorithm has about a person, the more it is able to discriminate against that person. It's not always sexist, as it can also discriminate against skin color or social strata, information that is readily available online. Talk about Big Data. "The algorithm can be just as discriminating as a human if it, for whatever reason, has full access to certain sets of information," Dewes says. "A lot can go wrong when using algorithms."

The probability of algorithms finding information that can be used to discriminate increases as more and more data is available about growing numbers of people. This becomes particularly critical when algorithms make decisions that have a profound influence on our lives. "What, for example, am I to do if, based on my online behavioral patterns, I am classified as a terrorist?" Bath asks. "Algorithms can produce incorrect information."

There's only one solution. Developers of algorithms must become aware of the problem, ideally during their studies. But users should also be aware that every algorithm they use has been programmed by a human, someone who is fallible and is not free from prejudice.

You've reached your limit of free articles.

To read the full story, start your free trial today.

Get unlimited access. Cancel anytime.

Exclusive coverage from the world's top sources, in English for the first time.

Insights from the widest range of perspectives, languages and countries.


Post-Pandemic Reflections On The Accumulation Of State Power

The public sector has seen a revival in response to COVID-19. This can be a good thing, but must be checked carefully because history tells us of the risks of too much control in the government's hands.

photo of 2 nurses in india walking past graffiti that says "democracy'

Medical students protesting at Calcutta Medical Collage and Hospital.

Sudipta Das/Pacific Press via ZUMA
Vibhav Mariwala


NEW DELHI — The COVID-19 pandemic marked the beginning of a period of heightened global tensions, social and economic upheaval and of a sustained increase in state intervention in the economy. Consequently, the state has acquired significant powers in managing people’s personal lives, starting from lockdowns and quarantine measures, to providing stimulus and furlough schemes, and now, the regulation of energy consumption.

Keep reading...Show less

You've reached your limit of free articles.

To read the full story, start your free trial today.

Get unlimited access. Cancel anytime.

Exclusive coverage from the world's top sources, in English for the first time.

Insights from the widest range of perspectives, languages and countries.

The latest