Google, Facebook And Beyond: Why Algorithms Discriminate

Yes, technology can be biased, racist and sexist. That's because the creators of algorithms are overwhelmingly white men, and the online results are becoming a huge problem in an increasingly digitized world.

Google Images search for CEO
Sara Weber

MUNICH â€" Algorithms decide who Facebook suggests you should befriend and when future self-driving cars will apply the brakes. They are a sequence of orders that can solve a problem, if applied correctly. They are supposed to be systematic, logical and produce the same result even after the umpteenth application.

But every algorithm is programmed by humans who are part of various social strata and whose work is influenced, consciously or not, by their preferences and prejudices. And the more the world is driven by technology, the more this is a problem.

"We assume that technology is neutral, but there are no neutral algorithms," says Corinna Bath, a computer specialist who researches technology development and gender issues. She says technology is always influenced by social standing, though very few people realize that.

Algorithms can be downright dangerous if they have been programmed to make racist or sexist decisions. Mind you, this probably doesn't happen on purpose. Most programmers probably aren't aware of the fact that their prejudices are influencing the code they develop.

Blatant sexism in image returns

If you were to Google the term "CEO," the search engine suggests a disproportionately high number of images of white men. But in reality about 27% of all CEOs in the U.S. are actually women. According to a study conducted by the University of Washington, only 11% of all the images suggested are those of women. And one of the first female images, by the way, is of the doll CEO Barbie.

"Technological developments are progressing on a constant basis, and if we don't start to examine algorithms on their ethical suitability soon, this could cause some serious problems," Bath says. She has observed that antiquated gender roles are perpetuated in technological developments, which means algorithms are actually more sexist than today's society. And besides, the people who use these algorithms don't question the results they produce but accept them to be an accurate depiction of reality.

If you Google the phrase starting with “women should …,” Google search predictions autocomplete the sentence with phrases such as "should not study," "should not wear trousers" and "should not have rights." According to Google, an algorithm automatically generates the suggested responses without any human interference. The algorithm, the search giant says, is based on many objective factors, among them the frequency with which a term is searched. Which is why the predictive phrases may seem "strange or surprising."

Another description for this search function's behavior would be "discriminatory." The UN Women organization criticized the supposed "autocomplete truth" in 2013 in one of its campaigns, saying that women should have equal rights everywhere, even on Google.

But it's getting worse. A study conducted by Carnegie Mellon University has shown that women get fewer advertisements on Google for well-paid jobs in managerial positions than men. Searching Google for names that sound African-American results in advertisements that imply a criminal record, according to a 2013 study by Harvard Professor Latanya Sweeney.

Even photos of hands are racist

When you Google the term "hand," the search engine primarily returns images of fair-skinned hands. Such symbolic images, also called "stock images," are used to illustrate texts that relate to medical topics, cosmetics or jewelry and in other advertisements. Brown or black hands are very rarely used for these purposes. An initiative called World White Web is attempting to draw attention to the "whiteness" of the Internet by using these same examples. It offers stock images of hands in various skin colors, which can be used free of charge.

Google Images search for "hands"

Bath says that technology development itself is responsible for what's happening. Programmers are overwhelmingly white men and therefore members of a very homogenous group. They develop products under the assumption that the user will have similar interests and competencies. In addition, programmers will often think of a stereotype while working rather than of real people. So, for example, telephones designed for senior citizens will often have large dialing pads because eyesight degenerates with age. But not old people have poor eyesight: Other age-related problems, such as dementia or hearing difficulties, also influence the way elderly people use phones and which functions are relevant to them â€" but many programmers don't think of these influencing factors.

Every new technological development can lead to renewed discrimination, for example, when images are automatically given a keyword, such as is the case on Google and Flickr. A picture of a dachshund, for example, will be given the keyword "dog," while a picture of the Eiffel Tower is given "Paris." When the algorithm was produced, it was fed information and examples to enable it to recognize certain categories on its own â€" or so the programmers promised. But this doesn't always work. This is demonstrated in photos of black people to which Flickr associated the keyword "ape" and Google the term "gorilla." Images of white people, however, weren't confused with those of animals. Google apologized and removed the keyword completely after staunch protests.

But algorithms are also used to by companies to determine who among hundreds of applicants will be invited to a job interview. Andreas Dewes, physicist and data analyst, researches discrimination and ethics in a data-driven society to understand, for example, whether men are preferred over women when job interview candidates are chosen. And are women treated worse than men?

His simulation, based on more than 10,000 case studies, demonstrates that the more information an algorithm has about a person, the more it is able to discriminate against that person. It's not always sexist, as it can also discriminate against skin color or social strata, information that is readily available online. Talk about Big Data. "The algorithm can be just as discriminating as a human if it, for whatever reason, has full access to certain sets of information," Dewes says. "A lot can go wrong when using algorithms."

The probability of algorithms finding information that can be used to discriminate increases as more and more data is available about growing numbers of people. This becomes particularly critical when algorithms make decisions that have a profound influence on our lives. "What, for example, am I to do if, based on my online behavioral patterns, I am classified as a terrorist?" Bath asks. “Algorithms can produce incorrect information."

There's only one solution. Developers of algorithms must become aware of the problem, ideally during their studies. But users should also be aware that every algorithm they use has been programmed by a human, someone who is fallible and is not free from prejudice.

Support Worldcrunch
We are grateful for reader support to continue our unique mission of delivering in English the best international journalism, regardless of language or geography. Click here to contribute whatever you can. Merci!

Why Chinese Cities Waste Millions On Vanity Building Projects

The so-called "White Elephants," or massive building projects that go unused, keep going up across China as local officials mix vanity and a misdirected attempt to attract business and tourists. A perfect example the 58-meter, $230 million statue of Guan Yu, a beloved military figure from the Third Century, that nobody seems interested in visiting.

Statue of Guan Yu in Jingzhou Park, China

Chen Zhe

BEIJING — The Chinese Ministry of Housing and Urban-Rural Development recently ordered the relocation of a giant statue in Jingzhou, in the central province of Hubei. The 58-meter, 1,200-ton statue depicts Guan Yu, a widely worshipped military figure from the Eastern Han Dynasty in the Third century A.D.

The government said it ordered the removal because the towering presence "ruins the character and culture of Jingzhou as a historic city," and is "vain and wasteful." The relocation project wound up costing the taxpayers approximately ¥300 million ($46 million).

Huge monuments as "intellectual property" for a city

In recent years local authorities in China have often raced to create what is euphemistically dubbed IP (intellectual property), in the form of a signature building in their city. But by now, we have often seen negative consequences of such projects, which evolved from luxurious government offices to skyscrapers for businesses and residences. And now, it is the construction of cultural landmarks. Some of these "white elephant" projects, even if they reach the scale of the Guan Yu statue, or do not necessarily violate any regulations, are a real problem for society.

It doesn't take much to be able to differentiate between a project constructed to score political points and a project destined for the people's benefit. You can see right away when construction projects neglect the physical conditions of their location. The over the top government buildings, which for numerous years mushroomed in many corners of China, even in the poorest regional cities, are the most obvious examples.

Homebuyers looking at models of apartment buildings in Shanghai, China — Photo: Imaginechina/ZUMA

Guan Yu transformed into White Elephant

A project truly catering to people's benefit would address their most urgent needs and would be systematically conceived of and designed to play a practical role. Unfortunately, due to a dearth of true creativity, too many cities' expression of their rich cultural heritage is reduced to just building peculiar cultural landmarks. The statue of Guan Yu in Jingzhou is a perfect example.

Long ago Jinzhou was a strategic hub linking the North and the South of China. But its development has lagged behind coastal cities since the launch of economic reform a generation ago.

This is why the city's policymakers came up with the idea of using the place's most popular and glorified personality, Guan Yu (who some refer to as Guan Gong). He is portrayed in the 14th-century Chinese classic "The Romance of the Three Kingdoms" as a righteous and loyal warrior. With the aim of luring tourists, the city leaders decided to use him to create the city's core attraction, their own IP.

Opened in June 2016, the park hosting the statue comprises a surface of 228 acres. In total it cost ¥1.5 billion ($232 million) to build; the statue alone was ¥173 million ($27 million). Alas, since the park opened its doors more than four years ago, the revenue to date is a mere ¥13 million ($2 million). This was definitely not a cost-effective investment and obviously functions neither as a city icon nor a cultural tourism brand as the city authorities had hoped.

China's blind pursuit of skyscrapers

Some may point out the many landmarks hyped on social media precisely because they are peculiar, big or even ugly. However, this kind of attention will not last and is definitely not a responsible or sustainable concept. There is surely no lack of local politicians who will contend for attention by coming up with huge, strange constructions. For those who can't find a representative figure, why not build a 40-meter tall potato in Dingxi, Gansu Province, a 50-meter peony in Luoyang, Shanxi Province, and maybe a 60-meter green onion in Zhangqiu, Shandong Province?

It is to stop this blind pursuit of skyscrapers and useless buildings that, early this month, the Ministry of Housing and Urban-Rural Development issued a new regulation to avoid local authorities' deviation from people's real necessities, ridiculous wasted costs and over-consumption of energy.

I hope those responsible for the creation of a city's attractiveness will not simply go for visual impact, but instead create something that inspires people's intelligence, sustains admiration and keeps them coming back for more.

Support Worldcrunch
We are grateful for reader support to continue our unique mission of delivering in English the best international journalism, regardless of language or geography. Click here to contribute whatever you can. Merci!