Future

Welcome To The Metadata Society — And Beware

There's a potentially sinister side to the crush of data we unwittingly feed into systems like Google, which can use the information not only to make money, but ultimately control us.

Leaving digital thumbprints all over
Leaving digital thumbprints all over
Adrian Lobe

MUNICH — Every day, Google processes 3.5 billion search queries. Users google everything: resumes, diseases, sexual preferences, criminal plans. And in doing so, they reveal a lot about themselves; more so, probably, than they would like.

From the aggregated data, conclusions can be drawn in real time about the emotional balance of society. What's the general mood like? How's the buying mood? Which product is in demand in which region at this second? Where is credit often sought? Search queries are an economic indicator. Little wonder, then, that central banks have been relying on Google data to feed their macroeconomic models and thus predict consumer behavior.

The search engine is not only a seismograph that records the twitches and movements of the digital society, but also a tool that generates preferences. And if you change your route based on a Google Maps traffic jam forecast, for example, you change not only your own behavior but also that of other road users by changing the parameters of the simulation with your own data.

The behavior of millions of users is conditioned in a continuous feedback loop.

Using the accelerometers built into smartphones, Google can tell if someone is cycling, driving or walking. If you click on the algorithmically-generated search prediction Google proposes when you type "Merkel," for instance, the probability increases that the autocomplete mechanism will also display this for other users. The mathematical models produce a new reality. The behavior of millions of users is conditioned in a continuous feedback loop. Continuous, and controlled.

The Italian philosopher and media theorist Matteo Pasquinelli, who teaches at the Karlsruhe University of Arts and Design, has put forward the hypothesis that this explosion of data exploitation makes a new form of control possible: a "metadata society." With metadata, new forms of bio-political control could be used to establish mass and behavioral control, such as online activities in social media channels or passenger flows in public transport.

"Data," Pasquinelli writes, "are not numbers but diagrams of surfaces, new landscapes of knowledge that inaugurated a vertiginous perspective over the world and society as a whole: the eye of the algorithm, or algorithmic vision."

The accumulation of figures and numbers through the information society has reached a point where they become a space and create a new topology. The metadata society can be understood as an extension of the cybernetic control society, writes Pasquinelli: "Today it is no longer a matter of determining the position of an individual (the data), but of recognizing the general trend of the mass (the metadata)."

Deadly deductions

Pasquinelli doesn't see a problem in the fact that individuals are under tight surveillance (as they were in Germany under the Stasi) but rather in the fact that they are measured and that society as a whole becomes calculable, predictable and controllable. As an example, he cites the NSA's mass surveillance program SKYNET, in which terrorists were identified using mobile phone data in the border region between Afghanistan and Pakistan. The program analyzed and put together the daily routines of 55 million mobile phone users like pieces of a giant jigsaw puzzle: Who travels with whom? Who shares contacts? Who's staying over at his friend's house for the night? A classification algorithm analyzed the metadata and calculated a terror score for each user.

"We kill people based on metadata," former NSA and CIA chief Michael Hayden boasted.

The cold-blooded contempt for humanity expressed in this sentence makes one shiver. The military target is no longer a human person, but only the sum of its metadata. The "algorithmic eye" doesn't see a terrorist, just a suspicious connection in the haze of data clouds. As a brutal consequence, this means that whoever produces suspicious links or patterns is liquidated.

Thousands of people were killed in drone attacks ordered on the basis of SKYNET's findings. It is unclear how many innocent civilians were killed in the process. The methodology is controversial because the machine's learning algorithm only learned from already identified terrorists and blindly reproduced these results. What this means is that whoever had the same trajectories — that is, metadata — as a terrorist, was suddenly considered one himself. The question is how sharp the algorithmic vision is set.

Artist's impression of a Skynet 5 satellite — Source: U.S. Navy

"What would it lead to if Google Trend's algorithm was applied to social issues, political rallies, strikes or the turmoil in the periphery of Europe's big cities?" asks Pasquinelli.

The data gurus have an obsession with predicting human interactions like the weather. Adepts of the "Social Physics' school of thought, founded by data scientist Alex Pentland, look at the world as if through a high-performance microscope: Society consists of atoms whose nuclei are surrounded by individuals orbiting like electrons in fixed orbits. Facebook founder Mark Zuckerberg, for his part, once said he believed there was a "a fundamental mathematical law underlying human social relationships." Love? Job? Crime? Everything is determined, everything predictable! As if society were a linear system of equations in which variables can be removed.

Control and predictability

In Isaac Asimov's science fiction series Foundation, mathematician Hari Seldon develops the fictitious science of Psychohistory, a major theory that combines elements of psychology, mathematics and statistics. Psychohistory models society according to physical chemistry. It assumes that the individual behaves like a gas molecule. And like a gas molecule, the sometimes chaotic movements of an individual cannot be calculated, but the general course and "state of aggregation" of society can be computed with the help of statistical laws.

In one of the novels, Emperor Cleon I says to his mathematician: "You don't need to predict the future. Just choose a future — a good future, a useful future — and make the kind of prediction that will alter human emotions and reactions in such a way that the future you predicted will come to fruition." Even if Seldon rejects this plan as "impossible" and "impractical," it excellently describes the technique of social engineering, in which reality (and sociality) are constructed and individuals are reduced to their physical characteristics.

This manifests a new power technique: The crowd is no longer controlled, but predicted. And that is the dialectical point: Its predictability is completely controllable. If you know where society is going, groups can be directed in the desired direction through manipulation techniques such as nudging, taking advantage of their psychological weaknesses.

Love? Job? Crime? Everything is determined, everything predictable!

Recently, an internal Google video was leaked in which the behavioral concept of a "Selfish Ledger" was presented — a kind of central register on which all user data is stored: surfing behavior, weight, health condition. Based on the data, Google suggests individualized options for action: eat healthier, protect the environment, or support local business. Analogous to DNA sequencing, it could carry out a "behavioral sequencing" and identify behavior patterns. Just as DNA can be changed, behavior can also be modified. The end result of this evolution would be a perfectly programed human being controlled by AI systems.

What is threatening about this algorithmic regulation is not only the subtlety of control that takes place somewhere in the opaque machine rooms of private corporations, but that a techno-authoritarian political mode could be installed, in which the masses would be a politico-physical quantity. Only what has a mass of data has weight in the political discourse.

The visionaries of technology think politics from the point of view of cybernetics: The aim is to avoid "disturbances' and keep the system in balance. The Chinese search engine giant Baidu has developed an algorithm that can use search inputs to predict up to three hours in advance where a crowd of people ("a critical mass') will form.

Here the program code becomes a preemptive prevention policy. The promise of politics is that it is open to the future and flexible. But when the behavior of individuals, groups and society becomes predictable, political decision-making becomes a waste. Where everything is determined, nothing can be changed anymore.

Keep up with the world. Break out of the bubble.
Sign up to our expressly international daily newsletter!
Geopolitics

How Thailand's Lèse-Majesté Law Is Used To Stifle All Protest

Once meant to protect the royal family, the century-old law has become a tool for the military-led government in Bangkok to stamp out all dissent. A new report outlines the abuses.

Pro-Democracy protest at The Criminal Court in Bangkok, Thailand

"We need to reform the institution of the monarchy in Thailand. It is the root of the problem." Those words, from Thai student activist Juthatip Sirikan, are a clear expression of the growing youth-led movement that is challenging the legitimacy of the government and demanding deep political changes in the Southeast Asian nation. Yet those very same words could also send Sirikan to jail.

Thailand's Criminal Code 'Lèse-Majesté' Article 112 imposes jail terms for defaming, insulting, or threatening the monarchy, with sentences of three to 15 years. This law has been present in Thai politics since 1908, though applied sparingly, only when direct verbal or written attacks against members of the royal family.


But after the May 2014 military coup d'état, Thailand experienced the first wave of lèse-majesté arrests, prosecutions, and detentions of at least 127 individuals arrested in a much wider interpretation of the law.

The recent report 'Second Wave: The Return of Lèse-Majesté in Thailand', documents how the Thai government has "used and abused Article 112 of the Criminal Code to target pro-democracy activists and protesters in relation to their online political expression and participation in peaceful pro-democracy demonstrations."

Criticism of any 'royal project'

The investigation shows 124 individuals, including at least eight minors, have been charged with lèse-majesté between November 2020 and August 2021. Nineteen of them served jail time. The new wave of charges is cited as a response to the rising pro-democracy protests across Thailand over the past year.

Juthatip Sirikan explains that the law is now being applied in such a broad way that people are not allowed to question government budgets and expenditure if they have any relationship with the royal family, which stifles criticism of the most basic government decision-making since there are an estimated 5,000 ongoing "royal" projects. "Article 112 of lèse-majesté could be the key (factor) in Thailand's political problems" the young activist argues.

In 2020 the Move Forward opposition party questioned royal spending paid by government departments, including nearly 3 billion baht (89,874,174 USD) from the Defense Ministry and Thai police for royal security, and 7 billion baht budgeted for royal development projects, as well as 38 planes and helicopters for the monarchy. Previously, on June 16, 2018, it was revealed that Thailand's Crown Property Bureau transferred its entire portfolio to the new King Maha Vajiralongkorn.

photo of graffiti of 112 crossed out on sidewalk

Protestors In Bangkok Call For Political Prisoner Release

Peerapon Boonyakiat/SOPA Images via ZUMA Wire

Freedom of speech at stake

"Article 112 shuts down all freedom of speech in this country", says Sirikan. "Even the political parties fear to touch the subject, so it blocks most things. This country cannot move anywhere if we still have this law."

The student activist herself was charged with lèse-majesté in September 2020, after simply citing a list of public documents that refer to royal family expenditure. Sirikan comes from a family that has faced the consequences of decades of political repression. Her grandfather, Tiang Sirikhan was a journalist and politician who openly protested against Thailand's involvement in World War II. He was accused of being a Communist and abducted in 1952. According to Sirikhan's family, he was killed by the state.

The new report was conducted by The International Federation for Human Rights (FIDH), Thai Lawyer for Human Rights (TLHR), and Internet Law Reform Dialogue (iLaw). It accuses Thai authorities of an increasingly broad interpretation of Article 112, to the point of "absurdity," including charges against people for criticizing the government's COVID-19 vaccine management, wearing crop tops, insulting the previous monarch, or quoting a United Nations statement about Article 112.

Activist in front of democracy monument in Thailand.

Shift to social media

While in the past the Article was only used against people who spoke about the royals, it's now being used as an alibi for more general political repression — which has also spurred more open campaigning to abolish it. Sirikan recounts recent cases of police charging people for spreading paint near the picture of the king during a protest, or even just for having a picture of the king as phone wallpaper.

The more than a century-old law is now largely playing out online, where much of today's protest takes place in Thailand. Sirikan says people are willing to go further on social media to expose information such as how the king intervenes in politics and the monarchy's accumulation of wealth, information the mainstream media rarely reports on them.

Not surprisingly, however, social media is heavily monitored and the military is involved in Intelligence operations and cyber attacks against human rights defenders and critics of any kind. In October 2020, Twitter took down 926 accounts, linked to the army and the government, which promoted themselves and attacked political opposition, and this June, Google removed two Maps with pictures, names, and addresses, of more than 400 people who were accused of insulting the Thai monarchy. "They are trying to control the internet as well," Sirikan says. "They are trying to censor every content that they find a threat".

Keep up with the world. Break out of the bubble.
Sign up to our expressly international daily newsletter!
THE LATEST
FOCUS
TRENDING TOPICS
MOST READ