The Case For Banning Facial Recognition Systems Altogether

With automated electronic surveillance systems, suspicion does not precede data collection but is generated by the analysis of the data itself.

Manager of China's BYD tries facial recognition on January 10, 2018.
Manager of China's BYD tries facial recognition on January 10, 2018.
Urvashi Aneja and Angelina Chamuah *


NEW DELHI — The Delhi police reportedly used automated facial recognition software (AFRS) to screen the crowd during Prime Minister Modi's election rally in Delhi last December. This was also the first time Delhi police used facial images collected across protests in Delhi to identify protesters at the rally.

New categories of deviance such as "habitual protesters," and "rowdy elements' have emerged as faces of protesters are matched against existing databases and maintained for future law enforcement. Police departments in a growing number of states also claim to be using facial recognition and predictive analytics to capture criminals.

The railway system, in the meantime, is developing plans to use AFRS at stations to identify criminals, linking the AFRS systems to existing data bases such as the Criminal Tracking Network. The Telangana State Election Commission is considering using AFRS to identify voters during the municipal polls in Telangana. And the Home Ministry recently announced its intention to install the world's largest AFRS to track and nab criminals.

AFRS adds to growing list of surveillance systems already in place in India, such as NATGRID and the Central Monitoring System, even while there continues to be little publicly available information about these programs. A recent study by Comparitech places India after China and Russia in terms of surveillance and failure to provide privacy safeguards.

It can have a chilling effect on society.

Automated facial recognition systems are a direct threat to the right to privacy. Unlike CCTV cameras, they allow for the automatic tracking and identification of individuals across place and time. Footage from surveillance cameras can be easily cross-matched, and combined with different databases, to yield a 360-degree view of individuals. As facial recognition systems combine constant bulk monitoring with individual identification, anonymity is further rendered impossible — there is no protection, or safety, even in numbers.

But much more is at stake than individual privacy.

AFRS can have a chilling effect on society, making individuals refrain from engaging in certain types of activity for fear of the perceived consequences of the activity being observed. As Daragh Murray points out, this chilling effect results in the curtailment of a far greater set of rights, such as the freedom of expression, association, and assembly. Taken together, this can undermine the very foundations of a participatory democracy.

With AFRS and other automated electronic surveillance systems, suspicion does not precede data collection, but is generated by the analysis of the data itself. To avoid suspicion, people will refrain from certain types of activity or expression; and the worry or threat, of not knowing what data is being collected or how it is being combined and analyzed, can result in the self-censorship of a wide range of activities. Such surveillance, as Christian Fuchs points out, first creates a form of psychological and structural violence, which can then turn into physical violence.

Further, because surveillance operates as "a mechanism of social sorting" — of classifying individuals based on a set of pre-determined characteristics and their likelihood of posing a risk to society — the chilling effect is likely to be experienced more severely by already discriminated against communities. Such social sorting is further likely to exacerbate identity politics in India, enforcing and exacerbating social divisions.

This is also why critiques of AFRS that point to their low accuracy rates or failure to identify certain skin tones miss the point entirely. A more effective system would pose an even greater threat to privacy, social sorting, and participatory democracy.

Much of the criticism around the deployment of AI-based technologies has highlighted issues of discrimination and exclusion; and how this can result in the violation of human rights. But the case of AFRS shows how AI systems can not only result in the violation or loss of rights, but are also productive of certain types of behavior — creating a disciplinary society.

Further, because chilling effects in some sense rest on the occurrence of non-events — namely not engaging in particular types of activities — frameworks based on identifying discrete violations of rights are likely to be inadequate. The case of AFRS thus highlights how conversations around AI governance need to move beyond the identification of immediately visible harm, at an individual level, to ask what kind of transformations are taking place at a structural level — how values of privacy, liberty, democracy and freedom are being recast.

In India, as elsewhere, surveillance technologies have entered the public domain through a narrative of both safety and protection (plus consumer convenience and personalization). The rhetoric of safety, for example, is behind the recent allocation of the Rs. 250 crore from the Nirbhaya fund for the installation of facial recognition cameras at 983 railway stations across the country.

Automated facial recognition software earlier procured to trace missing children in the country are now being used to sort and profile citizens, dissenters and peaceful protesters. This shows the folly of searching for the good use-cases of AI. Like other surveillance techniques, AFRS is also being routinized and normalized through the promise of consumer personalization and convenience — whether the embrace of facial recognition to unlock an iPhone or people voluntarily signing up for AFRS at airports.

Mark Andrejevic has argued, for example, that the "key to the creation of digital enclosures today is the emphasis that has been given to the technologies of liberation, in particular, mobile phones and social networking sites." This "domestication of the discourse of interactivity" has been crucial for expanding the means of surveillance. As a result, as Lyon notes, references to an Orwellian dystopia are "rendered inadequate because of the increasing importance of nonviolent and consumerist methods of surveillance."

AFRS requires the collection of biometric facial data from all, not only the targets of surveillance.

With the various government ministries seeking to employ AFRS, many have called for regulating its use — that it be employed only for specified and necessary judicial processes. But, regulating its use is not enough. Even if AFRS were permitted in only a few select instances, or after due process has been followed, the chilling effect on democracy will remain.

At a more practical level, the effectiveness of AFRS requires the collection of biometric facial data from all individuals, not only the targets of surveillance, or those suspected for criminal activity. Selective use also contributes to normalization and routinization (and over time, even more effective AFRS). Let's not forget that many surveillance technologies are first tested in the criminal system before they are deployed for the broader public.

Even with adequate legal safeguards and perfectly accurate facial recognition systems, the harms to society far outweigh any possible benefits. We need to ban the use of AFRS all together — to establish this as a necessary red line to preserve the health and future of democracy. Even while effecting such political change may seem a far cry in the current political climate, it is urgent to start building at least a normative consensus within civil society.

This conversation has already started in other corners of the world. San Francisco has already banned the use of AFRS by the police and all municipal agencies, and the EU is considering banning the technology in public spaces for five years. Neither go far enough. An even better example could be Portland, Oregon, which is considering banning the use of AFRS by both government agencies and private businesses.

While India continues to lack any frameworks for the governance and regulation of AI-based technologies, the case of AFRS highlights how this is an urgent priority.

AFRS will soon be complemented by systems for emotion and gait recognition; technologies that detect heartbeat and micro-biomes are also under development. We need to act now: As these technologies become more embedded in not only governance systems but only consumer habits, there will be fewer opportunities for course correction.

*Urvashi Aneja is co-founder and director of Tandem Research, an interdisciplinary research collective in Goa, India. Angelina Chamuah is a research fellow with the group.

Keep up with the world. Break out of the bubble.
Sign up to our expressly international daily newsletter!

What It Means When The Jews Of Germany No Longer Feel Safe

A neo-Nazi has been buried in the former grave of a Jewish musicologist Max Friedlaender – not an oversight, but a deliberate provocation. This is just one more example of antisemitism on the rise in Germany, and society's inability to respond.

At a protest against antisemitism in Berlin

Eva Marie Kogel


BERLIN — If you want to check the state of your society, there's a simple test: as the U.S. High Commissioner for Germany, John Jay McCloy, said in 1949, the touchstone for a democracy is the well-being of Jews. This litmus test is still relevant today. And it seems Germany would not pass.

Incidents are piling up. Most recently, groups of neo-Nazis from across the country traveled to a church near Berlin for the funeral of a well-known far-right figure. He was buried in the former grave of Jewish musicologist Max Friedlaender, a gravesite chosen deliberately by the right-wing extremists.

The incident at the cemetery

They intentionally chose a Jewish grave as an act of provocation, trying to gain maximum publicity for this act of desecration. And the cemetery authorities at the graveyard in Stahnsdorf fell for it. The church issued an immediate apology, calling it a "terrible mistake" and saying they "must immediately see whether and what we can undo."

There are so many incidents that get little to no media attention.

It's unfathomable that this burial was allowed to take place at all, but now the cemetery authorities need to make a decision quickly about how to put things right. Otherwise, the grave may well become a pilgrimage site for Holocaust deniers and antisemites.

The incident has garnered attention in the international press and it will live long in the memory. Like the case of singer-songwriter Gil Ofarim, who recently claimed he was subjected to antisemitic abuse at a hotel in Leipzig. Details of the crime are still being investigated. But there are so many other incidents that get little to no media attention.

Photo of the grave of Jewish musicologist Max Friedlaender

The grave of Jewish musicologist Max Friedlaender

Jens Kalaene/dpa/ZUMA

Crimes against Jews are rising

Across all parts of society, antisemitism is on the rise. Until a few years ago, Jewish life was seen as an accepted part of German society. Since the attack on the synagogue in Halle in 2019, the picture has changed: it was a bitter reminder that right-wing terror against Jewish people has a long, unbroken history in Germany.

Stories have abounded about the coronavirus crisis being a Jewish conspiracy; meanwhile, Muslim antisemitism is becoming louder and more forceful. The anti-Israel boycott movement BDS rears its head in every debate on antisemitism, just as left-wing or post-colonial thinking are part of every discussion.

Jewish life needs to be allowed to step out of the shadows.

Since 2015, the number of antisemitic crimes recorded has risen by about a third, to 2,350. But victims only report around 20% of cases. Some choose not to because they've had bad experiences with the police, others because they're afraid of the perpetrators, and still others because they just want to put it behind them. Victims clearly hold out little hope of useful reaction from the state – so crimes go unreported.

And the reality of Jewish life in Germany is a dark one. Sociologists say that Jewish children are living out their "identity under siege." What impact does it have on them when they can only go to nursery under police protection? Or when they hear Holocaust jokes at school?

Germany needs to take its antisemitism seriously

This shows that the country of commemorative services and "stumbling blocks" placed in sidewalks as a memorial to victims of the Nazis has lost its moral compass. To make it point true north again, antisemitism needs to be documented from the perspective of those affected, making it visible to the non-Jewish population. And Jewish life needs to be allowed to step out of the shadows.

That is the first thing. The second is that we need to talk about specifically German forms of antisemitism. For example, the fact that in no other EU country are Jewish people so often confronted about the Israeli government's policies (according to a survey, 41% of German Jews have experienced this, while the EU average is 28%). Projecting the old antisemitism onto the state of Israel offers people a more comfortable target for their arguments.

Our society needs to have more conversations about antisemitism. The test of German democracy, as McCloy called it, starts with taking these concerns seriously and talking about them. We need to have these conversations because it affects all of us. It's about saving our democracy. Before it's too late.

Keep up with the world. Break out of the bubble.
Sign up to our expressly international daily newsletter!