–Analysis–
MUNICH — Anyone who wants to know how law and legal reality diverge must read the Vienna Convention on Road Traffic, which was approved by the United Nations in November 1968 and is still valid today. “A driver shall not leave his vehicle or his animals without having taken all suitable precautions to avoid any accident and, in the case of a motor vehicle, to prevent its unauthorized use,” says article 23.
It also defines driver as “any person who drives a motor vehicle or other vehicle (including a cycle), or who guides cattle, singly or in herds, or flocks, or draught, pack or saddle animals on a road.” The provision looks a bit outdated because negotiators at that time could never have guessed that there would one day be e-bikes, e-scooters or SUVs.
A society that has changed does not make laws invalid or illegitimate. But automation presents enormous challenges to the legal system. When autonomous vehicles start rolling on roads in the future, they must respect road traffic regulations: traffic lights, right of way rules, prescribed distances. Machines, and not just people, are then receivers of the law.
The Organization for Economic Co-operation and Development (OECD) recently published a report on Governing Transport in the Algorithmic Age, which discusses the possibilities of algorithmic regulation in the transport sector. The report is the result of a workshop in December organized by the International Transport Forum (ITF), which was attended by transport experts from around the world, including Uber, Toyota and Renault.
It’s about adapting laws to the grammar of the code.
Algorithms already act as regulators today: they set dynamic prices, control traffic flows via recommended systems and route planners, decide which vehicle is where, where people are moving, and, in ethical dilemmas, whether the robotic vehicle drives into the pedestrian or into the motorcyclist in the event of an imminent collision. In short, machines orchestrate mobility. Or, as Seleta Reynolds, manager of the Los Angeles Department of Transportation, put it: “Code is the new concrete, the infrastructure is algorithmic and the city must deliver it instantaneously.”
The 84-page report is not limited to the analysis of a data-driven traffic world, it encourages a structural reform to the drafting of new legislation: The law should no longer be formulated in the legal form, i.e. in words, but in program code, that is mathematical formulas. The idea: Transport authorities could publish regulations, such as the requirement that a drone should not fly within 1,000 meters of a runway, directly into machine-readable code. The law and its provisions, the report says, are not compatible with algorithmic decision systems, so they would have to be translated into computer syntax and transcribed into algorithmic models.
The proposal is not simply about scanning legal texts and putting them into machine-readable format, but rather modifying the structure or syntax of laws themselves, adapting them to the grammar of the code. “Machine-readable law,” the report says, “would open the possibility to directly insert desired public policy outcomes as part of the input domain for algorithmic decision systems.” For example, a robotic vehicle could be programmed to drive no faster than 50 km/h in city traffic. The government would then issue in practice no regulation, but formulate a programming rule: When inner city area is reached, then maximum speed is 50 km/h. If it is exceeded, there is a fine. “Compliance by design” is what the authors call it.
[rebelmouse-image 27069749 original_size=”1050×701″ expand=1]
“Algorithms already act as regulators today” — Photo: Tim Gouw
The idea of algorithmic regulation was developed by Californian Internet guru Tim O’Reilly in a 2013 essay. “We can imagine a future in which the speed limit is automatically adjusted based on traffic, weather conditions or other subjective conditions,” he wrote. Simple web metrics, O’Reilly added, could lead to a “massive simplification” of government websites and a reduction in technology costs. What he meant by that was a streamlining of the state through flexible, dynamic systems. It was the utopia of the community as a cybernetic system that automatically adapts to its environment.
The OECD report does not go as far as O’Reilly, but it does mention the use of regulatory algorithms by governments, such as codes that automatically or semi-automatically implement legislative functions. In Estonia, the Ministry of Justice is planning to institutionalize a “robot judge” who will automatically handle small claims and relieve the judiciary. The automation of the legal system is in full swing.
The new report outlines the possible side effects of algorithmic regulation. For example, an autonomous vehicle could be placed in a position where it has to comply with two conflicting (programming) regulations: on the one hand to go to the side to leave way for and ambulance and, on the other hand, not to park in front of a hydrant. In practice, this leads to norms collisions and an inability to act.
Binary-based decision-making systems do not tolerate interpretation, and their rigidity can fuel a techno-authoritarian society in which legal entities follow blindly algorithmic chains of command. It is clear that such control forms can be connected to right-wing currents. For the idea of rebuilding the constitutional state with an opaque formula is based not only on the libertarian visions of a small programming elite, but also on the mental attitude of those who despise democracy.
Algorithms are black-box systems.
This would be the central issue from a democratic point of view: algorithms are black-box systems not subject to any democratic verifiability. The installation of an “executable right” would legitimize these arcane formulas and place them at the center of the political system. Parliament as a central political body would be undermined, the power finally transferred to data centers. Law would become just another piece of information in the Internet of Thing architecture, data feed for machines. Algorithmic regulation — whether on the road or elsewhere — would ultimately break with a century-old legal tradition that has a certain aesthetic: polished formulations, aphorisms, eternal propositions.
Although programming is also identified as “languages,” they are not languages in the narrower sense of the term, but merely formulas that, apart from programmers, only machines understand. The program code is just an instrument that executes a norm but does not set it. Does one seriously believe that one could transform the epochal constitutional precept “the dignity of man is inalienable” into ones and zeros? Would not this be a way of relativizing values, a trivialization, even the negation of a constitutional guarantee? You can delete programming lines at any time. The code is not subject to the law, but the Delete key — and is thus exposed to the arbitrariness of programmers.
A law, like an algorithm, describes an instruction for a variety of cases. The qualitative difference is that laws are democratically legitimized and not just rules set by programmers. Norms are subject to interpretation and discussion. And contrary to programming rules, you can also violate them.