When the world gets closer.

We help you see farther.

Sign up to our expressly international daily newsletter.

Enjoy unlimited access to quality journalism.

Limited time offer

Get your 30-day free trial!
Germany

Robot Brains Need Human Rules

Artificial intelligence is too useful and advanced and to ignore. But it also comes with huge risks, and should be limited accordingly.

Service robot ''L2B2''
Service robot ''L2B2''
Kathrin Werner

-OpEd-

MUNICH — If you want a good scare, take a look at the artificial intelligence on display at this year's South By Southwest festival in Texas. Scientists there report that people can control prostheses with their brains. Artificial Intelligence will soon control artificial body parts while robotic brains will hire workers, predict crime, control drones and manage health data.

Futurist Ray Kurzweil prophesies that by 2029, AI will be as intelligent as we are — and that no one will be able to tell whether we are talking to machines or actual people. Billionaire, technology fan and Tesla CEO Elon Musk, for his part, thinks artificial intelligence is more dangerous than nuclear weapons.

To dismiss these concerns as simply fear of the future — the way people once feared the advent of railways — is to ignore the real shifts that are taking place, and place entirely too much faith in technology. Unlike any other invention in history, artificial intelligence has a new dimension: Nobody understands it. That's because, after its initial programming, it continues to develop — on its own — and makes decisions that its inventors cannot explain. Rules for robot brains are thus urgently needed.

Technology is developing faster than legislation

But there's a problem: Because of the whole rhetoric about the robocalypse, the danger of overregulation is even greater than the danger posed by AI itself. AI, after all, presents huge opportunities for progress — when it comes to curing diseases, for example — because it can work through large amounts of data much better than the human brain can. Such applications are already a reality or close to reality. But an AI that would take over world domination has so far remained science fiction.

Any regulation of artificial intelligence faces a fundamental problem: Technology is developing faster than legislation. AI is also a collective term that covers several technologies. There isn't just one artificial intelligence. This misunderstanding is what drives demands to establish an AI authority to control AI. That's a bad idea. After all, there is no computer authority that sets rules for computers. AI is a tool, and so regulation must start where this tool can cause damage.

Artificial intelligence applied to daily work in China Photo: Yu Fangping/ZUMA

Autonomous cars, for example, are not allowed to decide for themselves to exceed speed limits just because drivers around them are driving too fast. Similarly, there must be limits in the area of financial markets and medicine. AI must not be allowed to break any laws that apply to humans. For example, it shouldn't be able to record and analyze conversations in the living room without permission. Responsibility must remain with us humans.

Likewise, there should be no place for the excuse "That wasn't me, that was my artificial intelligence." What's more, artificial intelligence should always make itself clearly identifiable as non-human. Another thing that should also be considered is to have AI devices in place to supervise other AI devices — a robot, for instance, that brakes when an autonomous car drives too fast.

Artificial intelligence will come, whether we like it or not. If we try to slow its progress down with regulations, China will continue to push it forward. So far, the level of knowledge of almost all politicians in such matters is abysmal. They generally only know that keywords like blockchain and AI are important.

They have to face up to the responsibility and take the fears of job losses and of killer weapons just as seriously as the opportunities AI will bring. Before they write laws, they must understand, even though part of technology will always remain unexplainable.

You've reached your limit of free articles.

To read the full story, start your free trial today.

Get unlimited access. Cancel anytime.

Exclusive coverage from the world's top sources, in English for the first time.

Insights from the widest range of perspectives, languages and countries.

In The News

War In Ukraine, Day 285: Three Dead In Ukraine's First-Ever Attack On Russian Air Bases

Reports of Ukraine's possible use of kamikaze drones deep inside Russian territory.

War In Ukraine, Day 285: Three Dead In Ukraine's First-Ever Attack On Russian Air Bases

Engels-2 airbase in Russia

Alex Hurst, Anna Akage, and Emma Albright

Updated 11:45 p.m.

Separate explosions Monday morning at two different Russian air bases, which have killed at least three and injured eight, have demonstrated that Ukraine has the capacity to use drones to attack targets deep inside Russia.

Stay up-to-date with the latest on the Russia-Ukraine war, with our exclusive international coverage.

Sign up to our free daily newsletter.

Russian state media reports that a fuel tanker exploded early Monday in an airfield near the city of Ryanza, southeast of Moscow, killing three and injuring six people. Another two people are reported to have been injured in another morning explosion at the Engles-2 airbase in the Saratov region, farther to the southeast.

Later Monday, both Russian and Ukrainian government sources confirmed that the attack was carried out by Ukraine, a major escalation in Kyiv's war effort.

Keep reading...Show less

You've reached your limit of free articles.

To read the full story, start your free trial today.

Get unlimited access. Cancel anytime.

Exclusive coverage from the world's top sources, in English for the first time.

Insights from the widest range of perspectives, languages and countries.

The latest

InterNations