Back to list All Articles Archives Search RSS Terug naar lijst Alle artikelen Archieven Zoek RSS

Myth: As robots and computers become smarter, we will be released from our responsibility

Myth: As robots and computers become smarter, we will be released from our responsibility

Photographer:Fotograaf: Joey Roberts

Myth busters

Self-driving cars, drones used in war zones, robots used to carry out surgical operations. Examples of computers playing a major role, but what happens if something goes wrong? What if a self-driving car doesn't stop at a zebra crossing? Who is responsible?
According to Merel Noorman, technology philosopher and postdoc at the Faculty of Arts and Social Sciences, the idea that smart computers take over tasks from people suggests that people are being made redundant. “But that idea is incorrect. Certain human tasks or decisions may have been taken over by computers, but other tasks have taken their place. When an organisation or business becomes computerized, as was the case with the IB Groep (now DUO, Educational Implementation Service) dozens of years ago, it is no longer a civil servant who determines whether or not a student receives student financing, but a computer. Those who are responsible now, are the managers who bought the system, the developers, and the manufacturers.”

A myth about robots and smart computer systems. But what are they exactly? “In artificial intelligence, we make smart computers that can search, recognise faces, detect patterns in data, read texts, or control robots,” says Noorman. “Autonomy is another important characteristic. The robot carries out tasks for quite some time without human intervention. Those tasks are often fairly complicated. One of the showpieces among the drones that are now being developed in the United States, is the X-47B, an unmanned combat air vehicle. This particular one is regarded as more autonomous than other drones, because it is capable of independently taking off from and landing on an aircraft carrier, which is difficult because such ships wobble and have a short runway.”
But why do we want (more of) these types of drones and other robots? “They are often deployed for missions that are too dirty, dangerous, or impossible for human beings, such as those after the nuclear disaster in Fukushima. Robots are also more efficient and faster. But there is also criticism when it comes to armed drones shooting at targets. It is true that human beings can become tired and make mistakes, but there is also something immoral about these tasks: decisions about life and death are being handed over to an automated system. Also: What would stop a government from using more drones? There seem to be no limits.”

Back to responsibility. Where can I submit a claim when my self-driving car ignores a red light? “Negotiations regarding this matter are currently taking place. There are many experiments with self-driving cars in the United States. The liability policies that now apply will need to be changed. If you drive a car that makes decisions for you, then it would be difficult to put all the blame on you if things go wrong. Volvo, Google, and Mercedes-Benz have stated that they are planning to bear the cost in the case of any accidents.”
Noorman feels that the discussion about liability is necessary. “If I make a robot that picks up dirt from the pavement, then I have certain expectations regarding it. I don't want a robot that bumps into people, do I? If it does, it will need to be reprogrammed. So in that sense, humans (programmers, those who buy systems, manufacturers) will always have their own responsibility for the conditions under which a robot or computer system operates.”

Mythbusters is a series in which academics shoot down popular myths on complex topics

Categories:Categorieën:
Tags:

CommentsReacties

There are currently no comments.Er zijn geen reacties.

Post a Comment

Laat een reactie achter

Name (required)

Email (required)

CAPTCHA image
Enter the code shown above: