X

Are the Three Laws of Robotics Soon to Come True?

When I was a little girl, my greatest aspiration was to live long enough to see the interplanetary manned flights. Of course, I was looking forward to other great things as well.

Are the Three Laws of Robotics Soon to Come True?

  • Videophones – Check
  • Pocket size TV sets with cartoons in them – Check
  • Worldwide information network – Check
  • Instant mail – Check
  • Magic cameras, displaying snaps the moment they are taken – Check.
  • Robots helping around the house – Check
  • Sentient robots helping around the house – No.

That’s a shame!

While humanlike robots are featured in SCI-FI movies and TV shows, we’re still unlikely to get them any time soon in a nearby electronics store. Yet, virtual AI is already here, Siri and Cortana for instance. Of course, they can’t be called sentient, or even semi-sentient, but considering the rate at which robotics develop, we’re bound to face some ethical problem soon.

Right now, if you ask Siri “how to make an atomic bomb”, it will give you a list of links to nuclear science or historic websites. This is a built-in precaution that engineers have foreseen, plus the limitations set by Google search engine. But, let’s imagine, that Siri has its own flexible and self-educating intellect. How will it react? Will it be pacifistic or misanthropic? Will it take after its master or have its own principles and priorities? And speaking of which, what they might be?

Interestingly enough, the creators of Siri aren’t concerned about an AI powered society future, but Microsoft is.

On September 28, 2016, Microsoft and IBM, together with Amazon, DeepMind/Google and Facebook, announced that they are creating a non-profit organization to “advance public understanding of artificial intelligence technologies (AI) and formulate best practices on the challenges and opportunities within the field. Academics, non-profits, and specialists in policy and ethics will be invited to join”.

This brings us back to the Si-Fi stories I was so fascinated with a long time ago. Particularly, Azimov’s three laws of robotics. I wonder why those at IBM or Deep Mind can’t take a book from a shelf or download it on their readers and read these laws:

A robot may not injure a human being or, through inaction, allow a human being to come to harm.

A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.

A robot must protect its own existence, so long as such protection does not conflict with the First or Second Laws.

When I first read the news about the Partnership on Artificial Intelligence I was so excited that I tried to apply the rules to the existing digital assistants.

Siri and Cortana cannot injure a human being right now. However, they can deliver potentially harmful information.
Siri and Cortana obey all users’ orders because they are incapable of evaluating the moral aspect of the order or its outcome. They have some built-in limitations, I believe.

Siri and Cortana are protected by Apple and IBM engineers; they can be switched off but cannot be deleted by a common user. It takes advanced knowledge of your computer system to delete Cortana, but Siri is indispensable.

In a word, the modern digital assistants do not meet any of the aforementioned rules. If we can’t control the assistants which just imitate intellectual behavior, what will we do with their progenies? And why has Apple, who possesses the most complicated digital assistant, taken a backseat? Or are they involved in some undercover super-secret ‘wow’ project?

Links

  • Asking yourself where to sell used Apple electronics? You’ve come to the right place. We buy your old iDevices, Macs, iPhones, Displays, iPods, iPads included, for the highest price online: Sell your iPhone and your Mac now.

Why the Laws of Robotics Don’t Work [Video]

Three or four laws to make robots and AI safe – should be simple right? Rob Miles on why these simple laws are so complicated. Video published by Computerphile published on November 6, 2015.

Related Post