Opening Insights: When a Robot Kills, Who is Responsible?
Just what do you think you’re doing, Dave? Dave, I really think I’m entitled to an answer to that question.
HAL 9000, 2001 A SPACE ODYSSEY
Dear autonomous war machine developers,
Have you ever watched a science fiction movie or read a science fiction novel? If you have, then you will admit that in nearly every scenario, the killer robot technology doesn’t work out very well for the humans.
It’s not necessary to consider these works of fiction as a manner of prophesy in order to grasp the underlying principles they present. Never mind that many science fiction wonders of the past have become realities today; like satellites, space travel, smartphones, 3D printing and artificial intelligence, just to name a few.
Please, for the sake of all mankind – you know, the people who will eventually be murdered by your creations – consider not building them.
Signed,
Concerned homo sapien
Whether or not we should build the killer robots is now a moot point. The unfortunate reality of the matter is, if nations that wish to use force to subjugate others are creating robots that can kill and have the ability to do so without human intervention, then matching countermeasures we must have. Microsoft President Brad Smith shares his thoughts on actions that could help to stay the robot apocalypse.
Informational Insights: Unstoppable?
The following article was published by The Telegraph, “an award-winning, multimedia news brand that has been synonymous with quality, authority and credibility for more than 160 years.” It was written by Robin Pagnamenta, Head of Technology for The Telegraph.
The rise of killer robots is now unstoppable and a new digital Geneva Convention is essential to protect the world from the growing threat they pose, according to the President of the world’s biggest technology company.
In an interview with The Telegraph, Brad Smith, president of Microsoft, said the use of ‘lethal autonomous weapon systems’ poses a host of new ethical questions which need to be considered by governments as a matter of urgency.
He said the rapidly advancing technology, in which flying, swimming or walking drones can be equipped with lethal weapons systems – missiles, bombs or guns – which could be programmed to operate entirely or partially autonomously, “ultimately will spread… to many countries”.
The US, China, Israel, South Korea, Russia and the UK are all developing weapon systems with a significant degree of autonomy in the critical functions of selecting and attacking targets.
The technology is a growing focus for many militaries because replacing troops with machines can make the decision to go to war easier.
But it remains unclear who is responsible for deaths or injuries caused by a machine – the developer, manufacturer, commander or the device itself.Smith said killer robots must “not be allowed to decide on their own to engage in combat and who to kill” and argued that a new international convention needed to be drawn up to govern the use of the technology.
“The safety of civilians is at risk today. We need more urgent action, and we need it in the form of a digital Geneva Convention, rules that will protect civilians and soldiers.”
Speaking at the launch of his new book, Tools and Weapons, at the Microsoft store in London’s Oxford Circus, Smith said there was also a need for stricter international rules over the use of facial recognition technology and other emerging forms of artificial intelligence.
“There needs to be there needs to be a new law in this space, we need regulation in the world of facial recognition in order to protect against potential abuse.”
He expressed optimism about the UK’s exit from the European Union and said the economy had so far remained resilient.
Brad Smith also believes there should be more regulation around facial recognition technology.
But he said Britain’s Brexit saga was like the fourth season of a TV boxed set drama series which is going on too long and needs to end soon.
“Regardless of how one feels about Brexit, it’s not necessarily an issue that benefits from year after year of debate.
“It’s a little bit like now entering the fourth season of a TV series that hit its most suspenseful point long ago. At some point, you just want the season to conclude. It will be good to have an ending and hopefully a happier one.”
The technology to develop lethal autonomous weapons is facing a growing public backlash.
Thousands of Google employees have signed a pledge not to develop AI for use in weapons.
The international Campaign to Stop Killer Robots has doubled in size over the past year to include 113 NGOs in 57 countries.
https://www.telegraph.co.uk/technology/2019/09/21/microsoft-chief-brad-smith-says-rise-killer-robots-unstoppable/
This article originally appeared on THE TELEGRAPH: Microsoft chief Brad Smith says rise of killer robots is ‘unstoppable’
Possibilities for Consideration: Our Own Worst Enemy
I wish that I could say I was optimistic about the human race. I love us all, but we are so stupid and shortsighted that I wonder if we can lift our eyes to the world about us long enough not to commit suicide.
ISAAC ASIMOV
Many bear the opinion that science fiction luminaries be considered as prophets of sorts. If that’s the case, we know that technology will eventually become our ruler. It’s a scary thought. Rather than determining if a thing can be done, or if it should be done, we need to consider what we are going to do about it.
Take a moment and examine…
- As you reviewed the material above, what stood out to you?
- What is the potential impact, economically and/or socially?
- What action is needed to stop or support this idea?
- You may want to consider whether you:
- want to be aware of,
- should become supportive of,
- would want to be active in this topic?
Add Your Insight
I have been impressed with the urgency of doing. Knowing is not enough; we must apply.
Being willing is not enough; we must do.
LEONARDO DA VINCI