Should Canada join the call for a ban on 'killer' robots? - Action News
Home WebMail Friday, November 22, 2024, 03:34 PM | Calgary | -10.4°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Politics

Should Canada join the call for a ban on 'killer' robots?

Peace groups hope the Liberal government will play a lead role in a global ban on killer robots. Canadian government and civilian officials were at the United Nations in Geneva this week for the third round of talks on whether to outlaw lethal autonomous weapons systems.

Debate rages over whether autonomous weapons would save lives or threaten humanity

Canada is facing pressure to join calls for a global ban on lethal autonomous weapons.

Peace groups are pushing the Liberal government to play a lead role in crafting a global ban on "killer robots."

Canadian government and civilian officials were at aUnited Nations disarmament conferencein Geneva this week for the third round of talks on whether to outlaw lethal autonomous weapons systems.

With the swift acceleration of artificial intelligence and automated technology, there are growing concernsaround the development of so-called killer robots.

Unlike drones and other technologies that are controlled remotely by humans, lethal autonomous weapons systems (LAWS) could operate independently in military missions. Proponents say with proper safeguards it could actually save lives, but critics saythey will lead to an arms race and even threaten humanity.

No place in battlefield

Yeshua Moser-Puangsuwan of Non-Violence International said while automated machines may work well on factory floors, they have no place on the battlefield.

"War is an unstructured space," he said. "Can a machine differentiate between a soldier charging and a refugee fleeing? It's hard enough for a human to do so, and requires relational thinking that is not yet occurring and isn't on the horizon."

Moser-Puangsuwan's organization is part of a global campaign, Stop Killer Robots. InGeneva they have been calling on Canada and other governmentsto implement a pre-emptive ban on the use, production and trade in autonomous weapons.

Fivecountries called for a ban on development of LAWS thisweek, bringing the total to 14.

'Deep discomfort'

Kathleen Lawand, a Canadian who is now head of the International Red Cross arms unit in Geneva, said her organization has major concerns with the ethical and legal implications of LAWS.

"The ICRC is of the view that, based on current and foreseeable technology, ensuring that autonomous weapon systems can be used in compliance with international humanitarian law will pose a formidable challenge if these weapons were to be used for more complex tasks or deployed in more dynamic environments than has been the case until now," she said.

From an ethical standpoint, there is a "sense of deep discomfort" with a machine capable of deploying deadly force and making life and death decisions on the battlefield without human input.

The ICRC wants governments, including Canada, to set clear limits on autonomy in weapon systems and assurance of some human control in deploying them.

Last July, thousands of researchers working in robotics signed an open letter calling on the UN to impose a pre-emptive ban. Among them were Stephen Hawking, Bill Gates and Elon Musk, the founder of Tesla and SpaceX.

Limit human casualties?

But Ronald Arkin, a roboticist at the Georgia Institute of Technology, said an all-out ban is premature. Instead, he suggests a careful, phased-in introduction of limited versions of the technology. If done right, he says it could lead to even more advanced precision-guided munitions that would actually limit human casualties.

"I believe that more intelligent weaponized robotic platforms can potentially outperform human warfighters with respect to adherence to international law in limited, narrow circumstances, resulting in a saving of lives of the innocent," he said. "But only if done correctly and if properly regulated."

Some think a killer robot like the one in The Terminator are a long way off, but there are growing fears about the future of lethal autonomous weapons systems. (Herwig Prammer/Reuters)

Arkin says the biggest problem is definingLAWS in terms of operational systems and capabilities.Cruise missiles, anti-tank mines, fire-and-forget munitions fall within his definition and are already in use.

"If it's the Terminator, sure I agree. But no one is proposing building the Terminator that I'm aware of. Many say that lethal autonomous weapons don't exist yet, but I believe they are currently in use by the definitions of a roboticist," he said. "So we need more concrete examples on what specific technology should be banned rather than using abstract terms such as autonomy and 'meaningful human control.'"

Retired Maj. Gen. Lewis McKenzie agrees that automated weapons could yield military advantage, but he's not convinced the benefits would outweigh the risks.

"It's not a matter of fire and forget and have faith the system is going to do what you want it to do," he said. "The system could come back at you, or be captured and the technology copied."

Paul Hannon, executive director of Mines Action Canada, urged the Liberal government to call parliamentary hearings to develop a national policy on LAWS. He also wants it included as a major agenda item with the defence policy review, and said Canada could take a leading role in a global ban.

Cautious approach

But for now, the Canadian government is taking a cautious approach on what it calls a "complex and multi-faceted" issue.

Global Affairs spokeswoman Tania Assaly said Canada remains concerned about the implications of increasing levels of autonomy in weapons systems, but that it's too soon to impose a ban.

"A ban on technology, with its important civilian implications, may not serve at this point, to fully address the operational, moral/ethical or policy or legal risks that LAWS may pose," she said.