News

Vatican gives warning about killer robots lacking ‘humanity and public conscience’

Vatican gives warning about killer robots lacking ‘humanity and public conscience’

Envoy takes part in UN debate on autonomous weapons that choose their own targets

The Bayraktar TB2 drone is pictured on December 16, 2019 at Gecitkale Airport in Famagusta in the self-proclaimed Turkish Republic of Northern Cyprus (TRNC). The Turkish military drone was delivered to northern Cyprus today amid growing tensions over Turkey's deal with Libya that extended its claims to the gas-rich eastern Mediterranean. The Bayraktar TB2 drone landed in Gecitkale Airport in Famagusta around 0700 GMT, an AFP correspondent said, after the breakaway northern Cyprus government approved the use of the airport for unmanned aerial vehicles. It followed a deal signed last month between Libya and Turkey that could prove crucial in the scramble for recently discovered gas reserves in the eastern Mediterranean.
 / AFP / Birol BEBEK

Futuristic weapons that choose their targets without human input are dangerous because they lack “humanity and public conscience”, the Vatican has said.

It weighed in during a UN debate on military technology, in which the International Committee of the Red Cross called for new rules to control autonomous weapons.

Rules like these are viewed sceptically by the US and opposed by Russia, which said it was absurd to apply moral principles to machines.

While today’s armed drones are piloted remotely by humans, activists fear they are a precursor to entirely unmanned weapons that fire at targets by themselves.

Experts say that fully autonomous weapons do not yet exist, but activists want a treaty to prevent them from becoming a reality.

Addressing UN delegates, Vatican envoy John Putzer raised concerns that robots could fire on the wrong targets because of faulty machine-learning technology.

He said that such a machine could “deviate into targeting non-combatants” in pursuit of a ruthless efficiency.

“Let us consider the actions which require the agency of human reason – for instance, of distinction, proportionality, precaution, necessity and expected military advantage,” he said.

“The respect and application of these principles require the timely interpretation and understanding of particular contexts and situations that are hardly programmable.

“The end does not justify the means used to achieve it. How would autonomous weapons be able to respond to the principles of humanity and the dictates of public conscience?”

Sometimes described as “killer robots”, the weapons are also known as Lethal Autonomous Weapons Systems (LAWs).

The Vatican is one of dozens of states including Australia, Brazil and Mexico which have called for an outright ban on autonomous weapons.

The EU unveiled sweeping new plans to regulate AI earlier this year, based on a sliding scale of potential risks.

Chairing the meeting, Belgium said it was time to “focus on what unites us to make sure illegal weapons stay out of the future battlefields”.

A statement by the Red Cross said that using LAWs to attack humans should be banned, but that they could be used against objects or buildings under strict regulation.

Using them against humans would mean “substituting human decisions about life and death with sensor, software and machine processes”, it said.

“From a humanitarian perspective, they risk harming those affected by armed conflict, both civilians and combatants hors de combat, and they increase the risk of conflict escalation.”

The US position is more equivocal, with Washington calling for discussions on the benefits and risks of such weapons.

A US delegate at the talks in Geneva called for clarification on how the weapons are already controlled by humanitarian law.

Russia expressed its opposition to new laws and said there was a “lack of justification” for banning the weapons.

“Requiring machines to comply with principles and social consciousness would be absolutely absurd,” Moscow’s delegate said.

“The actions of LAWs are the responsibility of responsible officers that identify a task to be carried out and issue orders for the use of such weapons systems.

“We believe that existing international legal laws and regulations are adequate.”

Related Articles

Back to top button
WP Twitter Auto Publish Powered By : XYZScripts.com
× How can I help you?