Highlights

  • Sharp rise in development, deployment of lethal autonomous weapons
  • Such arms do not need human guidance to identify targets and attack
  • Critics raise risks of large-scale violence, tech malfunction, no accountability

Latest news

Regulator DGCA eases flight duty norms for pilots amid IndiGo crisis

Regulator DGCA eases flight duty norms for pilots amid IndiGo crisis

RBI raises FY26 GDP growth projection to 7.3 pc

RBI raises FY26 GDP growth projection to 7.3 pc

AAP to hold farmers’ Mahapanchayat in Gujarat's Amreli on December 7 amid crop loss crisis

AAP to hold farmers’ Mahapanchayat in Gujarat's Amreli on December 7 amid crop loss crisis

OnePlus marks 12 years in India with a new six-star lineup for the 15R reveal

OnePlus marks 12 years in India with a new six-star lineup for the 15R reveal

India not neutral, it is on side of peace: PM Modi to Putin on Ukraine conflict

India not neutral, it is on side of peace: PM Modi to Putin on Ukraine conflict

Rory McIlroy Survives Cut at Dramatic Australian Open

Rory McIlroy Survives Cut at Dramatic Australian Open

Final Season of 'Four More Shots Please!' Premieres Dec 19 on Prime Video

Final Season of 'Four More Shots Please!' Premieres Dec 19 on Prime Video

Tribeny Rai's 'Shape of Momo': Celebrating Northeast Women

Tribeny Rai's 'Shape of Momo': Celebrating Northeast Women

Doom Tech | Forget USA's 'Ninja' blade missile, these autonomous weapons don't need human operators

With artificial intelligence improving and growing rapidly, its use in warfare is increasing. Turkey's KARGU, and Russia's Lancet-3 are some examples of autonomous weapons already deployed.

Doom Tech | Forget USA's 'Ninja' blade missile, these autonomous weapons don't need human operators

Suicide drones which can identify and attack targets on their own. Robot dogs with guns. Fighter jets that can autonomously fly and fight.

If you thought military drones, controlled by soldiers sitting in dark rooms, are the future of warfare, think again.

Lethal autonomous weapons, more commonly called killer robots, or more sinisterly, slaughterbots, are on the rise. With artificial intelligence improving and growing rapidly, its use in warfare is increasing. We have already reached a point where multiple weapons are capable of attacking without human guidance or intervention.

ALSO WATCH | Doom Tech | 'Ninja' missile with blades, no blast: USA's secret weapon killed al-Qaeda chief al-Zawahiri?

One of these next-gen weapons is the KARGU rotary wing strike drone made by Turkey's STM. It can reportedly fire at targets without a human operator's command. It has already been used on the battlefield with an autonomous attack on Khalifa Haftar's forces in Libya, according to a UN report. The KARGU is a small, portable, rotary-wing kamikaze drone, programmed with artificial intelligence, and machine learning algorithms.

A similar UAV has been deployed by Russia in Ukraine - the Zala Lancet-3. It has autonomous target-seeking capability, using cameras to find targets without human guidance. It is equipped with a 3-kg warhead to destroy targets. These suicide drones fly into their target, exploding upon impact.

Going a step further, an American company is developing fighter jets with artificial intelligence. Calspan is fitting L-39 Albatross jets with AI systems to create aircraft which can even do aerial combat without human pilots. There is a plan to have a live dogfight between 4 such jets in 2024.

ALSO WATCH | Doom Tech | Why Putin is scared of birds; USA's pigeon-guided missile project: bio-weapon claim decoded

Then there's the weaponised robot dog developed by SWORD International, and Ghost Robotics. It's reportedly named SPUR, or Special Purpose Unmanned Rifle. It has an on-board sighting system and can be controlled via an app. There's a machine gun on its back, and SPUR can remotely load, and unload the first round of ammunition. It was displayed at the 2021 US Army trade show, and while reports suggest it is not yet autonomous, the robot dog seems to be one of the prime platforms for weaponised AI in the near future.

So how exactly do these autonomous weapons work?

These so-called 'slaughterbots' use artificial intelligence to identify and attack targets. The decision is made by algorithms, and not human operators. The weapons are pre-programmed to attack specific 'target profiles'. The artificial intelligence system searches for the target profile using facial recognition etc. If an object matches the target profile, an attack is launched.

Lethal autonomous weapons provide many advantages on the battlefield.
There is lower risk for human soldiers in conflict zones. Also, attack accuracy increases, and collateral damage risk is reduced. Some of these weapons are not dependent on GPS, giving them immunity in case a country's satellites are shot down by enemies.

However, the debate is on whether the risks outweigh the advantages.

Autonomous weapons can lead to large-scale violence as it will not be limited to the number of human soldiers available. Also, a technical malfunction could cause serious civilian casualties. During wars, soldiers on the ground can be held accountable, but not machines. These weapons also make it easier for the aggressors to hide their identities. Another threat is that killer robots would make ethnic cleansing easier using facial recognition etc.

These threats have led to many human rights, and peace organisations to call for a complete ban on autonomous weapons. Some international discussions have been held on the issue.

In 2014, the United Nations organised an informal meeting of experts on the issue. In 2016, a decision was taken to establish a Group of Governmental Experts, or GGE. The next year, the GGE's first meeting took place to assess questions on artificially intelligent weapons. In 2019, UN Secretary General called for a ban on autonomous arms. The same year, 11 guiding principles on AI weapons were adopted. In 2021, a UN meeting took place to set the agenda for regulation of such arms.

However, an international treaty to prohibit or impose strict controls on autonomous weapons is unlikely. This is because major powers like the United States of America, Russia, Australia, Israel, and South Korea are reportedly against a new pact. But it is not impossible. After all, countries have come together in the past to ban or control widely-used weapons.

In 1970, the Non-Proliferation Treaty came into effect to prevent the spread of nuclear arms. Also, a 2017 treaty bans nuclear weapons, but it is considered toothless since the US and other powers are not party to it. Cluster munitions are banned under the 2008 Oslo Convention. These are bombs which disperse submunitions like grenades, and mines over a large area. The 1997 Ottawa Convention bans anti-personnel landmines. It prohibits the use, stockpiling, production, and transfer of mines.

For now, the 11 guiding principles adopted at the 2019 UN meeting seem to govern the use of autonomous weapons.

The principles say that international humanitarian law applies fully to all weapons, including autonomous systems. Human responsibility must be retained for decisions on weapon usage. Human-machine interaction must ensure compliance with law. Accountability must be ensured, including through a human command-and-control chain. Countries must determine whether new weapon systems violate international law. Countries must consider the risk of weapon acquisition by terrorists, and the threat of proliferation of such systems. Risk assessment, and mitigation measures be part of the design, and use of these arms. Consideration must be given to compliance with all legal obligations. In crafting policies, autonomous arms should not be anthropomorphised. Meanwhile, UN discussions should not hamper the peaceful use of such technology. The final principle says that the UN Convention on Certain Conventional Weapons, or CCW, has an appropriate framework for AI weapons.

Given human nature, our propensity for conflict, and the potential of technology to be destructive, countries are painting a grim future if they give priority to short-term military gains, and fail to control the rise of killer robots.

ADVERTISEMENT

Up Next

Doom Tech | Forget USA's 'Ninja' blade missile, these autonomous weapons don't need human operators

Doom Tech | Forget USA's 'Ninja' blade missile, these autonomous weapons don't need human operators

Trump and Zelensky: A tumultuous relationship – a timeline since 2022

Trump and Zelensky: A tumultuous relationship – a timeline since 2022

UK slashes overseas aid in record cuts – who pays the price?

UK slashes overseas aid in record cuts – who pays the price?

RPF constable on duty with baby sparks debate: Motherhood, duty & systemic gaps

RPF constable on duty with baby sparks debate: Motherhood, duty & systemic gaps

History’s most shocking political assassinations that changed the world

History’s most shocking political assassinations that changed the world

'Vultures': opposition leaders condemn UP minister for 'PR' with soldier's mother

'Vultures': opposition leaders condemn UP minister for 'PR' with soldier's mother

ADVERTISEMENT

editorji-whatsApp

More videos

The India Story | How India can capitalise on good AI?

The India Story | How India can capitalise on good AI?

The India Story | Should Artificial Intelligence get access to the internet

The India Story | Should Artificial Intelligence get access to the internet

Decoded: the rise of AI and how it can change our world | The India Story

Decoded: the rise of AI and how it can change our world | The India Story

Decoded: editorji explains UGC’s foreign education push | The India Story

Decoded: editorji explains UGC’s foreign education push | The India Story

Not Ukraine, this conflict in Europe has more potential to cause NATO-Russia war? | Serbia v Kosovo

Not Ukraine, this conflict in Europe has more potential to cause NATO-Russia war? | Serbia v Kosovo

USA's most feared nuclear warplane out of action: why grounding of all B-2 Spirit Bombers is significant

USA's most feared nuclear warplane out of action: why grounding of all B-2 Spirit Bombers is significant

UK: 7 types of workers going on strike at Christmas peak; Rishi Sunak's 'Grinch' attack on trade unions

UK: 7 types of workers going on strike at Christmas peak; Rishi Sunak's 'Grinch' attack on trade unions

Types, special features of Indian Army's weapons deployed against China at LAC | Tawang clash

Types, special features of Indian Army's weapons deployed against China at LAC | Tawang clash

Germany coup plot: Iridium sat phones, fortune-tellers, Russian aid? Reichsbürger conspiracy decoded

Germany coup plot: Iridium sat phones, fortune-tellers, Russian aid? Reichsbürger conspiracy decoded

Taliban desperate for Indian help? Pakistan economic crisis, China debt trap tactic haunting Afghanistan?

Taliban desperate for Indian help? Pakistan economic crisis, China debt trap tactic haunting Afghanistan?

Editorji Technologies Pvt. Ltd. © 2022 All Rights Reserved.