Source: Getty
commentary

India’s Normative Stance on Lethal Autonomous Weapons Systems

Considering their use and deployment in recent conflicts, this essay summarizes India’s normative stance on lethal autonomous weapons systems at multilateral forums.

Published on February 26, 2024

This essay is part of a series that highlights the main takeaways from discussions that took place at Carnegie India’s eighth Global Technology Summit, co-hosted with the Ministry of External Affairs, Government of India.


Introduction

The Azerbaijan-Armenia, Russia-Ukraine, and Israel-Hamas conflicts have brought global attention to the use of emerging technologies such as artificial intelligence (AI) in warfare, including the use of lethal autonomous weapons systems (LAWS). The concerns around equipping machines with the capability to autonomously select and engage targets have led the international community to convene a global discussion on this issue. In 2016, the Group of Governmental Experts (GGE) on LAWS was established under the framework of the Convention on Certain Conventional Weapons (CCW) at the United Nations (UN). 

As a party to the CCW, India has been an active participant at the GGE. It also chaired the forum in 2017 and 2018, when the eleven guiding principles on LAWS were being developed. Now that India is planning to build autonomous systems for its security needs, assessing its normative position on this matter assumes significance. Doing so can help understand its contribution to the ongoing process at multilateral forums and its potential to shape the global normative debate on the issue.

The CCW Framework and Recent Developments in LAWS

With 127 state parties, the CCW framework has been a key instrument of international humanitarian law (IHL), regulating and/or prohibiting the use of certain conventional weapons. Currently, the CCW contains five annexed protocols, each regulating and/or prohibiting an excessively injurious and indiscriminate weapon. Further, Article 8 of the CCW enables the adoption of any new protocol to extend the convention’s regulatory scope to new kinds of weapons. This presents the opportunity for creating a protocol to regulate LAWS in the future.

So far, the eleven principles released by the GGE guide state parties seeking to develop, deploy, and use these weapon systems and affirm the full application of IHL to LAWS. However, bodies like Human Rights Watch, Stop Killer Robots, and Article 36 have criticized the GGE for not making concrete progress in regulating LAWS. They claim that the consensus mechanism under the GGE has been used by some state parties that are developing autonomous weapons to stall the development of a legally binding international instrument that could regulate these weapons.

Several countries agree with the view that only a legally binding instrument can effectively regulate LAWS. Pursuantly, the Belén Communiqué, the CARICOM declaration, and notably the UN secretary general’s New Agenda for Peace have all emphasized the urgent need to negotiate such an instrument. The GGE’s perceived inability to build efficient protection against the use of LAWS has resulted in the debate expanding to other forums. In December 2023, the UN General Assembly (UNGA) adopted a resolution requesting that the UN secretary general seek the views of member and observer states on how to address the issues related to LAWS and submit a substantive report following this process. It also decided to include a provisional agenda on LAWS in the assembly’s seventy-nineth session in 2024, initiating a parallel process on the issue.

India’s Normative Position on LAWS

India voted against the December 2023 resolution, arguing that this would lead to a duplication of resources and efforts. So far, it has considered the GGE to be the appropriate forum for discussing emerging technologies in LAWS and maintains that the group has succeeded in bringing all relevant stakeholders to the discussion. Relatedly, India emphasizes that the GGE has produced a substantial body of work that must be built upon.

For example, considering the lack of a commonly agreed definition of LAWS, the chair of GGE compiled a non-exhaustive list of definitions and characterizations as submitted by state parties in March 2023. The GGE is also working on a two-tier approach toward these systems, prohibiting those that are not compliant with IHL and regulating those that could potentially be made compliant with IHL. Further, in 2024 and 2025, the GGE plans to amplify its efforts by working toward formulating “a set of elements of an instrument” that is based on existing CCW protocols and accounts for legal, military, and technical expertise.

Additionally, India believes that the IHL’s rules and principles take a technology-neutral approach, thus offering a sufficient framework for regulating LAWS. The five protocols of the CCW framework, which prohibit state parties from using specific kinds of weapons, illustrate the adequacy of IHL when it comes to regulating these weapons and their effects, regardless of the technology used to build them. Put simply, IHL regulates weapons based on the effects of their usage and not the underlying technology. For example, while lasers continue to be used in warfare, Protocol IV of the CCW prohibits the use of blinding laser weapons. Further, state parties to the CCW have already affirmed that the responsibility and accountability for using LAWS lie with their human operators, weakening the case for any alternative or new regulatory approach outside IHL.

India also views calls for a legally binding instrument that would regulate LAWS as premature, considering that a process to understand the impact of this technology is still underway. At present, member states are still working on a common definition as well as a characterization of LAWS at the GGE. To emphasize the importance of arriving at a definition first, India cited the example of the draft Comprehensive Convention on International Terrorism it tabled at the UN in 1996. It has remained unadopted until now simply due to the lack of a definition for the term “international terrorism.”

Instead, based on the 2019 guiding principles, India has supported the call for a “political declaration,” noting that a high-level declaration will lead state parties to formulate national policies and regulatory frameworks through which these guidelines can be implemented. Such a view is reflected by the United States, which released a voluntary political declaration to promote the responsible use of AI and autonomy in the military in 2023. While it voted for the UNGA resolution, the United States does not support parallel processes on LAWS and values the GGE as central to these discussions. Similarly, Russia too has opposed any binding instrument, claiming the GGE to be the sole forum for dealing with this issue. 

Finally, India considers the blanket condemnation of these technologies as counterproductive since it may stigmatize them. Rather, it stresses the positive effects of these technologies, noting that autonomy in weapons can accrue more precision and efficiency, thus avoiding human errors. This approach dovetails with the observation made in a recent report by the United Nations Institute for Disarmament Research, which emphasized that military use cases of AI go beyond autonomous weapons to areas such as logistics and data processing. As submitted by some state parties, AI-enabled autonomy is a function and not a weapon by itself. Therefore, regulations must concern certain end uses, such as weapon systems that cause superfluous injury, and not the underlying technology. 

Reaching a Consensus

India’s decision to treat the GGE as the legitimate forum for regulating LAWS is convincing because it seeks to avoid a multiplicity of forums outside the GGE and attempts to build upon a vast body of prior work. Furthermore, it also accounts for the more beneficial aspects of using AI in the military. As we see a parallel process emerge at the UNGA in 2024, state parties, including India, must ensure that the momentum generated in the GGE translates into a consensus on understanding and regulating LAWS.

Carnegie does not take institutional positions on public policy issues; the views represented herein are those of the author(s) and do not necessarily reflect the views of Carnegie, its staff, or its trustees.