Surrogate warfare and the transformation of war in the 2020s

Surrogate Warfare
Kevin Dooley — Flickr/CC BY 2.0

Surrogate warfare and the transformation of war in the 2020s

What surrogates can provide are the means to disrupt the battlespace kinetically, the information space subversively, and the willpower of the adversary psychologically — all without a major combat operation.

By Dr Jean-Marc Rickli, Head of Global and Emerging Risks

A snapshot of contemporary wars shows a picture that is drifting away from the Clausewitzian understanding of war.[1] Instead, the conflicts in Syria, Yemen, Afghanistan or Libya demonstrate that warfare is increasingly waged through surrogates.

The surrogate is an actor or a technological tool that absorbs the patron’s political, operational, or financial burden of conflict. The surrogate, thereby, can be a state or non-state actor, a commercial military company, a band of mercenaries or a criminal organisation, a terrorist organisation, or an insurgency group. Increasingly though, because of the potential power unleashed by the military applications of emerging technologies, surrogates are also cyber trolls and bots as well as increasingly autonomous technological platforms such as drones or robots.

All major powers see themselves drawn to conflicts that are transnational in character, geographically remote from their metropolitan heartland, against non-state actors with limited or no socio-political responsibility.

The surrogate of the 21st century is an adjunct in a growing tool box within the full-spectrum, hybrid conflicts where kinetic action is just one of many levers power protagonists employ to achieve political ends in contemporary globalised, privatised, securitised and mediatised conflicts. All major powers see themselves drawn to conflicts that are transnational in character, geographically remote from their metropolitan heartland, against non-state actors with limited or no socio-political responsibility. They do this to contain risks that are often subjectively securitised; the ‘unknown unknown’ with no tangibly existent threat, in a security environment of extended global, local and domestic scrutiny where strategic narratives of victory and defeat are often more decisive than strategic action. In this context, the surrogate can help the state patron to engage in protracted “everywhere wars” of choice to suppress intangible threats that pose potential risks to societies at home.

What surrogates can provide are the means to disrupt the battlespace kinetically, the information space subversively, and the willpower of the adversary psychologically — all without a major combat operation. What differentiates surrogate warfare from proxy warfare is that technology has become a surrogate in its own right. Developments in emerging technologies, notably in artificial intelligence, synthetic biology or neurosciences will increasingly contribute to creating true technological surrogates. This is particularly the case with the growing role of autonomy (enabled by artificial intelligence) in weapon systems. The recent conflict in Nagorno-Karabakh demonstrated the disruptive powers of weaponised drones equipped with increasingly autonomous and surveillance capabilities. In six weeks of fighting, Armenia lost 47% of its combat vehicles and 93% of its artillery. What is striking here, is that such an outcome could even be achieved by a small state.

What differentiates surrogate warfare from proxy warfare is that technology has become a surrogate in its own right.

It does not stop here. Indeed, one of the characteristics of emerging technologies is that they proliferate very quickly. Though developing these technologies is costly, once they are developed, they can scale up very rapidly. Once a digital code is released, there is no way you can stop it from proliferating. Thus, democratisation in access to emerging technologies means proliferation. This proliferation, unlike previous ones, is both horizontal (across states) and vertical (from states to non-state actors). Consequently, non-state groups and even individuals can have access to them and de facto potentially become actors of international security with a strategic impact.

Will this trend continue during this decade? Yes, it will not only continue but accelerate because emerging technologies are converging and mutually reinforcing each other. This is very visible during the current pandemic where artificial intelligence and synthetic biology contributed to speed up the search for a vaccine and their combination offers prospects to design entirely new biological systems.

Will this trend continue during this decade? Yes, it will not only continue but accelerate because emerging technologies are converging and mutually reinforcing each other.

One of the key challenges of global governance in this decade will be to put safeguards so that emerging technologies are not repurposed for malicious intent. The use of off-the-shelves commercial drones outfitted with explosives or a weapon that could fire a 40 mm munition by the Islamic State during 2016-2017 — which saw a non-state actor group challenge US tactical air superiority in a major combat zone for a defined period of time — should serve a as a wake-up call about the power of emerging technologies in the hands of malicious actors.

We are probably only at the beginning of a major transformation in the way force and violence are being used. The start of this century has demonstrated that warfare and the use of violence at scale are no longer the prerogative of states and great powers only. Entering the third decade of this century, we witness that human and technological surrogates are increasingly used by states and non-state actors alike to achieve their objectives or to deny their adversary’s objectives. What is very likely to happen during this decade is that human and technology will be increasingly merged. For instance, the ethics committee of the French Ministry of Defence has recently given its greenlight to conduct research on augmented soldiers. Human-machine teaming is becoming reality and will increasingly become a disruptive force in our daily life and in warfare alike.

The start of this century has demonstrated that warfare and the use of violence at scale are no longer the prerogative of states and great powers only.

Warfare in the 2020s is very likely to become more complex. The Clausewitzian way of war will not disappear, not the least because it is very likely that we will see rising tensions between great powers, notably between the United States and China. However, the use of violence for strategic purposes will no longer be the prerogatives of states only. Access to emerging technologies will contribute to democratise the access to means of coercion. The picture of warfare — that we can labelled neo-trinitarian — that will emerge from this evolution is thus a very complex one, where states, non-state actors and individuals will be able to wage kinetic and/or non-kinetic violence by relying on human and technological surrogates in order to manage the multiplicity of risks with limited resources, at low human, financial and thereby political costs, with a degree of deniability and local legitimacy.

[1] Clausewitz, one of the most influential political thinkers about warfare, starts with the assumption that warfare is inherently a socio-political phenomenon. He conceptualised warfare on the basis of social and political relationships between three interlinked actors which form a trinity. The trinity of society, state and soldier is tied to the delivery of security as the most essential of human needs and is inextricably linked to warfare and the necessity of individuals and communities to secure their position in a competitive international system.

 

Disclaimer: This article was first published by the Observer Research Foundation (ORF) here.

Dr. Jean-Marc Rickli is the Head of Global and Emerging Risks at the Geneva Centre for Security Policy (GCSP) in Geneva, Switzerland. He is also a research fellow at King’s College London and a non-resident fellow in modern warfare and security at TRENDS Research and Advisory in Abu Dhabi. He is a senior advisor for the AI (Artificial Intelligence) Initiative at the Future Society at Harvard Kennedy School and an expert on autonomous weapons systems for the United Nations in the framework of the Governmental Group of Experts on Lethal Autonomous Weapons Systems (LAWS). He is also a member of The IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems and the co-chair of the NATO Partnership for Peace Consortium on Emerging Security Challenges Working Group.