Till startsida
Till innehåll Läs mer om hur kakor används på gu.se


Digitalized Warfare: Responsibility, Intentionality and the Rule of Law

Project leader: professor Gregor Noll

Increasingly, war is conducted by means of computer-controlled machines, such as drones. Likewise, most human decisions taken at times of war depend on digital technologies that identify targets or assess risks. Each of these technologies functions in accordance with algorithms – instructions expressed in the numerical language of computers. Our project investigates how the use of these algorithms affects traditional understandings of the rule of law at times of war. We do this by following three related strands of inquiry. First, we identify to which extent the capabilities of digital technologies are shaped by their human creators. This is important for determining which person should be responsible for the conduct of these digital technologies. Second, we study how international law should respond when algorithms behave in a manner that is unexpected, that a human code-writer did not anticipate. This matters in order to avoid that the utilization of digital technologies leads to unintended consequences. Our third strand of inquiry focusses on the problem that it is often impossible to know how a digital machine will react before it has reacted. Compared to human conduct which is regulated prospectively, the regulation of digital technologies often occurs retrospectively. This retrospective regulation creates legal uncertainty for developers and citizens. Thus, our project identifies strategies of containing these detrimental effects of digital technologies.

1. Assignment and Purpose

The digitalization of modes of warfare is in full swing. Increasingly, more or less independent machines carry out surveillance missions and attacks. Likewise, nearly every human decision taken at times of war is supported and conditioned by digital technologies. The individual digital gadgets that partake in these processes, such as drones, receive regular attention. We, however, aim to study a more fundamental set of issues which underpins not just one particular innovation but the issues related to digitalized warfare in general. Specifically, our project is organised around three strands of inquiry:

First, we seek to identify how normative choices shape the architecture of digital innovation in the realm of the laws of war. The suggestive neutrality and rationality of numerical code that controls digital gadgets could not be more deceiving. Every line of code is coloured by its human author’s convictions. And every line of code is interpreted by machines constructed by a different set of humans. While human intentionality thus shapes every stage of the process, it is a human intentionality linked to and diffused by a remarkable plurality of human actors interacting with and through digital systems (Rouvroy 2011:217). From a legal point of view, this makes it very difficult to identify to whom the authorship of a particular code should be attributed and, consequently, to determine the locus of legal responsibility. Thus, in this first strand of inquiry we seek to shed light on the complex relationship between human thought and digital code, and the legal significance of their interaction.

Whereas the first strand of inquiry focusses on the expression of human intentionality by digital code, the second strand seeks to demarcate the boundaries of normative voids within the digital architecture. To explain: as the complexity and interdependence of algorithmic processes increases, it becomes more difficult for human actors to fully understand how the programmes they wrote function and to anticipate how these algorithms will react to novel situations. The sphere of human intentions is not all-embracing and leaves ample normative space unoccupied.
Over time, this normative void will be filled by the independent practices of digital algorithms as a result of what one may call gradually developing machine intentionality. From a legal point of view this raises a multiplicity of questions. Most importantly, whether the practices resulting from machine intentionality could be bestowed with legal meaning and to which extent emerging machine intentionality will affect the legal assessment of supposedly more subjective human intentionality. It is against this background that our project seeks to unearth the contours of digital normative voids in order to contain, in as far as possible, any unintended consequences flowing from machine intentionality.

Finally, and based on the findings in response to the project’s first two strands, we seek to show how the utilization of digital war technologies, the normative fuzziness that they embody and the normative voids that they colonize, conflict with the traditional understanding of the rule of law related to the laws of war. In this regard, we pay particular attention to the link between temporality and legality. Traditionally, laws are made and assessed prospectively, that is, before they are enacted. With respect to decisions taken by machines from within a space unoccupied by human normativity, however, an assessment of a decision’s legality is only possible ex post facto – once the decision has been taken. In many situations, this might be a trivial matter. For example, when it was discovered that price-fixing algorithms on Amazon formed cartel-like structures, wronged customers could simply be compensated for any financial losses incurred. (Ezrachi and Stucke 2016: 13). In the context of the laws of war, it is often a question of life and death. Conversely, there are various actions whose legality law assesses retrospectively such as proportionality assessments of military commanders in the heat of the moment. With respect to decisions made by digital algorithms, however, the law must prospectively determine with precision which responses it will consider to be lawful. This can be problematic if it causes law to become rigid, and intolerant towards alternative, but in principle equally defensible responses to critical situations. Thus, this third strand seeks to identify strategies for ensuring to the largest possible extent that the emergence and utilisation of digital technologies does not lead to an undue impairment of the rule of law at times of war.

2. Theory and Method

With respect to each of the three strands of enquiry we will engage in a critical analysis of the applicable jurisprudence and available legal commentary. In doing so, we will combine empirical, normative and comparative approaches. Building on experience gained in previous research projects and drawing on our extensive international network, we will also reach across disciplinary boundaries and refer to related discourses in moral philosophy, theology and the history of science. Specifically, in order to explore the relationship between the interface between digital technologies and the rule of law on the one hand and the interface between humans and technology we will also make use of theories of New Materialism. In particular, the debates regarding the agentic capacity of technology and theorization of distributed agency between human and machine can be helpful for this project (Bennett 2009; Coole and Frost 2010).

3. Project Plan and Project Period

The project is planned to run for three years with each year being focussed on one of the respective three strands of inquiry. Across the three years we intend to hold two conferences. The first conference shall be held after the first 1.5 years to take stock and to adjust our focus in light of conversations and feedback from our international colleagues. The second conference takes place in the project’s final year to present and critically evaluate the results of our study. We aim to publish the insights of our project in two openly accessible monographs.

4. Contribution to International Research Front

Studies on the digitalization of the law focus on regulatory situations in peacetime (see e.g. Hildebrandt and Rouvroy 2011). The digitalization of warfare has only been discussed in limited areas, such as the development of so-called Lethal Autonomous Weapons Systems (see e.g. United Nations Office for Disarmament Affairs 2015, documenting military, legal and technical assessments). The ongoing discussion on cyberwarfare is only peripherally related to the problems emerging from digitalization.

Our project contributes to these existing studies in at least three ways. First, compared to many contemporary works that concern specific digital gadgets (see e.g. Barela 2015, O’Connell 2010) we assume a more comprehensive perspective that scrutinizes how changing weapon systems and military organization shape the future of warfare. Second, we move beyond the conventional studies that focus on determining with certainty whether digital gadgets are ultimately controlled by humans or technology (see e.g. Beard 2013, Crootof 2015). Instead, we consider the human-technology interface in a significantly more nuanced manner by unearthing the counter-dependencies between humans, analogue law and digital code. Finally, we specifically consider digitalized means of warfare with reference to the concept of the rule of law by asking what form this rule takes and what content it should have. This is a unique approach that promises to inform research on digital technologies well beyond the context of the laws of war.

5. Researcher Profiles and Functions

Professor Gregor Noll holds the chair in international law at the Lund Faculty of Law since 2005 and has been elementary in developing Lund as a brand for theoretically advanced international legal research together with a group of internationally recruited research fellows. Noll held the prestigious Pufendorf chair from 2012 to 2016. He has published the first full-length article in a refereed A-list journal exploring the impact of neurotechnology and brain-machine interfaces in weapons systems on the ability to implement the laws of war (Noll 2014). The article was well received and placed itself in the top 5% of all research outputs scored by Altmetrics.

Dr. Valentin Jeutner is a postdoctoral researcher at Lund’s Faculty of Law since 2016 and engages in foundational research on the legal characterisation of algorithm-driven decisions. He is a US-qualified attorney and holds First Class law degrees from Oxford University (BA) and Georgetown University (LLM). Upon completion of his doctoral studies at Cambridge in 2015, Jeutner was elected to a Research Fellowship at Oxford and appointed Visiting Professor at Katholieke Universiteit Leuven. He is the director in charge of data mining and visualising the jurisprudence of the UK Supreme Court for the UK Supreme Court Yearbook. Jeutner’s publications include the monograph entitled “The Concept of a Legal Dilemma” (Oxford UP, 2017) that analyses legal decision-making under dilemmatic conditions.


Ariel Ezrachi and Maurice E Stucke, Virtual Competition: The Promise and Perils of the Algorithm-Driven Economy (Harvard University Press 2016).

Diana Coole and Samantha Frost, New Materialisms: Ontology, Agency, and Politics (Duke University Press 2010).

Gregor Noll: ‘Weaponizing Neurotechnology: International Humanitarian Law and the Loss of Language’, London Review of International Law (2014) 201-231.

Jack M Beard, ‘Autonomous Weapons and Human Responsibilities’ (2013) 45 Georgetown Journal of International Law 617.

Jane Bennett, Vibrant Matter: A Political Ecology of Things (Duke University Press 2009)
Mary Ellen O’Connell, ‘The Resort to Drones under International Law’ (2010) 39 Denver Journal of International Law and Policy 585.

Mireille Hildebrandt and Antoinette Rouvroy (eds), Law, Human Agency and Autonomic Computing (Glass House/Routledge 2011).

Rebecca Crootof, ‘War, Responsibility, and Killer Robots’ (2015) 40 North Carolina Journal of International Law and Commercial Regulation 909.

Antoinette Rouvroy, ‘Technological Mediation, and Human Agency as Recalcitrance’ in Mireille Hildebrandt and Antoinette Rouvroy (eds), Law, Human Agency and Autonomic Computing (Glass House/Routledge 2011), 217-222.

Steven J Barela (ed), Legitimacy and Drones: Investigating the Legality, Morality and Efficacy of UCAVs (Ashgate 2015).

United Nations Office for Disarmament Affairs 2015, available at www.un.org/disarmament/geneva/ccw/20015-meeting-of-experts-on-laws/ (accessed on 21 January 2017).

Sidansvarig: Christine Forssell|Sidan uppdaterades: 2018-03-22

På Göteborgs universitet använder vi kakor (cookies) för att webbplatsen ska fungera på ett bra sätt för dig. Genom att surfa vidare godkänner du att vi använder kakor.  Vad är kakor?