Algorithmic predictions and pre-emptive violence: artificial intelligence and the future of unmanned aerial systems

Downey, Anthony (2023) Algorithmic predictions and pre-emptive violence: artificial intelligence and the future of unmanned aerial systems. Digital War. ISSN 2662-1975

s42984-023-00068-7.pdf - Published Version
Available under License Creative Commons Attribution.

Download (518kB)


The military rationale of a pre-emptive strike is predicated upon the calculation and anticipation of threat. The underlying principle of anticipation, or prediction, is foundational to the operative logic of AI. The deployment of predictive, algorithmically driven systems in unmanned aerial systems (UAS) would therefore appear to be all but inevitable. However, the fatal interlocking of martial paradigms of pre-emption and models of predictive analysis needs to be questioned insofar as the irreparable decisiveness of a pre-emptive military strike is often at odds with the probabilistic predictions of AI. The pursuit of a human right to protect communities from aerial threats needs to therefore consider the degree to which algorithmic auguries—often erroneous but nevertheless evident in the prophetic mechanisms that power autonomous aerial apparatuses—essentially authorise and further galvanise the long-standing martial strategy of pre-emption. In the context of unmanned aerial systems, this essay will outline how AI actualises and summons forth “threats” through (i) the propositional logic of algorithms (their inclination to yield actionable directives); (ii) the systematic training of neural networks (through habitually biased methods of data-labelling); and (iii) a systemic reliance on models of statistical analysis in the structural design of machine learning (which can and do produce so-called “hallucinations”). Through defining the deterministic intentionality, systematic biases and systemic dysfunction of algorithms, I will identify how individuals and communities—configured upon and erroneously flagged through the machinations of so-called “black box” instruments—are invariably exposed to the uncertainty (or brute certainty) of imminent death based on algorithmic projections of “threat”.

Item Type: Article
Identification Number:
6 November 2023Accepted
5 December 2023Published Online
Uncontrolled Keywords: Artifical intelligence, Image processing, Drone warfare, Prediction, Pre-emption, Autonomous weapons systems
Subjects: CAH25 - design, and creative and performing arts > CAH25-01 - creative arts and design > CAH25-01-02 - art
Divisions: Faculty of Arts, Design and Media > College of Art and Design
Depositing User: Gemma Tonks
Date Deposited: 09 Feb 2024 15:06
Last Modified: 09 Feb 2024 15:06

Actions (login required)

View Item View Item


In this section...