Artificial Intelligence, Capitalism, and the Logic of Harm: Toward a Critical Criminology of AI

Hart, Max and Bavin, Kyla and Lynes, Adam (2025) Artificial Intelligence, Capitalism, and the Logic of Harm: Toward a Critical Criminology of AI. Critical Criminology, 33 (3). pp. 513-532. ISSN 1205-8629

[thumbnail of s10612-025-09837-0.pdf]
Preview
Text
s10612-025-09837-0.pdf - Published Version
Available under License Creative Commons Attribution.

Download (938kB)

Abstract

This paper seeks to advance a critical criminology of artificial intelligence (AI) by exploring how AI technologies function as mechanisms of systemic harm under late capitalism. Moving beyond sensationalist concerns regarding malicious actors utilising AI for nefarious purposes, we interrogate how AI reconfigures labour, governance, and social control in ways that intensify inequality and erode worker autonomy. Drawing from ultra-realism, zemiology, and social harm theory, we introduce a new typology of AI-related harms: Datafication, Algorithmic Governance, Operational , and Existential harms. These categories reveal how AI operates not as a neutral tool but as a mechanism of pseudo-pacification that consolidates elite power while masking deepening exploitation. Through a thematic analysis of 224 sources, we demonstrate that AI’s telos—its intended good—has been corrupted by the capitalist logic of efficiency and control. We argue that criminology must urgently engage with AI’s embedded harms to remain fit for purpose in an increasingly automated world.

Item Type: Article
Identification Number: 10.1007/s10612-025-09837-0
Dates:
Date
Event
26 June 2025
Accepted
25 October 2025
Published Online
Subjects: CAH15 - social sciences > CAH15-01 - sociology, social policy and anthropology > CAH15-01-02 - sociology
Divisions: Law and Social Sciences > Criminology and Sociology > Criminology
Depositing User: Gemma Tonks
Date Deposited: 03 Nov 2025 16:13
Last Modified: 03 Nov 2025 16:13
URI: https://www.open-access.bcu.ac.uk/id/eprint/16707

Actions (login required)

View Item View Item

Research

In this section...