Sentencing dangerous offenders in an era of predictive technologies: New skin, same old snake?
Sentencing dangerous offenders in an era of predictive technologies: New skin, same old snake?
Predictive technologies are now used across the criminal justice system to inform risk-based decisions regarding bail, sentencing and parole as well as offender-management in prisons and in the community. However, public protection and risk considerations also provoke enduring concerns about ensuring proportionality in sentencing and about preventing unduly draconian, stigmatising and marginalising impacts on particular individuals and communities. If we are to take seriously the principle of individualised justice as desert in the liberal retributive sense, then we face serious (potentially intractable) difficulties in justifying any sort of role for predictive risk profiling and assessment, let alone sentencing based on automated algorithms drawing on big data analytics. In this respect, predictive technologies present us, not with genuinely new problems, but merely a more sophisticated iteration of established actuarial risk assessment (ARA) techniques. This chapter describes some of the reasons why principled and social justice objections to predictive, risk-based sentencing make so elusive any genuinely synthetic resolution or compromise. The fundamental question as regards predictive technologies therefore is how it might even be possible to conceive such a thing without seriously undermining fundamental principles of justice and fairness.
142-156
Cambridge University Press
Gurnham, David
f63e1a54-5924-4fd0-a3f5-521311cee101
July 2021
Gurnham, David
f63e1a54-5924-4fd0-a3f5-521311cee101
Gurnham, David
(2021)
Sentencing dangerous offenders in an era of predictive technologies: New skin, same old snake?
In,
Kohl, Uta and Eisler, Jacob
(eds.)
Data-Driven Personalisation in Markets, Politics and Law.
Cambridge University Press, .
Record type:
Book Section
Abstract
Predictive technologies are now used across the criminal justice system to inform risk-based decisions regarding bail, sentencing and parole as well as offender-management in prisons and in the community. However, public protection and risk considerations also provoke enduring concerns about ensuring proportionality in sentencing and about preventing unduly draconian, stigmatising and marginalising impacts on particular individuals and communities. If we are to take seriously the principle of individualised justice as desert in the liberal retributive sense, then we face serious (potentially intractable) difficulties in justifying any sort of role for predictive risk profiling and assessment, let alone sentencing based on automated algorithms drawing on big data analytics. In this respect, predictive technologies present us, not with genuinely new problems, but merely a more sophisticated iteration of established actuarial risk assessment (ARA) techniques. This chapter describes some of the reasons why principled and social justice objections to predictive, risk-based sentencing make so elusive any genuinely synthetic resolution or compromise. The fundamental question as regards predictive technologies therefore is how it might even be possible to conceive such a thing without seriously undermining fundamental principles of justice and fairness.
Text
8 David Gurnham chapter (accepted manuscript)
- Accepted Manuscript
More information
Published date: July 2021
Identifiers
Local EPrints ID: 450464
URI: http://eprints.soton.ac.uk/id/eprint/450464
PURE UUID: 6783c18d-5493-4659-b698-3c1a6f8dfe05
Catalogue record
Date deposited: 28 Jul 2021 16:32
Last modified: 17 Mar 2024 03:28
Export record
Contributors
Editor:
Uta Kohl
Editor:
Jacob Eisler
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics