Formal verification for human-centred trust in AI: a critical examination of current paradigms
Formal verification for human-centred trust in AI: a critical examination of current paradigms
As artificial intelligence systems increasingly permeate critical societal infrastructures, the gap between technical verification and human-centred trust has become a fundamental challenge. This position paper argues that current formal verification approaches for AI systems are fundamentally inadequate to foster genuine public trust, particularly in settings involving human interaction and socio-technical complexity. We advance three critical arguments: (1) the Trust Verification Paradox: static verification approaches fail to capture the dynamic and adaptive nature of trust; (2) the Public Technical Trust Divide: technical correctness without human understanding risks “certification theatre”; and (3) the Distributed Responsibility Crisis: existing verification paradigms struggle to account for collective outcomes and accountability. We propose a shift toward Participatory Verification, in which formal methods are extended to embed stakeholder values, support verification of trust evolution, and enable responsibility attribution. Through a formal and illustrative autonomous vehicle coordination case study, we demonstrate the expressive power of Participatory Verification and outline how trust evolution, stakeholder values, and responsibility attribution can be embedded into verification frameworks. This vision paper calls for a research agenda that bridges formal methods, human-AI interaction, and social science to support AI systems that are not only technically correct, but genuinely trustworthy.
54-64
Salehi Fathabadi, Asieh
b799ee35-4032-4e7c-b4b2-34109af8aa75
Salehi Fathabadi, Asieh
b799ee35-4032-4e7c-b4b2-34109af8aa75
Salehi Fathabadi, Asieh
(2026)
Formal verification for human-centred trust in AI: a critical examination of current paradigms.
Ahram, Tareq and Morales Casas, Adrian
(eds.)
In Human Interaction and Emerging Technologies (IHIET-AI 2026): Artificial Intelligence and Future Applications. AHFE (2026) International Conference.
vol. 1,
AHFE International.
.
(doi:10.54941/ahfe1007160).
Record type:
Conference or Workshop Item
(Paper)
Abstract
As artificial intelligence systems increasingly permeate critical societal infrastructures, the gap between technical verification and human-centred trust has become a fundamental challenge. This position paper argues that current formal verification approaches for AI systems are fundamentally inadequate to foster genuine public trust, particularly in settings involving human interaction and socio-technical complexity. We advance three critical arguments: (1) the Trust Verification Paradox: static verification approaches fail to capture the dynamic and adaptive nature of trust; (2) the Public Technical Trust Divide: technical correctness without human understanding risks “certification theatre”; and (3) the Distributed Responsibility Crisis: existing verification paradigms struggle to account for collective outcomes and accountability. We propose a shift toward Participatory Verification, in which formal methods are extended to embed stakeholder values, support verification of trust evolution, and enable responsibility attribution. Through a formal and illustrative autonomous vehicle coordination case study, we demonstrate the expressive power of Participatory Verification and outline how trust evolution, stakeholder values, and responsibility attribution can be embedded into verification frameworks. This vision paper calls for a research agenda that bridges formal methods, human-AI interaction, and social science to support AI systems that are not only technically correct, but genuinely trustworthy.
Text
978-1-964867-77-9_5
- Version of Record
More information
Accepted/In Press date: 14 January 2026
e-pub ahead of print date: 1 April 2026
Identifiers
Local EPrints ID: 510662
URI: http://eprints.soton.ac.uk/id/eprint/510662
PURE UUID: bbe053d1-cdf3-4e49-b095-ce4d942c30b5
Catalogue record
Date deposited: 15 Apr 2026 16:56
Last modified: 16 Apr 2026 01:44
Export record
Altmetrics
Contributors
Author:
Asieh Salehi Fathabadi
Editor:
Tareq Ahram
Editor:
Adrian Morales Casas
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics