The University of Southampton
University of Southampton Institutional Repository

Could artificial intelligence write mental health nursing care plans?

Could artificial intelligence write mental health nursing care plans?
Could artificial intelligence write mental health nursing care plans?

Background: artificial intelligence (AI) is being increasingly used and discussed in care contexts. ChatGPT has gained significant attention in popular and scientific literature although how ChatGPT can be used in care-delivery is not yet known.

Aims: to use artificial intelligence (ChatGPT) to create a mental health nursing care plan and evaluate the quality of the output against the authors’ clinical experience and existing guidance.

Materials & methods:basic text commands were input into ChatGPT about a fictitious person called ‘Emily’ who presents with self-injurious behaviour. The output from ChatGPT was then evaluated against the authors’ clinical experience and current (national) care guidance.

Results: ChatGPT was able to provide a care plan that incorporated some principles of dialectical behaviour therapy, but the output had significant errors and limitations and thus there is a reasonable likelihood of harm if used in this way.

Discussion: AI use is increasing in direct-care contexts through the use of chatbots or other means. However, AI can inhibit clinician to care-recipient engagement, ‘recycle’ existing stigma, and introduce error, which may thus diminish the ability for care to uphold personhood and therefore lead to significant avoidable harms.

Conclusion: use of AI in this context should be avoided until a point where policy and guidance can safeguard the wellbeing of care recipients and the sophistication of AI output has increased. Given ChatGPT’s ability to provide superficially reasonable outputs there is a risk that errors may go unnoticed and thus increase the likelihood of patient harms. Further research evaluating AI output is needed to consider how AI may be used safely in care delivery.
art of nursing, nursing role, quality of care, self-harm, therapeutic relationships
1351-0126
79-86
Woodnutt, Samuel
dbb6678a-2b2b-4e7c-9a12-f9d838555116
Allen, Chris
b7924cd0-80a6-4379-9915-720e0a124e78
Snowden, Jasmine
b19720c9-d6e0-4208-b145-60fecfa3760e
Flynn, Matt
5a7c40b8-eb0d-4d80-9793-08a1c3254ebc
Hall, Simon
734099e5-1bf8-4a64-9440-96805ca3459b
Libberton, Paula
d6c21e87-26b1-4842-bb74-d897de7dba14
Purvis, Francesca
b42c5cce-4b59-4515-9681-56991221f034
Woodnutt, Samuel
dbb6678a-2b2b-4e7c-9a12-f9d838555116
Allen, Chris
b7924cd0-80a6-4379-9915-720e0a124e78
Snowden, Jasmine
b19720c9-d6e0-4208-b145-60fecfa3760e
Flynn, Matt
5a7c40b8-eb0d-4d80-9793-08a1c3254ebc
Hall, Simon
734099e5-1bf8-4a64-9440-96805ca3459b
Libberton, Paula
d6c21e87-26b1-4842-bb74-d897de7dba14
Purvis, Francesca
b42c5cce-4b59-4515-9681-56991221f034

Woodnutt, Samuel, Allen, Chris, Snowden, Jasmine, Flynn, Matt, Hall, Simon, Libberton, Paula and Purvis, Francesca (2024) Could artificial intelligence write mental health nursing care plans? Journal of Psychiatric and Mental Health Nursing, 31 (1), 79-86. (doi:10.1111/jpm.12965).

Record type: Article

Abstract


Background: artificial intelligence (AI) is being increasingly used and discussed in care contexts. ChatGPT has gained significant attention in popular and scientific literature although how ChatGPT can be used in care-delivery is not yet known.

Aims: to use artificial intelligence (ChatGPT) to create a mental health nursing care plan and evaluate the quality of the output against the authors’ clinical experience and existing guidance.

Materials & methods:basic text commands were input into ChatGPT about a fictitious person called ‘Emily’ who presents with self-injurious behaviour. The output from ChatGPT was then evaluated against the authors’ clinical experience and current (national) care guidance.

Results: ChatGPT was able to provide a care plan that incorporated some principles of dialectical behaviour therapy, but the output had significant errors and limitations and thus there is a reasonable likelihood of harm if used in this way.

Discussion: AI use is increasing in direct-care contexts through the use of chatbots or other means. However, AI can inhibit clinician to care-recipient engagement, ‘recycle’ existing stigma, and introduce error, which may thus diminish the ability for care to uphold personhood and therefore lead to significant avoidable harms.

Conclusion: use of AI in this context should be avoided until a point where policy and guidance can safeguard the wellbeing of care recipients and the sophistication of AI output has increased. Given ChatGPT’s ability to provide superficially reasonable outputs there is a risk that errors may go unnoticed and thus increase the likelihood of patient harms. Further research evaluating AI output is needed to consider how AI may be used safely in care delivery.

Text
Psychiatric Ment Health Nurs - 2023 - Woodnutt - Version of Record
Available under License Creative Commons Attribution.
Download (307kB)
Text
Psychiatric Ment Health Nurs - 2023 - Woodnutt - Could artificial intelligence write mental health nursing care plans - Version of Record
Available under License Creative Commons Attribution.
Download (307kB)

More information

Accepted/In Press date: 23 July 2023
e-pub ahead of print date: 4 August 2023
Published date: February 2024
Keywords: art of nursing, nursing role, quality of care, self-harm, therapeutic relationships

Identifiers

Local EPrints ID: 480625
URI: http://eprints.soton.ac.uk/id/eprint/480625
ISSN: 1351-0126
PURE UUID: ce0bb154-c8ca-415f-9bf4-108fc5251b06
ORCID for Samuel Woodnutt: ORCID iD orcid.org/0000-0001-6821-3158
ORCID for Chris Allen: ORCID iD orcid.org/0000-0002-1296-8989
ORCID for Jasmine Snowden: ORCID iD orcid.org/0000-0001-5290-4587
ORCID for Matt Flynn: ORCID iD orcid.org/0009-0005-7354-7490
ORCID for Paula Libberton: ORCID iD orcid.org/0000-0002-7512-2411
ORCID for Francesca Purvis: ORCID iD orcid.org/0009-0006-0151-2456

Catalogue record

Date deposited: 08 Aug 2023 16:31
Last modified: 10 Apr 2024 02:11

Export record

Altmetrics

Contributors

Author: Samuel Woodnutt ORCID iD
Author: Chris Allen ORCID iD
Author: Jasmine Snowden ORCID iD
Author: Matt Flynn ORCID iD
Author: Simon Hall
Author: Paula Libberton ORCID iD
Author: Francesca Purvis ORCID iD

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×