REDAffectiveLM: leveraging affect enriched embedding and transformer-based neural language model for readers' emotion detection
REDAffectiveLM: leveraging affect enriched embedding and transformer-based neural language model for readers' emotion detection
Technological advancements in web platforms allow people to express and share emotions toward textual write-ups written and shared by others. This brings about different interesting domains for analysis, emotion expressed by the writer and emotion elicited from the readers. In this paper, we propose a novel approach for readers’ emotion detection from short-text documents using a deep learning model called REDAffectiveLM. Within state-of-the-art NLP tasks, it is well understood that utilizing context-specific representations from transformer-based pre-trained language models helps achieve improved performance. Within this affective computing task, we explore how incorporating affective information can further enhance performance. Toward this, we leverage context-specific and affect enriched representations by using a transformer-based pre-trained language model in tandem with affect enriched Bi-LSTM+Attention. For empirical evaluation, we procure a new dataset REN-20k, besides using RENh-4k and SemEval-2007. We evaluate the performance of our REDAffectiveLM rigorously across these datasets, against a vast set of state-of-the-art baselines, where our model consistently outperforms baselines and obtains statistically significant results. Our results establish that utilizing affect enriched representation along with context-specific representation within a neural architecture can considerably enhance readers’ emotion detection. Since the impact of affect enrichment specifically in readers’ emotion detection isn’t well explored, we conduct a detailed analysis over affect enriched Bi-LSTM+Attention using qualitative and quantitative model behavior evaluation techniques. We observe that compared to conventional semantic embedding, affect enriched embedding increases the ability of the network to effectively identify and assign weightage to the key terms responsible for readers’ emotion detection to improve prediction.
Affect enriched embedding, Affective Computing, Deep learning (DL), Language model, Readers' emotion detection, Textual emotion detection, Deep learning, Affective computing, Readers’ emotion detection
7495-7525
Kadan, Anoop
9cc17e26-a329-49fe-b73b-2fce75084966
Deepak, P.
80ebb63c-91a6-4500-8e03-9d806262049d
P. Gangan, Manjary
f1f79b4a-2662-4f0c-ad33-dbb0cbf2512b
Sam Abraham, Savitha
615cca2d-7df1-416d-9048-03749ecfa73e
Lajish, V.L.
034cc3e6-c98a-4e9c-ab30-4729948b55c2
19 August 2024
Kadan, Anoop
9cc17e26-a329-49fe-b73b-2fce75084966
Deepak, P.
80ebb63c-91a6-4500-8e03-9d806262049d
P. Gangan, Manjary
f1f79b4a-2662-4f0c-ad33-dbb0cbf2512b
Sam Abraham, Savitha
615cca2d-7df1-416d-9048-03749ecfa73e
Lajish, V.L.
034cc3e6-c98a-4e9c-ab30-4729948b55c2
Kadan, Anoop, Deepak, P., P. Gangan, Manjary, Sam Abraham, Savitha and Lajish, V.L.
(2024)
REDAffectiveLM: leveraging affect enriched embedding and transformer-based neural language model for readers' emotion detection.
Knowledge and Information Systems, 66 (12), .
(doi:10.1007/s10115-024-02194-4).
Abstract
Technological advancements in web platforms allow people to express and share emotions toward textual write-ups written and shared by others. This brings about different interesting domains for analysis, emotion expressed by the writer and emotion elicited from the readers. In this paper, we propose a novel approach for readers’ emotion detection from short-text documents using a deep learning model called REDAffectiveLM. Within state-of-the-art NLP tasks, it is well understood that utilizing context-specific representations from transformer-based pre-trained language models helps achieve improved performance. Within this affective computing task, we explore how incorporating affective information can further enhance performance. Toward this, we leverage context-specific and affect enriched representations by using a transformer-based pre-trained language model in tandem with affect enriched Bi-LSTM+Attention. For empirical evaluation, we procure a new dataset REN-20k, besides using RENh-4k and SemEval-2007. We evaluate the performance of our REDAffectiveLM rigorously across these datasets, against a vast set of state-of-the-art baselines, where our model consistently outperforms baselines and obtains statistically significant results. Our results establish that utilizing affect enriched representation along with context-specific representation within a neural architecture can considerably enhance readers’ emotion detection. Since the impact of affect enrichment specifically in readers’ emotion detection isn’t well explored, we conduct a detailed analysis over affect enriched Bi-LSTM+Attention using qualitative and quantitative model behavior evaluation techniques. We observe that compared to conventional semantic embedding, affect enriched embedding increases the ability of the network to effectively identify and assign weightage to the key terms responsible for readers’ emotion detection to improve prediction.
Text
8_REDAffectiveLM_KAIS-1
- Accepted Manuscript
Available under License Other.
More information
Accepted/In Press date: 23 July 2024
e-pub ahead of print date: 19 August 2024
Published date: 19 August 2024
Keywords:
Affect enriched embedding, Affective Computing, Deep learning (DL), Language model, Readers' emotion detection, Textual emotion detection, Deep learning, Affective computing, Readers’ emotion detection
Identifiers
Local EPrints ID: 495953
URI: http://eprints.soton.ac.uk/id/eprint/495953
ISSN: 0219-1377
PURE UUID: 4a4dc7c8-d499-4246-8a94-4c15b7726b8f
Catalogue record
Date deposited: 28 Nov 2024 17:31
Last modified: 19 Aug 2025 04:01
Export record
Altmetrics
Contributors
Author:
Anoop Kadan
Author:
P. Deepak
Author:
Manjary P. Gangan
Author:
Savitha Sam Abraham
Author:
V.L. Lajish
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics