The University of Southampton
University of Southampton Institutional Repository

Less but better: parameter-efficient fine-tuning of large language models for personality detection

Less but better: parameter-efficient fine-tuning of large language models for personality detection
Less but better: parameter-efficient fine-tuning of large language models for personality detection
Personality detection automatically identifies an individual's personality from various data sources, such as social media texts. However, as the parameter scale of language models continues to grow, the computational cost becomes increasingly difficult to manage. Fine-tuning also grows more complex, making it harder to justify the effort and reliably predict outcomes. We introduce a novel parameter-efficient fine-tuning framework, PersLLM, to address these challenges. In PersLLM, a large language model (LLM) extracts high-dimensional representations from raw data and stores them in a dynamic memory layer. PersLLM then updates the downstream layers with a replaceable output network, enabling flexible adaptation to various personality detection scenarios. By storing the features in the memory layer, we eliminate the need for repeated complex computations by the LLM. Meanwhile, the lightweight output network serves as a proxy for evaluating the overall effectiveness of the framework, improving the predictability of results. Experimental results on key benchmark datasets like Kaggle and Pandora show that PersLLM significantly reduces computational cost while maintaining competitive performance and strong adaptability.
cs.CL, cs.LG
arXiv
Shen, Lingzhi
Long, Yunfei
6652ac59-2950-4738-b001-5e187655b0d8
Cai, Xiaohao
de483445-45e9-4b21-a4e8-b0427fc72cee
Chen, Guanming
a5c50691-6b41-4669-b2c1-01a95d1be450
Razzak, Imran
85c57ead-8a63-4aec-bba3-559a43dd5888
Jameel, Shoaib
ae3c588e-4a59-43d9-af41-ea30d7caaf96
Shen, Lingzhi
Long, Yunfei
6652ac59-2950-4738-b001-5e187655b0d8
Cai, Xiaohao
de483445-45e9-4b21-a4e8-b0427fc72cee
Chen, Guanming
a5c50691-6b41-4669-b2c1-01a95d1be450
Razzak, Imran
85c57ead-8a63-4aec-bba3-559a43dd5888
Jameel, Shoaib
ae3c588e-4a59-43d9-af41-ea30d7caaf96

[Unknown type: UNSPECIFIED]

Record type: UNSPECIFIED

Abstract

Personality detection automatically identifies an individual's personality from various data sources, such as social media texts. However, as the parameter scale of language models continues to grow, the computational cost becomes increasingly difficult to manage. Fine-tuning also grows more complex, making it harder to justify the effort and reliably predict outcomes. We introduce a novel parameter-efficient fine-tuning framework, PersLLM, to address these challenges. In PersLLM, a large language model (LLM) extracts high-dimensional representations from raw data and stores them in a dynamic memory layer. PersLLM then updates the downstream layers with a replaceable output network, enabling flexible adaptation to various personality detection scenarios. By storing the features in the memory layer, we eliminate the need for repeated complex computations by the LLM. Meanwhile, the lightweight output network serves as a proxy for evaluating the overall effectiveness of the framework, improving the predictability of results. Experimental results on key benchmark datasets like Kaggle and Pandora show that PersLLM significantly reduces computational cost while maintaining competitive performance and strong adaptability.

Text
2504.05411v1 - Author's Original
Available under License Creative Commons Attribution.
Download (1MB)

More information

Published date: 7 April 2025
Keywords: cs.CL, cs.LG

Identifiers

Local EPrints ID: 502149
URI: http://eprints.soton.ac.uk/id/eprint/502149
PURE UUID: 51d064d8-8a05-44bd-a66f-d6d38c89bfce
ORCID for Xiaohao Cai: ORCID iD orcid.org/0000-0003-0924-2834

Catalogue record

Date deposited: 17 Jun 2025 16:48
Last modified: 18 Jun 2025 02:04

Export record

Altmetrics

Contributors

Author: Lingzhi Shen
Author: Yunfei Long
Author: Xiaohao Cai ORCID iD
Author: Guanming Chen
Author: Imran Razzak
Author: Shoaib Jameel

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×