RG4LDL: renormalization group for label distribution learning
RG4LDL: renormalization group for label distribution learning
Label distribution learning (LDL) is an effective paradigm to address label ambiguity by modeling the relevance of multiple labels to an instance. However, existing LDL methods suffer from challenges such as high model complexity, slow convergence, and limited availability of label distribution annotated training data. To tackle these issues,we propose RG4LDL, a novel framework that integrates the renormalization group (RG) principle with LDL for the first time. RG4LDL employs a restricted Boltzmann machine (RBM)-based neural network to iteratively extract relevant degrees of freedom, thereby optimizing feature learning and improving predictive accuracy. By combining unsupervised RG learning and supervised LDL prediction in an end-to-end manner, RG4LDL achieves both efficiency and effectiveness. Experimental results on 13 real-world datasets and a synthetic toy dataset demonstrate that RG4LDL significantly outperforms state-of-the-art LDL methods in terms of predictive accuracy and computational efficiency. These results highlight the potential of RG4LDL as a benchmark solution for label distribution learning tasks.
Label distribution learning, Renormalization group, Restricted Boltzmann machine, Unsupervised neural network
Tan, Chao
702e3af2-6bbb-49f1-af99-8ec8f26d588b
Chen, Sheng
9310a111-f79a-48b8-98c7-383ca93cbb80
Zhang, Jiaxi
df8d315b-b481-40ef-804f-6fb699742b0c
Xu, Zilong
977b8fac-0ea8-4ebc-956a-c024fd3eceb3
Geng, Xin
e8618d01-3413-4e73-9571-5c7f12ae1eed
Ji, Genlin
cf3bf9dd-8e1b-484e-9ccb-57f85b8b5b8e
13 May 2025
Tan, Chao
702e3af2-6bbb-49f1-af99-8ec8f26d588b
Chen, Sheng
9310a111-f79a-48b8-98c7-383ca93cbb80
Zhang, Jiaxi
df8d315b-b481-40ef-804f-6fb699742b0c
Xu, Zilong
977b8fac-0ea8-4ebc-956a-c024fd3eceb3
Geng, Xin
e8618d01-3413-4e73-9571-5c7f12ae1eed
Ji, Genlin
cf3bf9dd-8e1b-484e-9ccb-57f85b8b5b8e
Tan, Chao, Chen, Sheng, Zhang, Jiaxi, Xu, Zilong, Geng, Xin and Ji, Genlin
(2025)
RG4LDL: renormalization group for label distribution learning.
Knowledge-Based Systems, 320, [113666].
(doi:10.1016/j.knosys.2025.113666).
Abstract
Label distribution learning (LDL) is an effective paradigm to address label ambiguity by modeling the relevance of multiple labels to an instance. However, existing LDL methods suffer from challenges such as high model complexity, slow convergence, and limited availability of label distribution annotated training data. To tackle these issues,we propose RG4LDL, a novel framework that integrates the renormalization group (RG) principle with LDL for the first time. RG4LDL employs a restricted Boltzmann machine (RBM)-based neural network to iteratively extract relevant degrees of freedom, thereby optimizing feature learning and improving predictive accuracy. By combining unsupervised RG learning and supervised LDL prediction in an end-to-end manner, RG4LDL achieves both efficiency and effectiveness. Experimental results on 13 real-world datasets and a synthetic toy dataset demonstrate that RG4LDL significantly outperforms state-of-the-art LDL methods in terms of predictive accuracy and computational efficiency. These results highlight the potential of RG4LDL as a benchmark solution for label distribution learning tasks.
Text
KBS2025-accepted
- Accepted Manuscript
Restricted to Repository staff only until 10 May 2027.
Request a copy
Text
KBS2025-Jun
- Version of Record
Restricted to Repository staff only
Request a copy
More information
Accepted/In Press date: 28 April 2025
e-pub ahead of print date: 10 May 2025
Published date: 13 May 2025
Keywords:
Label distribution learning, Renormalization group, Restricted Boltzmann machine, Unsupervised neural network
Identifiers
Local EPrints ID: 502217
URI: http://eprints.soton.ac.uk/id/eprint/502217
ISSN: 0950-7051
PURE UUID: 9973437b-b730-418b-abee-412fabfe43c3
Catalogue record
Date deposited: 18 Jun 2025 16:39
Last modified: 03 Sep 2025 16:50
Export record
Altmetrics
Contributors
Author:
Chao Tan
Author:
Sheng Chen
Author:
Jiaxi Zhang
Author:
Zilong Xu
Author:
Xin Geng
Author:
Genlin Ji
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics