The University of Southampton
University of Southampton Institutional Repository

A fully convolutional neural network for comprehensive compartmentalization of abdominal adipose tissue compartments in MRI

A fully convolutional neural network for comprehensive compartmentalization of abdominal adipose tissue compartments in MRI
A fully convolutional neural network for comprehensive compartmentalization of abdominal adipose tissue compartments in MRI
Background: existing literature has highlighted structural, physiological, and pathological disparities among abdominal adipose tissue (AAT) sub-depots. Accurate separation and quantification of these sub-depots are crucial for advancing our understanding of obesity and its comorbidities. However, the absence of clear boundaries between the sub-depots in medical imaging data has challenged their separation, particularly for internal adipose tissue (IAT) sub-depots. To date, the quantification of AAT sub-depots remains challenging, marked by a time-consuming, costly, and complex process.

Purpose: to implement and evaluate a convolutional neural network to enable granular assessment of AAT by compartmentalization of subcutaneous adipose tissue (SAT) into superficial subcutaneous (SSAT) and deep subcutaneous (DSAT) adipose tissue, and IAT into intraperitoneal (IPAT), retroperitoneal (RPAT), and paraspinal (PSAT) adipose tissue.

Material and methods: MRI datasets were retrospectively collected from Singapore Preconception Study for Long-Term Maternal and Child Outcomes (S-PRESTO: 389 women aged 31.4 ± 3.9 years) and Singapore Adult Metabolism Study (SAMS: 50 men aged 28.7 ± 5.7 years). For all datasets, ground truth segmentation masks were created through manual segmentation. A Res-Net based 3D-UNet was trained and evaluated via 5-fold cross-validation on S-PRESTO data (N = 300). The model's final performance was assessed on a hold-out (N = 89) and an external test set (N = 50, SAMS).

Results: the proposed method enabled reliable segmentation of individual AAT sub-depots in 3D MRI volumes with high mean Dice similarity scores of 98.3%, 97.2%, 96.5%, 96.3%, and 95.9% for SSAT, DSAT, IPAT, RPAT, and PSAT respectively.

Conclusion: convolutional neural networks can accurately sub-divide abdominal SAT into SSAT and DSAT, and abdominal IAT into IPAT, RPAT, and PSAT with high accuracy. The presented method has the potential to significantly contribute to advancements in the field of obesity imaging and precision medicine.
Abdominal fat segmentation, Convolutional neural network, Deep learning, Water-fat MRI
0010-4825
Kway, Yeshe Manuel
e8accaa8-9903-4d66-8ece-551e8fcdb2df
Thirumurugan, Kashthuri
4cb6410e-2206-4802-9504-cbe44c0420d4
Michael, Navin
fb8b79bb-696c-480c-8a52-cf5f930c4f30
Tan, Kok Hian
4714c94d-334a-42ad-b879-f3aa3a931def
Godfrey, Keith
0931701e-fe2c-44b5-8f0d-ec5c7477a6fd
Gluckman, Peter
5131bd9a-4f09-4907-9c58-a726e72a28a4
Chong, Yap-Seng
7043124b-e892-4d4b-8bb7-6d35ed94e136
Venkataraman, Kavita
48d9aa53-8792-4329-aa59-9ef67f5487d0
Khoo, Eric Yin Hao
e54fd9b7-3719-4a37-9b14-1fe5b82f0539
Khoo, Chin Meng
0c1a537b-4ccd-4971-84ee-14e478f28bea
Leow, Melvin Khee-Shing
b8c72ca5-6a27-4e19-83fe-216fe6f8c536
Tai, E Shyong
50b83bd6-3fd9-46ca-b0ea-98d79c1ba42c
Chan, Jerry KY
6553b2f7-4a35-40af-8efa-7be997bb94d2
Chan, Shiao-Yng
3c9d8970-2cc4-430a-86a7-96f6029a5293
Eriksson, Johan G.
eb96b1c5-af07-4a52-8a73-7541451d32cd
Fortier, Marielle V.
8b9dc5de-429c-4f04-908c-5b4125fa019a
Lee, Yung Seng
0e28a8d6-3085-4086-9fa1-ac0684783bcf
Velan, Sendhil
20621485-91f4-4cac-84f2-b39f51e80e45
Feng, Mengling
5487f056-fde1-460a-88f9-4471bc096682
Sadananthan, Suresh Anand
41601e35-0034-44a4-b37f-87fc92adfe79
Kway, Yeshe Manuel
e8accaa8-9903-4d66-8ece-551e8fcdb2df
Thirumurugan, Kashthuri
4cb6410e-2206-4802-9504-cbe44c0420d4
Michael, Navin
fb8b79bb-696c-480c-8a52-cf5f930c4f30
Tan, Kok Hian
4714c94d-334a-42ad-b879-f3aa3a931def
Godfrey, Keith
0931701e-fe2c-44b5-8f0d-ec5c7477a6fd
Gluckman, Peter
5131bd9a-4f09-4907-9c58-a726e72a28a4
Chong, Yap-Seng
7043124b-e892-4d4b-8bb7-6d35ed94e136
Venkataraman, Kavita
48d9aa53-8792-4329-aa59-9ef67f5487d0
Khoo, Eric Yin Hao
e54fd9b7-3719-4a37-9b14-1fe5b82f0539
Khoo, Chin Meng
0c1a537b-4ccd-4971-84ee-14e478f28bea
Leow, Melvin Khee-Shing
b8c72ca5-6a27-4e19-83fe-216fe6f8c536
Tai, E Shyong
50b83bd6-3fd9-46ca-b0ea-98d79c1ba42c
Chan, Jerry KY
6553b2f7-4a35-40af-8efa-7be997bb94d2
Chan, Shiao-Yng
3c9d8970-2cc4-430a-86a7-96f6029a5293
Eriksson, Johan G.
eb96b1c5-af07-4a52-8a73-7541451d32cd
Fortier, Marielle V.
8b9dc5de-429c-4f04-908c-5b4125fa019a
Lee, Yung Seng
0e28a8d6-3085-4086-9fa1-ac0684783bcf
Velan, Sendhil
20621485-91f4-4cac-84f2-b39f51e80e45
Feng, Mengling
5487f056-fde1-460a-88f9-4471bc096682
Sadananthan, Suresh Anand
41601e35-0034-44a4-b37f-87fc92adfe79

Kway, Yeshe Manuel, Thirumurugan, Kashthuri, Michael, Navin, Tan, Kok Hian, Godfrey, Keith, Gluckman, Peter, Chong, Yap-Seng, Venkataraman, Kavita, Khoo, Eric Yin Hao, Khoo, Chin Meng, Leow, Melvin Khee-Shing, Tai, E Shyong, Chan, Jerry KY, Chan, Shiao-Yng, Eriksson, Johan G., Fortier, Marielle V., Lee, Yung Seng, Velan, Sendhil, Feng, Mengling and Sadananthan, Suresh Anand (2023) A fully convolutional neural network for comprehensive compartmentalization of abdominal adipose tissue compartments in MRI. Computers in Biology & Medicine, 167, [107608]. (doi:10.1016/j.compbiomed.2023.107608).

Record type: Article

Abstract

Background: existing literature has highlighted structural, physiological, and pathological disparities among abdominal adipose tissue (AAT) sub-depots. Accurate separation and quantification of these sub-depots are crucial for advancing our understanding of obesity and its comorbidities. However, the absence of clear boundaries between the sub-depots in medical imaging data has challenged their separation, particularly for internal adipose tissue (IAT) sub-depots. To date, the quantification of AAT sub-depots remains challenging, marked by a time-consuming, costly, and complex process.

Purpose: to implement and evaluate a convolutional neural network to enable granular assessment of AAT by compartmentalization of subcutaneous adipose tissue (SAT) into superficial subcutaneous (SSAT) and deep subcutaneous (DSAT) adipose tissue, and IAT into intraperitoneal (IPAT), retroperitoneal (RPAT), and paraspinal (PSAT) adipose tissue.

Material and methods: MRI datasets were retrospectively collected from Singapore Preconception Study for Long-Term Maternal and Child Outcomes (S-PRESTO: 389 women aged 31.4 ± 3.9 years) and Singapore Adult Metabolism Study (SAMS: 50 men aged 28.7 ± 5.7 years). For all datasets, ground truth segmentation masks were created through manual segmentation. A Res-Net based 3D-UNet was trained and evaluated via 5-fold cross-validation on S-PRESTO data (N = 300). The model's final performance was assessed on a hold-out (N = 89) and an external test set (N = 50, SAMS).

Results: the proposed method enabled reliable segmentation of individual AAT sub-depots in 3D MRI volumes with high mean Dice similarity scores of 98.3%, 97.2%, 96.5%, 96.3%, and 95.9% for SSAT, DSAT, IPAT, RPAT, and PSAT respectively.

Conclusion: convolutional neural networks can accurately sub-divide abdominal SAT into SSAT and DSAT, and abdominal IAT into IPAT, RPAT, and PSAT with high accuracy. The presented method has the potential to significantly contribute to advancements in the field of obesity imaging and precision medicine.

Text
Manuscript - clean version - Accepted Manuscript
Available under License Creative Commons Attribution.
Download (4MB)

More information

Accepted/In Press date: 17 October 2023
e-pub ahead of print date: 18 October 2023
Published date: December 2023
Additional Information: For the purpose of Open Access, the author has applied a Creative Commons Attribution (CC BY) license to any Author Accepted Manuscript version arising from this submission. Publisher Copyright: © 2023 Elsevier Ltd
Keywords: Abdominal fat segmentation, Convolutional neural network, Deep learning, Water-fat MRI

Identifiers

Local EPrints ID: 483733
URI: http://eprints.soton.ac.uk/id/eprint/483733
ISSN: 0010-4825
PURE UUID: 31c60223-9544-4c2b-9d8c-8442e059d67e
ORCID for Keith Godfrey: ORCID iD orcid.org/0000-0002-4643-0618

Catalogue record

Date deposited: 03 Nov 2023 18:03
Last modified: 13 Apr 2024 01:33

Export record

Altmetrics

Contributors

Author: Yeshe Manuel Kway
Author: Kashthuri Thirumurugan
Author: Navin Michael
Author: Kok Hian Tan
Author: Keith Godfrey ORCID iD
Author: Peter Gluckman
Author: Yap-Seng Chong
Author: Kavita Venkataraman
Author: Eric Yin Hao Khoo
Author: Chin Meng Khoo
Author: Melvin Khee-Shing Leow
Author: E Shyong Tai
Author: Jerry KY Chan
Author: Shiao-Yng Chan
Author: Johan G. Eriksson
Author: Marielle V. Fortier
Author: Yung Seng Lee
Author: Sendhil Velan
Author: Mengling Feng
Author: Suresh Anand Sadananthan

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×