Analysis and applications of deep cascade learning
Analysis and applications of deep cascade learning
This dissertation is on the analysis and applications of a constructive architecture for training Deep Neural Networks, which are usually trained by End-to-End gradient propagation with fixed depths. End-to-End training of Deep Neural Networks has proven to offer impressive performances in a number of applications such as computer vision, machine translation and in playing complex games such as GO. Cascade Learning, the approach of interest here, trains networks in a layer-wise fashion and has been demonstrated to achieve satisfactory performance in large scale tasks such as the popular ImageNet benchmark dataset, at substantially reduced computing and memory requirements. Here we focus on the nature of features extracted from Cascade Learning. By attempting to explain the process of learning using Tishby et al.s’ Information Bottleneck theory, we derive an empirical rule (Information Transition Ratio) to automatically determine a satisfactory depth for Deep Neural Networks. We suggest that Cascade Learning packs information in a hierarchical manner, with coarse features in early layers and more task specific features in later layers. This is verified by considering Transfer Learning whereby features learned from a data-rich source domain assist in learning a data-sparse target domain. Using a wide range of inference problems in medical imaging, human activity recognition and inference from single cell gene expression between mice and humans, we demonstrate that Transfer Learning from a cascade trained model outperforms results noted by previous authors. An exception to this is the single cell gene expression problem where a single hidden layer network happens to be an adequate solution.
University of Southampton
Du, Xin
9629013b-b962-4a81-bf18-7797d581fdd8
Du, Xin
9629013b-b962-4a81-bf18-7797d581fdd8
Niranjan, Mahesan
5cbaeea8-7288-4b55-a89c-c43d212ddd4f
Du, Xin
(2022)
Analysis and applications of deep cascade learning.
University of Southampton, Doctoral Thesis, 164pp.
Record type:
Thesis
(Doctoral)
Abstract
This dissertation is on the analysis and applications of a constructive architecture for training Deep Neural Networks, which are usually trained by End-to-End gradient propagation with fixed depths. End-to-End training of Deep Neural Networks has proven to offer impressive performances in a number of applications such as computer vision, machine translation and in playing complex games such as GO. Cascade Learning, the approach of interest here, trains networks in a layer-wise fashion and has been demonstrated to achieve satisfactory performance in large scale tasks such as the popular ImageNet benchmark dataset, at substantially reduced computing and memory requirements. Here we focus on the nature of features extracted from Cascade Learning. By attempting to explain the process of learning using Tishby et al.s’ Information Bottleneck theory, we derive an empirical rule (Information Transition Ratio) to automatically determine a satisfactory depth for Deep Neural Networks. We suggest that Cascade Learning packs information in a hierarchical manner, with coarse features in early layers and more task specific features in later layers. This is verified by considering Transfer Learning whereby features learned from a data-rich source domain assist in learning a data-sparse target domain. Using a wide range of inference problems in medical imaging, human activity recognition and inference from single cell gene expression between mice and humans, we demonstrate that Transfer Learning from a cascade trained model outperforms results noted by previous authors. An exception to this is the single cell gene expression problem where a single hidden layer network happens to be an adequate solution.
Text
Thesis__XIN DU_PHD_VLC_11_MAR_2022
- Version of Record
Text
Permission to deposit thesis - form
Restricted to Repository staff only
More information
Submitted date: 11 March 2022
Identifiers
Local EPrints ID: 457254
URI: http://eprints.soton.ac.uk/id/eprint/457254
PURE UUID: e68a12a8-bfee-436e-87b7-40fbfb4a2240
Catalogue record
Date deposited: 30 May 2022 16:30
Last modified: 17 Mar 2024 03:11
Export record
Contributors
Author:
Xin Du
Thesis advisor:
Mahesan Niranjan
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics