Mamalakis, M. orcid.org/0000-0002-4276-4119, Banerjee, A., Ray, S. et al. (5 more authors)
(2024)
Deep multi-metric training: the need of multi-metric curve evaluation to avoid weak learning.
Neural Computing and Applications, 36 (30).
pp. 18841-18862.
ISSN 0941-0643
Abstract
The development and application of artificial intelligence-based computer vision systems in medicine, environment, and industry are playing an increasingly prominent role. Hence, the need for optimal and efficient hyperparameter tuning strategies is more than crucial to deliver the highest performance of the deep learning networks in large and demanding datasets. In our study, we have developed and evaluated a new training methodology named deep multi-metric training (DMMT) for enhanced training performance. The DMMT delivers a state of robust learning for deep networks using a new important criterion of multi-metric performance evaluation. We have tested the DMMT methodology in multi-class (three, four, and ten), multi-vendors (different X-ray imaging devices), and multi-size (large, medium, and small) datasets. The validity of the DMMT methodology has been tested in three different classification problems: (i) medical disease classification, (ii) environmental classification, and (iii) ecological classification. For disease classification, we have used two large COVID-19 chest X-rays datasets, namely the BIMCV COVID-19+ and Sheffield hospital datasets. The environmental application is related to the classification of weather images in cloudy, rainy, shine or sunrise conditions. The ecological classification task involves a classification of three animal species (cat, dog, wild) and a classification of ten animals and transportation vehicles categories (CIFAR-10). We have used state-of-the-art networks of DenseNet-121, ResNet-50, VGG-16, VGG-19, and DenResCov-19 (DenRes-131) to verify that our novel methodology is applicable in a variety of different deep learning networks. To the best of our knowledge, this is the first work that proposes a training methodology to deliver robust learning, over a variety of deep learning networks and multi-field classification problems.
Metadata
Item Type: | Article |
---|---|
Authors/Creators: |
|
Copyright, Publisher and Additional Information: | © Crown 2024. This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons. org/licenses/by/4.0/. |
Keywords: | DenRes-131; Classification; Chest X-rays; COVID-19; Robust learning; Weak learning |
Dates: |
|
Institution: | The University of Sheffield |
Academic Units: | The University of Sheffield > Faculty of Engineering (Sheffield) > Department of Computer Science (Sheffield) The University of Sheffield > Faculty of Engineering (Sheffield) > School of Electrical and Electronic Engineering The University of Sheffield > Faculty of Medicine, Dentistry and Health (Sheffield) > School of Medicine and Population Health |
Funding Information: | Funder Grant number WELLCOME TRUST (THE) 205188/Z/16/Z Engineering and Physical Sciences Research Council EP/P006566/1 |
Depositing User: | Symplectic Sheffield |
Date Deposited: | 23 Sep 2024 11:51 |
Last Modified: | 23 Sep 2024 11:51 |
Status: | Published |
Publisher: | Springer Science and Business Media LLC |
Refereed: | Yes |
Identification Number: | 10.1007/s00521-024-10182-6 |
Open Archives Initiative ID (OAI ID): | oai:eprints.whiterose.ac.uk:217528 |
Download
Filename: s00521-024-10182-6.pdf
Licence: CC-BY 4.0