TY - JOUR
T1 - Dilated multilevel fused network for virus classification using transmission electron microscopy images
AU - Usman, Muhammad
AU - Sultan, Haseeb
AU - Hong, Jin Seong
AU - Kim, Seung Gu
AU - Akram, Rehan
AU - Gondal, Hafiz Ali Hamza
AU - Tariq, Muhammad Hamza
AU - Park, Kang Ryoung
N1 - Publisher Copyright:
© 2024 The Authors
PY - 2024/12
Y1 - 2024/12
N2 - Previous studies have demonstrated significant performance in the field of virus classification; however, they focused on the classification of a small number of virus classes, with a maximum of 16 classes. To address this limitation, this study aims to create a deep learning-based network that outperforms the state-of-the-art (SOTA) models for the classification of 22 different virus classes with the fewest possible trainable parameters. We introduce an automatic identification system for virus classes based on our classification-driven retrieval framework. The proposed dilated multilevel fused network (DMLF-Net) utilizes the multilevel feature fusion concept within a network to exploit more abstract features for microscopic data analysis. A multi-stage training strategy was applied to achieve optimal model convergence without overfitting the training data. We evaluated the performance of the DMLF-Net on three open databases including two virus datasets and one bacteria species dataset. The results demonstrated an accuracy of 89.89%, a weighted harmonic mean of precision and recall (F1-score) of 83.39%, and an area under the curve (AUC) of 92.50% for the 1st virus dataset. For the 2nd virus dataset, the accuracy was 80.70%, the F1-score was 81.20%, and the AUC was 86.20%. For the 3rd bacteria species dataset, the accuracy was 95.93% and the F1-score was 96.24%. DMLF-Net outperforms SOTA methods in terms of classification accuracy while utilizing nearly 5.3 times fewer trainable parameters (25.5 million) compared to the second-best model, visual geometry group (VGG)16 (134.3 million).
AB - Previous studies have demonstrated significant performance in the field of virus classification; however, they focused on the classification of a small number of virus classes, with a maximum of 16 classes. To address this limitation, this study aims to create a deep learning-based network that outperforms the state-of-the-art (SOTA) models for the classification of 22 different virus classes with the fewest possible trainable parameters. We introduce an automatic identification system for virus classes based on our classification-driven retrieval framework. The proposed dilated multilevel fused network (DMLF-Net) utilizes the multilevel feature fusion concept within a network to exploit more abstract features for microscopic data analysis. A multi-stage training strategy was applied to achieve optimal model convergence without overfitting the training data. We evaluated the performance of the DMLF-Net on three open databases including two virus datasets and one bacteria species dataset. The results demonstrated an accuracy of 89.89%, a weighted harmonic mean of precision and recall (F1-score) of 83.39%, and an area under the curve (AUC) of 92.50% for the 1st virus dataset. For the 2nd virus dataset, the accuracy was 80.70%, the F1-score was 81.20%, and the AUC was 86.20%. For the 3rd bacteria species dataset, the accuracy was 95.93% and the F1-score was 96.24%. DMLF-Net outperforms SOTA methods in terms of classification accuracy while utilizing nearly 5.3 times fewer trainable parameters (25.5 million) compared to the second-best model, visual geometry group (VGG)16 (134.3 million).
KW - Deep learning
KW - Dilated multilevel feature fusion
KW - Multi-stage training strategy
KW - Transmission electron microscopy
KW - Virus classification
UR - http://www.scopus.com/inward/record.url?scp=85204485228&partnerID=8YFLogxK
U2 - 10.1016/j.engappai.2024.109348
DO - 10.1016/j.engappai.2024.109348
M3 - Article
AN - SCOPUS:85204485228
SN - 0952-1976
VL - 138
JO - Engineering Applications of Artificial Intelligence
JF - Engineering Applications of Artificial Intelligence
M1 - 109348
ER -