Abstract
Introduction: proteins play a critical role in cellular functions, and the evaluation of protein patterns in microscope images is vital for biomedical research. This study introduces a pioneering hybrid framework for human protein classification, leveraging a combination of Fast Spectral Convolutional Neural Network (CNN) with Multi-Head Attention and GAN Augmentation. This innovative approach aims to mechanize the examination of microscope images containing mixed protein patterns, thereby accelerating biomedical research insights into human cells and diseases.
Method: the framework integrates spectral processing layers and attention mechanisms into the Fast Spec CNN architecture to enhance classification accuracy and interpretability. Through GAN augmentation, synthetic protein images are generated to complement the real dataset, bolstering model generalization and robustness. The Fast Spec CNN model, coupled with Multi-Head Attention, adeptly captures spectral features and discerns discriminative representations.
Results: the study achieved an impressive accuracy rate of 98,79 % on the Image segmentation of the Human Protein Atlas dataset, outperforming prior methodologies. The results underscore the efficacy of the suggested model in accurately classifying proteins across various hierarchical levels simultaneously. GAN augmentation enriches dataset variability and fortifies model resilience.
Conclusion: this study makes significant additions to automated biomedical image analysis, providing a valuable tool for the expedited exploration of human cells and diseases. The architectural flexibility of the emulate enables end-to-end processing of protein images, offering interpretable representations and profound insights into cellular structures and functions. Compared to earlier studies, such as UNet, DeepHiFam with ProtCNN, ProPythia, Protein Bert, ELM, and CNN, this framework performs better than others in terms of accuracy, achieving 98,79 %, the highest among the compared methodologies
Publisher
Salud, Ciencia y Tecnologia