Transformation of PET raw data into images for event classification using convolutional neural networks
-
Published:2023
Issue:8
Volume:20
Page:14938-14958
-
ISSN:1551-0018
-
Container-title:Mathematical Biosciences and Engineering
-
language:
-
Short-container-title:MBE
Author:
Konieczka Paweł1, Raczyński Lech1, Wiślicki Wojciech1, Fedoruk Oleksandr1, Klimaszewski Konrad1, Kopka Przemysław1, Krzemień Wojciech2, Shopa Roman Y.1, Baran Jakub34, Coussat Aurélien34, Chug Neha34, Curceanu Catalina5, Czerwiński Eryk34, Dadgar Meysam34, Dulski Kamil34, Gajos Aleksander34, Hiesmayr Beatrix C.6, Kacprzak Krzysztof34, Kapłon Łukasz34, Korcyl Grzegorz34, Kozik Tomasz34, Kumar Deepak34, Niedźwiecki Szymon34, Parzych Szymon34, Río Elena Pérez del34, Sharma Sushil34, Shivani Shivani34, Skurzok Magdalena345, Stępień Ewa Łucja34, Tayefi Faranak34, Moskal Paweł34
Affiliation:
1. Department of Complex Systems, National Centre for Nuclear Research, 05-400 Świerk, Poland 2. High Energy Physics Division, National Centre for Nuclear Research, 05-400 Świerk, Poland 3. Marian Smoluchowski Institute of Physics, Jagiellonian University, 31-348 Cracow, Poland 4. Center for Theranostics, Jagiellonian University, 31-348 Cracow, Poland 5. INFN, National Laboratory of Frascati, 00044 Frascati, Italy 6. University of Vienna, Faculty of Physics, 1090 Vienna, Austria
Abstract
<abstract><p>In positron emission tomography (PET) studies, convolutional neural networks (CNNs) may be applied directly to the reconstructed distribution of radioactive tracers injected into the patient's body, as a pattern recognition tool. Nonetheless, unprocessed PET coincidence data exist in tabular format. This paper develops the transformation of tabular data into $ n $-dimensional matrices, as a preparation stage for classification based on CNNs. This method explicitly introduces a nonlinear transformation at the feature engineering stage and then uses principal component analysis to create the images. We apply the proposed methodology to the classification of simulated PET coincidence events originating from NEMA IEC and anthropomorphic XCAT phantom. Comparative studies of neural network architectures, including multilayer perceptron and convolutional networks, were conducted. The developed method increased the initial number of features from 6 to 209 and gave the best precision results (79.8$ % $) for all tested neural network architectures; it also showed the smallest decrease when changing the test data to another phantom.</p></abstract>
Publisher
American Institute of Mathematical Sciences (AIMS)
Subject
Applied Mathematics,Computational Mathematics,General Agricultural and Biological Sciences,Modeling and Simulation,General Medicine
Reference46 articles.
1. Y. Lecun, Y. Bengio, G. Hinton, Deep Learning, Nature, 521 (2015), 436—444. https://doi.org/10.1038/nature14539 2. M. Z. Alom, T. M. Taha, C. Yakopcic, S. Westberg, P. Sidike, M. S. Nasrin, et al., A state-of-the-art survey on deep learning theory and architectures, Electronics, 8 (2019), 292. https://doi.org/10.3390/electronics8030292 3. A. H. Habibi, H. E. Jahani, Guide to Convolutional Neural Networks: A Practical Application to Traffic-Sign Detection and Classification, Springer International Publishing, 2017. https://doi.org/10.1007/978-3-319-57550-6 4. K. He, X. Zhang, S. Ren, J. Sun, Deep Residual Learning for Image Recognition, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, (2016), 770–778. https://doi.org/10.1109/CVPR.2016.90 5. J. Redmon, S. Divvala, R. Girshick, A. Farhadi, You only look once: Unfied, real-time object detection, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, (2016), 779–788. https://doi.org/10.1109/CVPR.2015.7298594
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
|
|