Abstract
Abstract
Objective. Modern PET scanners offer precise TOF information, improving the SNR of the reconstructed images. Timing calibrations are performed to reduce the worsening effects of the system components and provide valuable TOF information. Traditional calibration procedures often provide static or linear corrections, with the drawback that higher-order skews or event-to-event corrections are not addressed. Novel research demonstrated significant improvements in the reachable timing resolutions when combining conventional calibration approaches with machine learning, with the disadvantage of extensive calibration times infeasible for a clinical application. In this work, we made the first steps towards an in-system application and analyzed the effects of varying data sparsity on a machine learning timing calibration, aiming to accelerate the calibration time. Furthermore, we demonstrated the versatility of our calibration concept by applying the procedure for the first time to analog readout technology. Approach. We modified experimentally acquired calibration data used for training regarding their statistical and spatial sparsity, mimicking reduced measurement time and variability of the training data. Trained models were tested on unseen test data, characterized by fine spatial sampling and rich statistics. In total, 80 decision tree models with the same hyperparameter settings, were trained and holistically evaluated regarding data scientific, physics-based, and PET-based quality criteria. Main results. The calibration procedure can be heavily reduced from several days to some minutes without sacrificing quality and still significantly improving the timing resolution from
(
304
±
5
)
ps
to
(
216
±
1
)
ps
compared to conventionally used analytical calibration methods.Significance. This work serves as the first step in making the developed machine learning-based calibration suitable for an in-system application to profit from the method’s capabilities on the system level. Furthermore, this work demonstrates the functionality of the methodology on detectors using analog readout technology. The proposed holistic evaluation criteria here serve as a guideline for future evaluations of machine learning-based calibration approaches.