Skip to main content
Top
Published in: European Journal of Nuclear Medicine and Molecular Imaging 12/2022

Open Access 18-06-2022 | Positron Emission Tomography | Original Article

Direct inference of Patlak parametric images in whole-body PET/CT imaging using convolutional neural networks

Authors: Neda Zaker, Kamal Haddad, Reza Faghihi, Hossein Arabi, Habib Zaidi

Published in: European Journal of Nuclear Medicine and Molecular Imaging | Issue 12/2022

Login to get access

Abstract

Purpose

This study proposed and investigated the feasibility of estimating Patlak-derived influx rate constant (Ki) from standardized uptake value (SUV) and/or dynamic PET image series.

Methods

Whole-body 18F-FDG dynamic PET images of 19 subjects consisting of 13 frames or passes were employed for training a residual deep learning model with SUV and/or dynamic series as input and Ki-Patlak (slope) images as output. The training and evaluation were performed using a nine-fold cross-validation scheme. Owing to the availability of SUV images acquired 60 min post-injection (20 min total acquisition time), the data sets used for the training of the models were split into two groups: “With SUV” and “Without SUV.” For “With SUV” group, the model was first trained using only SUV images and then the passes (starting from pass 13, the last pass, to pass 9) were added to the training of the model (one pass each time). For this group, 6 models were developed with input data consisting of SUV, SUV plus pass 13, SUV plus passes 13 and 12, SUV plus passes 13 to 11, SUV plus passes 13 to 10, and SUV plus passes 13 to 9. For the “Without SUV” group, the same trend was followed, but without using the SUV images (5 models were developed with input data of passes 13 to 9). For model performance evaluation, the mean absolute error (MAE), mean error (ME), mean relative absolute error (MRAE%), relative error (RE%), mean squared error (MSE), root mean squared error (RMSE), peak signal-to-noise ratio (PSNR), and structural similarity index (SSIM) were calculated between the predicted Ki-Patlak images by the two groups and the reference Ki-Patlak images generated through Patlak analysis using the whole acquired data sets. For specific evaluation of the method, regions of interest (ROIs) were drawn on representative organs, including the lung, liver, brain, and heart and around the identified malignant lesions.

Results

The MRAE%, RE%, PSNR, and SSIM indices across all patients were estimated as 7.45 ± 0.94%, 4.54 ± 2.93%, 46.89 ± 2.93, and 1.00 ± 6.7 × 10−7, respectively, for models predicted using SUV plus passes 13 to 9 as input. The predicted parameters using passes 13 to 11 as input exhibited almost similar results compared to the predicted models using SUV plus passes 13 to 9 as input. Yet, the bias was continuously reduced by adding passes until pass 11, after which the magnitude of error reduction was negligible. Hence, the predicted model with SUV plus passes 13 to 9 had the lowest quantification bias. Lesions invisible in one or both of SUV and Ki-Patlak images appeared similarly through visual inspection in the predicted images with tolerable bias.

Conclusion

This study concluded the feasibility of direct deep learning-based approach to estimate Ki-Patlak parametric maps without requiring the input function and with a fewer number of passes. This would lead to shorter acquisition times for WB dynamic imaging with acceptable bias and comparable lesion detectability performance.
Appendix
Available only for authorised users
Literature
1.
go back to reference Czernin J, Allen-Auerbach M, Schelbert HR. Improvements in cancer staging with PET/CT: Literature-based evidence as of september 2006. J Nucl Med. 2007;48(1_suppl):78S – 88.PubMed Czernin J, Allen-Auerbach M, Schelbert HR. Improvements in cancer staging with PET/CT: Literature-based evidence as of september 2006. J Nucl Med. 2007;48(1_suppl):78S – 88.PubMed
2.
go back to reference Zaidi H, Karakatsanis N. Towards enhanced PET quantification in clinical oncology. Br J Radiol. 2018;91(1081):20170508.CrossRef Zaidi H, Karakatsanis N. Towards enhanced PET quantification in clinical oncology. Br J Radiol. 2018;91(1081):20170508.CrossRef
3.
go back to reference Rahmim A, Lodge MA, Karakatsanis NA, Panin VY, Zhou Y, McMillan A, et al. Dynamic whole-body PET imaging: principles, potentials and applications. Eur Eur J Nucl Med Mol Imaging. 2019;46(2):501–18.CrossRef Rahmim A, Lodge MA, Karakatsanis NA, Panin VY, Zhou Y, McMillan A, et al. Dynamic whole-body PET imaging: principles, potentials and applications. Eur Eur J Nucl Med Mol Imaging. 2019;46(2):501–18.CrossRef
4.
go back to reference Patlak CS, Blasberg RG. Graphical evaluation of blood-to-brain transfer constants from multiple-time uptake data. Generalizations. J Cereb Blood Flow Meta. 1985;5(4):584–90.CrossRef Patlak CS, Blasberg RG. Graphical evaluation of blood-to-brain transfer constants from multiple-time uptake data. Generalizations. J Cereb Blood Flow Meta. 1985;5(4):584–90.CrossRef
5.
go back to reference Slifstein M, Laruelle M. Models and methods for derivation of in vivo neuroreceptor parameters with PET and SPECT reversible radiotracers. Nucl Med Biol. 2001;28(5):595–608.CrossRef Slifstein M, Laruelle M. Models and methods for derivation of in vivo neuroreceptor parameters with PET and SPECT reversible radiotracers. Nucl Med Biol. 2001;28(5):595–608.CrossRef
6.
go back to reference Bentourkia Mh, Zaidi H. Tracer kinetic modeling in PET. PET Clinics. 2007;2(2):267–77.CrossRef Bentourkia Mh, Zaidi H. Tracer kinetic modeling in PET. PET Clinics. 2007;2(2):267–77.CrossRef
7.
go back to reference Karakatsanis NA, Lodge MA, Tahari AK, Zhou Y, Wahl RL, Rahmim A. Dynamic whole-body PET parametric imaging: I Concept, acquisition protocol optimization and clinical application. Phys Med Biol. 2013;58(20):7391–418.CrossRef Karakatsanis NA, Lodge MA, Tahari AK, Zhou Y, Wahl RL, Rahmim A. Dynamic whole-body PET parametric imaging: I Concept, acquisition protocol optimization and clinical application. Phys Med Biol. 2013;58(20):7391–418.CrossRef
8.
go back to reference Fahrni G, Karakatsanis NA, Di Domenicantonio G, Garibotto V, Zaidi H. Does whole-body Patlak 18 F-FDG PET imaging improve lesion detectability in clinical oncology? Eur Radiol. 2019;29(9):4812–21.CrossRef Fahrni G, Karakatsanis NA, Di Domenicantonio G, Garibotto V, Zaidi H. Does whole-body Patlak 18 F-FDG PET imaging improve lesion detectability in clinical oncology? Eur Radiol. 2019;29(9):4812–21.CrossRef
9.
go back to reference Zaker N, Kotasidis F, Garibotto V, Zaidi H. Assessment of lesion detectability in dynamic whole-body PET imaging using compartmental and Patlak parametric mapping. Clin Nucl Med. 2020;45(5):e221–31.CrossRef Zaker N, Kotasidis F, Garibotto V, Zaidi H. Assessment of lesion detectability in dynamic whole-body PET imaging using compartmental and Patlak parametric mapping. Clin Nucl Med. 2020;45(5):e221–31.CrossRef
10.
go back to reference Osborne DR, Acuff S. Whole-body dynamic imaging with continuous bed motion PET/CT. Nucl Med Commun. 2016;37(4):428–31.CrossRef Osborne DR, Acuff S. Whole-body dynamic imaging with continuous bed motion PET/CT. Nucl Med Commun. 2016;37(4):428–31.CrossRef
11.
go back to reference Zhu W, Li Q, Bai B, Conti PS, Leahy RM. Patlak image estimation from dual time-point list-mode PET data. IEEE Trans Med Imaging. 2014;33(4):913–24.CrossRef Zhu W, Li Q, Bai B, Conti PS, Leahy RM. Patlak image estimation from dual time-point list-mode PET data. IEEE Trans Med Imaging. 2014;33(4):913–24.CrossRef
12.
go back to reference Chen K, Bandy D, Reiman E, Huang S-C, Lawson M, Feng D, et al. Noninvasive quantification of the cerebral metabolic rate for glucose using positron emission tomography, 18F-fluoro-2-deoxyglucose, the Patlak method, and an image-derived input function. J Cereb Blood Flow Metab. 1998;18(7):716–23.CrossRef Chen K, Bandy D, Reiman E, Huang S-C, Lawson M, Feng D, et al. Noninvasive quantification of the cerebral metabolic rate for glucose using positron emission tomography, 18F-fluoro-2-deoxyglucose, the Patlak method, and an image-derived input function. J Cereb Blood Flow Metab. 1998;18(7):716–23.CrossRef
13.
go back to reference Gambhir SS, Schwaiger M, Huang S-C, Krivokapich J, Schelbert HR, Nienaber CA, et al. Simple noninvasive quantification method for measuring myocardial glucose utilization in humans employing positron emission tomography and fluorine-18 deoxyglucose. J Nucl Med. 1989;30(3):359–66.PubMed Gambhir SS, Schwaiger M, Huang S-C, Krivokapich J, Schelbert HR, Nienaber CA, et al. Simple noninvasive quantification method for measuring myocardial glucose utilization in humans employing positron emission tomography and fluorine-18 deoxyglucose. J Nucl Med. 1989;30(3):359–66.PubMed
14.
go back to reference Wu H-M, Hoh CK, Choi Y, Schelbert HR, Hawkins RA, Phelps ME, et al. Factor analysis for extraction of blood time-activity curves in dynamic FDG-PET studies. J Nucl Med. 1995;36(9):1714–22.PubMed Wu H-M, Hoh CK, Choi Y, Schelbert HR, Hawkins RA, Phelps ME, et al. Factor analysis for extraction of blood time-activity curves in dynamic FDG-PET studies. J Nucl Med. 1995;36(9):1714–22.PubMed
15.
go back to reference Hove JD, Iida H, Kofoed KF, Freiberg J, Holm S, Kelbaek H. Left atrial versus left ventricular input function for quantification of the myocardial blood flow with nitrogen-13 ammonia and positron emission tomography. Eur J Nucl Med Mol Imaging. 2004;31(1):71–6.CrossRef Hove JD, Iida H, Kofoed KF, Freiberg J, Holm S, Kelbaek H. Left atrial versus left ventricular input function for quantification of the myocardial blood flow with nitrogen-13 ammonia and positron emission tomography. Eur J Nucl Med Mol Imaging. 2004;31(1):71–6.CrossRef
16.
go back to reference de Geus-Oei L-F, Visser EP, Krabbe PF, van Hoorn BA, Koenders EB, Willemsen AT, et al. Comparison of image-derived and arterial input functions for estimating the rate of glucose metabolism in therapy-monitoring 18F-FDG PET studies. J Nucl Med. 2006;47(6):945–9.PubMed de Geus-Oei L-F, Visser EP, Krabbe PF, van Hoorn BA, Koenders EB, Willemsen AT, et al. Comparison of image-derived and arterial input functions for estimating the rate of glucose metabolism in therapy-monitoring 18F-FDG PET studies. J Nucl Med. 2006;47(6):945–9.PubMed
17.
go back to reference Yamamoto H, Takemoto S, Maebatake A, Karube S, Yamashiro Y, Nakanishi A, et al. Verification of image quality and quantification in whole-body positron emission tomography with continuous bed motion. Ann Nucl Med. 2019;33(4):288–94.CrossRef Yamamoto H, Takemoto S, Maebatake A, Karube S, Yamashiro Y, Nakanishi A, et al. Verification of image quality and quantification in whole-body positron emission tomography with continuous bed motion. Ann Nucl Med. 2019;33(4):288–94.CrossRef
18.
go back to reference Kaneta T, Takai Y, Iwata R, Hakamatsuka T, Yasuda H, Nakayama K, et al. Initial evaluation of dynamic human imaging using 18 F-FRP170 as a new PET tracer for imaging hypoxia. Ann Nucl Med. 2007;21(2):101–7.CrossRef Kaneta T, Takai Y, Iwata R, Hakamatsuka T, Yasuda H, Nakayama K, et al. Initial evaluation of dynamic human imaging using 18 F-FRP170 as a new PET tracer for imaging hypoxia. Ann Nucl Med. 2007;21(2):101–7.CrossRef
19.
go back to reference Karakatsanis NA, Casey ME, Lodge MA, Rahmim A, Zaidi H. Whole-body direct 4D parametric PET imaging employing nested generalized Patlak expectation–maximization reconstruction. Phys Med Biol. 2016;61(15):5456–85.CrossRef Karakatsanis NA, Casey ME, Lodge MA, Rahmim A, Zaidi H. Whole-body direct 4D parametric PET imaging employing nested generalized Patlak expectation–maximization reconstruction. Phys Med Biol. 2016;61(15):5456–85.CrossRef
20.
go back to reference Karakatsanis NA, Lodge MA, Casey ME, Zaidi H, Rahmim A. Impact of acquisition time-window on clinical whole-body PET parametric imaging. 2014 IEEE Nuclear Science Symposium and Medical Imaging Conference (NSS/MIC). Karakatsanis NA, Lodge MA, Casey ME, Zaidi H, Rahmim A. Impact of acquisition time-window on clinical whole-body PET parametric imaging. 2014 IEEE Nuclear Science Symposium and Medical Imaging Conference (NSS/MIC).
21.
go back to reference Karakatsanis NA, Zhou Y, Lodge MA, Casey ME, Wahl RL, Zaidi H, et al. Generalized whole-body Patlak parametric imaging for enhanced quantification in clinical PET. Phys Med Biol. 2015;60(22):8643–76.CrossRef Karakatsanis NA, Zhou Y, Lodge MA, Casey ME, Wahl RL, Zaidi H, et al. Generalized whole-body Patlak parametric imaging for enhanced quantification in clinical PET. Phys Med Biol. 2015;60(22):8643–76.CrossRef
22.
go back to reference Kotasidis FA, Garibotto V, Zaidi H, editors. Hybrid whole-body dynamic TOF PET imaging for simultaneous estimation of compartmental and Patlak parametric maps from continuous bed motion data. 2016 IEEE Nuclear Science Symposium and Medical Imaging Conference (NSS/MIC). Kotasidis FA, Garibotto V, Zaidi H, editors. Hybrid whole-body dynamic TOF PET imaging for simultaneous estimation of compartmental and Patlak parametric maps from continuous bed motion data. 2016 IEEE Nuclear Science Symposium and Medical Imaging Conference (NSS/MIC).
23.
go back to reference Kotasidis FA, Manari M, Garibotto V, Zaidi H. Joint optimization of kinetic modelling and CBM acquisition parameters in hybrid whole-body dynamic PET imaging. 2017 IEEE Nuclear Science Symposium and Medical Imaging Conference (NSS/MIC). Kotasidis FA, Manari M, Garibotto V, Zaidi H. Joint optimization of kinetic modelling and CBM acquisition parameters in hybrid whole-body dynamic PET imaging. 2017 IEEE Nuclear Science Symposium and Medical Imaging Conference (NSS/MIC).
24.
go back to reference Panin V, Bal H, Defrise M, Casey M, Karakatsanis N, Rahmim A, editors. Whole body parametric imaging on clinical scanner: Direct 4D reconstruction with simultaneous attenuation estimation and time-dependent normalization. 2015 IEEE Nuclear Science Symposium and Medical Imaging Conference (NSS/MIC). Panin V, Bal H, Defrise M, Casey M, Karakatsanis N, Rahmim A, editors. Whole body parametric imaging on clinical scanner: Direct 4D reconstruction with simultaneous attenuation estimation and time-dependent normalization. 2015 IEEE Nuclear Science Symposium and Medical Imaging Conference (NSS/MIC).
25.
go back to reference Wang Q, Wang RF, Zhang J, Zhou Y. Differential diagnosis of pulmonary lesions by parametric imaging in 18F-FDG PET/CT dynamic multi-bed scanning. J BUON. 2013;18(42):928–34.PubMed Wang Q, Wang RF, Zhang J, Zhou Y. Differential diagnosis of pulmonary lesions by parametric imaging in 18F-FDG PET/CT dynamic multi-bed scanning. J BUON. 2013;18(42):928–34.PubMed
26.
go back to reference Arabi H, AkhavanAllaf A, Sanaat A, Shiri I, Zaidi H. The promise of artificial intelligence and deep learning in PET and SPECT imaging. Physica Med. 2021;83:122–37.CrossRef Arabi H, AkhavanAllaf A, Sanaat A, Shiri I, Zaidi H. The promise of artificial intelligence and deep learning in PET and SPECT imaging. Physica Med. 2021;83:122–37.CrossRef
27.
go back to reference Cheng Z, Wen J, Huang G, Yan J. Applications of artificial intelligence in nuclear medicine image generation. Quant Imaging Med Surg. 2021;11(6):2792–822.CrossRef Cheng Z, Wen J, Huang G, Yan J. Applications of artificial intelligence in nuclear medicine image generation. Quant Imaging Med Surg. 2021;11(6):2792–822.CrossRef
28.
go back to reference Zaharchuk G. Next generation research applications for hybrid PET/MR and PET/CT imaging using deep learning. Eur J Nucl Med Mol Imaging. 2019;46(13):2700–7.CrossRef Zaharchuk G. Next generation research applications for hybrid PET/MR and PET/CT imaging using deep learning. Eur J Nucl Med Mol Imaging. 2019;46(13):2700–7.CrossRef
29.
go back to reference Zaidi H, El Naqa I. Quantitative molecular Positron Emission Tomography imaging using advanced deep learning techniques. Annu Rev Biomed Eng. 2021;23:249–76.CrossRef Zaidi H, El Naqa I. Quantitative molecular Positron Emission Tomography imaging using advanced deep learning techniques. Annu Rev Biomed Eng. 2021;23:249–76.CrossRef
30.
go back to reference Smith RL, Ackerley IM, Wells K, Bartley L, Paisey S, Marshall C. Reinforcement learning for object detection in PET imaging. 2019 IEEE Nuclear Science Symposium and Medical Imaging Conference (NSS/MIC). Smith RL, Ackerley IM, Wells K, Bartley L, Paisey S, Marshall C. Reinforcement learning for object detection in PET imaging. 2019 IEEE Nuclear Science Symposium and Medical Imaging Conference (NSS/MIC).
31.
go back to reference Ackerley I, Smith R, Scuffham J, Halling-Brown M, Lewis E, Spezi E, et al. Can deep learning detect esophageal lesions in PET-CT scans? 2019 IEEE Nuclear Science Symposium and Medical Imaging Conference (NSS/MIC). Ackerley I, Smith R, Scuffham J, Halling-Brown M, Lewis E, Spezi E, et al. Can deep learning detect esophageal lesions in PET-CT scans? 2019 IEEE Nuclear Science Symposium and Medical Imaging Conference (NSS/MIC).
32.
go back to reference Feng T, Zhao Y, Dong Y, Yao S. Acceleration of whole-body Patlak parametric image reconstruction using convolutional neural network [abstract]. J Nucl Med; 2019;60(Suppl 1):518. Feng T, Zhao Y, Dong Y, Yao S. Acceleration of whole-body Patlak parametric image reconstruction using convolutional neural network [abstract]. J Nucl Med; 2019;60(Suppl 1):518.
33.
go back to reference Ho KC, Scalzo F, Sarma KV, El-Saden S, Arnold CW. A temporal deep learning approach for MR perfusion parameter estimation in stroke. 23rd International Conference on Pattern Recognition (ICPR). 2016:1315–20. Ho KC, Scalzo F, Sarma KV, El-Saden S, Arnold CW. A temporal deep learning approach for MR perfusion parameter estimation in stroke. 23rd International Conference on Pattern Recognition (ICPR). 2016:1315–20.
34.
go back to reference Das D, Coello E, Schulte RF, Menze BH, editors. Quantification of metabolites in magnetic resonance spectroscopic imaging using machine learning. International Conference on Medical Image Computing and Computer-Assisted Intervention. 2017:462–70. Das D, Coello E, Schulte RF, Menze BH, editors. Quantification of metabolites in magnetic resonance spectroscopic imaging using machine learning. International Conference on Medical Image Computing and Computer-Assisted Intervention. 2017:462–70.
35.
go back to reference Ulas C, Tetteh G, Thrippleton MJ, Armitage PA, Makin SD, Wardlaw JM, et al., editors. Direct estimation of pharmacokinetic parameters from DCE-MRI using deep CNN with forward physical model loss. International Conference on Medical Image Computing and Computer-Assisted Intervention. 2018:39–47. Ulas C, Tetteh G, Thrippleton MJ, Armitage PA, Makin SD, Wardlaw JM, et al., editors. Direct estimation of pharmacokinetic parameters from DCE-MRI using deep CNN with forward physical model loss. International Conference on Medical Image Computing and Computer-Assisted Intervention. 2018:39–47.
36.
go back to reference Zou J, Balter JM, Cao Y. Estimation of pharmacokinetic parameters from DCE-MRI by extracting long and short time-dependent features using an LSTM network. Med Phys. 2020;47(8):3447–57.CrossRef Zou J, Balter JM, Cao Y. Estimation of pharmacokinetic parameters from DCE-MRI by extracting long and short time-dependent features using an LSTM network. Med Phys. 2020;47(8):3447–57.CrossRef
37.
go back to reference Ulas C, Das D, Thrippleton MJ, Valdes Hernandez MdC, Armitage PA, Makin SD, et al. Convolutional neural networks for direct inference of pharmacokinetic parameters: Application to stroke dynamic contrast-enhanced MRI. Frontiers in Neurology. 2019;9:1147. Ulas C, Das D, Thrippleton MJ, Valdes Hernandez MdC, Armitage PA, Makin SD, et al. Convolutional neural networks for direct inference of pharmacokinetic parameters: Application to stroke dynamic contrast-enhanced MRI. Frontiers in Neurology. 2019;9:1147.
38.
go back to reference Li W, Wang G, Fidon L, Ourselin S, Cardoso MJ, Vercauteren T. On the compactness, efficiency, and representation of 3D convolutional networks: brain parcellation as a pretext task. International Conference on Information Processing in Medical Imaging. 2017; pp 348–60. Li W, Wang G, Fidon L, Ourselin S, Cardoso MJ, Vercauteren T. On the compactness, efficiency, and representation of 3D convolutional networks: brain parcellation as a pretext task. International Conference on Information Processing in Medical Imaging. 2017; pp 348–60.
39.
go back to reference Gibson E, Li W, Sudre C, Fidon L, Shakir DI, Wang G, et al. NiftyNet: a deep-learning platform for medical imaging. Comput Methods Programs Biomed. 2018;158:113–22.CrossRef Gibson E, Li W, Sudre C, Fidon L, Shakir DI, Wang G, et al. NiftyNet: a deep-learning platform for medical imaging. Comput Methods Programs Biomed. 2018;158:113–22.CrossRef
40.
go back to reference He K, Zhang X, Ren S, Sun J. Identity mappings in deep residual networks. European Conference on Computer Vision. 2016:630–45. He K, Zhang X, Ren S, Sun J. Identity mappings in deep residual networks. European Conference on Computer Vision. 2016:630–45.
41.
go back to reference Ioffe S, Szegedy C, editors. Batch normalization: Accelerating deep network training by reducing internal covariate shift. International Conference on Machine Learning. 2015; pp 448–56. Ioffe S, Szegedy C, editors. Batch normalization: Accelerating deep network training by reducing internal covariate shift. International Conference on Machine Learning. 2015; pp 448–56.
42.
go back to reference Maas AL, Hannun AY, Ng AY. Rectifier nonlinearities improve neural network acoustic models. Proc Workshop on Deep Learning for Audio, Speech and Language Processing. 2013;30:3. Maas AL, Hannun AY, Ng AY. Rectifier nonlinearities improve neural network acoustic models. Proc Workshop on Deep Learning for Audio, Speech and Language Processing. 2013;30:3.
43.
go back to reference He K, Zhang X, Ren S, Sun J, editors. Deep residual learning for image recognition. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition; 2016; pp 770–78. He K, Zhang X, Ren S, Sun J, editors. Deep residual learning for image recognition. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition; 2016; pp 770–78.
44.
go back to reference Wang T, Sun M, Hu K. Dilated residual network for image denoising. arXiv preprint arXiv:170805473. 2017. Wang T, Sun M, Hu K. Dilated residual network for image denoising. arXiv preprint arXiv:170805473. 2017.
45.
go back to reference Seo SY, Kim S-J, Oh JS, Chung J, Kim S-Y, Oh SJ, et al. Unified deep learning-based mouse brain MR segmentation: Template-based individual brain Positron Emission Tomography volumes-of-interest generation without spatial normalization in mouse Alzheimer model. Frontiers in Aging Neuroscience. 2022;14: 807903.CrossRef Seo SY, Kim S-J, Oh JS, Chung J, Kim S-Y, Oh SJ, et al. Unified deep learning-based mouse brain MR segmentation: Template-based individual brain Positron Emission Tomography volumes-of-interest generation without spatial normalization in mouse Alzheimer model. Frontiers in Aging Neuroscience. 2022;14: 807903.CrossRef
46.
go back to reference Son HJ, Oh JS, Oh M, Kim SJ, Lee J-H, Roh JH, et al. The clinical feasibility of deep learning-based classification of amyloid PET images in visually equivocal cases. Eur J Nucl Med Mol Imaging. 2020;47(2):332–41.CrossRef Son HJ, Oh JS, Oh M, Kim SJ, Lee J-H, Roh JH, et al. The clinical feasibility of deep learning-based classification of amyloid PET images in visually equivocal cases. Eur J Nucl Med Mol Imaging. 2020;47(2):332–41.CrossRef
Metadata
Title
Direct inference of Patlak parametric images in whole-body PET/CT imaging using convolutional neural networks
Authors
Neda Zaker
Kamal Haddad
Reza Faghihi
Hossein Arabi
Habib Zaidi
Publication date
18-06-2022
Publisher
Springer Berlin Heidelberg
Published in
European Journal of Nuclear Medicine and Molecular Imaging / Issue 12/2022
Print ISSN: 1619-7070
Electronic ISSN: 1619-7089
DOI
https://doi.org/10.1007/s00259-022-05867-w

Other articles of this Issue 12/2022

European Journal of Nuclear Medicine and Molecular Imaging 12/2022 Go to the issue