Skip to main content
Top
Published in: International Journal of Computer Assisted Radiology and Surgery 8/2023

06-02-2023 | Ultrasound | Original Article

Spatiotemporal analysis of speckle dynamics to track invisible needle in ultrasound sequences using convolutional neural networks: a phantom study

Authors: Amin Amiri Tehrani Zade, Maryam Jalili Aziz, Hossein Majedi, Alireza Mirbagheri, Alireza Ahmadian

Published in: International Journal of Computer Assisted Radiology and Surgery | Issue 8/2023

Login to get access

Abstract

Purpose

Accurate needle placement into the target point is critical for ultrasound interventions like biopsies and epidural injections. However, aligning the needle to the thin plane of the transducer is a challenging issue as it leads to the decay of visibility by the naked eye. Therefore, we have developed a CNN-based framework to track the needle using the spatiotemporal features of the speckle dynamics.

Methods

There are three key techniques to optimize the network for our application. First, we used Gunnar-Farneback (GF) as a traditional motion field estimation technique to augment the model input with the spatiotemporal features extracted from the stack of consecutive frames. We also designed an efficient network based on the state-of-the-art Yolo framework (nYolo). Lastly, the Assisted Excitation (AE) module was added at the neck of the network to handle the imbalance problem.

Results

Fourteen freehand ultrasound sequences were collected by inserting an injection needle steeply into the Ultrasound Compatible Lumbar Epidural Simulator and Femoral Vascular Access Ezono test phantoms. We divided the dataset into two sub-categories. In the second category, in which the situation is more challenging and the needle is totally invisible, the angle and tip localization error were 2.43 ± 1.14° and 2.3 ± 1.76 mm using Yolov3+GF+AE and 2.08 ± 1.18° and 2.12 ± 1.43 mm using nYolo+GF+AE.

Conclusion

The proposed method has the potential to track the needle in a more reliable operation compared to other state-of-the-art methods and can accurately localize it in 2D B-mode US images in real time, allowing it to be used in current ultrasound intervention procedures.
Literature
10.
go back to reference Amiri Tehrani Zade A, Aziz MJ, Masoudnia S, Mirbagheri A, Ahmadian A (2022) An improved capsule network for glioma segmentation on MRI images: a curriculum learning approach. Comput Biol Med 148:105917CrossRefPubMed Amiri Tehrani Zade A, Aziz MJ, Masoudnia S, Mirbagheri A, Ahmadian A (2022) An improved capsule network for glioma segmentation on MRI images: a curriculum learning approach. Comput Biol Med 148:105917CrossRefPubMed
16.
go back to reference Derakhshani MM, Masoudnia S, Shaker AH, Mersa O, Sadeghi MA, Rastegari M, Araabi BN (2019) Assisted excitation of activations: a learning technique to improve object detectors. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 9201–9210 Derakhshani MM, Masoudnia S, Shaker AH, Mersa O, Sadeghi MA, Rastegari M, Araabi BN (2019) Assisted excitation of activations: a learning technique to improve object detectors. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 9201–9210
17.
go back to reference Hacohen G, Weinshall D (2019) On the power of curriculum learning in training deep networks. In: International conference on machine learning. PMLR, May, pp 2535–2544 Hacohen G, Weinshall D (2019) On the power of curriculum learning in training deep networks. In: International conference on machine learning. PMLR, May, pp 2535–2544
18.
go back to reference Farnebäck G (2003) Two-frame motion estimation based on polynomial expansion. Scandinavian conference on Image analysis. Springer, Berlin and Heidelberg, pp 363–370CrossRef Farnebäck G (2003) Two-frame motion estimation based on polynomial expansion. Scandinavian conference on Image analysis. Springer, Berlin and Heidelberg, pp 363–370CrossRef
19.
go back to reference Prevost R, Salehi M, Jagoda S, Kumar N, Sprung J, Ladikos A, Bauer R, Zetting O, Wein W (2018) 3D freehand ultrasound without external tracking using deep learning. Med Image Anal 48:187–202CrossRefPubMed Prevost R, Salehi M, Jagoda S, Kumar N, Sprung J, Ladikos A, Bauer R, Zetting O, Wein W (2018) 3D freehand ultrasound without external tracking using deep learning. Med Image Anal 48:187–202CrossRefPubMed
20.
go back to reference Ren S, He K, Girshick R, Sun J (2015) Faster r-cnn: towards real-time object detection with region proposal networks. Advances in neural information processing systems, 28 Ren S, He K, Girshick R, Sun J (2015) Faster r-cnn: towards real-time object detection with region proposal networks. Advances in neural information processing systems, 28
22.
23.
go back to reference Chen LC, Papandreou G, Kokkinos I, Murphy K, Yuille AL (2017) Deeplab: Semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected CRFS. IEEE Trans Pattern Anal Mach Intell 40(4):834–848CrossRefPubMed Chen LC, Papandreou G, Kokkinos I, Murphy K, Yuille AL (2017) Deeplab: Semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected CRFS. IEEE Trans Pattern Anal Mach Intell 40(4):834–848CrossRefPubMed
25.
go back to reference Lasso A, Heffter T, Rankin A, Pinter C, Ungi T, Fichtinger G (2014) PLUS: open-source toolkit for ultrasound-guided intervention systems. IEEE Trans Biomed Eng 61(10):2527–2537CrossRefPubMedPubMedCentral Lasso A, Heffter T, Rankin A, Pinter C, Ungi T, Fichtinger G (2014) PLUS: open-source toolkit for ultrasound-guided intervention systems. IEEE Trans Biomed Eng 61(10):2527–2537CrossRefPubMedPubMedCentral
Metadata
Title
Spatiotemporal analysis of speckle dynamics to track invisible needle in ultrasound sequences using convolutional neural networks: a phantom study
Authors
Amin Amiri Tehrani Zade
Maryam Jalili Aziz
Hossein Majedi
Alireza Mirbagheri
Alireza Ahmadian
Publication date
06-02-2023
Publisher
Springer International Publishing
Published in
International Journal of Computer Assisted Radiology and Surgery / Issue 8/2023
Print ISSN: 1861-6410
Electronic ISSN: 1861-6429
DOI
https://doi.org/10.1007/s11548-022-02812-y

Other articles of this Issue 8/2023

International Journal of Computer Assisted Radiology and Surgery 8/2023 Go to the issue