Skip to main content
Top
Published in: Neuroradiology 11/2020

01-11-2020 | Glioblastoma | Short Report

Implementation of model explainability for a basic brain tumor detection using convolutional neural networks on MRI slices

Authors: Paul Windisch, Pascal Weber, Christoph Fürweger, Felix Ehret, Markus Kufeld, Daniel Zwahlen, Alexander Muacevic

Published in: Neuroradiology | Issue 11/2020

Login to get access

Abstract

Purpose

While neural networks gain popularity in medical research, attempts to make the decisions of a model explainable are often only made towards the end of the development process once a high predictive accuracy has been achieved.

Methods

In order to assess the advantages of implementing features to increase explainability early in the development process, we trained a neural network to differentiate between MRI slices containing either a vestibular schwannoma, a glioblastoma, or no tumor.

Results

Making the decisions of a network more explainable helped to identify potential bias and choose appropriate training data.

Conclusion

Model explainability should be considered in early stages of training a neural network for medical purposes as it may save time in the long run and will ultimately help physicians integrate the network’s predictions into a clinical decision.
Literature
1.
go back to reference El Jundi B, Heinze S, Lenschow C et al (2009) The locust standard brain: a 3D standard of the central complex as a platform for neural network analysis. Front Syst Neurosci 3:21CrossRef El Jundi B, Heinze S, Lenschow C et al (2009) The locust standard brain: a 3D standard of the central complex as a platform for neural network analysis. Front Syst Neurosci 3:21CrossRef
2.
go back to reference Zhou Z, Sanders JW, Johnson JM et al (2020) Computer-aided detection of brain metastases in T1-weighted MRI for stereotactic radiosurgery using deep learning single-shot detectors. Radiology:295(2):407–415 Zhou Z, Sanders JW, Johnson JM et al (2020) Computer-aided detection of brain metastases in T1-weighted MRI for stereotactic radiosurgery using deep learning single-shot detectors. Radiology:295(2):407–415
3.
go back to reference Kickingereder P, Isensee F, Tursunova I, Petersen J, Neuberger U, Bonekamp D, Brugnara G, Schell M, Kessler T, Foltyn M, Harting I, Sahm F, Prager M, Nowosielski M, Wick A, Nolden M, Radbruch A, Debus J, Schlemmer HP, Heiland S, Platten M, von Deimling A, van den Bent MJ, Gorlia T, Wick W, Bendszus M, Maier-Hein KH (2019) Automated quantitative tumour response assessment of MRI in neuro-oncology with artificial neural networks: a multicentre, retrospective study. Lancet Oncol 20:728–740CrossRef Kickingereder P, Isensee F, Tursunova I, Petersen J, Neuberger U, Bonekamp D, Brugnara G, Schell M, Kessler T, Foltyn M, Harting I, Sahm F, Prager M, Nowosielski M, Wick A, Nolden M, Radbruch A, Debus J, Schlemmer HP, Heiland S, Platten M, von Deimling A, van den Bent MJ, Gorlia T, Wick W, Bendszus M, Maier-Hein KH (2019) Automated quantitative tumour response assessment of MRI in neuro-oncology with artificial neural networks: a multicentre, retrospective study. Lancet Oncol 20:728–740CrossRef
4.
go back to reference Jang B-S, Jeon SH, Kim IH, Kim IA (2018) Prediction of pseudoprogression versus progression using machine learning algorithm in glioblastoma. Sci Rep 8:12516CrossRef Jang B-S, Jeon SH, Kim IH, Kim IA (2018) Prediction of pseudoprogression versus progression using machine learning algorithm in glioblastoma. Sci Rep 8:12516CrossRef
5.
6.
go back to reference Selvaraju RR, Cogswell M, Das A, et al (2016) Grad-CAM: visual explanations from deep networks via gradient-based localization. arXiv [cs.CV] Selvaraju RR, Cogswell M, Das A, et al (2016) Grad-CAM: visual explanations from deep networks via gradient-based localization. arXiv [cs.CV]
7.
go back to reference Gal Y, Ghahramani Z (2015) Dropout as a Bayesian approximation: representing model uncertainty in deep learning. arXiv [statML] Gal Y, Ghahramani Z (2015) Dropout as a Bayesian approximation: representing model uncertainty in deep learning. arXiv [statML]
8.
go back to reference He K, Zhang X, Ren S, Sun J (2015) Deep residual learning for image recognition. arXiv [cs.CV] He K, Zhang X, Ren S, Sun J (2015) Deep residual learning for image recognition. arXiv [cs.CV]
9.
go back to reference Louis DN, Perry A, Reifenberger G, von Deimling A, Figarella-Branger D, Cavenee WK, Ohgaki H, Wiestler OD, Kleihues P, Ellison DW (2016) The 2016 World Health Organization classification of tumors of the central nervous system: a summary. Acta Neuropathol 131:803–820CrossRef Louis DN, Perry A, Reifenberger G, von Deimling A, Figarella-Branger D, Cavenee WK, Ohgaki H, Wiestler OD, Kleihues P, Ellison DW (2016) The 2016 World Health Organization classification of tumors of the central nervous system: a summary. Acta Neuropathol 131:803–820CrossRef
10.
go back to reference Clark K, Vendt B, Smith K, Freymann J, Kirby J, Koppel P, Moore S, Phillips S, Maffitt D, Pringle M, Tarbox L, Prior F (2013) The Cancer Imaging Archive (TCIA): maintaining and operating a public information repository. J Digit Imaging 26:1045–1057CrossRef Clark K, Vendt B, Smith K, Freymann J, Kirby J, Koppel P, Moore S, Phillips S, Maffitt D, Pringle M, Tarbox L, Prior F (2013) The Cancer Imaging Archive (TCIA): maintaining and operating a public information repository. J Digit Imaging 26:1045–1057CrossRef
Metadata
Title
Implementation of model explainability for a basic brain tumor detection using convolutional neural networks on MRI slices
Authors
Paul Windisch
Pascal Weber
Christoph Fürweger
Felix Ehret
Markus Kufeld
Daniel Zwahlen
Alexander Muacevic
Publication date
01-11-2020
Publisher
Springer Berlin Heidelberg
Published in
Neuroradiology / Issue 11/2020
Print ISSN: 0028-3940
Electronic ISSN: 1432-1920
DOI
https://doi.org/10.1007/s00234-020-02465-1

Other articles of this Issue 11/2020

Neuroradiology 11/2020 Go to the issue