A preview of this full-text is provided by Springer Nature.
Content available from Journal of Ambient Intelligence and Humanized Computing
This content is subject to copyright. Terms and conditions apply.
Vol.:(0123456789)
1 3
Journal of Ambient Intelligence and Humanized Computing (2021) 12:455–483
https://doi.org/10.1007/s12652-020-01998-w
ORIGINAL RESEARCH
Deep learning neural networks formedical image segmentation
ofbrain tumours fordiagnosis: arecent review andtaxonomy
SindhuDevunooru1· AbeerAlsadoon1· P.W.C.Chandana1· AzamBeg2
Received: 28 April 2019 / Accepted: 17 April 2020 / Published online: 6 May 2020
© Springer-Verlag GmbH Germany, par t of Springer Nature 2020
Abstract
Brain tumour identification with traditional magnetic resonance imaging (MRI) tends to be time-consuming and in most cases,
reading of the resulting images by human agents is prone to error, making it desirable to use automated image segmentation.
This is a multi-step process involving: (a) collecting data in the form of raw processed or raw images, (b) removing bias by
using pre-processing, (c) processing the image and locating the brain tumour, and (d) showing the tumour affected areas
on a computer screen or projector. Several systems have been proposed for medical image segmentation but have not been
evaluated in the field. This may be due to ongoing issues of image clarity, grey and white matter present in a scan image,
lack of knowledge of the end user and constraints arising from MRI imaging systems. This makes it imperative to develop a
comprehensive technique for the accurate diagnosis of brain tumors in MRI images. In this paper, we introduce a taxonomy
consisting of ‘Data, Image segmentation processing, and View’ (DIV) which are the major components required to develop
a high-end system for brain tumour diagnosis based on deep learning neural networks. The DIV taxonomy is evaluated based
on system completeness and acceptance. The utility of the DIV taxonomy is demonstrated by classifying 30 state-of-the-art
publications in the domain of medFical image segmentation systems based on deep neural networks. The results demonstrate
that few components of medical image segmentation systems have been validated although several have been evaluated by
identifying role and efficiency of the components in this domain.
Keywords Taxonomy· Medical image segmentation· Magnetic resonance imaging (MRI)· Brain tumour· Deep neural
networks (DNN)· Diagnosis· Image contrast· Image clustering· Re-clustering· Image pixels· Tumour boundaries
Abbreviations
MRI Magnetic resonance imaging
MCFM Modified fuzzy C-means
CLE Confocal laser endomicroscopy
CNN Convolutional neural networks
DCNN Deep conventional neural network
ACM Active contour models
CRFs Conditional random fields
FCNN Fully convolutional neural network
LHNPSO Low-discrepancy sequence initialized par-
ticle swarm optimization algorithm with
high-order nonlinear time-varying inertia
weight
KFECSB Kernelized fuzzy entropy clustering with
spatial information and bias correction
RF Classifier Random forests classifier
1 Introduction
Brain image segmentation processing is the subject of a sig-
nificant body of research that aims to develop systems for
accurate cancer diagnosis, capable of differentiating tumour
affected from healthy tissue. This is achieved by image pre-
processing, clustering, and post-segmentation processes to
enhance contrast in the raw or processed Magnetic Reso-
nance Imaging MRI data, using clustering algorithms for
automatic segmentation of images into different parts and
fine-tuning the output data to eliminate bias. This process
enhances accuracy of MRI images making tumour-affected
regions easily identifiable (Chen etal. 2017a, b) through
greater clarity of images, thus eliminating the issues faced
in manual segmentation processes.
* Abeer Alsadoon
aalsadoon@studygroup.com
1 School ofComputing andMathematics, Charles Sturt
University, Sydney Campus, Sydney, Australia
2 College ofInformation Technology, United Arab Emirates
University, AlAin, UAE
Content courtesy of Springer Nature, terms of use apply. Rights reserved.