Access to this full-text is provided by Springer Nature.
Content available from Clinical and Experimental Medicine
This content is subject to copyright. Terms and conditions apply.
Vol.:(0123456789)
Clinical and Experimental Medicine (2024) 24:110
https://doi.org/10.1007/s10238-024-01377-1
RESEARCH
The diagnostic value ofmultimodal imaging based onMR combined
withultrasound inbenign andmalignant breast diseases
DongBai1· NanZhou2· XiaofeiLiu3· YuanziLiang1· XiaojunLu1· JiajunWang2· LeiLiang2· ZhiqunWang1
Received: 11 March 2024 / Accepted: 13 May 2024
© The Author(s) 2024
Abstract
We aimed to construct and validate a multimodality MRI combined with ultrasound based on radiomics for the evaluation
of benign and malignant breast diseases. The preoperative enhanced MRI and ultrasound images of 131 patients with breast
diseases confirmed by pathology in Aerospace Center Hospital from January 2021 to August 2023 were retrospectively
analyzed, including 73 benign diseases and 58 malignant diseases. Ultrasound and 3.0T multiparameter MRI scans were
performed in all patients. Then, all the data were divided into training set and validation set in a 7:3 ratio. Regions of inter-
est were drawn layer by layer based on ultrasound and MR enhanced sequences to extract radiomics features. The optimal
radiomic features were selected by the best feature screening method. Logistic Regression classifier was used to establish
models according to the best features, including ultrasound model, MRI model, ultrasound combined with MRI model. The
model efficacy was evaluated by the area under the curve (AUC) of the receiver operating characteristic, sensitivity, specific-
ity, and accuracy. The F-test based on ANOVA was used to screen out 20 best ultrasonic features, 11 best MR Features, and
14 best features from the combined model. Among them, texture features accounted for the largest proportion, accounting
for 79%.The ultrasound combined with MR Image fusion model based on logistic regression classifier had the best diag-
nostic performance. The AUC of the training group and the validation group were 0.92 and 091, the sensitivity was 0.80
and 0.67, the specificity was 0.90 and 0.94, and the accuracy was 0.84 and 0.79, respectively. It was better than the simple
ultrasound model (AUC of validation set was 0.82) or the simple MR model (AUC of validation set was 0.85). Compared
with the traditional ultrasound or magnetic resonance diagnosis of breast diseases, the multimodal model of MRI combined
with ultrasound based on radiomics can more accurately predict the benign and malignant breast diseases, thus providing a
better basis for clinical diagnosis and treatment.
Keywords Breast disease· Magnetic resonance imaging· Ultrasound· Radiomics· Multimodal
Introduction
The anatomy of the mammary gland consists of different
blood vessels, connective tissue, milk ducts, lobules, and
lymphatic vessels. When breast tissue grows abnormally
and cell differentiation becomes uncontrolled, tumors form
in the milk ducts or lobules. The developing tumor may be
benign or malignant [1]. Breast cancer is the most common
type of cancer and the biggest cause of death for women [2].
According to the statistics published by the World Health
Organization (WHO), among 1,350,000 cases of breast can-
cer, there are 460,000 deaths worldwide every year [3]. The
incidence of breast cancer has been on the rise for most of
the last four decades, increasing by 0.5% per year in the most
recent data year (2010–2019), driven primarily by local-
ized breast cancer and hormone receptor-positive disease
Dong Bai and Nan Zhou authors have contributed equally to this
work.
* Lei Liang
lianglei_csk@126.com
* Zhiqun Wang
439387525@qq.com
1 Department ofRadiology, Aerospace Center Hospital,
Beijing, China
2 Department ofUltrasound, Aerospace Center Hospital,
Beijing, China
3 Department ofRadiology, Liangxiang Hospital,
FangshanDistrict,Beijing, China
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Clinical and Experimental Medicine (2024) 24:110 110 Page 2 of 11
[4]. From 1975 to 1989, the overall breast cancer death rate
increased by 0.4% per year, but by 2020 it had decreased
by 43%. The decline in breast cancer mortality is attributed
to better and more targeted treatment and early detection
through screening breast imaging [5, 6].
With the development of imaging technology, a variety
of imaging methods can be applied to the early screening of
breast disease, such as breast X-ray mammography, mag-
netic resonance imaging, positron emission tomography,
computed tomography, ultrasound.
Histopathology is the gold standard for the diagnosis of
breast disease. Ultrasound has gradually become the main
examination method of breast diseases because it can be
diagnosed from different angles and directions in real time,
without the radiation of breast diseases. MRI, especially
enhanced MRI, can show the details of breast tissue in more
detail, and make the diagnosis of suspicious lesions and the
scope of invasion. However, there are many deficiencies in
the diagnosis of breast cancer by conventional ultrasound
or MRI alone. To a certain extent, there are overlapping
images of benign and malignant breast tumors, and the lack
of specificity of malignant tumors also leads to a decrease
in the accuracy of clinical diagnosis of breast tumors, result-
ing in low early detection rate of breast cancer and loss of
the best opportunity for treatment. This is also an impor-
tant reason for the poor prognosis of some breast cancer
patients. At the same time, manual examination of breast
images may reduce diagnostic efficiency and increase mis-
diagnosis or missed diagnosis due to the large number of
images or experience level. Therefore, it is very important to
find new techniques and methods for the diagnosis of breast
mass. In recent years, artificial intelligence technology has
made great progress in the automatic analysis of abnormal
detection of medical images. Compared with manual inspec-
tion, automatic image analysis based on artificial intelligence
avoids tedious and time-consuming screening process and
effectively captures valuable relevant information from a
large amount of image data [7, 8].
Radiomics technology, which is developed from com-
puter-aided detection/diagnosis (CAD) technology, can
characterize the heterogeneity of tumors by mining mas-
sive quantitative imaging features to assist physicians
in making clinical decisions [9]. In the processing and
analysis of breast tumor images, radiomics methods have
also been widely used. Generally speaking, the process-
ing flow mainly includes image preprocessing, tumor
segmentation, feature extraction and optimization, clas-
sification model construction and testing, and other steps
[10]. With the continuous development of modern imag-
ing technology, the comprehensive application of different
examination methods is helpful to improve the accuracy of
diagnosis, so as to provide better clinical services. Fusion
multimodal imaging examination technology can provide
more information support for the diagnosis of breast dis-
eases, which is a new imaging examination scheme for
the diagnosis of breast diseases. Multimodal examina-
tion combined with artificial intelligence is used to play
the advantages of different examination techniques and
improve the accuracy of clinical diagnosis. However, there
are few studies on the multimodal examination methods
of MR combined with ultrasound based on radiomics in
benign and malignant breast diseases. ROC curve analy-
sis was not carried out in the existing studies, and the
diagnostic effectiveness of different diagnostic schemes
could not be clearly presented. In view of this research
status, this study will focus on breast diseases and explore
the diagnostic efficacy of multimodal radiomics based on
ultrasound alone, MRI alone, and combined ultrasound
and MRI application, in order to provide more reference
for the clinical diagnosis of breast diseases. In view of
this research status, this study takes breast diseases as
the main research object, and explores the diagnostic effi-
cacy of radiomics based multimodal imaging examination
methods of ultrasound alone, MRI alone, and combined
ultrasound and MRI application, in order to provide more
reference for the clinical diagnosis of breast diseases.
Materials andmethods
General information
The clinical and imaging data of 200 patients with breast
diseases confirmed by pathology in Aerospace Center Hos-
pital from January 2021 to August 2023 were retrospec-
tively analyzed, including 100 cases of benign diseases
and 100 cases of malignant diseases. Inclusion criteria
were: (1) benign and malignant breast diseases confirmed
by surgery or needle biopsy; (2) enhanced breast MR and
ultrasound before operation; (3) complete clinical data;
(4) no history of other tumors.Exclusion criteria: (1) pre-
operative chemoradiotherapy or endocrine therapy; (2)
The clinicopathological data is not perfect or the image
quality is poor, and the artifacts are heavy; (3) There are
other benign and malignant tumors; (4) Previous unilateral
mastectomy. Finally, 131 cases were included in the study
(Fig.1). A total of 131 patients were randomly divided
into training group (92 cases) and validation group (39
cases) according to the ratio of 7∶3. Clinical informa-
tion was collected, including age and pathological type of
breast disease. This study has been approved by the ethics
committee of the Aerospace Center Hospital, and informed
consent was waived because this study was a retrospective
study and patient information was treated privately.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Clinical and Experimental Medicine (2024) 24:110 Page 3 of 11 110
MRI andultrasound equipment parameters
All patients underwent a 3T Siemens Skyra examination
with a 16-channel breast specific phased-array surface coil,
and the patient was placed in the prone position with the
head entering the scanner first. The patient’s breast was
naturally suspended in the coil, and the nipple was kept
in the center of the coil.The scanning range was selected
from both sides of the breast tissue, aorta and axilla.The
specific scanning parameters were as follows: (1) axial
T1WI parameters: Field of view (FOV) 360mm × 360mm,
excitation frequency 3 times, matrix 320 × 320, interval
of layers 1mm, slice thickness 1.2mm, flip angle 80°,TR
5.37ms, TE 2.46 ms; axial STIR T2WI parameters: TE
63ms, TR 3,700ms, Slice thickness 4mm; (2) diffusion
weighted imaging (DWI) was performed with a single
excitation spin-echo echo planar imaging (SE-EPI) with
the following parameters: FOV 360mm × 360mm, slice
thickness 5mm,TE 58ms, excitation times 6 times, interval
1mm,TR 5980ms,b values selected as 0 and 1 000s/mm2,
and apparent diffusion coefficient (ADC) was calculated. (3)
Dynamic enhanced MRI adopted the 3D rapid gradient echo
sequence, The parameters were as follows: NEX 0.7, FOV
350mm × 350mm, No-interval scan, Slice thickness 1mm,
TR 4.36ms,TE 1.56ms. After the normal scan was com-
pleted, dynamic enhancement scan was performed. A high
pressure syringe was used to inject 0.2mmol/kg of contrast
agent gadolinium pentic acid into the intravenous mass at
the rate of 2mL/s, and 20mL of normal saline was injected
at the same rate. Then dynamic scanning was performed in
7 scanning phases, the first phase was the enhanced front
mask, and the second phase was the enhanced phase. The
acquisition time of each dynamic phase was 60s.
All patients underwent breast ultrasound examination
within 1week before surgery. LOGIQ precise E9 ultrasonic
diagnostic instrument with probe frequency of 6 ~ 15MHz
was used. The patient was placed in the horizontal position,
the probe was placed vertically on the breast skin surface,
and the probe was scanned radially with the nipple as a
multi-section plane. On the basis of obtaining a good two-
dimensional image, the maximum surface of the nodules
was displayed by superimposing color and energy Doppler
blood flow imaging. According to the size and shape of the
nodules, the maximum long-axis section of the lesion was
stored in the image in the form of DICOM. All images were
collected and evaluated by two physicians with more than
6years of experience in breast ultrasound diagnosis, and
the final classification results were obtained. The frequency,
focus, gain and time compensation curve were adjusted to
achieve the best image quality.
Image acquisition andradiomics feature extraction
Ultrasound and MRI enhanced early images were imported
from the picture archiving and communication system
(PACS) system as “*.DICOM” into the Deepwise Multi-
modal Research Platform version 2.2 (https:// keyan.deep-
wise.com) (Beijing Deepwise and League of PHD Technol-
ogy Co., Ltd, Beijing, China). It is an integrated machine
learning platform for medical data analysis using the mature
python pyradiomics (version 3.0.1) and scikit-learn (version
0.22) packages. A radiologist and sonographer (Reader A,
with 6years of experience in breast disease diagnosis and
no knowledge of pathological findings) reviewed all images
and semi-automatically mapped the area of interest (ROI), a
senior radiologist and sonographer (Reader B, with 16years
of experience in breast disease interpretation, Also unaware
of the pathology) reviewed all ROI for viewer A segmenta-
tion. Viewers A and B negotiated the controversial case to
reach an agreement. Fifty lesions were randomly selected
and resegmented by the senior radiologist and sonographer
(reader B) to calculate the interclass correlation coefficient
(ICC). Features with an ICC > 0.75 were considered to have
Fig. 1 Patients selection flow
chart
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Clinical and Experimental Medicine (2024) 24:110 110 Page 4 of 11
good agreement. The specific flow chart of radiomics analy-
sis in this study was as follows (Fig.2).
To eliminate the scale differences in image radiomics
features, feature standardization was performed before fea-
ture selection. For feature selection, ten preprocessed image
methods were selected including the raw filtering, laplacian
of gaussian (LoG) filtering, wavelet filtering, gradient cal-
culation, local binary patterns (LBP) in both 2D and 3D,
square, square root, logarithm, and exponential. The fea-
tures analyzed in this study included first-order features,
shape-based features and texture features, including gray
level co-occurrence matrix (GLCM), gray level run-length
matrix (GLRLM), gray level size zone matrix (GLSZM),
neighborhood gray-tone difference matrix (NGTDM), gray
level dependence matrix (GLDM). We conducted feature
correlation analysis and eliminated features with correlation
coefficient below 0.9.When the linear correlation coefficient
between any two independent variables on the training set
was greater than this threshold, one of the features would be
removed, which aimed to reduce the redundancy between
features. The dependent variable had high priority reten-
tion rate and high linear correlation coefficient. The features
through feature correlation analysis were used for further
model training. After the feature correlation analysis, we
chose F Test as the method of feature selection, that was,
we divided features into different populations according to
different label categories, and tested whether there were sig-
nificant differences in the mean values of features among
different populations. Features with significant differences
were considered to be distinguishable for classification and
would be retained.
Model building
Following feature selection, the radiomics model was built
using the Logistic Regression machine learning algorithm.
After feature selection, a radiomic model was built using a
Logistic Regression machine learning algorithm. Logistic
regression is a typical model that uses regression methods
to obtain binary classification results. All datasets were ran-
domly split and divided into training set and validation set
with the ratio of 7:3. Machine learning models for benign
and malignant breast diseases recognition were constructed,
including ultrasound model, MRI model, and ultrasound
combined MRI model. The performance of the model
was evaluated by area under the curve (AUC) of receiver
operating characteristic (ROC), sensitivity, specificity, and
accuracy.
Statistical analysis
Statistical analysis was performed using SPSS version
22.0 (SPSS Inc. Chicago, IL).Quantitative variables were
reported as mean ± standard deviation or interquartile
median, while categorical variables were described as counts
and percentages. Normal distribution was checked using the
Kolmogorov–Smirnov test. To compare radiological features
between benign and malignant breast lesions, quantitative
variables used the Student t test or Mann–Whitney U test,
and the Chi-square test for categorical variables. AUC,
accuracy, sensitivity, and specificity were used to describe
the classification performance of the different models. The
intra-class correlation coefficient (ICC) for the intra-and
inter-observer agreement was calculated. P-value < 0.05 was
considered statistically significant.
Results
Clinical results
In this study, 200 patients were retrospectively analyzed,
69 of whom were excluded due to poor image quality
(n = 20), lack of complete clinical or pathological data
Fig. 2 The workflow of the radiomics process
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Clinical and Experimental Medicine (2024) 24:110 Page 5 of 11 110
(n = 19), preoperative treatment (n = 20), and combina-
tion of tumors at other sites (n = 10). Finally, 131 patients
with breast disease were included in the study. The patients
ranged in age from 26 to 76years old, with an average age
of 49.11 ± 9.85years. There were 73 cases of benign breast
lesions, including 45 cases of fibroadenoma, 20 cases of
adenopathy and 8 cases of intraductal papilloma. There were
58 malignant lesions, including invasive ductal carcinoma in
26 cases, ductal carcinoma insitu in 27 cases, and papillary
carcinoma in 5 cases (Table1).
Radiomics feature analysis
The logistic regression classifier was used to establish three
models of ultrasound, MR and ultrasound combined with
MR. The inter-class and intra-class correlation coefficients
of the features extracted by ultrasound were analyzed, and
1316 robust features were obtained. The F test was used to
screen out 20 optimal features, including 1 first-order fea-
ture, 4 shape features and 15 texture features. For MRI, after
ICC analysis, 1781 features with consistency greater than
0.75 were retained. After feature correlation analysis, 11
best features were selected by F-test, including 5 first-order
features, 1 shape features and 5 textural features. In order
to model the fusion of ultrasound and enhanced MRI, the
features of the two sequences were combined to obtain a
total of 3097 joint features. Then, 14 features with the most
diagnostic value were selected by F test. In terms of feature
types, it includes 2 first-order features, 1 shape feature and
11 texture features. The weight diagram of the features used
for modeling is shown in Table2 and Fig.3.
Comparison ofthree machine learning models
The AUC values, accuracy, sensitivity, and specificity values
of the ultrasound model, MR model, and ultrasound com-
bined MR model are summarized in Table3, and the ROC
curves and waterfall plots of the training and validation sets
of each model are shown in Fig.4. In summary, the ultra-
sound combined with MR image combined model based on
the Logistic Regression classifier has the best diagnostic
efficiency, with AUC of 0.92 and 0.91, sensitivity of 0.80
and 0.67, specificity of 0.90 and 0.94, accuracy of 0.84 and
0.79, respectively, in the training group and the validation
group. It was superior to either the ultrasound model alone
or the MR Model alone (Fig.5).
Discussion
Breast cancer is the most common cancer among women in
154 countries and the leading cause of cancer-related death
in 103 countries [11]. According to the American Cancer
Society, the average lifetime risk of developing breast cancer
in a U.S. woman is estimated to be 12.3% (or 1 in 8 women),
and by 2023, 300,590 invasive breast cancers (299,540 in
women and 2,800 in men) and 55,720 cancers insitu in
women will be diagnosed in the U.S.A About 43,700 peo-
ple will die from breast cancer in 2023 [12]. Breast can-
cer is a serious public health problem. Early detection and
control can reduce the death rate of this deadly disease. At
present, the diagnosis and screening of breast diseases are
mainly through mammography, ultrasound and magnetic
resonance examination, and different examination methods
have their own advantages and disadvantages. At the same
time, because the analysis of image data involves the visual
perception and cognitive ability of radiologists, the diagno-
sis of breast imaging-reporting and data system (BI-RADS)
Table 1 General clinical and
pathological data of breast
patients
Clinical factors Benign cases (n = 73) Malignant cases (n = 58)
Age (years) 40.02 ± 11.54 58.20 ± 8.16
Pathological results
(n = 131)
Fibroadenoma (n = 45)
Adenopathy (n = 20)
Intraductal papilloma (n = 8)
Invasiveductalcarcinoma (n = 26)
Ductal carcinoma insitu (n = 27)
Papillary carcinoma (n = 5)
Table 2 14 radiomics features and their weights in the combined
model
Radiomic features Coef Relative weight
gradient_glcm_MCC 1.1439 0.9092
logarithm_glcm_MCC 0.5373 0.4270
lbp-3D-m1_gldm_DependenceVariance 0.4559 0.3624
wavelet-HL_firstorder_Range 0.3598 0.2860
wavelet-LH_glcm_MCC 0.2063 0.1639
gradient_glcm_Imc2 0.0408 0.0324
lbp-2D_glszm_GrayLevelNonUniform-
ity
− 0.1090 − 0.0866
wavelet-LL_ngtdm_Complexity − 0.1716 − 0.1364
lbp-3D-k_ngtdm_Coarseness − 0.1798 − 0.1429
squareroot_ngtdm_Complexity − 0.2878 − 0.2288
original_shape_LeastAxisLength − 0.4083 − 0.3245
logarithm_firstorder_Median − 0.4607 − 0.3661
log-sigma-1–0-mm-3D_glcm_JointEn-
tropy
− 1.1637 − 0.9248
wavelet-LL_glcm_MCC − 1.2582 − 1.0000
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Clinical and Experimental Medicine (2024) 24:110 110 Page 6 of 11
3 score and 4a score appeared. Therefore, in order to make
a definitive diagnosis, patients often undergo painful and
expensive biopsy procedures. In this case, the artificial intel-
ligence from single or multiple medical imaging model of
multiple quantitative traits, highlight the image characteris-
tics of invisible to the naked eye, significantly enhance the
distinction between medical imaging and predict potential.
In recent years, there have been many studies on non-inva-
sive discrimination of benign and malignant breast lesions
using artificial intelligence technology, including different
machine learning, deep learning and data mining algorithms
related to breast cancer prediction, detection of gene muta-
tions in breast cancer, and application of deep learning to
breast cancer with different imaging modalities. Their dis-
crimination ability and work efficiency are usually better
than that of radiologists [13–15].
KIM etal. [16] conducted an international study based on
mammography, and the results showed that artificial intel-
ligence models were superior to radiologists in tumor detec-
tion, asymmetric and distorted structure recognition. Using
artificial intelligence technology, Hou Yin etal. [17] built a
machine learning model of breast ultrasound images, and the
diagnostic accuracy was better than that of imaging doctors.
Artificial intelligence plays an important role in improving
the diagnostic performance of breast MRI. The artificial
intelligence breast classification system was applied to
multi-parameter MRI examination, and the maximum AUC
reached 0.852, and the diagnostic specificity was improved
[18]. The study of Deng Qiuxiang etal. showed that ultra-
sound combined with mammography significantly improved
its sensitivity (74.07%), specificity (88.61%) and accuracy
(84.9%) [19]. At present, there are few studies on the appli-
cation of multi-modal imaging methods based on radiom-
ics in the diagnosis of breast lesions. Multi-modal imaging
technology adopts more than two methods to make up for the
shortcomings of a single imaging mode and improve diag-
nostic efficiency, which has become an important direction
of the development of imaging diagnosis.
In this study, early enhanced MRI and ultrasound images
were selected for feature extraction. MRI has great value
in breast cancer diagnosis because of its high resolution of
soft tissue and multi-sequence imaging. In view of the fact
that dynamic contrast enhanced MRI (DCE-MRI) can reflect
functional information such as hemodynamics of cancer
Fig. 3 14 radiomics features coefficient in the combined model
Table 3 Comparison of diagnostic efficiency of different models for breast diseases
US model MR model Combined model
Training set Validation set Training set Validation set Training set Validation set
AUC (95%CI) 0.87 (0.81–0.94) 0.82 (0.63–1.00) 0.93 (0.88–0.98) 0.85 (0.72–0.99) 0.92 (0.87–0.98) 0.91 (0.83–0.99)
Accuracy 0.76 0.70 0.85 0.83 0.84 0.79
Sensitivity 0.73 0.82 0.80 0.77 0.80 0.67
Specificity 0.78 0.56 0.90 0.89 0.90 0.94
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Clinical and Experimental Medicine (2024) 24:110 Page 7 of 11 110
US AUS B
US b
US a
MR CMR D
Combined E Combined F
Fig. 4 The ROC curves (A–F) and the waterfall plots (a–f) of the ultrasound model, MR model and combined model in the training set and vali-
dation set
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Clinical and Experimental Medicine (2024) 24:110 110 Page 8 of 11
foci, most imaging omics studies are carried out based on
this [20, 21]. In the process of selecting the best features,
there are six screening methods, including F test, Pearson,
mutual information, L1 based, tree based, recursive, finally,
the system selects the best feature screening method—fea-
ture screening F test (ANOVA F-value) based on analysis of
variance, that is, features are divided into different popula-
tions according to different label categories, and the mean
values of features between different populations are tested
to see whether there are significant differences. Features
with significant differences considered to be distinguish-
able for classification will be retained. Finally, 1316 robust
features were obtained from the ultrasound model, and 20
optimal features were selected by F test, among which tex-
ture features accounted for 70%. The MRI model retained
1781 features. After feature correlation analysis, F-test was
used to select 11 best features, among which texture fea-
tures accounted for 45%.A total of 3097 joint features were
obtained by the ultrasound combined with MR model. Then,
14 features with the most diagnostic value were selected
by F test, and most of them were the features screened in
the MRI model. Texture features accounted for 73% of the
feature types. Texture features accounted for the majority
of the three models, but texture features are rarely observed
by the naked eye, suggesting that texture features have
important clinical significance in the differential diagnosis
MR c MR d
Combined e Combined f
Fig. 4 (continued)
Fig. 5 ROC curves of logistic regression model of different data
modalities (US model; MR model and combined model)
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Clinical and Experimental Medicine (2024) 24:110 Page 9 of 11 110
of benign and malignant breast lesions. Texture features
represent the pixel arrangement and spatial pixel relation-
ship of the lesion image, and represent the texture hetero-
geneity of the growth and development of the lesion [22].
Texture features mainly include gray level co-occurrence
matrix (GLCM), gray level size zone matrix (GLSZM),
gray level run length matrix (GLRLM), neighbouring gray
tone difference matrix (NGTDM) and gray level dependence
matrix (GLDM). GLCM describes texture by calculating
the condition of two pixels maintaining a certain distance
on the image with a certain gray level, respectively, that is,
the spatial correlation characteristics of gray level. GLSZM
records the number of occurrences of the same adjacent ele-
ments in a two-dimensional region on the image. GLRLM
describes the texture by recording the occurrence of multiple
identical pixel values in the direction of the image, namely
the grayscale path matrix. GLDM calculates the situation
where the difference between a certain pixel and surround-
ing pixels on the image is less than a certain threshold, that
is, the gray dependence matrix, to describe the texture.
NGTDM calculates the difference between the average gray
value of each pixel on the image and all pixels in its neigh-
borhood, that is, the neighborhood gray difference matrix,
to calculate the texture. In this study, GLCM is the most
important texture feature. PARK etal. [23] studied radiom-
ics features as a predictor of distant disease free survival in
patients with invasive breast cancer. It was found that tex-
ture features based on GLCM/ gray-level size zone matrix
(GLSZM) could capture heterogeneous texture information
better than those based on histogram. The reason is that the
texture feature based on GLCM/GLSZMs takes into account
the interaction between adjacent pixels, which is very suit-
able for quantifying tumor texture and heterogeneity. The
results of this study are consistent with those reported in
the literature. In addition, the importance of wavelet features
is also high, which indicates that the overall gray level of
the image has a certain relationship with the differentiation
of benign and malignant breast lesions. Some studies have
reported that wavelet and Gaussian filters decompose the
original image from different directions, which further pre-
sents multi-dimensional spatial heterogeneity and helps to
reveal the tumor heterogeneity not detected in the original
image [24, 25]. Wavelet features can reflect more informa-
tion about tumor heterogeneity [26–29]. However, whether
texture features and wavelet features can be used as specific
imaging biomarkers to predict breast disease heterogeneity
still needs to be verified by further multi-center and large-
sample clinical trials.
In this study, ultrasound model, MR Model, and joint
model were established based on Logistic Regression clas-
sifier. Logistic regression is a classification model, which
is represented by conditional probability distribution in
the form of parameterized logistic distribution. Here, the
random variable X takes the value of a real number, and the
random variable Y takes the value of 1 or 0. We estimate
the model parameters by supervised learning method. Yas-
sin etal. [30] outlined different traditional machine learn-
ing algorithms for breast cancer diagnosis in recent years in
their review. These algorithms include decision trees (DT),
random forests (RF), support vector machines (SVM), naive
Bayes (NB), K-nearest neighbor (KNN), linear discriminant
analysis (LDA), and logistic regression (LR). Some stud-
ies have shown that the Logistic Regression classifier has
better diagnostic performance, and this study is consistent
with literature reports. In this study, the training set of these
three radiomics models showed high diagnostic efficiency,
with the minimum AUC of 0.87 and the maximum AUC of
0.93. The AUC values in the validation set were lower than
those in the training set, indicating that the model has good
generalization ability and can ensure that the training group
model still has good performance on the validation group.
The logistic regression classifier-based ultrasound combined
with MR model showed the best diagnostic performance,
with an AUC of 0.91 in the validation group, which was
better than the single ultrasound model (AUC of 0.82) or
the single MR model (AUC of 0.85). These results indicate
that the diagnostic efficiency of multimodal imaging based
on radiomics is better than that of single imaging mode
in the diagnosis of breast diseases. Ultrasound and X-ray
mammography are the preferred imaging methods for early
screening and diagnosis of breast diseases. With the advan-
tages of real-time imaging, non-invasive, rapid and simple,
high cost-effectiveness, high patient compliance, and high
accessibility, breast ultrasound has become the most com-
monly used tool for breast tumor screening and diagnosis,
and can detect changes in organ tissues and blood flow. How-
ever, due to the low resolution of ultrasonic instruments, it
is impossible to accurately distinguish small lesions, and
two-dimensional gray scale ultrasound cannot display tis-
sue attenuation changes, which is easy to miss diagnosis
and misdiagnosis.
MRI uses low-energy radio waves and intense magnetic
fields to obtain detailed images of the internal structure of
the breast, which can detect malignant breast lesions that
cannot be detected by clinical, mammography and ultra-
sound. MRI has high accuracy in the diagnosis of breast
diseases, and MRI features account for a large proportion
of the 11 texture features of the fusion model with this
study. However, breast MRI still has certain limitations,
which is not sensitive to small calcification. The require-
ment of magnetic field uniformity is high; the examination
is time-consuming and expensive. MRI examination has
high sensitivity and low specificity, which is easy to cause
false positive diagnosis and overestimate the scope of the
lesion. Therefore, the combination of MR and ultrasound
is more conducive to the diagnosis of breast diseases, and
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Clinical and Experimental Medicine (2024) 24:110 110 Page 10 of 11
the integration of radiomics further improves the accuracy
of lesion diagnosis. At present, there are few multi-modal
radiomics studies. Zhang etal. [31] used T1WI, T2WI,
DWI and DCE-MRI to establish a radiomics nomogram
based on multi-parametric MRI to predict the lymphatic
vascular invasion and clinical prognosis status of patients
with breast invasive ductal carcinoma. Mu Xiaojing
etal. found that the AUC of ultrasound in the diagnosis
of benign and malignant BI-RADS 4 nodules was 0.923
(95%CI 0.904–0.942), and the AUC of mammography in
the diagnosis of benign and malignant BI-RADS 4 nodules
was 0.904 (95%CI 0.883–0.926).The AUC of ultrasound
combined with mammography in the diagnosis of breast
BI-RADS 4 tumors was 0.950 (95%CI 0.936–0.965),
which was higher than that of single diagnosis [32].
Sui Guangping etal. [33] conducted a study on 70 patients
with early breast cancer. The patients underwent com-
bined examination of conventional ultrasound (as the rou-
tine group) and light scattering imaging + elastic imag-
ing + contra-ultrasound + molybdenum target + magnetic
resonance imaging (as the combined group), and compared
with postoperative pathological results, respectively. It is
found that the combined application of multimodal imag-
ing technology can significantly improve the diagnostic
accuracy of early breast diseases, which is worthy of pro-
motion and application. However, there are few studies
on the diagnostic value of ultrasound combined with MR
Model based on radiomics for breast diseases. The com-
bined application of several methods can play the role of
complementarity and complementarity, which is of great
significance to improve the accuracy of diagnosis, which
is the innovation of this study.
There are still some shortcomings in this study: First, it
is a retrospective study with a small sample size and unbal-
anced distribution of patient data, which may lead to bias
in the study results; second, only T1W enhanced images
were selected for MRI images, and other sequence images
were not included, which may not fully exploit the advan-
tages of mpMRI; finally, this study is a single-center study,
and multi-center big data research is needed to improve the
accuracy of the results in the future.
In summary, with the deep integration of artificial
intelligence and medical technology, machine learning
has reflected an important value in the processing and
analysis of image depth. At the same time, combining the
advantages of various imaging methods, fully exploiting
the potential of AI-based multimodal imaging methods
can further improve the value of imaging in the diagnosis
of breast lesions.
Acknowledgements Zhiqun Wang and Lei Liang contributed toward
the article. And Dr. Chen provided professional writing services or
materials. Authors obtained permission to acknowledge from all those
mentioned in the Acknowledgements section.
Author contributions DB and NZ wrote the main manuscript text. XL
and XL prepared figures. YL and JW analysed the data. ZW and LL
revised the paper. All authors reviewed the manuscript. All authors read
and approved the final manuscript.
Funding Not applicable.
Data availability The datasets used and analyzed during the current
study are available from the corresponding author on reasonable
request.
Declarations
Conflict of interest The author(s) declared no potential conflicts of in-
terest with respect to the research, authorship, and/or publication of
this article.
Ethical approval Written informed consent for publication of this paper
was obtained from the ethics committee of the Aerospace Center Hos-
pital.
Consent for publication Written informed consent was obtained from
the patient for publication of this study and any accompanying images.
Open Access This article is licensed under a Creative Commons Attri-
bution 4.0 International License, which permits use, sharing, adapta-
tion, distribution and reproduction in any medium or format, as long
as you give appropriate credit to the original author(s) and the source,
provide a link to the Creative Commons licence, and indicate if changes
were made. The images or other third party material in this article are
included in the article's Creative Commons licence, unless indicated
otherwise in a credit line to the material. If material is not included in
the article's Creative Commons licence and your intended use is not
permitted by statutory regulation or exceeds the permitted use, you will
need to obtain permission directly from the copyright holder. To view a
copy of this licence, visit http:// creat iveco mmons. org/ licen ses/ by/4. 0/.
References
1. Mahmood T, Li J, Pei Y, Akhtar F, Imran A, Rehman KU. A brief
survey on breast cancer diagnostic with deep learning schemes
using multi-image modalities. IEEE Access. 2020;8:165779–809.
2. Giaquinto AN, Miller KD, Tossas KY, etal. Cancer statistics
for African American/black people 2022[J]. CA: Cancer J Clin.
2022;72(3):202–29.
3. Miller KD, Ortiz AP, Pinheiro PS, etal. Cancer statistics for the
US Hispanic/Latino population, 2021[J]. CA: Cancer J Clin.
2021;71(6):466–87.
4. Islami F, Goding Sauer A, Miller KD, etal. Proportion and num-
ber of cancer cases and deaths attributable to potentially mod-
ifiable risk factors in the United States[J]. CA: Cancer J Clin.
2018;68(1):31–54.
5. Munoz D, Near AM, Van Ravesteyn NT, etal. Effects of screening
and systemic adjuvant therapy on ER-specific US breast cancer
mortality[J]. J Natl Cancer Inst. 2014;106(11):dju289.
6. Tong CWS, Wu M, Cho WCS, etal. Recent advances in the treat-
ment of breast cancer[J]. Front Oncol. 2018;8:227.
7. Talo M. Automated classification of histopathology images using
transfer learning[J]. Artif Intell Med. 2019;101: 101743.
8. George K, Faziludeen S, Sankaran P. Breast cancer detection from
biopsy images using nucleus guided transfer learning and belief
based fusion[J]. Comput Biol Med. 2020;124: 103954.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
Clinical and Experimental Medicine (2024) 24:110 Page 11 of 11 110
9. Lambin P, Rios-Velazquez E, Leijenaar R, et al. Radiomics:
extracting more information from medical images using advanced
feature analysis[J]. Eur J Cancer. 2012;48(4):441–6.
10. Valdora F, Houssami N, Rossi F, etal. Rapid review: radiomics and
breast cancer[J]. Breast Cancer Res Treat. 2018;169(2):217–29.
11. Bray F, Ferlay J, Soerjomataram I, etal. Global cancer statis-
tics 2018: GLOBOCAN estimates of incidence and mortal-
ity worldwide for 36 cancers in 185 countries[J]. CA: J Clin.
2018;68(6):394–424.
12. Siegel RL, Miller KD, Wagle NS, etal. Cancer statistics, 2023[J].
Ca Cancer J Clin. 2023;73(1):17–48.
13. Fatima N, Liu L, Hong S, etal. Prediction of breast cancer,
comparative review of machine learning techniques, and their
analysis[J]. IEEE Access. 2020;8:150360–76.
14. Wisesty UN, Mengko TR, Purwarianti A. Gene mutation detec-
tion for breast cancer disease: a review[C] IOP Conference Series:
Materials Science and Engineering. IOP Publishing. 2020;830(3):
032051.
15. Pang T, Wong JHD, Ng WL, etal. Deep learning radiomics in
breast cancer with different modalities: overview and future[J].
Expert Syst Appl. 2020;158: 113501.
16. Kim HE, Kim HH, Han BK, etal. Changes in cancer detection
and false-positive recall in mammography using artificial intelli-
gence: a retrospective, multireader study[J]. Lancet Digital Health.
2020;2(3):e138–48.
17. Yin H, Panpan Z, Qingling Z. Construction and application of
nomogram prediction model of breast mass based on quantitative
characteristics of ultrasound images [J]. J Clin Ultrasound Med.
2022;24(5):332–7.
18. Dalmis MU, Gubern-Mérida A, Vreemann S, etal. Artificial
intelligence–based classification of breast lesions imaged with a
multiparametric breast MRI protocol with ultrafast DCE-MRI, T2,
and DWI[J]. Invest Radiol. 2019;54(6):325–32.
19. Deng Qiuxiang Wu, Renrong SS, Xiaoliang Ge, Yuanwu P. Value
of mammography combined with ultrasound for breast cancer
detection [J]. Chin J Mod Med. 2011;21(17):2074–6.
20. Wang Y, Wen SB, Zhou HY, etal. Application progress of MRI
radiomics in predicting the prognosis of breast cancer[J]. Chin J
Magn Reson Imag. 2023;14(9):136–40.
21. Jiang YL, Edwards AV, Newstead GM. Artificial intelligence
applied to breast MRI for improved diagnosis[J]. Radiology.
2021;298(1):38–46.
22. Damascelli A, Gallivanone F, Cristel G, etal. Advanced imaging
analysis in prostate MRI: building a radiomic signature to predict
tumor aggressiveness[J]. Diagnostics. 2021;11(4):594.
23. Aerts HJWL, Velazquez ER, Leijenaar RTH, etal. Decoding
tumour phenotype by noninvasive imaging using a quantitative
radiomics approach[J]. Nat Commun. 2014;5(1):4006.
24. Gu D, Xie Y, Wei J, etal. MRI-based radiomics signature: a poten-
tial biomarker for identifying glypican 3-positive hepatocellular
carcinoma[J]. J Magn Reson Imaging. 2020;52(6):1679–87.
25. Jing R, Wang J, Li J, etal. A wavelet features derived radiomics
nomogram for prediction of malignant and benign early-stage lung
nodules[J]. Sci Rep. 2021;11(1):22330.
26. Zheng Y, Zhou D, Liu H, etal. CT-based radiomics analysis of
different machine learning models for differentiating benign and
malignant parotid tumors[J]. Eur Radiol. 2022;32(10):6953–64.
27. Xia W, Hu B, Li H, etal. Multiparametric-MRI-based radiom-
ics model for differentiating primary central nervous system
lymphoma from glioblastoma: development and cross-vendor
validation[J]. J Magn Reson Imaging. 2021;53(1):242–50.
28. Bai H, Xia W, Ji X, etal. Multiparametric magnetic resonance
imaging-based peritumoral radiomics for preoperative prediction
of the presence of extracapsular extension with prostate cancer[J].
J Magn Reson Imaging. 2021;54(4):1222–30.
29. Park H, Lim Y, Ko ES, etal. Radiomics signature on mag-
netic resonance imaging: association with disease-free survival
in patients with invasive breast cancer[J]. Clin Cancer Res.
2018;24(19):4705–14.
30. Yassin NIR, Omran S, El Houby EMF, etal. Machine learning
techniques for breast cancer computer aided diagnosis using dif-
ferent image modalities: a systematic review[J]. Comput Methods
Progr Biomed. 2018;156:25–45.
31. Zhang JJ, Wang GH, Ren JL, etal. Multiparametric MRI-based
radiomics nomogram for preoperative prediction of lymphovascu-
lar invasion and clinical outcomes in patients with breast invasive
ductal carcinoma[J]. Eur Radiol. 2022;32(6):4079–89.
32. Xiaojing M. Value of nomogram model based on ultrasound
and X-ray breast features in breast BI-class RADS 4 lesions [D].
Shanxi: Shanxi Medical University; 2023.
33. Guangping S, Dongmei L, Chunyue W, Shujuan G. Application
analysis of multimodal imaging techniques in the diagnosis of
early breast cancer [J]. Imaging Res Med Appl. 2019;3(7):99–100.
Publisher's Note Springer Nature remains neutral with regard to
jurisdictional claims in published maps and institutional affiliations.
Content courtesy of Springer Nature, terms of use apply. Rights reserved.
1.
2.
3.
4.
5.
6.
Terms and Conditions
Springer Nature journal content, brought to you courtesy of Springer Nature Customer Service Center GmbH (“Springer Nature”).
Springer Nature supports a reasonable amount of sharing of research papers by authors, subscribers and authorised users (“Users”), for small-
scale personal, non-commercial use provided that all copyright, trade and service marks and other proprietary notices are maintained. By
accessing, sharing, receiving or otherwise using the Springer Nature journal content you agree to these terms of use (“Terms”). For these
purposes, Springer Nature considers academic use (by researchers and students) to be non-commercial.
These Terms are supplementary and will apply in addition to any applicable website terms and conditions, a relevant site licence or a personal
subscription. These Terms will prevail over any conflict or ambiguity with regards to the relevant terms, a site licence or a personal subscription
(to the extent of the conflict or ambiguity only). For Creative Commons-licensed articles, the terms of the Creative Commons license used will
apply.
We collect and use personal data to provide access to the Springer Nature journal content. We may also use these personal data internally within
ResearchGate and Springer Nature and as agreed share it, in an anonymised way, for purposes of tracking, analysis and reporting. We will not
otherwise disclose your personal data outside the ResearchGate or the Springer Nature group of companies unless we have your permission as
detailed in the Privacy Policy.
While Users may use the Springer Nature journal content for small scale, personal non-commercial use, it is important to note that Users may
not:
use such content for the purpose of providing other users with access on a regular or large scale basis or as a means to circumvent access
control;
use such content where to do so would be considered a criminal or statutory offence in any jurisdiction, or gives rise to civil liability, or is
otherwise unlawful;
falsely or misleadingly imply or suggest endorsement, approval , sponsorship, or association unless explicitly agreed to by Springer Nature in
writing;
use bots or other automated methods to access the content or redirect messages
override any security feature or exclusionary protocol; or
share the content in order to create substitute for Springer Nature products or services or a systematic database of Springer Nature journal
content.
In line with the restriction against commercial use, Springer Nature does not permit the creation of a product or service that creates revenue,
royalties, rent or income from our content or its inclusion as part of a paid for service or for other commercial gain. Springer Nature journal
content cannot be used for inter-library loans and librarians may not upload Springer Nature journal content on a large scale into their, or any
other, institutional repository.
These terms of use are reviewed regularly and may be amended at any time. Springer Nature is not obligated to publish any information or
content on this website and may remove it or features or functionality at our sole discretion, at any time with or without notice. Springer Nature
may revoke this licence to you at any time and remove access to any copies of the Springer Nature journal content which have been saved.
To the fullest extent permitted by law, Springer Nature makes no warranties, representations or guarantees to Users, either express or implied
with respect to the Springer nature journal content and all parties disclaim and waive any implied warranties or warranties imposed by law,
including merchantability or fitness for any particular purpose.
Please note that these rights do not automatically extend to content, data or other material published by Springer Nature that may be licensed
from third parties.
If you would like to use or distribute our Springer Nature journal content to a wider audience or on a regular basis or in any other manner not
expressly permitted by these Terms, please contact Springer Nature at
onlineservice@springernature.com