ArticlePDF Available

Belief entropy-of-entropy and its application in the cardiac interbeat interval time series analysis

Authors:

Abstract and Figures

How to measure the complexity of physiological signals in biological system is an open problem. Various entropy algorithms have been presented, but most of them fail to account for the complexity of time series with high accuracy. In this paper, the concept of Belief Entropy-of-Entropy (BEoE) is introduced, it expands entropy of entropy (EoE) into belief structure, and computes quadratic belief entropy to characterize the complexity of biological systems based on multiple time scales. The influence of inherent complex fluctuation, length bound, correlation of time windows, etc. is considered in the BEoE analysis. Application and discussion demonstrate that BEoE has better accurateness and applicability than many existing entropy algorithms.
Content may be subject to copyright.
Chaos, Solitons and Fractals 155 (2022) 11173 6
Contents lists available at ScienceDirect
Chaos, Solitons and Fractals
Nonlinear Science, and Nonequilibrium and Complex Phenomena
journal homepage: www.elsevier.com/locate/chaos
Belief entropy-of-entropy and its application in the cardiac interbeat
interval time series analysis
Huizi Cui
a
, Lingge Zhou
a
, Yan Li
a
, Bingyi Kang
a , b , c ,
a
College of Information Engineering, Northwest A&F University, Yangling, Shaanxi 712100, China
b
Key Laboratory of Agricultural Internet of Things, Ministry of Agriculture and Rural Affairs, Yangling, Shaanxi 712100, China
c
Shaanxi Key Laboratory of Agricultural Information Perception and Intelligent Service, Yangling, Shaanxi 712100, China
a r t i c l e i n f o
Article history:
Received 1 August 2021
Revised 15 December 2021
Accepted 16 December 2021
Keywo rds:
D-S evidence theory
Belief entropy
Deng entropy
Time series
Complexity
a b s t r a c t
How to measure the complexity of physiological signals in biological system is an open problem. Various
entropy algorithms have been presented, but most of them fail to account for the complexity of time
series with high accuracy. In this paper, the concept of Belief Entropy-of-Entropy (BEoE) is introduced, it
expands entropy of entropy (EoE) into belief structure, and computes quadratic belief entropy to char-
acterize the complexity of biological systems based on multiple time scales. The influence of inherent
complex fluctuation, length bound, correlation of time windows, etc. is considered in the BEoE analysis.
Application and discussion demonstrate that BEoE has better accurateness and applicability than many
existing entropy algorithms.
©2021 Elsevier Ltd. All rights reserved.
1. Introduction
Biological system contains various of dynamic information, and
“complexity” [1–3] usually is used to describe the changeable time
series in the system (e.g. biological signals), which plays an impor-
tant role in reflecting the ability of handling numerous information
so that it could adapt and function normally in a changing environ-
ment. A “healthy” system needs to coordinate the overall situation
and make corresponding adjustments to the constantly changing
environment, its complex behavior is different from a very regular
or highly random one, so “healthy” biological signal is constantly
changing and unpredictable. Note that the medical terms related to
“healthy” mentioned in this paper means that there is no obvious ab-
normalities, most of the time is in a good physiological condition. For
example, “healthy” system means that all the parts have satisfactory
physiological functions, when ignoring unobvious abnormalities that
occur with a very small probability.
How to quantify the complex degree of physiological signals in
the system is important. Most traditional entropies seem to be in-
appropriate. For example, Shannon entropy [4] , proposed by Shan-
non [4] , measures the uncertainty of average information accord-
ing to all specific events with related probabilities, but it ignores
the relationship of distinct events in a time series. Then, some en-
Corresponding author at: College of Information Engineering, Northwest A&F
University, Yangling, Shaanxi 712100 , China.
E-mail addresses: bingyi.kang@nwsuaf.edu.cn , bingyi.kang@hotmail.com (B.
Kang).
tropies specifically designed to evaluate the complexity of time se-
ries are proposed, such as Kolmogorov–Sinai entropy [5,6] , Approx-
imate entropy [7] , Sample entropy [8] , Fuzzy entropy [9] , Condi-
tional entropy [10] and so on.
Kolmogorov–Sinaientropy [5,6] , known as mean rate of infor-
mation creation or information dimension, takes into account of
the correlation between the distinct events so that it is a useful
parameter to characterize the dynamics of a biological system. It
reckons that one state of the system at a certain instant is great in-
fluenced by its previous states and only the entropy values of lim-
ited order can be calculated. Once the number of states becomes
larger relative to the length of given time series, its entropy will be
underestimated and decay to 0. Therefore, in real world, the time
series in a finite length could not be estimated by Kolmogorov–
Sinai entropy appropriately. In order to analyze the short time se-
ries more reasonably, Pincus presented the concept of Approximate
entropy [7] , which has been widely applied in the many areas like
physiology and medicine, but the error of final result may be large.
Consider the disadvantages of above entropy algorithms, then Sam-
ple entropy was presented [8] , it greatly reduces the dependence
on the length and previous states of time series and the accuracy
has been significantly improved. There are many other entropies to
measure the time series, however, most of them assign higher en-
tropy value to pathological time series [11] instead of the time se-
ries derived from “healthy” system which functions normally and
has no significant abnormalities (e.g. pathological signal of atrial
fibrillation (AF)). As a result, the increase in entropy value may not
always be related to the increase in dynamic complexity. Similarly,
https://doi.org/10.1016/j.chaos.2021.111736
0960-0779/© 2021 Elsevier Ltd. All rights reserved.
H. Cui, L. Zhou, Y. Li et al. Chaos, Solitons and Fractals 155 (2022) 11173 6
to a random time series, although the generation of alternative
data will destroy the correlation to a certain extent and reduce the
information content, but it always has larger entropy value than
the original time series [11] .
To deal with the above contradiction, Costa et al. introduced
the concept of time scale and proposed Multiscale entropy (MSE)
[11,12] to measure cardiac interbeat interval time series from the
perspective of complexity. According to the MSE, it is the com-
plexity rather than irregularity that reflects the characteristic of a
biological system through its output physiological signals. For ex-
ample, a pathological AF time series is considered to be highly ir-
regular, which results in a low MSE value and a high SE value.
This reflects the meaning of complexity and irregularity is differ-
ent, most traditional measurements represented by Kolmogorov–
Sinai entropy yield contradictory results may be related to single-
scale analysis, they measure the irregularity without considering
the complex fluctuations contained in the physiological time se-
ries.
The core of MSE is dividing a time series into a list of time win-
dows with the same scale, and each one could be seen as its re-
lated representative state. As a result, the original time series is
converted into a coarse-grained sequence of representative states,
then the complexity is ascertained according to the SampEn value,
the irregularity of physiological time series is reduced by the pro-
cedure of averaging over time scale.
Since multi-scale analysis can make the results of complexity
analysis more reasonable and accurate, based on it, many kinds
of entropy algorithms have been presented, e.g. entropy of en-
tropy (EoE). EoE presented by Hsu et al. combines the core of MSE
and the idea of super-information [13] , describes biological system
from the variation of information point of view which is masked
in physiological time series based on multiple time scales. Com-
pute the quadratic entropy based on a certain time scale, the fi-
nal EoE value reflects the complex degree of time series. In [14] ,
it gives E o E
AF
= E o E
CHF
= 0 . 41 , E o E
Healt hy
= 1 . 40 , which means the
complexity of AF subject and CHF subject is same, it is obviously
counterintuitive. Therefore, it can be seen that EoE method could
pick out “healthy” subject, however, it may fail in distinguishing
different pathological subjects correctly. Note that “healthy subjects”
and “NSR subjects” [15] in this paper are from MIT-BIH Normal Sinus
Rhythm Database, only the names are different.
Inspired by EoE, we propose Belief Entropy-of-Entropy (BEoE)
to measure the complexity of time series in the background of D-
S evidence theory [16,17] . As the extension of probability theory,
D-S evidence theory can better deal with unknown and imprecise
information based on belief probability assignment (BPA), it has
greater flexibility and effectiveness in modeling and processing un-
certain information. How to measure the uncertain degree of infor-
mation under this framework is worthy of in-depth study, various
of entropy algorithms represented by Deng Entropy [18] have en-
riched the development of this theory.
Deng entropy has attracted many attentions, various extensions
[19,20] have been proposed into different complicated applications,
such as fractals and quantum physics [21,22] , complex network
[23] , medical diagnosis [24] , Pseudo-Pascal Triangle [25] and so
on. As a composite measure for both disorder and total nonspeci-
ficity of the mass function [26] , it indicates that the quantity of
information in every piece of evidence is related to its own uncer-
tain degree. Moreover, Deng entropy takes the effect of cardinal-
ity of focal elements into account and measures the uncertainty of
BPA which has the ability of dealing with the simultaneous events
in the probable field, just like the microcosmic particles can exist
at the same time and be entangled with each other in the quan-
tum world. (e.g. for the state { B
1
, B
2
} , Deng entropy could discribe
the logic relationship between B
1 and B
2 is AND ). As a result, it is
“nonadditive” entropy under certain conditions as it reckons that
influence of correlations of the states in a system, similar to the
concept of Tsallis entropy [27] , more information about Deng en-
tropy, please refer to Deng [28] .
Based on the effect of time scale, belief probability assignment,
Deng entropy, etc., the proposed BEoE could appropriately mea-
sure the inherent complex fluctuation of physiological time series.
With the aim of demonstrating the advantages of BEoE method for
complexity analysis, the cardiac interbeat interval time series from
normal sinus rhythm (NSR) subjects, atrial fibrillation (AF) subjects
and heart failure (CHF) subjects are used to analyze. BEoE could
not only accurately quantify the complexity of different physiolog-
ical time series based on BPA and time window but also research
the relationship between belief entropy and quadratic belief en-
tropy. The main novelty of BEoE is summarized as below:
1). Based on D-S evidence theory, BEoE combines BPA with time
window to analyze the characteristics of time series, explains
the meaning and application value of Quadratic Belief Entropy.
The hidden “information” and “changes” in the time series
could be accurately extracted, then quantify them as complex-
ity to evaluate the comprehensive ability of a system.
2). In the first Deng entropy processing for analyzing the amount
of information contained in each time window, slice boundaries
and boundary intervals could accommodate all the data bet-
ter, especially “difficult data” distributed at the boundary of the
slices, thus the reasonable BPA makes the final result more ac-
curate.
3). In the second Deng entropy processing for analyzing the
changes of time series based on the quadratic belief entropy, a
selection method for the level list is presented, belief possible
levels and level intervals are used to arrange the entropy values
of each time window, so that it can solve the problem like how
to determine the level list when the number of entropy values
is more than levels in the list.
4). BEoE could quantify the complexity of different interbeat in-
terval time series more accurately, not only pick out NSR sub-
jects, but also distinguish different pathological subjects. For
the three typical subjects which are often used for heartbeat
fluctuation analysis, it could give a comprehensive complexity
ranking: NSR subject > AF subject > CHF subject .
5). The relationship between quadratic belief entropy (BEoE) and
belief entropy is roughly like an inverted “U” ship with the in-
creasing of disorder and the maximum BEoE appears in the in-
termediate of order and extreme disorder.
The organization of the rest part is summarized as follows, in
Section 2 , some related background knowledge is reviewed, espe-
cially D-S evidence theory and Deng entropy. Section 3 mainly in-
troduces the proposed method, rationality and effectiveness are il-
lustrated via the application about the interbeat interval time se-
ries in Section 4 . Some discussion about accuracy, important pa-
rameters and the relationship between BEoE and belief entropy,
etc. are shown in Section 5 , conclusion and future study are given
in Section 6 .
2. Preliminaries
Before introducing the proposed method in this section, some
concepts are briefly reviewed here, including D-S evidence theory
[16,17] and Deng entropy [18] .
2.1. D-S evidence theory
How to measure the uncertain degree to give a rational decision
pays a lot of attention, various kinds of work has been presented,
such as D-S evidence theory [16,17] , Generalized evidence theory
[29,30] , D-number [31,32] , Z -number [33–38] and so on.
2
H. Cui, L. Zhou, Y. Li et al. Chaos, Solitons and Fractals 155 (2022) 11173 6
Among them, D-S evidence theory proposed by Dempster and
developed by Shafer, is an effective tool for information fusion and
uncertain information processing [28,39–42] . The study of uncer-
tainty based on D-S evidence theory has received more and more
attention, it can be considered as the generalization of the proba-
bility theory, but it meets a weaker axiom system than the proba-
bility theory and has much wider applications in dealing with the
uncertain information [43–48] .
D-S evidence theory has many advantages, first, compared with
probability theory in which probability mass only could be as-
signed to each singleton subset, it allows both singleton subsets
and sompound sets to assign the belief. Second, it permits one
to specify a certain ignorance degree sometimes in order to in-
stead of being forced to be assigned to probabilities. What’s more,
it doesn’t require a prior distribution before combining informa-
tion from individual information sources [19] . Due to its advan-
tages, it has been widely used in various kinds of aspects such
as conflict management [49,50] , data fusion [51,52] , risk analysis
[53–55] , multiple attribute decision analysis and making [56–58] ,
expert system [59,60] , pattern classification [61–63] and so on.
Some basic concepts of the D-S evidence theory are listed as
below:
2.1.1. The frame of discernment
Let U be a set which is finite, in the D-S evidence theory, it is
also reckoned as the frame of discernment(FOD), defined as
U = { u
1
, u
2
, u
3
, . . . u
k
} (1)
U, as the power set, is composed of 2
k elements of Ushown in
Eq. (2) .
P (U) =
{
φ,
{
u
1
}
,
{
u
2
}
, . . . .,
{
u
k
}
,
{
u
1
, u
2
}
, . . . .,
{
u
1
, u
2
, . . . . u
m
}
, . . . .U
}
(2)
φis an empty set and A is any proposition of Uas long as meeting
the condition that A P (U) , and also, such a function is named as
a belief structure by Yager.
2.1.2. Basic probability assignment
Here, FOD is denoted by U, any proposition of interest is corre-
sponding to a subset of the framework. When meeting the condi-
tion [17,64] that
m (φ) = 0 and
A U
m (A ) = 1 (3)
m : 2
U
[0 , 1] is a basic probability assignment(BPA) [65] , also
reckoned as a mass function, defined on U , delegates the degree
that the evidence sustains the proposition A .
2.1.3. The belief function and the plausibility function
In DST, when give a structure m on U, the belief function and
the plausibility function corresponding to m respectively are the
lower and upper limitations of the support degree for each propo-
sition in a BPA, defined as
Bel : Bel(A ) =
B A
m (B ) (4)
P l : P l(A ) =
B A = φ
m (B ) = 1
B A = φ
m (B ) (5)
In Eq. (4) , Bel describes the total basic probability quality which
is completely assigned to A and its smaller subsets and in Eq. (5) ,
P lindicates that all possible basic probability quality that can be
assigned to A and its smaller subsets. Therefore, the belief function
and the plausibility function can be considered respectively as the
lower and upper limitations of the probability of supporting A . The
belief interval is BI(A ) =
[
Bel (A ) , P l (A )
] and it has the function of
indicating the uncertainty.
2.2. Deng entropy
Deng entropy [18] proposed by Professor Deng, could be reck-
oned as the generalization of Shannon entropy. It is a very efficient
tool to describe the uncertain degree which is suitable for both
probability theory and D-S evidence theory. The specific definition
is shown as Eq. (6) .
E
d
=
A X
m (A ) log
2
m (A )
2
| A |
1
(6)
where X is the framework of discernment, A is the focal element
and | A | is the cardinality of A , the belief value of each proposition
is divided by the term ( 2
| A |
1) which could describe the potential
number of states in A (except for the empty set).
Deng entropy could measure not only disorder but also total
nonspecificity of the mass function [26] . When the belief value is
only assigned to the single elements of the FOD, it has | A | = 1 ,
Deng entropy will be degenerated to the Shannon entropy:
E
d
=
A X
m (A ) log
2
m (A )
2
| A |
1
=−
A X
m (A ) log
2
m (A ) (7)
For more information, refer to Deng [18] , Abellán and Joaquín
[26] , Deng [28] .
3. The proposed BEoE method
3.1. The flowchart of BEoE
The process of BEoE analysis is mainly divided into two parts,
which is similar to EoE method [14] . Divide the time series into a
list of time windows with the same length, then calculate the be-
lief entropy twice. Note that the main focus of the two parts of be-
lief entropy processing is different, the first time is for measuring
the disorder and the second time is for qualifying the complexity.
The flowchart of BEoE analysis is shown in Fig. 1 .
3.2. The steps of BEoE method
BEoE is a two-part-operation method to measure the complex-
ity of the time series. During the first part of BEoE analysis, the
time series { x
i
} = { x
1
, x
2
, . . . x
m
} of length m divided into a list
of windows { w
(t)
j
} which are consecutive non-overlapping. Every
window w
(t)
j
= { x
(j1) t+1
, . . . x
(j1) t+ t
} contains t time factors, in
other words, the window length, also is the scale factor.
Then, Deng entropy is used to measure the uncertain degree of
each window { w
(t)
j
} . [ x
min
, x
max
] is the range of every time inter-
val, divide it into s
1 slices in equal width, each slice is determined
by two adjacent boundary values B
s
and B
s +1
, represents the inde-
pendent state of each time interval. The BPA based on the time
interval is defined as follows:
m
j
({ B
s
, B
s +1
} ) =
total number of x
i
ov er w
(t)
j
between B
s
and B
s +1
t
(8)
B
s and B
s +1 are the boundary values of aslice, aset of time data
{ B
s
, B
s +1
} is used to describe the s th slice, the cardinality of each
set is two if the adjacent borders don’t coincide. What’s more, the
range of parameter s is [1 , s
1
] and the parameter j is considered
as the window index whose range is [1 ,
m
t
] . As a result, the belief
entropy value E
(t)
j
of window w
(t)
j
is
E
(t)
j
=
s
1
s =1
m
j
({ B
s
, B
s +1
} ) ln
m
j
({ B
s
, B
s +1
} )
2
|{ B
s
, B
s +1
}|
1
=
s
1
s =1
m
j
({ B
s
, B
s +1
} ) ln
m
j
({ B
s
, B
s +1
} )
3
(9)
3
H. Cui, L. Zhou, Y. Li et al. Chaos, Solitons and Fractals 155 (2022) 11173 6
Fig. 1. The specific analysis process of BEoE method.
Note that each belief entropy reflects the amount of informa-
tion hidden in the according time window during a certain period
of time. Repeat the same processing operation to every window,
then the belief entropy sequence { E
(t)
j
} is obtained.
During the second part of BEoE analysis, Deng entropy is uti-
lized again to measure the “changing” degree of the total time se-
ries. Each element from the belief entropy sequence { E
(t)
j
} could
be assigned to a certain belief possible level or level interval,
the number is s
2
(t) , depends on the time scale. Refer to Hsu
et al. [14] , in general, it has s
2
(4) = 5 , s
2
(5) = 7 , s
2
(6) = 11 , more
explanation about the relevant parameters refers to the next
section.
4
H. Cui, L. Zhou, Y. Li et al. Chaos, Solitons and Fractals 155 (2022) 11173 6
The BPA based on the belief entropy sequence occurred in level
interval { l
r
, l
r+1
} is defined as follows:
m
f
({ l
r
, l
r+1
} ) =
total number of E
(t)
j
ov er { E
(t)
j
} between l
r
and l
r+1
m/t
(10)
where m/t represents the window index, the range of level index
lis [1 , s
2
] . Each level interval is consists of two adjacent levels,
denotes as the set { l
r
, l
r+1
} , the cardinality of each set is 2 if the
values of the elements in the belief entropy sequence { E
(t)
j
} are
not coincide with adjacent levels.
About the selection rule for the level list, due to BEoE uses
belief possible levels and level intervals to arrange the elements
in { E
(t)
j
} . Firstly, count the frequency of every possible belief en-
tropy value, and sorted them in descending order. Secondly, the
first s
2
(t) are selected as the belief possible levels of level list, all
the elements from { E
(t)
j
} is assigned approximately based on be-
lief possible level or level interval. BEoE can solve the problem like
how to determine a reasonable level list when the number of be-
lief entropy values is more than that of belief possible levels, while
EoE method ignores it.
As a result, the quadratic belief entropy value can be obtained,
also it is the final result of complexity analysis, the BEoE value of
time series { x
i
} = { x
1
, x
2
, . . . x
m
} can be calculated as follows.
BEoE =
s
2
r=1
m
f
({ l
r
, l
r+1
} ) ln
m
f
({ l
r
, l
r+1
} )
2
|{ l
r
, l
r+1
}|
1
=
s
2
r=1
m
f
({ l
r
, l
r+1
} ) ln
m
f
({ l
r
, l
r+1
} )
3
(11)
When all the data points are distributed over some certain
boundaries and all the elements of { E
(t)
j
} are exactly on the lev-
els, at this time, the proposed BEoE method will be degenerated
into EoE method.
Summary the analysis process of the proposed BEoE method.
As belief entropy could describe the uncertain degree in D-S evi-
dence theory, to the time series, if the amount of information of
a certain time window is large, it will be assigned higher disorder.
So after the first part of BEoE analysis via Eq. (9) , it results in a
coarse-grained sequence of some representative states in terms of
Deng entropy, and reflects the disorder, i.e. irregularity. Then, ac-
cording to the belief entropy sequence and selection rule for belief
possiable levels, the new BPA could be determined. Finally, the Be-
lief Entropy-of-Entropy value can be calculated to via Eq. (11) , the
quadratic belief entropy measures the complexity of time series,
the more complex the time series, the higher the BEoE.
4. Application in cardiac interbeat interval time series analysis
4.1. Cardiac interbeat interval time series
In order to exhibit the advantages of BEoE method for complex-
ity analysis and compare the performance of other typical entropy
algorithms, so we apply BEoE method to cardiac interbeat interval
time series.
The sources of the data on PhysioNet [66] are listed as below:
(1) BIDMC (Beth Israel Deaconess Medical Center) Congestive Heart
Failure Database
(2) MIT (Massachusetts Institute of Technology)-BIH (Beth Israel
Hospital) Normal Sinus Rhythm Database
(3) Long Term Atrial Fibrillation (AF) Database
For convenience, in this paper, the three databases are referred
to BCHF, NSR and LTAF, and three kinds of subjects are referred
to CHF, normal sinus rhythm (NSR) [15] and AF subjects. These
databases are the electrocardiography (ECG) with long-term (20–
24 h) records, and subjects in each database are 15, 18, and 83 re-
spectively. For all data sets in the paper, abnormal data (e.g. noise,
error) has been processed accordingly.
The reasons for choosing these three cases are suitable and
enough to verify the rationality, accuracy and advantages of the
proposed method are summarized as below:
i. Based on the data description of the original paper of PhysioNet
[66] , the selected representative pathologies CHF and AF with
major public health implication for complexity analysis are rea-
sonable and valuable.
ii. BEoE is consistent with many other relevant entropy algo-
rithms, the way of data processing is also similar, so the results
of comparative experiments are intuitive and uncontroversial.
iii. The similarity of these three cases is very low, each one has its
own characteristics, which can clearly show that the main focus
of belief entropy and Belief Entropy-of-Entropy is different.
iv. Only perform complexity analysis on CHF, AF and NSR subjects,
the results could fully demonstrate the advantages of BEoE, it is
better than many other methods.
4.2. Data processing about the cardiac interbeat interval time series
The ECG records of congestive heart failure (CHF), atrial fibril-
lation (AF) and normal sinus rhythm (NSR) subjects are analyzed
in this paper. About the processing of data, firstly, 500 data points
are taken into account for each interbeat interval time series anal-
ysis. Specifically, for each of 15 ECG records from BCHF database
and 18 from NSR database, the first 20% data points from each
10, 0 0 0 are extracted which have been performed error process-
ing (e.g. detection of abnormal sequences or single short series
error), all the date points are truncated into five short time se-
ries. For the data from LTAF database, according to the its descrip-
tion from PhysioNet [66] , the data segment during the AF episode
is first extracted from 84 ECG records. Then, 72 extracted records
are selected for the reason that their length is more than 500 data
points, utilize the same processing method to the other databases,
the first 500 data points are extracted from every record. As a re-
sult, 105 ECG records are selected (15, 18 , and 72 sets are respec-
tively from BCHF, NSR and LTAF database) from 237 sets of short
time series (75, 90, and 72 sets are respectively from BCHF, NSR
and LTAF database) in total.
What’s more, in order to reflect that BEoE method is still ef-
fective for the related short time series, refer to EoE method [14] ,
we also select 70 data points to test. On the basis of the previous
steps, the first 70 data points are extracted from the short time
series of 500 data points in the above 237 groups. As there is no
clear definition of the series length so far, to facilitate the distinc-
tion, we call the time series with 70 data as the related short time
series.
What’s more, with the view of ensuring the correctness and ra-
tionality of the experiments, all the tested subjects are between 45
and 65 years old. Before the comparative experiments, the beat an-
notations have been corrected by automated analysis with manual
review and correction. The interbeat interval time series (70 data)
of three representative subjects is shown in Fig. 2 .
4.3. The experiment about the cardiac interbeat interval time series
If measure the complexity of the interbeat interval time series
based on EoE method, the analysis process is as follows:
Firstly, divide all the data points into one thousand groups
which are in equal length. Choose the value of first and 999th or-
dered set as x
min
and x
max
. The purpose of doing so is to prevent
detection errors mixed in the first or the last groups and noises.
5
H. Cui, L. Zhou, Y. Li et al. Chaos, Solitons and Fractals 155 (2022) 11173 6
Fig. 2. The interbeat interval time series (70 data) of three representative subjects
The interval boundaries are x
min
= 0 . 3 and x
max
= 1 . 6 , that means, the
range about the heartbeat interval [0.3,1.6]. Divide this range into
55 slices with the same width, denoted as { p
1
, p
2
, p
55
} , then split
the original time series (total length is N = 70 ) into j =
N
t
= 14
windows (time scale is t = 5 ). All the data points are accommo-
dated by 55 slices, so the probability assignment could be ob-
tained, the entropy values of every time window calculated by
Shannon entropy are shown in Table 1 .
Table 1 shows the disorder of the three coarse-grained se-
quences of 14 time windows based on Shannon entropy, it can
be seen that after the first process part of EoE method, the in-
formation amount of the interbeat interval time series should be
characterized. The larger the entropy value, the more disorder
the time series, also means the more information the time win-
dow contains. Specifically, ECG curve of CHF subject always is flat,
even can be roughly regarded as a horizontal line; ECG curve of
NSR subject has obvious fluctuations than former, but the range is
controllable. However, to the AF subject, ECG curve often experi-
ences sharp increase or decrease, so it should be given the high-
est disorder, the Shannon entropy values of AF sequence should be
larger.
Then measure the complexity of the time series. Se-
lect s
2
(t = 5) = 7 possible levels to assign all the val-
ues of Shannon entropy, so the set of the levels is { l
i
} =
{ 0 , 0 . 5004 , 0 . 6730 , 0 . 9503 , 1 . 0549 , 1 . 3322 , 1 . 6094 } , and the num-
ber of possible levels coincides with possible entropy values. Thus,
the second probability assignment and EoE values of the three
subjects are shown in Table 2 .
6
H. Cui, L. Zhou, Y. Li et al. Chaos, Solitons and Fractals 15 5 (2022) 1117 36
Tabl e 1
The entropy values of every window based on Shannon entropy.
Subject Win 1 Win 2 Win 3 Win 4 Win 5 Win 6 Win 7 Win 8 Win 9 Win 10 Win 11 Win 12 Win 13 Win 14
CHF 0.5004 0.5004 0.5004 0.5004 0.5004 0.6730 0.6730 0.5004 0.5004 0.5004 0.5004 0.5004 0.5004 0
AF 1.6094 1.3322 1.6094 1.6094 1.6094 1.6094 1.6094 1.6094 1.3322 1.3322 1.3322 1.3322 1.6094 1.6094
NSR 1.3322 0.9503 1.3322 1.6094 1.6094 1.3322 1.0549 1.0549 0.9503 1.3322 1.6094 1.3322 0.5004 1.6094
Tabl e 2
The final EoE results of the three different subjects.
Subject Probability assignment EoE value
CHF p
1
( l
1
) =
1
14
, p
1
( l
2
) =
11
14
, p
1
( l
3
) =
2
14
0.6560
AF p
2
( l
6
) =
5
14
, p
2
( l
7
) =
9
14
0.6518
NSR p
3
( l
2
) =
1
14
, p
3
( l
4
) =
2
14
, p
3
( l
5
) =
2
14
, p
3
( l
6
) =
5
14
, p
3
( l
7
) =
4
14
1.4701
The results in Table 2 could represent the quantitative results
of complexity via EoE method, the complexity of NSR subject is
much higher than pathological subjects. However, it could not cor-
rectly measure the complexity of CHF and AF subjects. Based on
the above-mentioned analysis, ECG curve of CHF subject always is
flat, AF subject often experiences rapidly increase or decrease, so
the complexity of AF subject should be much higher, but the EoE
value of CHF subject is larger, it is not reasonable.
Now, measure the complexity of different interbeat interval
time series based on the proposed method, the specific process is
expressed as Algorithm 1 . BEoE method could extract the hidden
Algorithm 1: The proposed BEoE method.
Input : Time series { x
i
} = { x
1
, x
2
, . . . , x
m
}
Output : BEoE value of the time series
1 Divide the time series { x
i
} = { x
1
, x
2
, . . . , x
m
} into j windows
{ w
(t)
j
} ;Determine the range of the interbeat interval
[ x
min
, x
max
] ;Split the interval [ x
min
, x
max
] into s
1 slices in
equal width;Ascertain the number of x
i
over each
independent window of each time interval { B
s
, B
s +1
} and
generate BPA; if the data distributes over some certain
boundaries then
2 the cardinality of the set { B
s
, B
s +1
} is 1;
3 else
4 the cardinality of the set { B
s
, B
s +1
} is 2;
5 Calculate the uncertain degree of each time window via
belief entropy and obtain the belief sequence { E
(t)
j
} ;Ascertain
the possible levels in finite number s
2
(t) ;Determine the
number of belief entropy values over { l
r
, l
r+1
} and generate
BPA again; if the belief entropy values distribute over some
certain levels then
6 the cardinality of the related set { l
r
, l
r+1
} is 1;
7 else
8 the cardinality of the related set { l
r
, l
r+1
} is 2;
9 Calculate the Belief Entropy-of-Entropy; final ;
10 return BEoE value;
“information” and “changes” in the time series based on “BPA” and
time window, then quantify them as complexity to evaluate the
comprehensive ability of the system. Specifically, belief entropy for
the first time could estimate the amount of information of every
time window (i.e. disorder), and belief entropy for the second time
could quantify the changes of the total time series (i.e. complexity).
Firstly, like the EoE method, divide the interbeat interval time
series into one thousand groups which are in equal length. Choose
the value of first and 999th ordered set as x
min
and x
max
. The
purpose of doing so is to prevent detection errors mixed in the
first or the last groups and noises. Keep the related parame-
ters like x
max
, x
min
, s
1 are consistent with EoE method [14] , x
min
=
0 . 3 , x
max
= 1 . 6 , t = 5 and s
1
= 55 , that means, the range about the
heartbeat interval [0.3,1.6] is consist of 55 slices with the same
width. The boundaries of the interval is B
1
B
56 and the original
time series (the length of every series is N = 70 ) is divided into
j =
N
t
= 14 windows (the length of every window is t = 5 ). Thus,
according to the time series in left column in Fig. 3 , the belief en-
tropy values of every window based on Deng entropy are shown
in Table 3 , all the belief entropy values form a belief entropy se-
quence.
Table 3 shows the disorder of three coarse-grained sequences
with 14 time windows based on Deng entropy, it can be found
that after the first process part of BEoE method, the information
amount of each time window could be correctly characterized. The
larger the belief entropy value, the more disorder the time se-
ries, also means the more information the time window contains.
Similar to above-mentioned analysis, combine the characteristics of
these three ECG curves to explain, CHF subject always is flat, even
can be roughly regarded as a horizontal line; NSR subject has ob-
vious fluctuations than former, but the magnitude of the changes
at two adjacent recording moments can be estimated. However, to
the AF subject, ECG curve often experiences sharp increase or de-
crease, so it has the highest disorder, the belief entropy values of
AF sequence should be larger.
Secondly, measure the complexity of the interbeat interval
time series via BEoE method. There are s
2
(t = 5) = 7 belief pos-
sible levels denoted as l
1
l
7 to put the Deng entropy de-
rived at t = 5 , and convert the entropy values into BPAs ac-
cording to the belief possible levels. The selection of the lev-
els is related to the possible Deng entropy values, seven Deng
entropy values which are most frequent occurrences are con-
sidered as the belief possible levels, so the belief level list is
{ l
i
} = { 1 . 0986 , 1 . 5990 , 1 . 7716 , 2 . 0489 , 2 . 1535 , 2 . 4308 , 2 . 7081 } . As a
result, the secondary BPAs of CHF, AF and NSR subjects and the
final BEoE results are shown in Table 4 .
According to Eq. (11) , the BEoE values of three cases are 0.6560,
0.7354 and 2.4920 respectively. The proposed method could mea-
sure the complexity of interbeat interval time series correctly, it
can be found that the BEoE value of NSR time series is larger.
It is in line with the expectation, because the heartbeat fluctua-
tion of NSR subject is the most difficult to estimate and the con-
trollability of operation in heart system is the least difficult to
grasp. In contrast, the BEoE values of pathological groups are rela-
tively low, because their heart systems are poor in regulation and
control. What’s more, the change range of CHF time series of-
ten is small, while AF is relatively large, the curve often with a
sharply increase and decrease trend. Therefore, the complexity of
7
H. Cui, L. Zhou, Y. Li et al. Chaos, Solitons and Fractals 15 5 (2022) 1117 36
Fig. 3. The comparative results based on EoE and BEoE analysis. The left column is the result via EoE, the right column is BEoE. Compare the two pathological time series,
the curve of CHF subject is always flat during each recording, AF subject often experiences sharp increase or decrease, usually fluctuates greatly, so it should contain more
information, and the complexity is higher than CHF. However, it can be seen that the effect of EoE analysis is not ideal, E o E
CHF
> E o E
AF
, but the result via BEoE method is
consistent with expectations, BE o E
NSR
> BE o E
AF
> BE o E
CHF
.
Tabl e 3
The belief entropy values of every window based on belief entropy.
Subject Win 1 Win 2 Win 3 Win 4 Win 5 Win 6 Win 7 Win 8 Win 9 Win 10 Win 11 Win 12 Win 13 Win 14
CHF 1.5990 1.5990 1.5990 1.5990 1.5990 1.7716 1.7716 1.5990 1.5990 1.5990 1.5990 1.5990 1.5990 1.0986
AF 2.7081 2.4308 2.4883 2.7081 2.7081 2.7081 2.7081 2.7081 2.4308 2.2111 2.4308 2.4308 2.7081 2.7081
NSR 2.4308 2.0489 2.4308 2.7081 2.7081 2.4308 2.1535 2.1535 2.0489 2.2111 2.7081 2.4308 1.5990 2.7081
Tabl e 4
The final BEoE results of the three different subjects.
Subject Belief probability assignment BEoE valu e
CHF m
1
({ l
1
} ) =
1
14
, m
1
({ l
2
} ) =
11
14
, m
1
({ l
3
} ) =
2
14
0.6560
AF m
2
({ l
5
, l
6
} ) =
1
14
, m
2
({ l
6
} ) =
4
14
, m
2
({ l
6
, l
7
} ) =
1
14
, m
2
({ l
7
} ) =
8
14
1.2117
NSR m
3
({ l
2
} ) =
1
14
, m
3
({ l
4
} ) =
2
14
, m
3
({ l
5
} ) =
2
14
, m
3
({ l
5
, l
6
} ) =
1
14
, m
3
({ l
6
} ) =
4
14
, m
3
({ l
7
} ) =
4
14
1.7273
8
H. Cui, L. Zhou, Y. Li et al. Chaos, Solitons and Fractals 155 (2022) 11173 6
Fig. 4. The specific analysis process of first belief entropy.
three types of time series should be ranked as follows: NSR sub-
ject > AF subject > CHF subject . This is also one of the outstand-
ing advantages of the proposed method, it could distinguish the
three kinds of subjects with high accuracy, i.e. CHF, AF and NSR
subjects, which illustrates the efficiency and rationality of BEoE
method.
The comparative results about the interbeat interval time series
(70 data) of three typical subjects based on EoE and BEoE method
are shown in Fig. 3 .
4.4. Analysis of comparison results based on “quadratic” entropy
Fig. 3 demonstrates that BEoE could quantity the complexity
of time series more correctly and effectively, because BEoE ana-
lyzes the distribution of the data points in the time series more
definitely during the first time of using Deng entropy, the specific
analysis is shown in Fig. 4 . After dividing the range of interbeat in-
terval into slices, BEoE introduces the concept of BPA and utilizes
the boundaries of each slices to accommodate all the data points, it
reckons that the data points distributing on the boundaries should
be different with the data points distributing between two adja-
cent boundaries.
In addition, BEoE considers the selection rules of belief possi-
ble levels during the second time of using belief entropy. If there
are more kinds of belief entropy values than the number of belief
possible levels, EoE could not make a choice, while BEoE gives a
selection strategy, it utilizes the interval to accommodate all the
belief entropies, it reckons that the belief entropy distributing on
the belief possible level should be different with the belief entropy
distributing between two adjacent belief possible levels. The spe-
cific analysis is shown in Fig. 5 .
Summarize the above-mentioned analysis, EoE method ignores
some key problems when quantifying the complexity of time se-
ries based on multiple time scale and Shannon entropy, so it
could not distinguish the complexity of CHF subject and AF sub-
ject very well, maybe lead to wrong results. However, the pro-
posed BEoE method introduce the concept of “BPA” the location
of the data in each slice is refined, the internal and edge data
are processed differently. In addition, BEoE gives the selection rule
of belief possible levels and only most possible levels are used
to do complexity analysis so that the final result could be more
reasonable.
4.5. Comparison of other related work
In order to demonstrate the outstanding advantages of BEoE,
the complexity analysis result of BEoE is compared with other
methods. Among them, MSE, mvSE, mvMSE are either undefined
or unreliable for short time series (70 sample points are selected
here). A fairly large data set is a necessary condition for these
methods could give an reasonable complexity analysis. Although
there have been a couple of improved methods, their significance
is still undetermined in short-term applications.
Table 5 shows the comparison results of some related methods,
the most worthy method for comparison is EoE, because the both
two methods measure the complexity of time series from the per-
spective of “quadratic entropy”, while EoE cannot correctly distin-
guish AF and NSR subjects. AE could reflect both the randomness
and the probability distribution of all discrete values in the time
series, while its result is unreliable. What’s more, MDE and its im-
proved version overcome many limitations of MSE and could mea-
sure the complexity of short time series with 300 data, but when
the number is 70 which means the time series is shorter, either of
them could give a correct result, MDE even reckons that the com-
plexity of CHF subject is higher than NSR subject. The remaining
methods have been mentioned before, they are less sensitive to
short time series, no reliable results could be available based on
70 data since the SampEn for irregularity in MSE, mvSE, mvMSE
requires a large amount of data.
In summary, according to the comparative results, it can be
found that for the complexity analysis of the related short time se-
ries with 70 data, BEoE shows obvious advantages over many other
complexity methods.
5. Some discussion of the proposed BEoE method
5.1. The accuracy of BEoE
At t = 5 and s
1
= 55 , the sensitivity about pathological subjects
and the specificity about NSR subjects via the proposed BEoE anal-
ysis are analyzed based on Eqs. (12) and (13) .
Specificity : E
spe
=
T
NSR
T
NSR
+ F
NSR
(12)
Sensitivity : E
sen
=
T
PAT HOLOGY
T
PAT HOLOGY + F
PAT HOLOGY
(13)
9
H. Cui, L. Zhou, Y. Li et al. Chaos, Solitons and Fractals 15 5 (2022) 1117 36
Fig. 5. The specific analysis process of quadratic belief entropy.
Tabl e 5
The complexity analysis based on the different methods.
Methods Short time series (70 data) Order Accuracy
C
CHF C
AF C
NSR
BEoE 0.6560 1.2117 1.7273 C
CHF
< C
AF
< C
NSR reliable
EoE [14] 0.6560 0.6518 1.4701 C
AF
< C
CHF
< C
NSR unreliable
AE [67] 0.4893 1.5104 1.2578 C
CHF
< C
NSR
< C
AF unreliable
MDE ( m = 3 ) [68] 3.5814 4.0564 3.5744 C
NSR
< C
CHF
< C
AF unreliable
REMDE ( m = 2 ) [69] 1.2736 2.0291 1.5998 C
CHF
< C
NSR
< C
AF unreliable
REMDE ( m = 3 ) [69] 1.8747 2.7967 2.1547 C
CHF
< C
NSR
< C
AF unreliable
MSE [11] ×××× undefined
mvSE [70] ×××× undefined
mvMSE [71] ×××× undefined
Tabl e 6
The sensitivity and specificity of the BEoE analysis.
70 data points 200 data points 500 data points
NSR subjects(Specificity) 0.93 0.89 0.91
CHF subjects(Sensitivity) 0.85 0.88 0.89
AF subjects(Sensitivity) 0.86 0.88 0.90
where T
NSR is the correct number of NSR subjects that classified
as the NSR subjects, T
PAT HOLOGY
is the correct number of CHF or
AF subjects that classified as the pathological subjects, F
NSR is the
false number of the NSR subjects classified as the pathological sub-
jects and F
PAT HOLOGY
is the false number of the pathological subjects
classified as the NSR subjects.
For the 237 sets of the cardiac interbeat interval time series
with 70, 200 and 500 data points, calculate the sensitivity and
specificity of three cases respectively, the results are shown in
Table 6 .
Tabl e 7
The accuracy of the BEoE analysis.
70 data points 200 data points 500 data points
Accuracy 0.89 0.88 0.90
Then in regard to the feature of BEoE method that it can sepa-
rate NSR subjects from pathological subjects, we use the Eq. (14) to
test its accuracy which is shown in Table 7 .
Accuracy : E
acc
=
T
NSR
+ T
PAT HOLOGY
T
NSR
+ F
NSR
+ T
PAT HOLOGY + F
PAT HOLOGY
(14)
10
H. Cui, L. Zhou, Y. Li et al. Chaos, Solitons and Fractals 155 (2022) 11173 6
Fig. 6. The selection about the parameter t.
Fig. 7. The accuracy of the BEoE method.
The longer the time series, the higher the sensitivity and speci-
ficity of BEoE method. However, when data size is small, it is clear
that both sensitivity and specificity are not low, the accuracy is still
considerable. As a result, the proposed method could select the
NSR subjects from CHF and AF subjects at one time when the time
series is short. In other word, BEoE is not sensitive to the length of
time series and has no length limitation, may be desirable in more
applications.
5.2. The explanation of related parameter settings
The accuracy of BEoE is mainly affected by two important pa-
rameters: time scale and slice number denoted as t and s
1 respec-
tively.
5.2.1. Parameter t
To the 237 series sets of 15 CHF, 72 AF and 18 NSR subjects,
the plot about the average value < BEoE > vs. the time scale is dis-
11
H. Cui, L. Zhou, Y. Li et al. Chaos, Solitons and Fractals 155 (2022) 11173 6
played in Fig. 6 , the data is 70 and 500 respectively. It is clear that
< BEoE > of both NSR and pathological subjects rose considerably
between scale 0 and 10. Moreover, the < BEoE > values of NSR sub-
jects always obviously remain higher than the pathological subjects
based on a certain time scale, especially for t 5 .
The separation of the NSR subjects from the pathological sub-
jects occurs when t 2 , and the difference is particularly obvious
at t = 5 . Thus, when measuring the complexity of time series via
BEoE method, as AF subjects is considered much more complex
than CHF subjects, so t = 5 meets the expectation.
5.2.2. Parameter s
1
Fig. 7 is a three-dimensional plot about the accuracy of the pro-
posed method as the function with parameter t and s
1
, we analyze
237 sets of the cardiac interbeat interval time series with 500 data.
The accuracy of various divisions can be evaluated by the color
of the three-dimensional image. When the slice range is [4,6] and
the scale range is [45,60], the accuracy of BEoE method is high. In
addition, when the time scale is 5 and the slice number is 55, the
accuracy reaches the center of plateau, the corresponding color is
deep-red, thus it is rational of the settings about parameter s
1
.
5.3. The relationship between BEoE and belief entropy
In order to explore the relationship between quadratic belief
entropy and belief entropy, we conduct a deeper study. Keep the
relative parameters unchanged, set t = 5 and s
1
= 55 . To the 70 and
500 data points in the 237 sets of the short time series, draw a
graph of the relationship between BEoE and Deng entropy. There
are obvious differences among the CHF, AF and NSR subjects, the
Fig. 8. The inverted “U” curve (Belief entropy vs. BEoE).
12
H. Cui, L. Zhou, Y. Li et al. Chaos, Solitons and Fractals 155 (2022) 11173 6
three kinds of points are in dispersed position. About the position,
CHF subjects are in left-bottom, AF subjects are in right-bottom
and NSR subjects are in middle-top, the total distribution tends to
be an “U” sharp which is inverted, Fig. 8 describes their relation-
ship.
The 75 circle, 72 square and 90 triangle are respectively from 15
CHF subjects, 72 AF subjects and 18 NSR subjects, the position of
the black arrow indicates the center of each group. Moreover, the
straight line like an inverted “U” curve is the quadratic fitting, it
also shows the relationship between BEoE (complexity) and belief
entropy (disorder), is roughly like an inverted “U” sharp, and the
max complexity value appears in the middle of disorder and ex-
treme order. This finding is consisted with many studies about the
complexity [1,72–74] , and we demonstrate it in the background of
D-S evidence theory.
6. Conclusion
Biological systems usually show complex dynamic changes on
multiple scales of time, physiological signals that containing plenty
of information are very valuable for research. In this paper, Belief
Entropy-of-Entropy (BEoE) is presented to measure the hidden in-
formation and changes in the time series and quantify them as
the ability about the adjustment and adaptability of a system. The
larger the BEoE value, the higher the complexity of time series, so
the stronger the comprehensive ability of this system. The main
contribution of BEoE is listed as follows:
1). Belief entropy under the background of D-S evidence theory
is an effective tool for measuring the uncertain information.
Plenty of work has been presented, but limited work about the
uncertainty of belief entropy (i.e. quadratic belief entropy). In
this paper, we have researched its meaning and value.
2). The proposed BEoE method introduces the concept of BPA and
time scale to do complexity analysis, measures the hidden “in-
formation” in each window and “changes” of the time series.
What’s more, it is not sensitive to the length of time series and
has no length limitation, may be desirable in more applications.
(Refer to Section 5.1 )
3). BEoE method analyzes the complexity of time series from the
perspective of belief entropy, and Deng entropy is selected as
the representative. Compared with EoE method and many other
relevant algorithms, BEoE could distinguish different pathologi-
cal subjects more accurately due to the fact that it divides the
data in more detail and considers the complex temporal fluctu-
ations inherent in physiological systems.
4). BEoE utilizes the boundaries of slices and fixed number of be-
lief possible levels as the level list to accommodate all the data,
so the corresponding BPAs could describe the allocation of “dif-
ficult points” (i.e. boundary data) more clearly and accurately.
(Refer to Section 4.4 )
5). Experiments demonstrated that BEoE method could not only
ascertain NSR groups like the EoE method, but also distin-
guish the typical pathological groups, and give the complex-
ity ranking: NSR subject > AF subject > CHF subject . (Refer to
Section 4.3 )
6). About the relationship between belief entropy and BEoE has
been researched, as the belief entropy increases, they display
an inverted “U” relationship and the maximum BEoE appears
in the intermediate of extreme disorder and order. (Refer to
Section 5.3 )
Quantify the characteristics of time series correctly from the
perspective of complexity may provide assistance for medical anal-
ysis. Further study will focus on further developing the BEoE anal-
ysis, we may apply it into many other physiological signals (e.g.
EEG for Alzheimer’s disease).
Declaration of Competing Interest
The authors declare that they have no conflict of interest.
CRediT authorship contribution statement
Huizi Cui: Conceptualization, Methodology, Software, Investi-
gation, Formal analysis, Writing – original draft, Writing –re-
view & editing, Visualization, Project administration. Lingge Zhou:
Data curation, Writing – original draft, Validation. Yan Li: Vali-
dation, Visualization. Bingyi Kang: Conceptualization, Supervision,
Resources, Writing –review & editing, Funding acquisition.
Acknowledgments
The work is partially supported by the Fund of the National
Natural Science Foundation of China (Grant no. 61903307 ),
China Postdoctoral Science Foundation (Grant no. 2020M683575 ),
Chinese Universities Scientific Fund (Grant no. 24520180 6 6 ), and
the National College Students Innovation and Entrepreneurship
Training Program (Grant nos. 202110712143, 202110712146 ). The
authors thank all the reviewers for their valuable comments and
constructive suggestions.
References
[1] Huberman B , Hogg T . Complexity and adaptation. Phys D
1986;22(1–3):376–84 .
[2] Gell-Mann M. What is complexity. Complexity 1995;1. doi: 10.1002/bies.10192 .
[3] Mitchell M . Complexity: a guided tour. Oxford Univers ity Press; 2009 .
[4] Shannon CE . A mathematical theory of communication. Bell Syst Tech J
1948;27(3):379–423 .
[5] Sinai Y . Kolmogorov–Sinai entropy. Scholarpedia 2009;4(3):2034 .
[6] Grassberger P , Procaccia I . Estimation of the Kolmogorov entropy from a
chaotic signal. Phys Rev A 1983;28(4):2591–3 .
[7] Pincus SM . Approximate entropy as a measure of system complexity. Proc Natl
Acad Sci 1991;88(6):2297–301 .
[8] Richman JS , Moorman JR . Physiological time-series analysis using ap-
proximate entropy and sample entropy. Am J Physiol-Heart Circ Physiol
20 0 0;278(6):H2039–49 .
[9] Chen W , Zhuang J , Yu W , Wang Z . Measuring complexity using FuzzyEn, ApEn,
and SampEn. Med Eng Phys 2009;31(1):61–8 .
[10] Porta A , Castiglioni P , Bari V , Bassani T , Marchi A , Cividjian A ,
et al. K -nearest-neighbor conditional entropy approach for the assess-
ment of the short-term complexity of cardiovascular control. Physiol Meas
2012;34(1):17 .
[11] Costa M , Goldberger AL , Peng C-K
. Multiscale entropy analysis of complex
physiologic time series. Phys Rev Lett 2002;89:68102 .
[12] Costa M , Goldberger AL , Peng CK . Multiscale entropy analysis of biological sig-
nals. Phys Rev E 2005;71:1–18 .
[13] Bose R , Chouhan S . Alternate measure of information useful for DNA se-
quences. Phys Rev E 2011;83(5):051918 .
[14] Hsu C , We i SY , Huang HP , Hsu L , Chi S , Peng CK . Entropy of entropy: measure-
ment of dynamical complexity for biological systems. Entropy 2017;19(10):550 .
[15] Liu C , Gao R . Multiscale entropy analysis of
the differential RR interval time
series signal and its application in detecting congestive heart failure. Entropy
2017;19(6):251 .
[16] Dempster AP . Upper and lower probabilities induced by a multivalued map-
ping. Ann Math Stat 1967;38(2):325–39 .
[17] Shafer G. A mathematical theory of evidence. Princeton university press;
vol. 46. 1976 .
[18] Deng Y . Deng entropy. Chaos, Solitons Fractals 2016;91:549–53 .
[19] Kang B , Deng Y . The maximum Deng entropy. IEEE Access 2019;7:120758–65 .
[20] Zhang H , Deng Y . Entropy measure for orderable sets. Inf Sci 2021;561:141–51 .
[21] Huang Z , Ya ng L , Jiang
W . Uncertainty measurement with belief entropy on the
interference effect in the quantum-like Bayesian networks. Appl Math Comput
2019;347:417–28 .
[22] Wang M , Liao X , Deng Y , Li Z , Su Y , Zeng Y . Dynamics, synchronization and
circuit implementation of a simple fractional-order chaotic system with hidden
attractors. Chaos, Solitons Fracta ls 2020;130:109406 .
[23] Zhang Q , Li M , Deng Y . Measure the structure similarity of nodes in complex
networks based on relative entropy. Phys A 2018;4 91 :74 9–63 .
[24] Chen L , Diao L , Sang J . A novel weighted evidence combination rule based
on improved entropy function with a diagnosis application. Int J Distrib Sens
Netw 2019;15(1):1–13 .
[25] Gao X , Deng Y . The pseudo-pascal triangle of maximum Deng entropy. Int J
Comput Commun Control 2020;15(1):1006 .
[26] Abellán , Joaquín . Analyzing properties of Deng entropy in the theory of evi-
dence. Chaos Solitons Fractals 2017;95:195–9 .
13
H. Cui, L. Zhou, Y. Li et al. Chaos, Solitons and Fractals 155 (2022) 11173 6
[27] Tsallis C . Nonadditive entropy: the concept and its use. Eur Phys J A
2009;40(3):257–66 .
[28] Deng Y . Uncertainty measure in evidence theory. Sci China Inf Sci
2020;63(11):1–19 .
[29] Deng Y . Generalized evidence theory. Appl Intell 2015;43(3):530–43 .
[30] Fan L , Deng Y . Determine the number of unknown targets in open world based
on Elbow method. IEEE Trans Fuzzy Syst 2021;29(5):986–95 .
[31] Deng X , Jiang W . D number theory based game-theoretic framework in ad-
versarial decision making under a fuzzy environment. Int J Approx Reason
2019;106:194–213 .
[32] Deng X , Jiang W . Evaluating green supply chain management practices under
fuzzy environment: a novel method based on D number theory. Int J Fuzzy
Syst 2019;21(5):1389–402 .
[33] Kang B , Deng Y , Hewage K , Sadiq R . A method of measuring uncertainty for
Z -number. IEEE Trans Fuzzy Syst 2019;27(4):731–8 .
[34] Tian Y , Mi X , Cui H , Zhang P , Kang B . Using Z -number to measure the reliabil-
ity of new information fusion method and its application in pattern recogni-
tion. Appl Soft Comput 2021;111:107658 .
[35] Kang B , Zhang P , Gao Z , Chhipi-Shrestha
G , Hewage K , Sadiq R . Environmental
assessment under uncertainty using Dempster–Safer theory and Z -numbers. J
Ambient Intell Humaniz Comput 2020;11(5):2041–60 .
[36] Liu Q , Tian Y , Kang B . Derive knowledge of Z -number from the perspective of
Dempster–Shafer evidence theory. Eng Appl Artif Intell 2019;85:754–64 .
[37] Liu Q , Cui H , Tian Y , Kang B . On the negation of discrete Z -numbers. Inf Sci
2020;537:18–29 .
[38] Tian Y, Liu L, Mi X, Kang B. ZSLF : a new soft likelihood function based on Z -
numbers and its application
in expert decision system. IEEE Trans Fuzzy Syst
2020. doi: 10.1109/TFUZZ.2020.2997328 . Accepted.
[39] Deng Y. Information volume of mass function. Int J Comput Commun control
2020;15. doi: 10.15837/ijlll.2020.6.3983 . Accepted.
[40] Gao X , Liu F , Pan L , Deng Y , Tsai S-B . Uncertainty measure based on Tsallis
entropy in evidence theory. Int J Intell Syst 2019;34(11):3105–20 .
[41] Zhang J , Liu R , Zhang J , Kang B . Extension of Yage r’s negation of a probability
distribution based on Tsallis entropy. Int J Intell Syst 2019;35(1):72–84 .
[42] Cui H , Liu Q , Zhang
J , Kang B . An improved Deng entropy and its application
in pattern recognition. IEEE Access 2019;7:18284–92 .
[43] Wu Y , Kang B , Wu H . Stra tegies of attack-defense game for wireless sensor
networks considering the effect of confidence level in fuzzy environment. Eng
Appl Artif Intell 2021;102(11):104238 .
[44] He Z , Jiang W . An evidential dynamical model to predict the interference effect
of categorization on decision making. Knowl Based Syst 2018;150:139–49 .
[45] Song Y , Zhu J , Lei L , Wang X . Self-adaptive combination method for temporal
evidence based on negotiation strategy.
Sci China Inf Sci 2020;63(11):1–3 .
[46] Zhou L, Cui H, Huang C, Kang B. Counter deception in belief func-
tions using Shapley value methodology. Int J Fuzzy Syst 2021. doi: 10. 100 7/
s40815- 021- 01139- 1 . Accepted.
[47] Huang C, Mi X, Kang B. Basic probability assignment to probability distribution
function based on the Shapley value approach. Int J Intell Syst 2021. doi: 10.
1002/int.22456 . Accepted.
[48] Mi X , Kang B . On the belief universal gravitation (BUG). Comput Ind Eng
2020;148:106685 .
[49] Zadeh LA . A simple view of the Dempster–Shafer theory of evidence and its
implication for the rule of combination. AI Mag 1986;7:85–90 .
[50] Liu W . Analyzing the degree of conflict among belief functions. Artif Intell
2006;170(11):909–24 .
[51] Xiao F . Evidence combination based on prospect theory for multi-sensor data
fusion. ISA Trans 2020;106:253–61 .
[52] Mi X, Lv T, Tian Y, Kang B. Multi-sensor data fusion based on soft likelihood
functions and OWA aggregation and its application in target recognition sys-
tem. ISA Trans 2020. doi: 10.1016/j.isatra.2020.12.009 . Accepted.
[53] Su X , Deng Y , Mahadevan S , Bao Q . An improved method for risk evaluation in
failure modes and effects analysis of aircraft engine rotor blades. Eng Fail Anal
2012;26:164–74 .
[54] Wang Y-M , Elhag TM . A comparison of neural network, evidential reasoning
and multiple regression analysis in modelling bridge risks. Expert Syst Appl
2007;32(2):336–48 .
[55] Su X ,
Mahadevan S , Xu P , Deng Y . Dependence assessment in human reliability
analysis using evidence theory and AHP. Risk Anal 2015;35(7):1296–316 .
[56] Du WS , Hu BQ . Attribute reduction in ordered decision tables via evidence
theory. Inf Sci 2016;364–365:91–110 .
[57] Xu DL , Yan g JB , Wan g YM . The evidential reasoning approach for multi-
ple attribute decision analysis using interval belief degrees. Eur J Oper Res
2006;174(3):1914–43 .
[58] Fei L , Lu J , Feng Y . An extended best-worst multi-criteria decision-making
method by belief functions and its applications in hospital service evaluation.
Comput
Ind Eng 2020;142:106–355 .
[59] Boujelben MA , Smet YD , Frikha A , Chabchoub H . Building a binary outranking
relation in uncertain, imprecise and multi-experts contexts: the application of
evidence theory. Int J Approx Reason 2009;50(8):1259–78 .
[60] Boujelben MA , De Smet Y , Frikha A , Chabchoub H . A ranking model in uncer-
tain, imprecise and multi-experts contexts: the application of evidence theory.
Int J Approx Reason 2011;52(8):1171–94 .
[61] Denoeux T . A k-nearest neighbor classification rule based on Dempster–Shafer
theory. IEEE Trans Syst Man Cybern 1995;25(5):804–13 .
[62] Liu ZG , Pan Q , Dezert J , Martin A . Adaptive imputation of missing values for
incomplete pattern classification. Pattern Recognit 2016;52:85–95 .
[63] Xiao F. A distance measure for intuitionistic fuzzy sets and its application to
pattern classification problems. IEEE Trans Syst, Man, Cybern 2019(99):1–13.
doi: 10.1109/TSMC.2019.2958635 .
[64] Dempster AP . Upper and lower probabilities induced by a multivalued map-
ping. Ann Math Stat 1967;38(2):325–39 .
[65] Smets P . Decision making in the TBM: the necessity of the pignistic transfor-
mation. Int J Approx Reason 2005;38(2):133–47 .
[66] BIDMC Congestive Heart Failure Database, MIT-BIH Normal Sinus Rhythm
Database and Long Ter m AF
Database. 2020. accessed on 23 December http:
//www.physionet.org/physiobank/database/#ecg ;
[67] Hsu CF , Lin P-Y , Chao H-H , Hsu L , Chi S . Average entropy: measurement of
disorder for cardiac RR interval signals. Phys A 2019;529:121533 .
[68] Azami H , Fernández A , Escudero J . Multivariate multiscale dispersion entropy
of biomedical times series. Entropy 2019;21(9):913 .
[69] Azami H , Rostaghi M , Abásolo D , Escudero J . Refin ed composite multiscale dis-
persion entropy and its application to biomedical signals. IEEE Trans Biomed
Eng 2017;64(12):2872–9 .
[70] Ahmed MU , Mandic DP . Multivariate multiscale entropy
analysis. IEEE Signal
Process Lett 2011;19(2):91–4 .
[71] Ahmed MU , Mandic DP . Multivariate multiscale entropy: a tool for complexity
analysis of multichannel data. Phys Rev E 2011;84(6):061918 .
[72] Peng CK , Costa M , Goldberger AL . Adaptive data analysis of complex fluctua-
tions in physiologic time series. Adv Adapt Data Anal 2009;1:61–70 .
[73] Gell-Mann M . What is complexity. Complexity 2009;1(1):16–19 .
[74] Silva LEV , Cabella BCT , da Costa Neves UP , Murta Junior LO . Multiscale en-
tropy-based methods for heart rate variability complexity analysis. Phys A
2015;422:143–52 .
14
... In order to measure the uncertainty of evidence, Deng proposed Deng entropy and information volume of mass function [10], [11]. Deng entropy and information volume of mass function contributes to nonlinear systems [12]- [14] and information measurement [15], also plays a role in time series analysis [16]. Considering quantum theory, Pan et al. proposed quantum combination rules [17]. ...
... There is a middle stage that the input picture is not clear and it is ambiguous between 4 and 6 in Fig.3 and corresponding event is {4, 6}. Event space of MNIST is in Eq. (16). If there is 95% confidence that target is in {4, 6}, the mass form of confidence is in Eq. (17). ...
Preprint
Full-text available
Evidence theory is widely used in decision-making and reasoning systems. In previous research, Transferable Belief Model (TBM) is a commonly used evidential decision making model, but TBM is a non-preference model. In order to better fit the decision making goals, the Evidence Pattern Reasoning Model (EPRM) is proposed. By defining pattern operators and decision making operators, corresponding preferences can be set for different tasks. Random Permutation Set (RPS) expands order information for evidence theory. It is hard for RPS to characterize the complex relationship between samples such as cycling, paralleling relationships. Therefore, Random Graph Set (RGS) were proposed to model complex relationships and represent more event types. In order to illustrate the significance of RGS and EPRM, an experiment of aircraft velocity ranking was designed and 10,000 cases were simulated. The implementation of EPRM called Conflict Resolution Decision optimized 18.17% of the cases compared to Mean Velocity Decision, effectively improving the aircraft velocity ranking. EPRM provides a unified solution for evidence-based decision making.
... Therefore, some variations of Deng entropy have been recently introduced, such as modi¯ed belief entropy [44], Deng extropy [5], fractional Deng entropy [16], fractal-based belief entropy [43], decomposable Deng entropy [38], and generalized R enyi entropy for BPA [40]. Several applications of Deng entropy and its extensions have been considered, see [36,16,13,42,22,28,10], and references therein. ...
Article
This work introduces belief inaccuracy information (BII) and Jensen-belief inaccuracy information (JBII), grounded in the foundational concept of basic probability assignment. The significance of these novel methodologies lies in their potential to advance the field by providing valuable insights into belief accuracy assessment. The research not only establishes these measures but also highlights their relevance in addressing critical challenges within the realm of probability assignment. Further, some results associated with these proposed information measures were established. In addition, the BII and and JBII measures were examined for escort and generalized escort of basic probability assignment functions, where a escort basic probability assignment with order α is constructed based on the power α of a baseline basic probability assignment with a normalizing constant. Finally, this paper presents compelling results derived from the application of BII and JBII measures. A detailed simulation of Conway's game of life cellular automaton is described, showcasing the utility of these measures in analyzing belief inaccuracies within complex dynamic systems. Furthermore, an application to ozone time series is explored, employing belief Kullback-Leibler and belief chi-square divergences. This application specifically focuses on the study of station discrepancies, revealing the versatility of the proposed measures in addressing diverse scenarios.
... Information uncertainty can be measured by entropy. One of the most essential entropies is Shannon entropy. 1 There are some entropy-relative theories, such as information of picture fuzzy set, [2][3][4] belief entropy-of-entropy, 5 the detection of neurological disorders, 6 the asymptotic distribution of the permutation entropy, 7 and so on. [8][9][10] In addition, Deng entropy 11 and the entropy of random permutation set (RPS) [12][13][14] are derived from Shannon entropy. ...
Article
Full-text available
Shannon entropy is used to measure information uncertainty, while the information dimension is used to measure information complexity. Given two probability distributions, the difference can be measured by relative entropy. However, the existing relative entropy does not consider the effect of information dimension. To improve the existing entropy, a new relative entropy is presented in this paper. The information fractal dimension is considered in the proposed relative entropy. The new relative entropy is more generalized than the initial relative entropy. When dimension is not considered, it will degenerate to the initial relative entropy. Another interesting point is that the new relative entropy may have negative values when calculating. The physical meaning is still under exploration. Finally, some application examples are provided to exemplify the utilization of the proposed relative entropy.
... Among these theories, DST is an effective tool to manage the uncertainty [20,21], which satisfies the associative law and commutative law, generalizes the uncertainty to subsets by basic probability assignment (BPA). Furthermore, DST is easily implemented and can be applied in real applications conveniently, such as risk evaluation [22], classification [23], time series analysis [24,25], uncertain database retrieval [26], etc. ...
Article
Full-text available
Dempster-Shafer evidence theory (DST) is a versatile framework for handling uncertainty and provides a reliable method for data fusion. Managing conflicts between multiple bodies of evidence (BOEs) within DST poses a challenging problem that necessitates effective strategies. In this paper, we present a novel similarity measurement called the belief Tanimoto coefficient (BTC). The BTC accurately quantifies the consistency between BOEs by considering both the length and direction of the evidence vectors. Furthermore, we propose a conflict measurement approach based on BTC. We analyze and demonstrate the desirable properties of the proposed similarity and conflict measures. Numerical examples and comparisons are provided to illustrate the superior effectiveness and validity of BTC. Additionally, we introduce a multisource information fusion method called BTC-MSIF. The proposed BTC-MSIF method achieves higher accuracy rates compared to existing approaches in real-world scenarios, including fault diagnosis and pattern classification.
Article
Full-text available
In Dempster–Shafer evidence theory, how to use the basic probability assignment (BPA) in decision‐making is a significant issue. The transformation of BPA into a probability distribution function is one of the common and feasible schemes. To overcome the problems of the existing methods, we propose a marginal probability transformation method based on the Shapley value approach. The proposed method allocates BPA values in terms of how much an element contributes to a set, which is an equitable and effective distribution mechanism. Furthermore, we use probabilistic information content to evaluate the effect of each transformation method. Moreover, some numerical examples are used to demonstrate the efficiency and feasibility of the proposed method. Further, two applications, target recognition, fault diagnosis are used to verify the superiority and effectiveness of the proposed method in practice.
Article
Full-text available
Given a probability distribution, its corresponding information volume is Shannon entropy. However, how to determine the information volume of a given mass function is still an open issue. Based on Deng entropy, the information volume of mass function is presented in this paper. Given a mass function, the corresponding information volume is larger than its uncertainty measured by Deng entropy. In addition, when the cardinal of the frame of discernment is identical, both the total uncertainty case and the BPA distribution of the maximum Deng entropy have the same information volume. Some numerical examples are illustrated to show the efficiency of the proposed information volume of mass function.
Article
Counter deception is one of the main content in data fusion. The existence of deceptive data may cause great hidden dangers to the generation of correct decisions. While among previous studies, whether evidence should aggregate is still virgin and may become a fascinating question. In this paper, a new counter deception model based on the Shapley value methodology is proposed, which provides a perspective for determining the weight of evidence. Then, we present that the distance of evidence is a kind of “marginal contribution” to the anomaly of the entire fusion system. Moreover, we also investigated the properties of the proposed method to judge whether there is deceptive data in the information fusion based on the cooperation benefits of all basic belief assignment (BBA) combinations. Several numerical examples and a classification application were used to illustrate the practicability and effectiveness of the proposed methodology.
Article
Information fusion has traditionally been a concern. In the fusion process, how to effectively take care of the ambiguity and uncertainty of data is a fascinating problem. Dempster-Shafer evidence theory shows powerful functions in dealing with uncertainty information, and Z-number can comprehensively model the ambiguity and reliability of information. Inspired by this, this paper proposed a new information fusion method based on Dempster-Shafer theory and K-means clustering and it established the reliability evaluation criterion based on Z-number. Comparison and discussion verify the rationality of the proposed method, which also illustrates the method has better robustness and sensitivity than existing methods, some critical issues in DST, e.g., conflict management, evidence stuck, are well investigated and overcome by the proposed method. Number examples and the application further shows the application potential of the proposed method in a data-driven intelligent system.
Article
It is a common case that Wireless Sensor Networks are attacked by malware in the real world. According to the game theory, the action of attack–defense between Wireless Sensor Network(WSN) and malware can be regarded as a game. While substantial efforts have been made to address this issue, most of these efforts have predominantly focused on the analysis of attack–defense game in the known environment. Given that the process of gaming in real world often contains a lot of fuzzy information, we extend the focus in this line by considering the fuzzy exterior environment. Specifically, we assume the WSN attack–defense Stackelberg game is in the fuzzy environment by using fuzzy variable. Then Stackelberg game theory is utilized to calculate the equilibrium solutions of the introduced maximax chance-constrained model and minimax chance-constrained model. Based on the simulation data, this study demonstrates the confidence levels and decision perspectives affect the optimal strategy of WSN and the reliability of WSN. Finally, the novel analytical method is compared with the non-fuzzy WSN attack–defense game method. The analysis shows that the novel approach is optimal in terms of predicting the behavior of malware in resisting the attack of malware.
Article
How to measure the uncertainty of orderable sets is still an open issue. In this paper, a new method based on Deng entropy to measure the uncertainty of orderable sets is proposed. When orderable sets degenerate as unorderable sets, the new entropy would degenerate as Deng entropy. When orderable sets degenerate into singleton, the new entropy would degenerate into Shannon entropy. Some numerical examples are used to illustrate the efficiency and accuracy of the proposed method.
Article
Multi-sensor data fusion plays an irreplaceable role in actual production and application. Dempster-Shafer theory (DST) is widely used in numerous fields of information modeling and information fusion due to the flexibility and effectiveness of processing uncertain information and dealing with uncertain information without prior probabilities. However, when highly contradictory evidence is combined, it may produce results that are inconsistent with human intuition. In order to solve this problem, a hybrid method for combining belief functions based on soft likelihood functions (SLFs) and ordered weighted averaging (OWA) operators is proposed. More specifically, a soft likelihood function based on OWA operators is used to provide the possibility to fuse uncertain information compatible with each other. It can characterize the degree to which the probability information of compatible propositions in the collected evidence is affected by unknown uncertain factors. This makes the results of using the Dempster’s combination rule to fuse uncertain information from multiple sources more comprehensive and credible. Experimental results manifest that this method is reliable. Example and application show that this method has obvious advantages in solving the problem of conflict evidence fusion in multi-sensor. In particular, in target recognition, when three pieces of evidence are fused, the target recognition rate is 96.92%, etc.
Article
As an extension of probability theory, evidence theory is able to better handle unknown and imprecise information. Owing to its advantages, evidence theory has more flexibility and effectiveness for modeling and processing uncertain information. Uncertainty measure plays an essential role both in evidence theory and probability theory. In probability theory, Shannon entropy provides a novel perspective for measuring uncertainty. Various entropies exist for measuring the uncertainty of basic probability assignment (BPA) in evidence theory. However, from the standpoint of the requirements of uncertainty measurement and physics, these entropies are controversial. Therefore, the process for measuring BPA uncertainty currently remains an open issue in the literature. Firstly, this paper reviews the measures of uncertainty in evidence theory followed by an analysis of some related controversies. Secondly, we discuss the development of Deng entropy as an effective way to measure uncertainty, including introducing its definition, analyzing its properties, and comparing it to other measures. We also examine the concept of maximum Deng entropy, the pseudo-Pascal triangle of maximum Deng entropy, generalized belief entropy, and measures of divergence. In addition, we conduct an analysis of the application of Deng entropy and further examine the challenges for future studies on uncertainty measurement in evidence theory. Finally, a conclusion is provided to summarize this study.
Article
In temporal information fusion, the information collected by sensors is obtained dynamically with the passage of time. Unlike the spatial information fusion, temporal fusion should be dynamic. Evidence theory has been applied to fuse temporal and spatial information; however, existing temporal fusion methods do not treat conflicting and non-conflicting evidence sources distinctively. Moreover, unlike spatial evidence sources, which are obtained simultaneously, temporal evidence sources cannot be evaluated simultaneously to obtain their degree of reliability. Thus, it is necessary to develop a method for temporal evidence combination. In this paper, a self-adaptive combination method for temporal evidence is proposed based on the negotiation strategy. In the proposed method, a set called an evidence set is constructed by the cumulative temporal fusion results of the previous moment, current moment, and future moment. The evidence set is evaluated as conflicting or non-conflicting according to the maximum power pignistic probability distance between each pair of evidence sources in the set. Then, temporal evidence sources are self-adaptively combined by different methods according to the degree of conflict. Numerical experiments were conducted to evaluate the performance of the proposed method. The results indicated that the proposed method is sufficiently effective and robust to support decision making.
Article
In this paper, we proposed a notion of belief universal gravitation (BUG) in the Dempster-Shafer (D-S) evidence theory, of which the notion of mass of a belief function is newly addressed using evidence quality coding (EQC) method. The proposed BUG aims to discuss the process of information fusion from the perspective of Newton’s mechanics, which may provide us a new insight to address the issues of D-S evidence theory. A key issue in D-S evidence theory, i.e., conflict management, is solved better than previous methods using the proposed BUG. An application in fault diagnosis is used to illustrate the effectiveness of the proposed BUG. Some further work is also summarized to present the potentials of the proposed BUG.