Nothing Special   »   [go: up one dir, main page]

Next Article in Journal
Improved Cryptanalysis and Enhancements of an Image Encryption Scheme Using Combined 1D Chaotic Maps
Next Article in Special Issue
Age of Information in Wireless Powered Networks in Low SNR Region for Future 5G
Previous Article in Journal
Subnational Analysis of Economic Fitness and Income Dynamic: The Case of Mexican States
Previous Article in Special Issue
Detecting and Reducing Biases in Cellular-Based Mobility Data Sets
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A New Belief Entropy to Measure Uncertainty of Basic Probability Assignments Based on Belief Function and Plausibility Function

Institute of Fundamental and Frontier Science, University of Electronic Science and Technology of China, Chengdu 610054, China
*
Author to whom correspondence should be addressed.
Entropy 2018, 20(11), 842; https://doi.org/10.3390/e20110842
Submission received: 29 September 2018 / Revised: 28 October 2018 / Accepted: 31 October 2018 / Published: 3 November 2018
(This article belongs to the Special Issue Information Theory and 5G Technologies)

Abstract

:
How to measure the uncertainty of the basic probability assignment (BPA) function is an open issue in Dempster–Shafer (D–S) theory. The main work of this paper is to propose a new belief entropy, which is mainly used to measure the uncertainty of BPA. The proposed belief entropy is based on Deng entropy and probability interval consisting of lower and upper probabilities. In addition, under certain conditions, it can be transformed into Shannon entropy. Numerical examples are used to illustrate the efficiency of the new belief entropy in measurement uncertainty.

1. Introduction

With the sharply growing interest in data fusion, the evidence theory, also known as the Dempster–Shafer (D–S) theory [1], which was first presented by Dempster [2] and then developed by Shafer [3], has aroused great concern for its effectiveness in modeling and fusing uncertain information [4]. D–S theory assigns probabilities to the power set of events [5], so it has advantages of dealing with uncertainty and unknown problems. In addition, it has wide applications, such as sensors’ network analysis [6], classification and clustering [7,8,9], decision-making [10,11,12], knowledge reasoning [13,14], risk assessment and evaluation [10,15], and others [9,11,16,17,18].
The D–S theory is used to combine belief functions [2,19,20]. However, in D–S theory, there is an open issue on how to measure the uncertainty of belief functions [2,5,21,22]. Uncertainty plays a significant role in some fields since it is the foundation and prerequisite to quantitatively study the questions [3,23,24,25]. Shannon entropy has basically resolved the uncertainty of probability theory [26], which is widely used in many application systems [27,28,29]. Inspired by ideas, many scientists are devoted to studying uncertainty of belief function [30]. So far, there are some methods of uncertainties in belief function [31]. We classify these methods according to additivity [32,33]. Deng entropy [34] and Tsallis [35] entropy do not satisfy the additivity, which are non-extended entropy. In addition, Yager’s specificity measure [31], Hartley entropy [36], Korner’s specificity definition [37], Höhle confusion measure [38], discord measure [39] and conflict measure [40] satisfy additivity. Generally speaking, the measures can reduce to Shannon’s entropy under certain conditions. However, in recent studies, there is an important discovery that belief function theory is not a successful generalization of probability theory [3,41]. The basic probability assignment (BPA) function is transformed into probability distribution through conversion, which results in the loss of information. Hence, it is unreasonable that uncertainty of belief functions was calculated by the evolution of Shannon entropy. Therefore, it is very desirable to define a new way of measuring uncertainty to avoid the loss of information. Based on that, many people have made some attempts in the field, Deng [34] has presented Deng entropy to simplify the calculation of uncertainty of BPAs by considering total non-specificity and discord simultaneously without the conversion from BPA to probability. Recently, the probability interval in BPA has aroused wide attention because it is also a key factor for uncertainty. Yang and Han [41] have defined a distance-based total uncertainty measure for BPA based on probability interval. Deng et al. [42] have improved this measure to avoid counter-intuitive results caused by it. They overcome some shortcomings of traditional measurement; however, the uncertainty of those methods is inconsistent with Shannon entropy when BPA is degenerated to probability distribution.
In this paper, we analyze the uncertainty of BPA based on intervals which contain more information than probability. We propose new belief entropy by combining probability interval and Deng Entropy’s idea, which can degenerate Shannon entropy when there is probability distribution. Thus, our proposed method can effectively measure uncertainty in BPA and probability distribution. Since there is no switch between BPA and probability distribution, it can overcome these limitations in traditional measures. Thus, it is feasible to define an uncertainty measure for a BPA based on probability interval.
The paper is organized as follows. Basics of D–S evidence theory for BPA are briefly introduced in Section 2. Section 3 presents and existing uncertainty measures and new belief entropy of BPA. Some important examples are described in Section 4 in order to illustrate the efficiency of the new belief entropy. Finally, this paper is concluded in Section 5.

2. Preliminaries

In this section, some preliminaries are briefly introduced.

D–S Evidence Theory

Some basic definitions of D–S theory are briefly introduced [2,3]:
A set of hypotheses Θ is the exhaustive hypotheses of variable θ [43]. The elements are mutually exclusive in Θ [44]. Then, Θ is called the frame of discernment, defined as follows [2,3]:
Θ = { θ 1 , θ 2 , , θ i , , θ N } .
The power set of Θ is denoted by 2 Θ [45], and
2 Θ = { , { θ 1 } , , { θ N } , { θ 1 , θ 2 } , , { θ 1 , θ 2 , , θ i } , , Θ } ,
where is an empty set [46].
A BPA function m is a mapping of 2 Θ to a probability interval [ 0 , 1 ] , formally defined by [2,3]:
m : 2 Θ [ 0 , 1 ] ,
which satisfies the following conditions [47]:
m ( ) = 0 A 2 Θ m ( A ) = 1 0 m ( A ) 1 A 2 Θ .
The mass m ( A ) represents how strongly the evidence supports A.
The belief function ( B e l ) is a mapping from set 2 θ to [0, 1] and satisfied:
B e l A = B A m B .
When BPA is m A = 1 A = Θ 0 A Θ , the B e l is the simplest, B e l ( m ) = 1 A = Θ 0 A Θ , this b e l is called a vacuous belief function which is suitable for situations without any evidence.
The plausibility function ( P l ): 2 θ [ 0 , 1 ] , and satisfied:
P l A = B A ϕ m B = 1 B e l ( A ¯ ) .
The P l indicates the degree to which is not suspected.
As can be seen from the above, A Θ , B e l A < P l A , B e l ( A ) , P l ( A ) are respectively the lower and upper limits of A, namely [Bel(A), Pl(A)], which indicates uncertain interval for A.
For the same evidence, the different BPAs come from the different evidence resources. The Dempster’s combination rule can be used to obtain the combined evidence [2,48]:
m ( ) = 0 m ( A ) = B C = A m 1 ( B ) m 2 ( C ) 1 K ,
where K = B C = m 1 ( B ) m 2 ( C ) . It is remarkable that, if K > 1 , the Dempster’s rules can not apply to two BPAs.

3. Uncertainty Measures for Belief Structures

3.1. Existing Uncertainty Measures for Belief Structures

There are many methods to handle uncertainty [49]. In 1948, Shannon pointed out: “Information is used to eliminate random uncertainty” and proposed the concept of “information entropy” (using the concept of entropy in thermodynamics) to solve the problem of information measurement [50]. The concept of entropy is derived from physics [50,51]; it has been a measure of uncertainty and disorder [52]. A system with higher uncertainty has greater entropy, which also contains more information [11].
The Shannon entropy H is derived as [26,53]:
H = i = 1 N p i l o g b p i ,
where N is the number of basic states in a system, and p i is the probability of state i appears satisfying i = 1 N p i = 1 .
Shannon entropy plays a key role in handling a basic probability problem, and there are some limitations of Shannon entropy [42]. The concept of entropy in the framework of D–S theory is an open issue. Many researchers have extended many measured functions based on it, such as:
Dubois and Prade. Dubois and Prade weighted Hartley entropy of BPA was shown [54]:
H d p ( m ) = A 2 Θ m ( A ) l o g ( | A | ) .
Höhle. One of the earlier confusion measures for D–S theory was due to Höhle [38]:
H o ( A ) = A 2 Θ m ( A ) l o g ( B e l ( A ) ) .
Yager. Dissonance measure of BPA was defined by Yager, as follows [31]:
H y ( m ) = A 2 Θ m ( A ) l o g P l ( A ) .
Klir and Ramer. Another discord measure of BPA was defined by Klir and Ramer, as follows [39]:
H k r = A Θ m ( A ) l o g B Θ m ( B ) | A B | B .
Klir and Parviz. Klir and Parviz defined entropy [40]:
H k p ( m ) = A Θ m ( A ) l o g B Θ m ( B ) | A B | | A | .
George and Pal. George and Pal suggested a definition of conflict measure [55]:
H g p ( m ) = A Θ m ( A ) B Θ m ( B ) | 1 A B A B | .
It can clearly be seen that these methods are all based on the Shannon entropy. There are also some documents that give a detailed introduction to these functions [49,56,57], and these entropies have their own basic properties, such as consistency with D–S theory semantics, non-negativity, probability consistency, etc. and later Deng proposed the concept of Deng Entropy [34], which is a new function of measuring uncertainty. The Deng entropy is described as follows [34]:
H d ( m ) = A 2 Θ m ( A ) l o g m ( A ) 2 | A | 1 ,
where | A | is the cardinality of A. As the above, Deng Entropy is very similar to Shannon Entropy, but Deng Entropy uses 2 | A | 1 to deal with the BPA of multifocal elements, which is more advantageous than Shannon Entropy. In addition, additivity and boundary are expanded.

3.2. The New Belief Entropy

In D–S theory, the probability interval [ B e l ( A ) , P l ( A ) ] can be obtained more information based on the basic probability assigned to each focal element. In this article, we use the probability interval to extend new methods of measuring uncertainty, as follows:
H b e l ( m ) = A 2 Θ B e l ( A ) + P l ( A ) 2 l o g B e l ( A ) + P l ( A ) 2 ( 2 | A | 1 ) .
As mentioned, this probability interval whose lower and upper bounds are the B e l and the P l , respectively [58,59]. For a probability distribution, there are some advantages, such as discord and non-specificity [60]. Moreover, central values of probability interval can be used to compare uncertainty. At length, we all know that cardinality of every BPA is very important for the measurement of uncertainty. Hence, the new belief entropy which considers Deng entropy and the interval probability can better measure the uncertainty of BPA. In addition, according to the the literature of Kirl and Lewis [32], Kirl [33], the basic properties of the new belief entropy are explored as follows:
P 1 consistency with DS theory semantics: The new entropy is consistent with D–S theory semantics. Thus, it satisfies the consistency with D–S theory semantics property.
P 2 non-negativity: We know that 0 < B l e x + P l x 2 < 1 , thus, H b e l m > 0 . For H b e l m = 0 to hold, only if m x = 1 , H b e l m = 0 if and only if m is Bayesian. Thus, new entropy satisfies the non-negativity property.
P 3 probability consistency: If m is Bayesian, then m x = B l e x = P l x , for all x ϵ X . Thus, new entropy satisfies the probability consistency property.
P 4 subadditivity: To check that new entropy does not verify the subadditivity property, we consider the following example:
Let X × Y be the product space of the sets X = x 1 , x 2 , x 3 and Y = y 1 , y 2 . We have that the marginal BPAs on X × Y with masses
m z 11 , z 12 , z 21 = 0.5 , m z 31 , z 32 = 0.1 , m z 21 = 0.1 , m X × Y = 0.3 , where z i j = x i , y j . We have that the marginal BPAs on X × Y are the following ones: m 1 and m 2 , respectively
m 1 x 1 , x 2 = 0.5 , m 1 x 3 = 0.1 , m 1 x 2 = 0.1 , m 1 X = 0.3 ,
m 2 y 2 = 0.1 , m 2 Y = 0.9 .
Thus:
B e l x 1 , x 2 = 0.6 , P l x 1 , x 2 = 0.9 , B e l x 3 = 0.1 , P l x 3 = 0.4 ,
B e l x 2 = 0.1 , P l x 2 = 0.9 , B e l X = 1 , P l X = 1 ,
B e l y 1 = 0.1 , P l y 1 = 1 , B e l Y = 1 , P l Y = 1 ,
B e l z 11 , z 12 , z 21 = 0.6 , P l z 11 , z 12 , z 21 = 0.9 , B e l z 31 , z 32 = 0.1 , P l z 31 , z 32 = 0.4 ,
B e l z 21 = 0.1 , P l z 21 = 0.9 , B e l X × Y = 1 , P l X × Y = 1 ,
H b e l m 1 + H b e l m 2 = 7.36669 , H b e l m = 9.79031 .
Obviously, H b e l m > H b e l m 1 + H b e l m 2 , and the subadditivity property is not satisfied.
P 5 additivity properties: The new entropy is also non-additive. It is easy to check, in general, that 2 m n 1 2 m 1 × 2 n 1 . We can use the following counter example to prove it in a more direct way:
Using the symbol of the previous example. Let X × Y be the product space of the sets X = x 1 , x 2 , x 3 and Y = y 1 , y 2 . We have that the marginal BPAs on X × Y are the following ones: m 1 and m 2 , respectively:
m 1 x 1 , x 2 = 0.5 , m 1 x 3 = 0.1 , m 1 x 2 = 0.1 , m 1 X = 0.3 ,
m 2 y 2 = 0.1 , m 2 Y = 0.9 .
Now, we build the following BPA m = m 1 × m 2 on X × Y (the marginal BPAs of m are m 1 and m 2 ; and they are noninteractive). The BPA m has the following masses:
m z 11 , z 12 , z 21 = 0.5 , m z 31 , z 32 = 0.1 , m z 21 = 0.1 , m ( X × Y ) = 0.3 ,
where z i j = x i , y j . Thus,
H b e l m 1 + H b e l m 2 = 7.36669 , H b e l m = 9.79031 .
Again, H b e l m > H b e l m 1 + H b e l m 2 , and the additivity property is not satisfied by the new belief entropy. Therefore, the new entropy satisfies the consistency with D–S theory semantics, non-negativity, probability, and does not satisfy additivity properties, sub-additives. Therefore, the basic properties of some current entropies are given in Table 1.
In addition, BPA reflects more information than probability distribution in D–S theory. There is a classic example as follows:
Assume in a test that there are 32 students participating in a course examination. The teacher has scores of these students. A teacher is only allowed to answer “Yes” or “No” to any questions, in order to know who is (are) the top student who gets (get) the highest score(s). How many times do we need to ask at most? Assume that the time is t, and it is easy to answer the problem through calculating the information volume by using information entropy t = l o g 2 32 = 5 However, when we have been told that there are two students tied for first. The entropy is still 5? In this case, how many times do we need to ask at most to know who are the first ONES? In this case, obviously t 5 .
It can be seen from this example that the uncertainty of BPA is greater than the probability distribution. Thus, the uncertain measure boundary of probability distribution should be extended.
On the other hand, it can be found from recent research that the application of Tsallis entropy as non-additive entropy is more and more extensive [61]. The additivity entropy is a special case of the non-additivity entropy. As a result, the two requirements above, namely boundary and additivity, should be improved.

4. Numerical Experimental

In this section, some numerical examples are used to illustrate the application of our approach.

4.1. Example 1

Assume that the frame of discernment is Θ = { A } and we are given a BPA from a sensor as m ( { A } ) = 1 . Thus, we can calculate the B e l and P l by Equations (5) and (6):
B e l ( A ) = 1 , P l ( A ) = 1 .
Moreover, their classical Shannon entropy and the new belief entropy was calculated as follows:
H ( m ) = H ( m ) = 1 × l o g 1 = 0 , H b e l ( m ) = 1 × l o g 1 = 0 .
From above, we can conclude that the new belief entropy will retrograde the Shannon entropy if the frame of discernment has a single element. Under these circumstances, there is no uncertainty:

4.2. Example 2

Given that the frame of discernment is Θ = θ 1 , θ 2 , θ 3 , θ 4 , for a mass function m ( θ 1 ) = m ( θ 2 ) = m ( θ 3 ) = m ( θ 4 ) = 1 4 , then:
B e l ( θ 1 ) = B e l ( θ 2 ) = B e l ( θ 3 ) = B e l ( θ 4 ) = 1 4 ,
P l ( θ 1 ) = P l ( θ 2 ) = P l ( θ 3 ) = P l ( θ 4 ) = 1 4 ,
H ( m ) = 1 4 × l o g 1 4 + 1 4 × l o g 1 4 + 1 4 × l o g 1 4 + 1 4 × l o g 1 4 = 2 ,
H b e l ( m ) = 1 4 + 1 4 2 × l o g 1 4 + 1 4 2 × ( 2 1 1 ) + 1 4 + 1 4 2 × l o g 1 4 + 1 4 2 × ( 2 1 1 ) + 1 4 + 1 4 2 × l o g 1 4 + 1 4 2 × ( 2 1 1 ) + 1 4 + 1 4 2 × l o g 1 4 + 1 4 2 × ( 2 1 1 ) = 2 .
Obviously, the Shannon entropy and the new belief Entropy are the same when dealing with a mass function of a single element. It further demonstrates the feasibility of the new belief entropy.

4.3. Example 3

Given a frame of discernment Θ = θ 1 , θ 2 , θ 3 , θ 4 , for a mass function m ( θ 1 , θ 2 , θ 3 , θ 4 ) = 1 , then:
H b e l ( m ) = 1 + 1 2 × l o g 1 + 1 2 4 1 = 3.90689 .
In comparison of Example 2, the uncertainty of Example 3 is bigger than Example 2. Because m ( Θ ) = 1 contains more information, that is to say, the mass function is totally unknown for system. However, for Example 2, the probability distribution contains less information than m ( Θ ) = 1 . Therefore, the result is reasonable.

4.4. Example 4

Given a framework Θ = θ 1 , θ 2 , θ 3 , θ 4 , for a mass function m θ 1 = 1 4 , m θ 2 = 1 3 , m θ 1 , θ 2 = 1 6 , m θ 3 = 1 6 , m θ 4 = 1 12 , whose H b e l m are calculated as follows:
B e l ( θ 1 ) = 1 4 , P l ( θ 1 ) = 5 12 ,
B e l ( θ 2 ) = 1 3 , P l ( θ 2 ) = 1 2 ,
B e l ( θ 1 , θ 2 ) = 3 4 , P l ( θ 1 , θ 2 ) = 3 4 ,
B e l ( θ 3 ) = 1 6 , P l ( θ 3 ) = 1 6 ,
B e l ( θ 4 ) = 1 12 , P l ( θ 4 ) = 1 12 ,
H b e l = 1 4 + 5 12 2 l o g 1 4 + 5 12 2 ( 2 1 1 ) + ( 1 3 + 1 2 2 l o g 1 3 + 1 2 2 ( 2 1 1 ) ) + ( 3 4 + 3 4 2 l o g 3 4 + 3 4 2 ( 2 2 1 ) ) + ( 1 6 + 1 6 2 l o g 1 6 + 1 6 2 ( 2 1 1 ) ) + ( 1 12 + 1 12 2 l o g 1 12 + 1 12 2 ( 2 1 1 ) ) = 3.28415 .

4.5. Example 5

Given a framework Θ = θ 1 , θ 2 , θ 3 , θ 4 , for a mass function m ( θ 1 ) = 1 4 , m ( θ 2 ) = 1 3 , m ( θ 3 ) = 1 6 , m ( θ 1 , θ 2 , θ 3 ) = 1 6 , m ( θ 4 ) = 1 12 , whose H b e l m are calculated as follows:
B e l ( θ 1 ) = 1 4 , P l ( θ 1 ) = 5 12 ,
B e l ( θ 2 ) = 1 3 , P l ( θ 2 ) = 1 2 ,
B e l ( θ 3 ) = 1 6 , P l ( θ 3 ) = 1 3 ,
B e l ( θ 1 , θ 2 , θ 3 ) = 11 12 , P l ( θ 1 , θ 2 , θ 3 ) = 11 12 ,
B e l ( θ 4 ) = 1 12 , P l ( θ 4 ) = 1 12 ,
H b e l = 1 4 + 5 12 2 l o g 1 4 + 5 12 2 ( 2 1 1 ) + ( 1 3 + 1 2 2 l o g 1 3 + 1 2 2 ( 2 1 1 ) ) + ( 1 6 + 1 3 2 l o g 1 6 + 1 3 2 ( 2 1 1 ) ) + ( 11 12 + 11 12 2 l o g 11 12 + 11 12 2 ( 2 3 1 ) ) + ( 1 12 + 1 12 2 l o g 1 12 + 1 12 2 ( 2 1 1 ) ) = 3.31977 .
These are the two examples we randomly choose. It can be seen that m θ 1 , θ 2 , θ 3 = 1 6 in Example 5 is one more element than m θ 1 , θ 2 = 1 6 in Example 4, which will cause the entropy of Example 5 to be larger than the entropy of Example 4. This result is reasonable.

4.6. Example 6

Given a frame of discernment Θ = θ 1 , θ 2 , , θ N , there are three special cases of mass function as follows:
m 1 ( A ) = 2 A 1 B Θ 2 | B | 1 , A , B Θ ,
m 2 ( Θ ) = 1 ,
m 3 ( θ 1 , θ 2 , , θ N ) = 1 N .
Their associated new belief entropy accompanied by the change of N of m 1 , m 2 , m 3 was shown in Figure 1. It can be seen from Figure 1 that, with the increase of N, the mass function m 1 has the maximum uncertainty which grows very fast, while the Bayesian function m 3 has the minimal uncertainty. By comparison, we know that the m 1 represents more information than m 2 , m 3 .

4.7. Example 7

Given a frame with 15 elements identifying A, the elements are from 1 to 15, and the basic mass function is as follows:
m ( 3 , 4 , 5 ) = 0.05 , m ( 7 ) = 0.05 , m ( A ) = 0.8 , m ( Θ ) = 0.1 .
Table 2 reflects the trend of the new belief entropy when A changes, which can be seen from Figure 2. The calculation results show that, as the elements in A continue to increase, the uncertainty of BPA also increases. It is rational that there is more uncertainty with more elements.
Furthermore, in the experiment, we also used different methods to measure the uncertainty of the BPA, such as Dubois and Prade’s weighted Hartley entropy [54], Höhle’s confusion measure [38], Yager’s dissonance measure [31], Klir and Ramer’s discord [39], Klir and Parviz’s strife [40], and George and Pal’s conflict measure [55]. The experimental results are shown in Figure 3. It is obvious that only the new belief entropy and Dubois and Prade’s weighted Hartley entropy increase constantly with the rise of the size of A. On the contrary, it can be seen from the insert in Figure 3 that the uncertainty obtained by other methods are reducing or changing irregularly when the A increases, which is obviously unreasonable. Therefore, the uncertainty of the new entropy in BPA Measurements are effective. Moreover, there are some differences between the new belief entropy and Dubois and Prade’s weighted Hartley entropy, and Dubois and Prade’s weighted Hartley entropy is not degenerate into Shannon entropy when the mass function is defined as a probability distribution. Therefore, the new belief entropy is a reasonable measure among these given uncertainty measures, which combine probability interval and cardinality of multiple elements of the BPA, and it is also more flexible.

5. Conclusions

Shannon entropy can effectively measure uncertainty of probability distribution. For the BPA, although many methods have appeared to measure the uncertainty, there is an open issue. The main work of this paper is to propose a new belief entropy without the conversion from BPA to probability based on probability interval and cardinality of multiple elements of BPA. The new belief entropy would have more uncertainty than other entropies, and the boundary and additivity have been improved. The new belief entropy is a generalization of the Shannon entropy, which can degenerate into the Shannon entropy when the BPA is a probability distribution. Moreover, some numerical examples are used to show the efficiency of the proposed new belief entropy.

Author Contributions

Y.D. and L.P. proposed the original idea and designed the research. L.P. wrote the manuscript.

Funding

The work is partially supported by the National Natural Science Foundation of China (Grant Nos. 61573290, 61503237).

Acknowledgments

The authors greatly appreciate the reviews’ suggestions and the editor’s encouragement.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Huynh, V.N. Discounting and combination scheme in evidence theory for dealing with conflict in information fusion. In International Conference on Modeling Decisions for Artificial Intelligence; Springer: Berlin/Heidelberg, Germany, 2009; pp. 217–230. [Google Scholar]
  2. Dempster, A.P. Upper and lower probabilities induced by a multivalued mapping. In Classic Works of the Dempster–Shafer Theory of Belief Functions; Springer: Berlin/Heidelberg, Germany, 2008; pp. 57–72. [Google Scholar]
  3. Shafer, G. A Mathematical Theory of Evidence; Princeton University Press: Princeton, NJ, USA, 1976. [Google Scholar]
  4. He, Z.; Jiang, W. A new belief Markov chain model and its application in inventory prediction. Int. J. Prod. Res. 2018, 56, 2800–2817. [Google Scholar] [CrossRef]
  5. Deng, X. Analyzing the monotonicity of belief interval based uncertainty measures in belief function theory. Int. J. Intell. Syst. 2018, 33, 1869–1879. [Google Scholar] [CrossRef]
  6. Deng, X.; Jiang, W.; Wang, Z. Zero-sum polymatrix games with link uncertainty: A Dempster–Shafer theory solution. Appl. Math. Comput. 2019, 340, 101–112. [Google Scholar] [CrossRef]
  7. Denœux, T.; Sriboonchitta, S.; Kanjanatarakul, O. Evidential clustering of large dissimilarity data. Knowl. Based Syst. 2016, 106, 179–195. [Google Scholar] [CrossRef] [Green Version]
  8. Lian, C.; Ruan, S.; Denœux, T.; Li, H.; Vera, P. Spatial evidential clustering with adaptive distance metric for tumor segmentation in FDG-PET images. IEEE Trans. Biomed. Eng. 2018, 65, 21–30. [Google Scholar] [CrossRef] [PubMed]
  9. Deng, X.; Jiang, W. Dependence assessment in human reliability analysis using an evidential network approach extended by belief rules and uncertainty measures. Ann. Nucl. Energy 2018, 117, 183–193. [Google Scholar] [CrossRef]
  10. Han, Y.; Deng, Y. An enhanced fuzzy evidential DEMATEL method with its application to identify critical success factors. Soft Comput. 2018, 22, 5073–5090. [Google Scholar] [CrossRef]
  11. Zheng, H.; Deng, Y. Evaluation method based on fuzzy relations between Dempster–Shafer belief structure. Int. J. Intell. Syst. 2018, 33, 1343–1363. [Google Scholar] [CrossRef]
  12. Huynh, V.N. Recent advances of uncertainty management in knowledge modelling and decision making. Ann. Oper. Res. 2017, 256, 199–202. [Google Scholar] [CrossRef] [Green Version]
  13. Fei, L.; Deng, Y. A new divergence measure for basic probability assignment and its applications in extremely uncertain environments. Int. J. Intell. Syst. 2018. [Google Scholar] [CrossRef]
  14. Denoeux, T. Maximum likelihood estimation from uncertain data in the belief function framework. IEEE Trans. Knowl. Data Eng. 2013, 25, 119–130. [Google Scholar] [CrossRef]
  15. Duan, Y.; Cai, Y.; Wang, Z.; Deng, X. A novel network security risk assessment approach by combining subjective and objective weights under uncertainty. Appl. Sci. 2018, 8, 428. [Google Scholar] [CrossRef]
  16. Jiang, W. A correlation coefficient for belief functions. Int. J. Approx. Reason. 2018, 103, 94–106. [Google Scholar] [CrossRef]
  17. He, Z.; Jiang, W. An evidential dynamical model to predict the interference effect of categorization on decision making results. Knowl. Based Syst. 2018, 150, 139–149. [Google Scholar] [CrossRef]
  18. Chatterjee, K.; Zavadskas, E.K.; Tamošaitienė, J.; Adhikary, K.; Kar, S. A hybrid MCDM technique for risk management in construction projects. Symmetry 2018, 10, 46. [Google Scholar] [CrossRef]
  19. Yin, L.; Deng, X.; Deng, Y. The negation of a basic probability assignment. IEEE Trans. Fuzzy Syst. 2018. [Google Scholar] [CrossRef]
  20. Nguyen, V.D.; Huynh, V.N. Noise-Averse Combination Method. In Proceedings of the 2016 IEEE 28th International Conference on Tools with Artificial Intelligence (ICTAI), San Jose, CA, USA, 6–8 November 2016; pp. 86–90. [Google Scholar]
  21. Yager, R.R. On the fusion of non-independent belief structures. Int. J. Gen. Syst. 2009, 38, 505–531. [Google Scholar] [CrossRef] [Green Version]
  22. Zavadskas, E.K.; Antucheviciene, J.; Hajiagha, R.; Hossein, S.; Hashemi, S.S. The interval-valued intuitionistic fuzzy MULTIMOORA method for group decision making in engineering. Math. Probl. Eng. 2015, 2015, 560690. [Google Scholar] [CrossRef]
  23. Zhang, W.; Deng, Y. Combining conflicting evidence using the DEMATEL method. Soft Comput. 2018, 1–10. [Google Scholar] [CrossRef]
  24. Ferreira, F.A.F.; Meidutė-Kavaliauskienė, I.; Zavadskas, E.K.; Jalali, M.S.; Catarino, S.M. A Judgment-Based Risk Assessment Framework for Consumer Loans. Int. J. Inf. Technol. Decis. Mak. 2018, 1–27. [Google Scholar] [CrossRef]
  25. Pal, J.K.; Ray, S.S.; Cho, S.B.; Pal, S.K. Fuzzy-Rough Entropy Measure and Histogram Based Patient Selection for miRNA Ranking in Cancer. IEEE/ACM Trans. Comput. Boil. Bioinform. 2018, 15, 659–672. [Google Scholar] [CrossRef] [PubMed]
  26. Shannon, C.E. A mathematical theory of communication. ACM SIGMOBILE Mob. Comput. Commun. Rev. 2001, 5, 3–55. [Google Scholar] [CrossRef]
  27. Zavadskas, E.K.; Podvezko, V. Integrated determination of objective criteria weights in MCDM. Int. J. Inf. Technol. Decis. Mak. 2016, 15, 267–283. [Google Scholar] [CrossRef]
  28. Yin, L.; Deng, Y. Toward uncertainty of weighted networks: An entropy-based model. Phys. A Stat. Mech. Its Appl. 2018, 508, 176–186. [Google Scholar] [CrossRef]
  29. Krylovas, A.; Kosareva, N.; Zavadskas, E.K. WEBIRA-comparative analysis of weight balancing method. Int. J. Comput. Commun. Control. 2018, 12, 238–253. [Google Scholar] [CrossRef]
  30. Deng, W.; Deng, Y. Entropic methodology for entanglement measures. Phys. A Stat. Mech. Its Appl. 2018, 512, 693–697. [Google Scholar] [CrossRef]
  31. Yager, R.R. Entropy and specificity in a mathematical theory of evidence. Int. J. Gen. Syst. 1983, 9, 249–260. [Google Scholar] [CrossRef]
  32. Klir, G.J.; Lewis, H.W. Remarks on “Measuring ambiguity in the evidence theory”. IEEE Trans. Syst. Man Cybern. Part A 2008, 38, 995–999. [Google Scholar] [CrossRef]
  33. Klir, G.J. Uncertainty and Information: Foundations of Generalized Information Theory; John Wiley and Sons: Hoboken, NJ, USA, 2005. [Google Scholar]
  34. Deng, Y. Deng entropy. Chaos Solitons Fractals 2016, 91, 549–553. [Google Scholar] [CrossRef]
  35. Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 1988, 52, 479–487. [Google Scholar] [CrossRef]
  36. Hartley, R.V.L. Transmission of information 1. Bell Syst. Tech. J. 1928, 7, 535–563. [Google Scholar] [CrossRef]
  37. Körner, R.; Näther, W. On the specificity of evidences. Fuzzy Sets Syst. 1995, 71, 183–196. [Google Scholar] [CrossRef]
  38. Höhle, U. Entropy with respect to plausibility measures. In Proceedings of the 12th IEEE International Symposium on Multiple Valued Logic, Paris, France, 25–27 May 1982. [Google Scholar]
  39. Klir, G.J.; Ramer, A. Uncertainty in the Dempster–Shafer theory: A critical re-examination. Int. J. Gen. Syst. 1990, 18, 155–166. [Google Scholar] [CrossRef]
  40. Pal, N.R.; Bezdek, J.C.; Hemasinha, R. Uncertainty measures for evidential reasoning I: A review. Int. J. Approx. Reason. 1992, 7, 165–183. [Google Scholar] [CrossRef]
  41. Yang, Y.; Han, D. A new distance-based total uncertainty measure in the theory of belief functions. Knowl. Based Syst. 2016, 94, 114–123. [Google Scholar] [CrossRef]
  42. Li, Y.; Deng, Y. Generalized Ordered Propositions Fusion Based on Belief Entropy. Int. J. Comput. Commun. Control. 2018, 13, 792–807. [Google Scholar] [CrossRef]
  43. Li, M.; Zhang, Q.; Deng, Y. Evidential identification of influential nodes in network of networks. Chaos Solitons Fractals 2018. [Google Scholar] [CrossRef]
  44. Deng, X.; Jiang, W.; Zhang, J. Zero-sum matrix game with payoffs of Dempster–Shafer belief structures and its applications on sensors. Sensors 2017, 17, 922. [Google Scholar] [CrossRef] [PubMed]
  45. Chen, L.; Deng, Y. A new failure mode and effects analysis model using Dempster–Shafer evidence theory and grey relational projection method. Eng. Appl. Artif. Intell. 2018, 76, 13–20. [Google Scholar] [CrossRef]
  46. Nguyen, V.D.; Huynh, V.N. Two-probabilities focused combination in recommender systems. Int. J. Approx. Reason. 2017, 80, 225–238. [Google Scholar] [CrossRef]
  47. Jiang, W.; Hu, W. An improved soft likelihood function for Dempster–Shafer belief structures. Int. J. Intell. Syst. 2018, 33, 1264–1282. [Google Scholar] [CrossRef]
  48. Chen, L.; Deng, X. A Modified Method for Evaluating Sustainable Transport Solutions Based on AHP and Dempster–Shafer Evidence Theory. Appl. Sci. 2018, 8, 563. [Google Scholar] [CrossRef]
  49. Jiroušek, R.; Shenoy, P.P. A new definition of entropy of belief functions in the Dempster–Shafer theory. Int. J. Approx. Reason. 2018, 92, 49–65. [Google Scholar] [CrossRef] [Green Version]
  50. Kang, B.; Deng, Y.; Hewage, K.; Sadiq, R. Generating Z-number based on OWA weights using maximum entropy. Int. J. Intell. Syst. 2018, 33, 1745–1755. [Google Scholar] [CrossRef]
  51. Yao, K.; Ke, H. Entropy operator for membership function of uncertain set. Appl. Math. Comput. 2014, 242, 898–906. [Google Scholar] [CrossRef]
  52. Harmanec, D.; Klir, G.J. Measuring total uncertainty in Dempster–Shafer theory: A novel approach. Int. J. Gen. Syst. 1994, 22, 405–419. [Google Scholar] [CrossRef]
  53. Lebowitz, J.L. Boltzmann’s entropy and time’s arrow. Phys. Today 1993, 46, 32. [Google Scholar] [CrossRef]
  54. Dubois, D.; Prade, H. A note on measures of specificity for fuzzy sets. Int. J. Gen. Syst. 1985, 10, 279–283. [Google Scholar] [CrossRef]
  55. George, T.; Pal, N.R. Quantification of conflict in Dempster–Shafer framework: A new approach. Int. J. Gen. Syst. 1996, 24, 407–423. [Google Scholar] [CrossRef]
  56. Abellán, J. Analyzing properties of Deng entropy in the theory of evidence. Chaos Solitons Fractals 2017, 95, 195–199. [Google Scholar] [CrossRef]
  57. Xiao, F. A Hybrid Fuzzy Soft Sets Decision Making Method in Medical Diagnosis. IEEE Access 2018, 6, 25300–25312. [Google Scholar]
  58. Han, Y.; Deng, Y. A novel matrix game with payoffs of Maxitive Belief Structure. Int. J. Intell. Syst. 2018. [Google Scholar] [CrossRef]
  59. Xu, H.; Deng, Y. Dependent evidence combination based on shearman coefficient and pearson coefficient. IEEE Access 2018, 6, 11634–11640. [Google Scholar] [CrossRef]
  60. Wang, X.; Song, Y. Uncertainty measure in evidence theory with its applications. Appl. Intell. 2017, 48, 1–17. [Google Scholar] [CrossRef]
  61. Tsallis, C. Introduction to Nonextensive Statistical Mechanics: Approaching a Complex World; Springer Science and Business Media: Berlin/Heidelberg, Germany, 2009. [Google Scholar]
Figure 1. New belief entropy as a function of size of frame of discernment in three types of BPA.
Figure 1. New belief entropy as a function of size of frame of discernment in three types of BPA.
Entropy 20 00842 g001
Figure 2. New belief entropy as a function of changes of A.
Figure 2. New belief entropy as a function of changes of A.
Entropy 20 00842 g002
Figure 3. Different measurement of uncertainty with changes of A of BPA.
Figure 3. Different measurement of uncertainty with changes of A of BPA.
Entropy 20 00842 g003
Table 1. The above table is extracted from the article of Jiroušek and Shenoy [49], For the sake of comparison, the last line adds the property of the new entropy.
Table 1. The above table is extracted from the article of Jiroušek and Shenoy [49], For the sake of comparison, the last line adds the property of the new entropy.
DefinitionCons.with D–SNon-negProb.consAdditivitySubadd
Höhleyesnoyesyesno
Smets.yesnonoyesno
Yageryesnoyesyesno
Nguyenyesnoyesyesno
Dubois–Pradeyesnonoyesyes
Lamata–Moralyesyesyesyesno
Klir–Rameryesyesyesyesno
Klir–Parvizyesyesyesyesno
Pal et alyesyesyesyesno
Maeda–Ichihashinonoyesyesyes
Harmanec–Klirnonoyesyesyes
Abellán–Moralnonoyesyesyes
Jousselme et alnoyesyesyesno
Pouly et alnoyesyesyesno
Dengyesyesyesnono
New entropyyesyesyesnono
Table 2. New belief entropy when A changes.
Table 2. New belief entropy when A changes.
CasesNew Belief Entropy
A = {1}16.1443
A = {1, 2}17.4916
A = {1, 2, 3}19.8608
A = {1, 2, 3, 4}20.8229
A = {1, 2, ⋯, 5}21.8314
A = {1, 2, ⋯, 6}22.7521
A = {1, 2, ⋯, 7}24.1131
A = {1, 2, ⋯, 8}25.0685
A = {1, 2, ⋯, 9}26.0212
A = {1, 2, ⋯, 10}27.1947
A = {1, 2, ⋯, 11}27.9232
A = {1, 2, ⋯, 12}29.1370
A = {1, 2, ⋯, 13}30.1231
A = {1, 2, ⋯, 14}31.0732

Share and Cite

MDPI and ACS Style

Pan, L.; Deng, Y. A New Belief Entropy to Measure Uncertainty of Basic Probability Assignments Based on Belief Function and Plausibility Function. Entropy 2018, 20, 842. https://doi.org/10.3390/e20110842

AMA Style

Pan L, Deng Y. A New Belief Entropy to Measure Uncertainty of Basic Probability Assignments Based on Belief Function and Plausibility Function. Entropy. 2018; 20(11):842. https://doi.org/10.3390/e20110842

Chicago/Turabian Style

Pan, Lipeng, and Yong Deng. 2018. "A New Belief Entropy to Measure Uncertainty of Basic Probability Assignments Based on Belief Function and Plausibility Function" Entropy 20, no. 11: 842. https://doi.org/10.3390/e20110842

APA Style

Pan, L., & Deng, Y. (2018). A New Belief Entropy to Measure Uncertainty of Basic Probability Assignments Based on Belief Function and Plausibility Function. Entropy, 20(11), 842. https://doi.org/10.3390/e20110842

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop