A New Total Uncertainty Measure from A Perspective of Maximum Entropy Requirement
<p>Comparison results of Example 1.</p> "> Figure 2
<p>Comparison results of Example 2.</p> "> Figure 3
<p>Comparison results of Example 5.</p> "> Figure 4
<p>Comparison results of Example 7. (<b>a</b>) The proposed method <math display="inline"><semantics> <msup> <mi>H</mi> <mn>1</mn> </msup> </semantics></math>; (<b>b</b>) the proposed method <math display="inline"><semantics> <msup> <mi>H</mi> <mn>2</mn> </msup> </semantics></math>; (<b>c</b>) the proposed method <math display="inline"><semantics> <msup> <mi>H</mi> <mn>3</mn> </msup> </semantics></math>; (<b>d</b>) Deng entropy; (<b>e</b>) AU; (<b>f</b>) weighted Hartley entropy; (<b>g</b>) Yang and Han’s measure; (<b>h</b>) SU; (i)JS; (j) AM; (<b>k</b>) Deng’s measure(<math display="inline"><semantics> <mrow> <mi>T</mi> <msubsup> <mi>U</mi> <mi>E</mi> <mi>I</mi> </msubsup> </mrow> </semantics></math>); (l) Yager’s dissonance entropy.</p> "> Figure 5
<p>Probability density functions (PDFs) of different features of samples in the Iris dataset.</p> "> Figure 6
<p>Average uncertainty of samples on each feature based on different uncertainty measures.</p> ">
Abstract
:1. Introduction
- We propose a new total uncertainty measure from the perspective of the maximum entropy requirement to quantify the uncertainty of BPAs in DST. Besides, properties of the proposed method have also been proved, such as non-negativity, monotonicity, maximum entropy, and so on.
- We conduct some numerical examples to evaluate the effectiveness of our proposed method. The simulation results indicated that our proposed total uncertainty measure could be degraded to Shannon entropy when BPA is a Bayesian mass function. Furthermore, the proposed entropy could effectively deal with the redundant information of the focal element.
2. Preliminaries
2.1. Dempster-Shafer Theory
- Let and be two BPAs, and then the Dempster combination rule is as follows [9]:
2.2. Belief and Plausibility Function
2.3. Shannon Entropy
2.4. Some Existing Entropies in DST
- Nguyen’s entropy.
- Nguyen [34] proposed a belief entropy based on the original BPA:
- Weighted Hartley entropy.
- Dubois and Prade [35] proposed a entropy for the non-specificity measure:
- Aggregated uncertainty measure (AU).
- Harmanec and Klir [43] proposed a total uncertainty measure of non-specificity and inconsistency:
- Yager’s entropy.
- Yager [37] proposed a dissonance measure of BPAs based on the plausibility function:
- Deng entropy.
- Deng [41] proposed a new uncertainty measurement method, namely, ”Deng entropy”. It is defined as:
- Höhle entropy.
- Höhle [36] proposed a belief entropy based on a belief function, which is defined as:
- Yang and Han’s measure ().
- Yang and Han [42] defined a total uncertainty measurement. The formula is defined as:
- Deng’s measure ().
- In addition, Deng et al. [46] proposed an improved total uncertainty measurement method based on belief intervals:
3. Proposed Uncertainty Measure in DST
3.1. The Proposed Method
3.2. Properties of the Proposed Method
4. Numerical Examples
- Case 1 ().
- According to [45], the maximum entropy is , where X is a FOD. In this paper, represents the maximum entropy in S. Hence, based on the above analysis, one function form of can be defined as:
- Case 2 ().
- According to [26], the maximum Deng entropy is . Theoretically, should be . However, the Deng entropy’s maximum uncertainty is obtained at , which is inconsistent with our idea. In this paper, we think that the uncertainty of should be a function of a and uncertainty degree of BPA . Hence, for any , there is only one situation, . Therefore, another function form of can be defined as:
- Case 3 ().
4.1. Example 1
4.2. Example 2
4.3. Example 3
4.4. Example 4
4.5. Example 5
4.6. Example 6
5. Application
- Step1 (BPA generation).
- For different features in , we generate the BPA corresponding to each sample in the fault dataset according to [55].
- Step2 (Uncertainty measure of BPAs).
- For each feature, calculate the uncertainty of each BPA on it by using all the above uncertainty measures.
- Step3 (Average uncertainty measure).
- Calculate the average uncertainty value on different features corresponding to different methods.
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
PT | probability theory |
DST | Dempster-Shafer evidence theory |
FOD | Frame of Discernment |
BPA | basic probability assignment |
BOE | body of evidence |
References
- Denoeux, T. A k-nearest neighbor classification rule based on Dempster-Shafer theory. IEEE Trans. Syst. Man Cybern. 1995, 25, 804–813. [Google Scholar] [CrossRef] [Green Version]
- Xiao, F. Multi-sensor data fusion based on the belief divergence measure of evidences and the belief entropy. Inf. Fusion 2019, 46, 23–32. [Google Scholar] [CrossRef]
- Farag, W. Kalman-filter-based sensor fusion applied to road-objects detection and tracking for autonomous vehicles. Proc. Inst. Mech. Eng. Part I J. Syst. Control. Eng. 2021, 23, 1125–1138. [Google Scholar]
- Liu, X.; Zhou, B.; Huang, P.; Xue, W.; Li, Q.; Zhu, J.; Qiu, L. Kalman Filter-Based Data Fusion of Wi-Fi RTT and PDR for Indoor Localization. IEEE Sens. J. 2021, 21, 8479–8490. [Google Scholar] [CrossRef]
- Xiao, F. CaFtR: A Fuzzy Complex Event Processing Method. Int. J. Fuzzy. Syst. 2021. [Google Scholar] [CrossRef]
- Sauta, E.; Demartini, A.; Vitali, F.; Riva, A.; Bellazzi, R. A Bayesian data fusion based approach for learning genome-wide transcriptional regulatory networks. BMC Bioinform. 2020, 21, 1–28. [Google Scholar] [CrossRef]
- Chen, H.; Maduranga, D.A.K.; Mundra, P.; Zheng, J. Bayesian Data Fusion of Gene Expression and Histone Modification Profiles for Inference of Gene Regulatory Network. IEEE-ACM Trans. Comput. Biol. Bioinform. 2020, 17, 516–525. [Google Scholar] [CrossRef] [PubMed]
- Holzinger, A.; Malle, B.; Saranti, A.; Pfeifer, B. Towards multi-modal causability with Graph Neural Networks enabling information fusion for explainable AI. Inf. Fusion 2021, 71, 28–37. [Google Scholar] [CrossRef]
- Dempster, A. Upper and Lower Probabilities Induced by a Multivalued Mapping. Ann. Mathmatical Stat. 1967, 38, 325–339. [Google Scholar] [CrossRef]
- Shafer, G. A Mathematical Theory of Evidence; Princeton University Press: Princeton, NJ, USA, 1976. [Google Scholar]
- Lin, Y.; Li, Y.; Yin, X.; Dou, Z. Multisensor Fault Diagnosis Modeling Based on the Evidence Theory. IEEE Trans. Reliab. 2018, 67, 513–521. [Google Scholar] [CrossRef]
- Zhang, Y.; Jiang, W.; Deng, X. Fault diagnosis method based on time domain weighted data aggregation and information fusion. Int. J. Distrib.Sens. Netw. 2019, 15. [Google Scholar] [CrossRef]
- Liu, Z.; Xiao, F. An Intuitionistic Evidential Method for Weight Determination in FMEA Based on Belief Entropy. Entropy 2019, 21, 211. [Google Scholar] [CrossRef] [Green Version]
- Ji, X.; Ren, Y.; Tang, H.; Shi, C.; Xiang, J. An intelligent fault diagnosis approach based on Dempster-Shafer theory for hydraulic valves. Measurement 2020, 165, 108129. [Google Scholar] [CrossRef]
- Pisano, R.; Sozzo, S. A Unified Theory of Human Judgements and Decision-Making under Uncertainty. Entropy 2020, 22, 738. [Google Scholar] [CrossRef] [PubMed]
- Zhang, H.; Jiang, W.; Deng, X. Data-driven multi-attribute decision-making by combining probability distributions based on compatibility and entropy. Appl. Intell. 2020, 50, 4081–4093. [Google Scholar] [CrossRef]
- Seiti, H.; Hafezalkotob, A.; Najafi, S.E.; Khalaj, M. A risk-based fuzzy evidential framework for FMEA analysis under uncertainty: An interval-valued DS approach. J. Int, Fuzzy Syst. 2018, 35, 1419–1430. [Google Scholar] [CrossRef]
- Zhang, W.; Deng, Y. Combining conflicting evidence using the DEMATEL method. Soft Comput. 2018. [Google Scholar] [CrossRef]
- Li, H.; Xiao, F. A method for combining conflicting evidences with improved distance function and Tsallis entropy. Int. J. Intell. Syst. 2020, 35, 1814–1830. [Google Scholar] [CrossRef]
- Liang, H.; Cai, R. A new correlation coefficient of BPA based on generalized information quality. Int. J. Intell. Syst. 2021. [Google Scholar] [CrossRef]
- Ni, S.; Lei, Y.; Tang, Y. Improved Base Belief Function-Based Conflict Data Fusion Approach Considering Belief Entropy in the Evidence Theory. Entropy 2020, 22, 801. [Google Scholar] [CrossRef] [PubMed]
- Smets, P. The Combination of Evidence in the Transferable Belief Model. IEEE Trans. Pattern Anal. Mach. Intell. 1990, 12, 447–458. [Google Scholar] [CrossRef]
- Zhan, J.; Jiang, W. A modified combination rule in generalized evidence theory. Appl. Intell. 2017, 46, 630–640. [Google Scholar]
- Wang, J.; Qiao, K.; Zhang, Z. An improvement for combination rule in evidence theory. Futur. Gener. Comp. Syst. 2019, 91, 1–9. [Google Scholar] [CrossRef]
- Matsuyama, T. Belief formation from observation and belief integration using virtual belief space in Dempster-Shafer probability model. In Proceedings of the 1994 IEEE International Conference on MFI’94. Multisensor Fusion and Integration for Intelligent Systems 1994, Las Vegas, NV, USA, 2–5 October 1994; pp. 379–386. [Google Scholar] [CrossRef]
- Deng, Y. Information Volume of Mass Function. arXiv 2020, arXiv:2012.07507. [Google Scholar]
- Zhou, Q.; Deng, Y. Higher order information volume of mass function. Int. J. Comput. Commun. Control 2020, 15. [Google Scholar] [CrossRef]
- Xiao, F. CEQD: A complex mass function to predict interference effects. IEEE Trans. Cybern. 2021. [Google Scholar] [CrossRef]
- Rényi, A. On measures of entropy and information. Virology 1985, 142, 158–174. [Google Scholar]
- Shannon, C. A mathematical theory of communication. ACM Sigmobile Mob. Comput. Commun. Rev. 2001, 5, 3–55. [Google Scholar] [CrossRef]
- Prigogine, I. The End of Certainty; Free Press: New York, NY, USA, 1997; ISBN 9780684837055. [Google Scholar]
- Parker, M.C.; Jeynes, C. Maximum Entropy (Most Likely) Double Helical and Double Logarithmic Spiral Trajectories in Space-Time. Sci. Rep. 2019, 1, 10779. [Google Scholar] [CrossRef]
- Zhou, M.; Liu, X.; Yang, J.; Chen, Y.; Wu, J. Evidential reasoning approach with multiple kinds of attributes and entropy-based weight assignment. Knowl. Syst. 2019, 163, 358–375. [Google Scholar] [CrossRef]
- Nguyen, H.T. On entropy of random sets and possibility distributions. Anal. Fuzzy Inf. 1987, 1, 145–156. [Google Scholar]
- Dubois, D.; Prade, H. Properties of measures of information in evidence and possibility theories. Fuzzy Sets Syst. 1987, 24, 161–182. [Google Scholar] [CrossRef]
- Höhle, U. Entropy with respect to plausibility measures. In Proceedings of the 12th International Symposium on Multiple-Valued Logic, Paris, France, 25–27 May 1982. [Google Scholar]
- Yager, R. Entropy and specificity in a mathematical theory theory of evidence. Int. J. Gen. Syst. 1983, 9, 249–260. [Google Scholar] [CrossRef]
- Klir, G.J.; Ramer, A. Uncertainty in the Dempster-Shafer theory: A critycal re-examination. Int. J. Gen. Syst. 1990, 18, 155–166. [Google Scholar] [CrossRef]
- Klir, G.J.; Parviz, B. A note on the measure of discord. Uncertain. Artificaial Intell. Proc. Eighth Conf. 1992, 18, 138–141. [Google Scholar]
- Jousselme, A.L.; Liu, C.; Grenier, D.; Bosse, E. Measuring ambiguity in the evidence theory. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 2006, 36, 890–903. [Google Scholar] [CrossRef]
- Deng, Y. Deng entropy. Chaos, Solitons Fractals 2016, 91, 549–553. [Google Scholar] [CrossRef]
- Yang, Y.; Han, D. A New Distance-Based Total Uncertainty Measure in the Theory of Belief Functions. Know. Based Syst. 2016, 94, 114–123. [Google Scholar] [CrossRef]
- Harmanec, D.; Klir, G.J. Measuring total uncertainty in Dempster-Shafer theory: A novel approach. Int. J. Gen. Syst. 1994, 22, 405–419. [Google Scholar] [CrossRef]
- Wang, X.; Song, Y. Uncertainty measure in evidence theory with its applications. Appl. Intell. 2018, 48, 1672–1688. [Google Scholar] [CrossRef]
- Jirousek, R.; Shenoy, P. A new definition of entropy of belief functions in the Dempster-Shafer theory. Int. J. Approx. Reason. 2018, 92, 49–65. [Google Scholar] [CrossRef] [Green Version]
- Deng, X. Analyzing the monotonicity of belief interval based uncertainty measures in belief function theory. Int. J. Intell. Syst. 2018, 33, 1775–1985. [Google Scholar] [CrossRef]
- Abellán, J.; Moral, S. Completing a total uncertainty measure in the Dempster-Shfer theory. Int. J. Gen. Syst. 1999, 28, 299–314. [Google Scholar] [CrossRef]
- Yager, R.R. Interval valued entropies for Dempster–Shafer structures. Know. Based Syst. 2018, 161, 390–397. [Google Scholar] [CrossRef]
- Xue, Y.; Deng, Y. Interva1-va1ued be1ief entropies for Dempster Shafer structures. Soft Comput. 2021, 25, 8063–8071. [Google Scholar] [CrossRef] [PubMed]
- Abellán, J.; Masegosa, A. Requirements for total uncertainty measures in Dempster–Shafer theory of evidence. Int. J. Gen. Syst. 2008, 37, 733–747. [Google Scholar] [CrossRef]
- Deng, X.; Jiang, W.; Zhang, J. Zero-Sum Matrix Game with Payoffs of Dempster-Shafer Belief Structures and Its Applications on Sensors. Sensors 2017, 17, 922. [Google Scholar] [CrossRef] [Green Version]
- Jiang, W.; Hu, W. An improved soft likelihood function for Dempster–Shafer belief structures. Int. J. Intell. Syst. 2018, 33, 1264–1282. [Google Scholar] [CrossRef]
- Deng, Y. Uncertainty measure in evidence theory. Sci. China Inf. Sci. 2020, 63, 210201. [Google Scholar] [CrossRef]
- Duda, R.O.; Hart, P.E.; Stork, D.G. Pattern Classification; John Willey & Sons: Hoboken, NJ, USA, 2012. [Google Scholar]
- Masson, M.; Denoeux, T. ECM: An evidential version of the fuzzy c-means algorithm. Pattern Recognit 2008, 41, 1384–1397. [Google Scholar] [CrossRef]
Uncertainty Measures | ||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Deng entropy | 0 | 1.5850 | 2.8074 | 3.9069 | 4.9542 | 5.9773 | 6.9887 | 7.9944 | 8.9972 | 9.9986 | 10.9993 | 11.9996 | 12.9998 | 13.9999 |
AU | 0 | 1 | 1.5850 | 2 | 2.3219 | 2.5850 | 2.8074 | 3 | 3.1699 | 3.3219 | 3.4594 | 3.5850 | 3.7004 | 3.8074 |
Weighted Hartley entropy | 0 | 1 | 1.5850 | 2 | 2.3219 | 2.5850 | 2.8074 | 3 | 3.1699 | 3.3219 | 3.4594 | 3.5850 | 3.7004 | 3.8074 |
Yang and Han’s measure | 0 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 |
SU | 0 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 |
JS | 0 | 2 | 3.1699 | 4 | 4.6439 | 5.1699 | 5.6147 | 6 | 6.3399 | 6.6439 | 6.9189 | 7.1699 | 7.4009 | 7.6147 |
AM | 0 | 1 | 1.5850 | 2 | 2.3219 | 2.5850 | 2.8074 | 3 | 3.1699 | 3.3219 | 3.4594 | 3.5850 | 3.7004 | 3.8074 |
Deng’s measure() | 0 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 |
Yager’s dissonance entropy | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
Proposed method | 0 | 2 | 3.1699 | 4 | 4.6439 | 5.1699 | 5.6147 | 6 | 6.3399 | 6.6439 | 6.9189 | 7.1699 | 7.4009 | 7.6147 |
Proposed method | 0 | 1.5850 | 2.8074 | 3.9069 | 4.9542 | 5.9773 | 6.9887 | 7.9944 | 8.9972 | 9.9986 | 10.9993 | 11.9996 | 12.9998 | 13.9999 |
Proposed method | 0 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 |
Uncertainty Measures | ||
---|---|---|
Deng entropy | 2.3219 | 2.3219 |
AU | 1 | 1.9710 |
Weighted Hartley entropy | 0.6 | 0.6 |
Yang and Han’s measure | 0.4 | 0.4394 |
SU | 1.6 | 2.5710 |
JS | 1.6 | 2.4113 |
AM | 1 | 1.9710 |
Deng’s measure () | 0.3586 | 0.3877 |
Yager’s dissonance entropy | 0.1288 | 1.3710 |
Uncertainty Measures | ||
---|---|---|
Deng entropy | 2.5559 | 2.5559 |
AU | 1.9710 | 1.9710 |
Weighted Hartley entropy | 1.0000 | 1.0000 |
Yang and Han’s measure | 0.5000 | 0.5000 |
SU | 2.9710 | 2.4855 |
JS | 2.9710 | 2.4855 |
AM | 1.9710 | 1.4855 |
Deng’s measure () | 0.5000 | 0.5000 |
Yager’s dissonance entropy | 0.9710 | 0 |
Proposed method | 2.9710 | 2.0000 |
Proposed method | 2.5559 | 1.5850 |
Proposed method | 2.9710 | 2.0000 |
Uncertainty Measures | |||||||
---|---|---|---|---|---|---|---|
Deng entropy | 2.6623 | 2.9303 | 4.9082 | 5.7878 | 6.6256 | 7.4441 | 8.2532 |
AU | 1.4328 | 2.2180 | 2.6707 | 2.9821 | 3.0378 | 3.2447 | 3.2956 |
Weighted Hartley entropy | 0.4699 | 1.2699 | 1.7379 | 2.0699 | 2.3275 | 2.5379 | 2.7158 |
Yang and Han’s measure | 0.1246 | 0.2216 | 0.2749 | 0.3283 | 0.3816 | 0.4349 | 0.4867 |
SU | 4.3583 | 5.7797 | 6.4096 | 7.0394 | 7.6693 | 8.3716 | 8.9394 |
JS | 3.8322 | 4.4789 | 4.8870 | 5.2250 | 5.5200 | 5.8059 | 6.0425 |
AM | 1.3461 | 2.1037 | 2.4623 | 2.7011 | 2.8762 | 3.0684 | 3.1083 |
Deng’s measure ( ) | 0.1195 | 0.2199 | 0.2732 | 0.3266 | 0.3799 | 0.4332 | 0.4853 |
Yager’s dissonance entropy | 0.3953 | 0.3953 | 0.1997 | 0.1997 | 0.1997 | 0.1997 | 0.0074 |
Proposed method | 1.3352 | 2.9352 | 3.6756 | 4.3396 | 4.8547 | 5.2756 | 5.4390 |
Proposed method | 2.0357 | 3.3036 | 4.0860 | 4.9656 | 5.8035 | 6.6219 | 7.2387 |
Proposed method | 2.0453 | 3.6453 | 4.2497 | 5.0497 | 5.8497 | 6.6497 | 7.2574 |
Uncertainty Measures | |||||||
---|---|---|---|---|---|---|---|
Deng entropy | 9.0578 | 9.8600 | 10.6612 | 11.4617 | 12.2620 | 13.0622 | 13.8622 |
AU | 3.4497 | 3.5796 | 3.6909 | 3.7824 | 3.8538 | 3.8986 | 3.9069 |
Weighted Hartley entropy | 2.8699 | 3.0059 | 3.1275 | 3.2375 | 3.3379 | 3.4303 | 3.5158 |
Yang and Han’s measure | 0.5400 | 0.5933 | 0.6467 | 0.7000 | 0.7533 | 0.8067 | 0.8600 |
SU | 9.6417 | 10.3440 | 11.0463 | 11.7486 | 12.4510 | 13.1533 | 13.8556 |
JS | 6.2772 | 6.4921 | 6.6903 | 6.8743 | 7.0461 | 7.2071 | 7.3587 |
AM | 3.2511 | 3.3747 | 3.4833 | 3.5797 | 3.6663 | 3.7446 | 3.8160 |
Deng’s measure () | 0.5386 | 0.5920 | 0.6453 | 0.6986 | 0.7520 | 0.8053 | 0.8586 |
Yager’s dissonance entropy | 0.0074 | 0.0074 | 0.0074 | 0.0074 | 0.0074 | 0.0074 | 0.0074 |
Proposed method | 5.7473 | 6.0192 | 6.2624 | 6.4824 | 6.6832 | 6.8680 | 7.0390 |
Proposed method | 8.0432 | 8.8455 | 9.6466 | 10.4472 | 11.2475 | 12.0476 | 12.8477 |
Proposed method | 8.0574 | 8.8574 | 9.6574 | 10.4574 | 11.2574 | 12.0574 | 12.8574 |
Features | Setosa | Versicolour | Virginica | |
---|---|---|---|---|
SL | Mean | 5.0060 | 5.9360 | 6.5880 |
Standard deviation | 0.3525 | 0.5162 | 0.6359 | |
SW | Mean | 3.4180 | 2.7700 | 2.9740 |
Standard deviation | 0.3810 | 0.3138 | 0.3225 | |
PL | Mean | 1.4640 | 4.2600 | 5.5520 |
Standard deviation | 0.1735 | 0.4699 | 0.5519 | |
PW | Mean | 0.2440 | 1.3260 | 2.0260 |
Standard deviation | 0.1072 | 0.1978 | 0.2747 |
Uncertainty Measures | SL | SW | PL | PW |
---|---|---|---|---|
Weighted Hartley entropy | 0.6642 | 0.6660 | 0.5540 | 0.6556 |
Deng entropy | 3.7394 | 3.8863 | 3.1025 | 3.7501 |
AU | 1.5850 | 1.5850 | 1.4869 | 1.5850 |
AM | 1.5333 | 1.5756 | 1.2764 | 1.5183 |
SU | 2.1950 | 2.2364 | 1.8396 | 2.1723 |
JS | 2.2226 | 2.2463 | 1.9536 | 2.2068 |
Yang and Han’s measure | 0.6072 | 0.6227 | 0.4787 | 0.5986 |
Deng’s measure | 1.6208 | 1.6532 | 1.3023 | 1.5981 |
Proposed method | 1.7057 | 1.7509 | 1.3665 | 1.6833 |
Proposed method | 1.4729 | 1.5178 | 1.1709 | 1.4542 |
Proposed method | 1.6810 | 1.7260 | 1.3467 | 1.6586 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, Y.; Huang, F.; Deng, X.; Jiang, W. A New Total Uncertainty Measure from A Perspective of Maximum Entropy Requirement. Entropy 2021, 23, 1061. https://doi.org/10.3390/e23081061
Zhang Y, Huang F, Deng X, Jiang W. A New Total Uncertainty Measure from A Perspective of Maximum Entropy Requirement. Entropy. 2021; 23(8):1061. https://doi.org/10.3390/e23081061
Chicago/Turabian StyleZhang, Yu, Fanghui Huang, Xinyang Deng, and Wen Jiang. 2021. "A New Total Uncertainty Measure from A Perspective of Maximum Entropy Requirement" Entropy 23, no. 8: 1061. https://doi.org/10.3390/e23081061
APA StyleZhang, Y., Huang, F., Deng, X., & Jiang, W. (2021). A New Total Uncertainty Measure from A Perspective of Maximum Entropy Requirement. Entropy, 23(8), 1061. https://doi.org/10.3390/e23081061