Several Basic Elements of Entropic Statistics
Abstract
:1. Introduction and Summary
2. Things Entropic
2.1. Sample Spaces in Different Resolutions
2.2. Entropic Objects
- A function, for all , is an entropy.
- The elements of are the entropic parameters, as compared to the elements of , which are multinomial parameters.
- The elements of or equivalently of are entropic statistics, as compared to the elements of or equivalently , which are multinomial statistics.
- Entropic statistics is the collection of statistical methodologies that help to make inference on the characteristics of a random system exclusively via entropies.
2.3. Examples of Entropic Statistics
- 1.
- First, a random sample of size n, , is taken from , under an unknown , which is summarized into .
- 2.
- The data-based local classification rule is as follows: the next observation, , is predicted to be the letter which is observed most frequently in the sample of size n. For simplicity, let it be assumed that , and a letter with the sample maximum frequency is unique (if not, some randomization may be employed).
3. Entropic Characterization
4. A Basic Convergence Theorem
- 1.
- , and
- 2.
- .
5. Conclusions and Discussion
Funding
Data Availability Statement
Conflicts of Interest
Appendix A
- , that is, are disjoint for all .
- For every k, , and it is the only observed relative frequency in .
- , that is, are disjoint for all i, .
- For every given k, and therefore an implied i, there are exactly relative frequencies among found in .
- The first m probabilities of , , are covered, respectively, by m disjoint intervals, , .
- The relative frequencies corresponding to , namely, , are also covered, respectively, by the same disjoint intervals, , .
- The first probabilities of , , are covered, respectively, by disjoint intervals, , .
- The relative frequencies corresponding to , namely, , are also covered, respectively, by the same disjoint intervals, , .
- but is not necessarily equal component-wise;
- for all ;
- In particular, .
References
- Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423, 623–656. [Google Scholar] [CrossRef] [Green Version]
- Rényi, A. On measures of information and entropy. In Proceedings of the Fourth Berkeley Symposium on Mathematics, Statistics and Probability, Berkeley, CA, USA, 20–30 June 1961; pp. 547–561. [Google Scholar]
- Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 1988, 52, 479–487. [Google Scholar] [CrossRef]
- Simpson, E.H. Measurement of diversity. Nature 1949, 163, 688. [Google Scholar] [CrossRef]
- Zhang, Z.; Zhou, J. Re-parameterization of multinomial distribution and diversity indices. J. Stat. Plan. Inference 2010, 140, 1731–1738. [Google Scholar] [CrossRef]
- Hill, M.O. Diversity and evenness: A unifying notation and its consequences. Ecology 1973, 54, 427–432. [Google Scholar] [CrossRef] [Green Version]
- Emlen, J.M. Ecology: An Evolutionary Approach; Addison-Wesley: Reading, MA, USA, 1973. [Google Scholar]
- Miller, G.A.; Madow, W.G. On the Maximum Likelihood Estimate of the Shannon-Weaver Measure of Information; Air Force Cambridge Research Center Technical Report AFCRC-TR-54-75; Operational Applications Laboratory, Air Force, Cambridge Research Center, Air Research and Development Command: New York, NY, USA, 1954. [Google Scholar]
- Miller, G.A. Note on the bias of information estimates. Inf. Theory Psychol. Probl. Methods 1955, 11-B, 95–100. [Google Scholar]
- Harris, B. The Statistical Estimation of Entropy in the Non-Parametric Case; Wisconsin University—Madison Mathematics Research Center: Madison, WI, USA, 1975. [Google Scholar]
- Antos, A.; Kontoyiannis, I. Convergence properties of functional estimates for discrete distributions. Random Struct. Algorithms 2001, 19, 163–193. [Google Scholar] [CrossRef]
- Paninski, L. Estimation of entropy and mutual information. Neural Comput. 2003, 15, 1191–1253. [Google Scholar] [CrossRef] [Green Version]
- Silva, J.F. Shannon entropy estimation in ∞-alphabets from convergence results: Studying plug-in estimators. Entropy 2018, 20, 397. [Google Scholar] [CrossRef] [Green Version]
- Zhang, Z. Statistical Implications of Turing’s Formula; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2017. [Google Scholar]
- Good, I.J. The population frequencies of species and estimation of population parameters. Biometrika 1953, 40, 237–264. [Google Scholar] [CrossRef]
- Grabchak, M.; Marcon, G.; Lang, G.; Zhang, Z. The generalized Simpson’s entropy is a measure of biodiversity. PLoS ONE 2017, 12, e0173305. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Contreras-Reyes, J.E. Mutual information matrix based on Rényi entropy and application. Nonlinear Dyn. 2022, 110, 623–633. [Google Scholar] [CrossRef]
- Cover, T.M.; Thomas, J.A. Elements of Information Theory; Wiley & Son, Inc.: New York, NY, USA, 2006. [Google Scholar]
- Zhang, Z. Generalized Mutual Information. Stats 2020, 3, 158–165. [Google Scholar] [CrossRef]
- Khinchin, A.I. Mathematical Foundations of Information Theory; Dover Publications: New York, NY, USA, 1957. [Google Scholar]
- Amigó, J.M.; Balogh, S.G.; Hernández, S. A Brief Review of Generalized Entropies. Entropy 2018, 20, 813. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ilić, V.M.; Korbel, J.; Gupta, G.; Scarfone, A.M. An overview of generalized entropic forms. Europhys. Lett. 2021, 133, 50005. [Google Scholar] [CrossRef]
- Auerbach, F. Das Gesetz der Bevölkerungskonzentration. Petermann’s Geogr. Mitteilungen 1913, 59, 74–76. [Google Scholar]
- Zipf, G.K. Selected Studies of the Principle of Relative Frequency in Language; Harvard University Press: Cambridge, MA, USA; London, UK, 1932. [Google Scholar]
- Zhang, Z. Domains of attraction on countable alphabets. Bernoulli 2018, 24, 873–894. [Google Scholar] [CrossRef]
- Molchanov, S.; Zhang, Z.; Zheng, L. Entropic Moments and Domains of Attraction on Countable Alphabets. Math. Meth. Stat. 2018, 27, 60–70. [Google Scholar] [CrossRef]
- Krichevsky, R.E.; Trofimov, V.K. The Performance of Universal Encoding. IEEE Trans. Inf. Theory 1981, 27, 199–207. [Google Scholar] [CrossRef] [Green Version]
- Holste, D.; Große, I.; Herzel, H. Bayes’ estimators of generalized entropies. J. Phys. A Math. Gen. 1998, 31, 2551–2566. [Google Scholar] [CrossRef] [Green Version]
- Schurmann, T.; Grassberger, P. Entropy estimation of symbol sequences. Chaos 1996, 6, 414–427. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Nemenman, I.; Shafee, F.; Bialek, W. Entropy and inference, revisited. In Advances in Neural Information Processing Systems; Dietterich, T.G., Becker, S., Ghahramani, Z., Eds.; MIT Press: Cambridge, MA, USA, 2002; Volume 14. [Google Scholar]
- Hausser, J.; Strimmer, K. Entropy inference and the James-Stein estimator, with application to nonlinear gene association networks. J. Mach. Learn. Res. 2009, 10, 1469–1484. [Google Scholar]
- Chao, A.; Shen, T.-J. Non-parametric estimation of Shannon’s Index of diversity when there are unseen species in sample. Environ. Ecol. Stat. 2003, 10, 429–443. [Google Scholar] [CrossRef]
- Vu, V.Q.; Yu, B.; Kass, R.E. Coverage-adjusted entropy estimation. Stat. Med. 2007, 26, 4039–4060. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Z. Entropy estimation in Turing’s perspective. Neural Comput. 2012, 24, 1368–1389. [Google Scholar] [CrossRef]
- Zhang, Z.; Zhang, X. A normal law for the plug-in estimator of entropy. IEEE Trans. Inf. Theory 2012, 58, 2745–2747. [Google Scholar] [CrossRef]
- Zhang, Z. Asymptotic normality of an entropy estimator with exponentially decaying bias. IEEE Trans. Inf. Theory 2013, 59, 504–508. [Google Scholar] [CrossRef]
- Chen, C.; Grabchak, M.; Stewart, A.; Zhang, J.; Zhang, Z. Normal Laws for Two Entropy Estimators on Infinite Alphabets. Entropy 2018, 20, 371. [Google Scholar] [CrossRef] [Green Version]
- Grabchak, M.; Zhang, Z. Asymptotic Normality for Plug-in Estimators of Diversity Indices on Countable Alphabet. J. Nonparametric Stat. 2018, 30, 774–795. [Google Scholar] [CrossRef]
0.5000 | 0.5000 | 0.5000 | |
0.7407 | 0.7901 | 0.8267 | |
0.8438 | 0.8965 | 0.9294 | |
0.8960 | 0.9421 | 0.9667 | |
0.9259 | 0.9645 | 0.9824 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, Z. Several Basic Elements of Entropic Statistics. Entropy 2023, 25, 1060. https://doi.org/10.3390/e25071060
Zhang Z. Several Basic Elements of Entropic Statistics. Entropy. 2023; 25(7):1060. https://doi.org/10.3390/e25071060
Chicago/Turabian StyleZhang, Zhiyi. 2023. "Several Basic Elements of Entropic Statistics" Entropy 25, no. 7: 1060. https://doi.org/10.3390/e25071060
APA StyleZhang, Z. (2023). Several Basic Elements of Entropic Statistics. Entropy, 25(7), 1060. https://doi.org/10.3390/e25071060