On measures of entropy and information

A Rényi - Proceedings of the fourth Berkeley symposium on …, 1961 - projecteuclid.org
Proceedings of the fourth Berkeley symposium on mathematical …, 1961projecteuclid.org
1. Characterization of Shannon's measure of entropy Let d'=(pI, P2,-, pn,) be a finite
discreteprobability distribution, that is, suppose pk _ O (k= 1, 2,*, n) and tl Pk= 1. The amount
of un-certainty of the distribution (P, that is, the amount of uncertainty concerning the
outcome of an experiment, the possible results of which have the probabilities PI, P2,*** p, n,
is called the entropy of the distribution (P and is usually measured by the quantity H [(P]= H
(p1, P2,** pn), introduced by Shannon [1] and defined by n 1 (1.1)
1. Characterization of Shannon's measure of entropy Let d'=(pI, P2,-, pn,) be a finite discreteprobability distribution, that is, suppose pk _ O (k= 1, 2,*, n) and tl Pk= 1. The amount of un-certainty of the distribution (P, that is, the amount of uncertainty concerning the outcome of an experiment, the possible results of which have the probabilities PI, P2,*** p, n, is called the entropy of the distribution (P and is usually measured by the quantity H [(P]= H (p1, P2,** pn), introduced by Shannon [1] and defined by n 1 (1.1)
Project Euclid