Nothing Special   »   [go: up one dir, main page]

A Review of Shannon and Differential Entropy Rate Estimation

Entropy (Basel). 2021 Aug 13;23(8):1046. doi: 10.3390/e23081046.

Abstract

In this paper, we present a review of Shannon and differential entropy rate estimation techniques. Entropy rate, which measures the average information gain from a stochastic process, is a measure of uncertainty and complexity of a stochastic process. We discuss the estimation of entropy rate from empirical data, and review both parametric and non-parametric techniques. We look at many different assumptions on properties of the processes for parametric processes, in particular focussing on Markov and Gaussian assumptions. Non-parametric estimation relies on limit theorems which involve the entropy rate from observations, and to discuss these, we introduce some theory and the practical implementations of estimators of this type.

Keywords: entropy rate; estimation; non-parametric; parametric.

Publication types

  • Review