Relative entropy between Markov transition rate matrices
The relative entropy between two Markov transition rate matrices is derived from sample
path considerations. This relative entropy is interpreted as a level-2.5 large-deviations action
functional. That is, the level-two large-deviations action functional for empirical distributions
of continuous-time Markov chains can be derived from the relative entropy using the
contraction mapping principle.<>
path considerations. This relative entropy is interpreted as a level-2.5 large-deviations action
functional. That is, the level-two large-deviations action functional for empirical distributions
of continuous-time Markov chains can be derived from the relative entropy using the
contraction mapping principle.<>
The relative entropy between two Markov transition rate matrices is derived from sample path considerations. This relative entropy is interpreted as a level-2.5 large-deviations action functional. That is, the level-two large-deviations action functional for empirical distributions of continuous-time Markov chains can be derived from the relative entropy using the contraction mapping principle.< >
ieeexplore.ieee.org