Excess Entropy TutorialΒΆ

This tutorial will show the steps to calculate the excess entropy of an \eM. CMPy includes a method which robustly computes the excess entropy, \EE, from a machine.

We first instantiate a machine to work with, and then compute the excess entropy:

m = NRPS()
E = m.excess_entropy()

Behind the scenes, excess_entropy attempts different methods of computing the excess entropy in order of exactness. It first attempts to compute I[\CausalState^+ : \CausalState^-] by constructing the bi-directional machine. This method alone is available as well:

E = m.excess_entropy_exact()

If this method fails (generally because the process is explosively irreversible), we move on to another method which can give the exact value of the excess entropy. We know that if \ell \ge \MOrder, then H[\MeasSymbols{0}{\ell}] - \ell \hmu = \EE, and similarly if \ell
\ge \COrder, then H[\MeasSymbols{0}{\ell}, \CausalState_\ell] - \ell
\hmu = \EE. Since \COrder \le \MOrder, we will make use of the second fact:

k = m.cryptic_order()
if k < np.inf:
    hmu = m.entropy_rate()
    E = m.block_entropies([k], HX0LSL=True)[1] - k*hmu

If both of these exact methods fail, we must move to approximation methods. The method we use is to compute the block entropy and the block-state entropy until they are within some tolerance of each other, and then output the average of the two:

E = sum(m.excess_entropy_bounds(tolerance))[0]/2

You can use any of these three methods to compute the excess entropy, but if you just want the best answer possible without having to worry about such issues, just call excess_entropy.