# Decomposition of a Single Observation¶

The entropy of an observation of a single symbol from a time series can be decomposed into several parts. After the observation of many symbols, we may come to understand the structure therein, and so our next symbol does not stand apart, but rather shares some information with our past observations. This shared amount is known as the total correlation rate, , because it is the asymptotic rate of the block total correlation curve. Carving out of our single symbol entropy leaves us with the “surprising” amount of information in the observation, the entropy rate , known as such because it is the rate at which the block entropy curve grows asymptotically.

Abdallah and Plumbley [abdallah2010] defined the predictive information rate as a measure of complexity in discrete time series. Their particular use case is finding a measure of “interestingness” for music, but the measure does stand on its own outside of music. In fact we can decompose into two pieces: a part which stands on its own and doesn’t influence nor is influenced by past or future observations, and a part which is shared with future structure. The parts are known as the residual entropy rate, , and the predictive information rate, , respectively.