Maximally reliable Markov chains under energy constraints

Sean Escola, Michael Eisele, Ken Miller, and Liam Paninski

Neural Computation 21: 1863-912.

Signal to noise ratios in physical systems can be signi cantly degraded if the output of a system is highly variable. Biological processes for which highly stereotyped signal generation is a necessary feature appear to have reduced their signal variabilities by employing multiple processing steps. To better understand why this multi-step cascade structure might be desirable, we prove that the reliability of a signal generated by a multi-state system with no memory (i.e. a Markov chain) is maximal if and only if the system topology is such that the process steps irreversibly through each state, with transition rates chosen such that an equal fraction of the total signal is generated in each state. Furthermore, our result indicates that by increasing the number of states, it is possible to arbitrarily increase the reliability of the system. In a physical system, however, there is an energy cost associated with maintaining irreversible transitions, and this cost increases with the number of such transitions (i.e. the number of states). Thus an infinite length chain, which would be perfectly reliable, is infeasible. To model the e ects of energy demands on the maximally reliable solution, we numerically optimize the topology under two distinct energy functions that penalize either irreversible transitions or incommunicability between states respectively. In both cases, the solutions are essentially irreversible linear chains, but with upper bounds on the number of states set by the amount of available energy. We therefore conclude that a physical system for which signal reliability is important should employ a linear architecture with the number of states (and thus the reliability) determined by the intrinsic energy constraints of the system.

*NOTE: After publication of this paper, David Aldous graciously pointed out that our basic inequality on the reliability of a Markov chain has in fact been known for a while. See Aldous, D. and Shepp, L. (1987), "The least variable phase type distribution is Erlang," Stochastic Models, Volume 3, pages 467 - 473, for a proof, or more recently the nice review paper by Arnold (2007), "Majorization: Here, There and Everywhere," Statistical Science, Vol. 22, No. 3, 407-413, for a discussion of a number of related results.

Reprint  |  Liam Paninski's research page