Spatio-temporal correlations and visual signaling in a complete neuronal population

Jonathan Pillow, Jonathon Shlens, Liam Paninski, Alexander Sher, Alan Litke, Eero Simoncelli, and E.J. Chichilnisky

Nature 454: 995-999.

Sensory encoding in spiking neurons depends on both the spatiotemporal integration of sensory inputs and the intrinsic mechanisms governing the dynamics and variability of spike generation. We show that the stimulus selectivity, reliability, and timing precision of primate retinal ganglion cell (RGC) light responses can be reproduced accurately with a simple model consisting of a leaky integrate-and-fire spike generator driven by a linearly filtered stimulus, a post-spike current, and a Gaussian noise current. We fit model parameters for individual RGCs by maximizing the likelihood of observed spike responses to a stochastic visual stimulus. Though compact, the fitted model predicts the detailed time structure of responses to novel stimuli, accurately capturing the interaction between spiking history and the encoding of the sensory stimulus. The model also accounts for the variability in responses to repeated stimuli, even when fit to data from a single (non-repeating) stimulus sequence. Finally, the model can be used to derive an explicit, maximum-likelihood decoding rule for neural spike trains, thus providing a tool for assessing the limitations that spiking variability imposes on sensory performance.
Preprint (pdf, 1.6M)   |  Liam Paninski's research   |  Related work on estimation of neural models