Curriculum Vitae (.pdf)
Liam Paninski
December 28, 2012



Current position

Associate Professor, Department of Statistics, Center for Theoretical Neuroscience, Doctoral Program in Neurobiology and Behavior, and Kavli Institute for Brain Science, Columbia University.

Co-director, Grossman Center for the Statistics of Mind.



Education

New York University; Ph.D., Neural Science (2003).
Brown University; B.S., Neuroscience (1999).



Previous experience

Assistant Professor, Department of Statistics, Center for Theoretical Neuroscience, and Doctoral Program in Neurobiology and Behavior, Columbia University (2005-8).
Senior research fellow, Gatsby Computational Neuroscience Unit, University College London (2004-5).
Postdoctoral fellow, Center for Neural Science, HHMI, NYU (2003).



Papers

[78] Ramirez, A. & Paninski, L. (2012). Fast generalized linear model estimation via expected log-likelihoods. Under review.

[77] Smith, C. & Paninski, L. (2012). Computing loss of efficiency in optimal Bayesian decoders given noisy or incomplete spike trains. Under review.

[76] Pakman, A. & Paninski, L. (2012). Efficient multivariate truncated normal sampling via exact Hamiltonian Monte Carlo. Under minor revision.

[75] Pakman, A., Huggins, J., & Paninski, L. (2012). Fast penalized state-space methods for inferring dendritic synaptic connectivity. Under minor revision.

[74] Sadeghi et al. (2012). Monte Carlo methods for localization of cones given multielectrode retinal ganglion cell recordings. In press, Network: Computation in Neural Systems.

[73] Pnevmatikakis, E., Rahnama Rad, K., Huggins, J., & Paninski, L. . (2012) Fast Kalman filtering and forward-backward smoothing via a low-rank perturbative approach. In press, J. Comput. Graph. Stat.

[72] Doi et al. (2012). Efficient coding of spatial information in the primate retina. Journal of Neuroscience 32: 16256-16264.

[71] Pnevmatikakis, E., Kelleher, K., Chen, R., Josic, K., Saggau, P. & Paninski, L. (2012). Fast nonnegative spatiotemporal calcium smoothing in dendritic trees. PLoS Comp. Bio. 8: e1002569.

[70] Paninski, L., Rahnama Rad, K. & Vidne, M. (2012). Robust particle filters via sequential pairwise reparameterized Gibbs sampling. CISS '12.

[69] Mishchenko, Y. & Paninski, L. (2012) Bayesian compressed sensing approach to reconstructing neural connectivity from subsampled anatomical data. J. Comput. Neuro. 33: 371-88.

[68] Pnevmatikakis & Paninski, L. (2012). Fast interior-point inference in high-dimensional sparse, penalized state-space models. AISTATS '12.

[67] Smith, C., Wood, F. & Paninski, L. (2012). Low rank continuous-space graphical models. AISTATS '12.

[66] Vidne et al. (2012). The impact of common noise on the activity of a large network of retinal ganglion cells. J. Comput. Neuro. 33: 97-121.

[65] Paninski, L., Vidne, M., DePasquale, B., & Ferreira, D. (2012). Inferring synaptic inputs given a noisy voltage trace. J. Comput. Neuro. 33: 1-19.

[64] Huggins, J. & Paninski, L. (2012). Optimal experimental design for sampling voltage on dendritic trees. J. Comput. Neuro. 32: 347-66.

[63] Nazarpour, K., Ethier, C., Paninski, L., Rebesco, J., Miall, C., & Miller, L. (2011). EMG prediction from motor cortical recordings via a non-negative point process filter. IEEE Transactions on Biomedical Engineering 59: 1829-1838.

[62] Rahnama Rad, K. & Paninski, L. (2011). Information rates and optimal decoding in large neural populations. NIPS.

[61] Mishchenko, Y. & Paninski, L. (2011). Efficient methods for sampling spike trains in networks of coupled neurons. Annals of Applied Statistics 5: 1893-1919.

[60] Ahmadian, Y., Packer, A., Yuste, R. & Paninski, L. (2011). Designing optimal stimuli to control neuronal spike timing. J. Neurophys. 106: 1038-1053.

[59] Butts, D., Weng, C., Jin, J. Alonso, J.-M. & Paninski, L. (2011). Temporal precision in the visual pathway through the interplay of excitation and stimulus-driven suppression J. Neurosci. 31: 11313-11327.

[58] Mishchenko, Y., Vogelstein, J. & Paninski, L. (2011). A Bayesian approach for inferring neuronal connectivity from calcium fluorescent imaging data. Annals of Applied Statistics 5: 1229-1261.

[57] Ramirez, A., Ahmadian, Y., Schumacher, J., Schneider, D., Woolley, S. & Paninski, L. (2011). Incorporating naturalistic correlation structure improves spectrogram reconstruction from neuronal activity in the songbird auditory midbrain. J. Neurosci. 31: 3828-42.

[56] Escola, S., Fontanini, A., Katz, D. & Paninski, L. (2011). Hidden Markov models for the inference of neural states and improved estimation of linear receptive fields. Neural Computation 23: 1071-1132.

[55] Calabrese, A. & Paninski, L. (2011). Kalman filter mixture model for spike sorting of non-stationary data. J. Neurosci. Methods 196: 159-169.

[54] Calabrese, A., Schumacher, J., Schneider, D., Woolley, S. & Paninski, L. (2011). A penalized GLM approach for estimating spectrotemporal receptive fields from responses to natural sounds. PLoS One 6(1): e16104.

[53] Lewi, J., Schneider, D., Woolley, S. & Paninski, L. (2011). Automating the design of informative sequences of sensory stimuli. Journal of Computational Neuroscience 30: 181-200 (special issue on methods of information theory in neuroscience research).

[52] Ahmadian, Y., Pillow, J. & Paninski, L. (2011). Efficient Markov Chain Monte Carlo methods for decoding population spike trains. Neural Computation 23: 46-96.

[51] Pillow, J., Ahmadian, Y. & Paninski, L. (2011). Model-based decoding, information estimation, and change-point detection in multi-neuron spike trains. Neural Computation 23: 1-45.

[50] Vogelstein, J., Packer, A., Machado, T., Sippy, T., Babadi, B., Yuste, R. & Paninski, L. (2010). Fast non-negative deconvolution for spike train inference from calcium imaging. J. Neurophys. 104: 3691-3704

[49] Field, G., Gauthier, J., Sher, A. et al. (2010). Functional connectivity in the retina at the resolution of photoreceptors. Nature 467, 673-677.

[48] Rahnama Rad, K. & Paninski, L. (2010). Efficient estimation of two-dimensional firing rate surfaces via Gaussian process methods. Network: Computation in Neural Systems 21: 142-68.

[47] Paninski, L., Ahmadian, Y., Ferreira, D., Koyama, S., Rahnama, K., Vidne, M., Vogelstein, J. & Wu, W. (2010). A new look at state-space models for neural data. Journal of Computational Neuroscience (special issue on statistical analysis of neural data) 29: 107-126.

[46] Koyama, S. & Paninski, L. (2010). Efficient computation of the most likely path in integrate-and-fire and more general state-space models. Journal of Computational Neuroscience 29: 89-105.

[45] Lawhern, V., Wu, W., Hatsopoulos, N. & Paninski, L. (2010). Population neuronal decoding using a generalized linear model with hidden states. J. Neurosci. Methods 189: 267-280.

[44] Babadi, B., Casti, A., Xiao, Y. & Paninski, L. (2010). A generalized linear model of the impact of direct and indirect inputs to the lateral geniculate nucleus. Journal of Vision 10: 22.

[43] Field, R., Lary, J., Cohn, J., Paninski, L. & Shepard, K. (2010). A low-noise, single-photon avalanche diode in standard 0.13 micron complementary metal-oxide-semiconductor process. Applied Physics Letters 97, 211111.

[42] Paninski, L. (2010). Fast Kalman filtering on quasilinear dendritic trees. Journal of Computational Neuroscience 28: 211-28.

[41] Lalor, E., Ahmadian, Y. & Paninski, L. (2009). The relationship between optimal and biologically plausible decoding of stimulus velocity in the retina. Journal of the Optical Society of America A (special issue on ideal observers and efficiency) 26: B25-42.

[40] Vogelstein, J., Watson, B., Packer, A., Yuste, R., Jedynak, B. & Paninski, L. (2009). Spike inference from calcium imaging using sequential Monte Carlo methods. Biophysical Journal 97: 636-655.

[39] Wu, W., Kulkarni, J., Hatsopoulos, N. & Paninski, L. (2009). Neural decoding of goal-directed movements using a linear state-space model with hidden states. IEEE Trans. Neural Systems and Rehabilitation Engineering 17: 370-378.

[38] Escola, S., Eisele, M., Miller, K. & Paninski, L. (2009). Maximally reliable Markov chains under energy constraints. Neural Computation 21: 1863-912.

[37] Toyoizumi, T., Rahnama Rad, K. & Paninski, L. (2009). Mean-field approximations for coupled populations of generalized linear model spiking neurons. Neural Computation 21, 1203-1243.

[36] Huys, Q. & Paninski, L. (2009). Smoothing of, and parameter estimation from, noisy biophysical recordings. PLOS Computational Biology 5: e1000379.

[35] Lewi, J., Butera, R. & Paninski, L. (2009). Sequential optimal design of neurophysiology experiments. Neural Computation 21: 619-687.

[34] Fudenberg, G. Paninski, L. (2009). Bayesian image recovery for low-SNR dendritic structures. IEEE Trans. Image Processing 18: 471-482.

[33] Lewi, J., Butera, R., Schneider, D., Woolley, S. & Paninski, L. (2008). Designing neurophysiology experiments to optimally constrain receptive field models along parametric submanifolds. NIPS.

[32] Paninski, L. (2008). A coincidence-based test for uniformity given very sparsely-sampled discrete data. IEEE Transactions on Information Theory 54: 4750-4755.

[31] Pillow, J., Shlens, J., Paninski, L., Sher, A., Litke, A., Chichilnisky, E. & Simoncelli, E. (2008). Spatiotemporal correlations and visual signaling in a complete neuronal population. Nature 454: 995-999.

[30] Paninski, L. & Yajima, M. (2008). Undersmoothed kernel entropy estimators. IEEE Transactions on Information Theory 54: 4384-4388.

[29] Kulkarni, J. & Paninski, L. (2008). Efficient analytic computational methods for state-space decoding of goal-directed movements. IEEE Signal Processing Magazine 25 (special issue on brain-computer interfaces): 78-86.

[28] Ahrens, M., Paninski, L. & Sahani, M. (2008). Inferring input nonlinearities in neural encoding models. Network: Computation in Neural Systems 19: 35-67.

[27] Paninski, L., Haith, A. & Szirtes, G. (2008). Differentiable integral equation methods for computing likelihoods in the stochastic integrate-and-fire model. J. Comput. Neuroscience 24: 69-79.

[26] Kulkarni, J. & Paninski, L. (2007). Common-input models for multiple neural spike train data. Network: Computation in Neural Systems 18: 375-407.

[25] Lewi, J., Butera, R. & Paninski, L. (2007). Efficient active learning with generalized linear models. Artificial Intelligence and Statistics (AISTATS) 11.

[24] Townsend, B., Paninski, L. & Lemon, R. (2006). Linear encoding of muscle activity in primary motor cortex and cerebellum. J. Neurophys. 96: 2578-92.

[23] Huys, Q., Ahrens, M. & Paninski, L. (2006). Efficient estimation of detailed single-neuron models. Journal of Neurophysiology 96: 872-890.

[22] Paninski, L. (2006). The spike-triggered average of the integrate-and-fire cell driven by Gaussian white noise. Neural Computation 18: 2592-2616.

[21] Paninski, L. (2006). The most likely voltage path and large deviations approximations for integrate-and-fire neurons. Journal of Computational Neuroscience 21: 71-87.

[20] Pillow, J., Paninski, L., Uzzell, V., Simoncelli, E. & Chichilnisky, E. (2005). Structure and precision of retinal responses analyzed with a noisy integrate-and-fire model. J. Neurosci. 25: 11003-13.

[19] Paninski, L. (2005). Inferring prior probabilities from Bayes-optimal behavior. Advances in Neural Information Processing 18.

[18] Shoham, S., Paninski, L., Fellows, M., Hatsopoulos, N., Donoghue, J. & Normann, R. (2005). Optimal decoding for a primary motor cortical brain-computer interface. IEEE Transactions on Biomedical Engineering 52: 1312-1322.

[17] Paninski, L. (2005). Asymptotic theory of information-theoretic experimental design. Neural Computation 17: 1480-1507.

[16] Paninski, L. (2004). Log-concavity results on Gaussian process methods for supervised and unsupervised learning. Advances in Neural Information Processing 17.

[15] Paninski, L. (2004). Variational minimax estimation of discrete distributions under Kullback-Leibler loss. Advances in Neural Information Processing 17.

[14] Paninski, L. (2004). Maximum likelihood estimation of cascade point-process neural encoding models. Network: Computation in Neural Systems 15: 243-262.

[13] Paninski, L., Pillow, J. & Simoncelli, E. (2004). Comparing integrate-and-fire-like models estimated using intra- and extra-cellular data. Neurocomputing 65: 379-385.

[12] Paninski, L., Pillow, J. & Simoncelli, E. (2004). Maximum likelihood estimation of a stochastic integrate-and-fire neural encoding model. Neural Computation 16: 2533-2561.

[11] Paninski, L. et al. (2004). Superlinear population encoding of dynamic hand trajectory in primary motor cortex. Journal of Neuroscience 24: 8551-8561.

[10] Paninski, L. (2004). Estimating entropy on m bins given fewer than m samples. IEEE Transactions on Information Theory 50: 2200-2203.

[9] Paninski, L., Fellows, M., Hatsopoulos, N. & Donoghue, J. (2004). Spatiotemporal tuning properties for hand position and velocity in motor cortical neurons. Journal of Neurophysiology 91: 515-532.

[8] Hatsopoulos, N., Paninski, L. & Donoghue, J. (2003). Sequential movement representations based on correlated neuronal activity. Experimental Brain Research 149: 478-486.

[7] Serruya, M., Hatsopoulos, N., Paninski, L., Fellows, M. & Donoghue, J. (2003). Robustness of neuroprosthetic decoding algorithms. Biological Cybernetics 88: 219-228.

[6] Paninski, L. (2003). Estimation of entropy and mutual information. Neural Comp. 15: 1191-1253.

[5] Paninski, L. (2003). Convergence properties of three spike-triggered analysis techniques. Network: Computation in Neural Systems 14: 437-464. (Special issue on natural scene statistics and neural codes.)

[4] Paninski, L., Lau, B. & Reyes, A. (2003). Noise-driven adaptation: in vitro and mathematical analysis. Neurocomputing 52: 877-883.

[3] Serruya, M., Hatsopoulos, N., Paninski, L., Fellows, M. & Donoghue, J. (2002). Instant neural control of a movement signal. Nature 416: 141-142.

[2] Paninski, L. & Hawken, M. (2001). Stochastic optimal control and the human oculomotor system. Neurocomputing, 38-40: 1511-1517.

[1] Hatsopoulos, N,, Ojakangas, C., Paninski, L. & Donoghue, J. (1998). Information about movement direction obtained from synchronous activity of motor cortical neurons. PNAS 95: 15706-11.



Books

Paninski, L., Eden, U., Brown, E. & Kass, R. Statistical analysis of neurophysiological data. Under contract, Springer.

Gerstner, W., Kistler, W., Naud, R. & Paninski, L. (2013). Spiking neuron models (2nd ed.). Cambridge U. Press.



Invited book chapters

Yuste, R., Watson, B., Paninski, L., Vogelstein, J. (2009). Imaging action potentials with calcium indicators. Imaging Neurons: A Laboratory Manual, 2ed., eds. Yuste, R. & Konnerth, A., CSHL Press.

Paninski, L., Kass, R., Brown, E. & Iyengar, I. (2008). Statistical analysis of neuronal data via integrate-and-fire models. Stochastic Methods in Neuroscience, eds. Laing, C. & Lord, G., Oxford.

Paninski, L., Pillow, J. & Lewi, J. (2007). Statistical models for neural encoding, decoding, and optimal stimulus design. Computational Neuroscience: Progress in Brain Research, eds. Cisek, P., Drew, T. & Kalaska, J.

Simoncelli, E., Paninski, L., Pillow, J. & Schwartz, O. (2004). Characterization of neural responses with stochastic stimuli. Chapter 23 of The New Cognitive Neurosciences, 3ed, ed. Gazzaniga, M..



Grants

Collaborative Research in Computational Neuroscience, NEI R01 EY018003, co-PI w/ E. Simoncelli and E.J. Chichilnisky, 2006-12.
Gatsby Initiative in Brain Circuitry Pilot Grant, co-PI w/ S. Woolley, 2006-8.
Alfred P. Sloan Research Fellowship, 2007.
NSF Faculty Early Career Development (CAREER) IOS-0641912, 2007-
McKnight Scholar award, 2008-.
Collaborative Research in Computational Neuroscience, NSF IIS-0904353, co-PI w/ R. Yuste, 2009-.
DARPA award, Reliable Neural-interface Technology program, co-PI w/ B. Pesaran, 2011-.
MURI award, ``Imaging how a neuron computes,'' co-PI w/ R. Yuste et al., 2012-.



Other awards and honors

``Scientist to watch,'' The Scientist magazine, June 2007.
Named one of top 35 innovators under 35 years old by Technology Review MIT, 2006.
Honorable mention, outstanding student paper award (to J. Lewi), NIPS, 2006.
Royal Society International Research Fellowship, 2004.
Best student paper award (w/ J. Pillow), NIPS, 2003.
Howard Hughes Medical Institute Predoctoral Fellowship in Biological Sciences, 1999.
National Science Foundation Predoctoral Fellowship, 1999.
Royce Fellowship, Brown University, 1998.



Selected teaching

Co-instructor, Statistical analysis and modeling of neural systems (NYU), 2002.
Invited lecturer, Advanced European computational neuroscience course (Obidos), 2004.
STAT4107, Statistical inference, Columbia University, 2005.
STAT4315, Linear regression models, Columbia University, 2006.
STAT4109, Probability and statistical inference, Columbia University, 2006-2010.
STAT8285, Statistical analysis and modeling of neural spike train data, Columbia, 2007,9,11,12.
STAT6104, Computational statistics, Columbia, 2012-13.
Invited lecturer, Program in Comput. Bio., Gulbenkian Science Institute, Lisbon, 2007.
Invited lecturer, Computational Modeling of Neuronal Systems, NYU, 2007.
Invited lecturer, Ignorance, Columbia University Biology Dept., 2008.
Invited lecturer, Princeton PICASso program, 2008.
Invited lecturer, Okinawa Computational Neuroscience Course, 2009.
Invited lecturer, Kyoto University workshop on state-space analysis in neuroscience, 2010.



Advising

Postdoctoral research advisor: J. Kulkarni, Q. Huys, Y. Ahmadian, Y. Mishchenko, L. Badel, E. Pnevmatikakis, K. Sadeghi, A. Pakman
Ph.D. research advisor: S. Escola, J. Vogelstein, J. Lewi, M. Nikitchenko, K. Rahnama Rad, M. Vidne, A. Ramirez, D. Ferreira, A. Calabrese, C. Smith, T. Machado, D. Pfau, J. Merel
M.A. research advisor: M. Yajima, C. Gohil, J. Bahk, S. Keshri
Undergraduate research advisor: G. Fudenberg, J. Huggins



Other duties

Action editor: J. Comput. Neuro.

Program committe: COSYNE.

Reviewer: Bayesian Analysis; Biometrika; COLT08; CRC Press; Frontiers in Comput. Neuro.; IEEE Transactions on: Biomedical Engineering, Information Theory, Pattern Analysis and Machine Intelligence, Neural Networks, and Signal Processing; ISIT08; J. Comput. Neuro.; J. Machine Learning Research; J. Neurophysiology; J. Neuroscience; J. Optical Soc. Am. A; J. Physics A; J. Vision; Machine Learning, Nature; Nature Neurosci.; Network: Computation in Neural Systems; Neural Computation; Neuron; NIPS; Oxford University Press; PNAS; PLOS Comp. Bio.; Science; SIAM J. Appl. Math; Statistics in Medicine; Technometrics.

NSF review panelist: 2007-.

Co-organizer: Statistical analysis of neural data meeting, 2010-.

Co-organizer: COSYNE workshop on new techniques for online neural characterization and optimal control, 2011.





2012-12-28