Statistical concept / technique | Neuroscience application |
---|---|
Point processes; conditional intensity functions | Neural spike trains; photon-limited image data |
Time-rescaling theorem for point processes | Fast simulation of network models; goodness-of-fit tests for spiking models |
Bias, consistency, principal components | Spike-triggered averaging; spike-triggered covariance |
Generalized linear models | Neural encoding models including spike-history effects; inferring network connectivity |
Regularization; shrinkage estimation | Maximum a posteriori estimation of high-dimensional neural encoding models |
Laplace approximation; Fisher information | Model-based decoding and information estimation; adaptive design of optimal stimuli |
Mixture models; EM algorithm; Dirichlet processes | Spike-sorting / clustering |
Optimization and convexity techniques | Spike-train decoding; ML estimation of encoding models |
Markov chain Monte Carlo: Metropolis-Hastings and hit-and-run algorithms | Firing rate estimation and spike-train decoding |
State-space models; sequential Monte Carlo / particle filtering | Decoding spike trains; optimal voltage smoothing |
Fast high-dimensional Kalman filtering | Optimal smoothing of voltage and calcium signals on large dendritic trees |
Markov processes; first-passage times; Fokker-Planck equation | Integrate-and-fire-based neural models |
Date | Topic | Reading | Notes |
---|---|---|---|
Sept 4 | Intro and overview | Paninski and Cunningham, `18; International Brain Lab, '17 | |
Sept 11 | Signal acquisition: spike sorting | Lewicki '98; Pachitariu et al '16; Lee et al '17; Calabrese and Paninski '11 | EM notes; Blei et al review on variational inference |
Sept 18 - Oct 9 | Signal acquisition: single-cell-resolution functional imaging | Overview: Pnevmatikakis
and Paninski '18; Compression and
denoising: Buchanan
et al '18; Demixing: Pnevmatikakis et al '16; Zhou et al '18; Friedrich et al '17b; Lu et al '17; Giovanucci et al '17; Deconvolution: Deneux et al '16; Picardo et al '16; Friedrich et al '17a; Berens et al '18 | HMM tutorial by Rabiner; HMM notes |
Oct 16, 23 | Poisson regression models; estimating time-varying firing rates; hierarchical models for sharing information across cells | Kass et al (2003), Wallstrom et al (2008), Batty et al (2017), Cadena et al (2017), Seely et al (2017) | Generalized linear model notes |
Oct 23 | Presentations of project ideas | Just two minutes each | |
Oct 30 | Expected log-likelihood. Network models. Optimal experimental design. | Ramirez and Paninski, '14, Field et al '10, Lewi et al '09, Shababo et al '13, Soudry et al '15 | |
Nov 6 | No class (University holiday) | ||
Nov 13, 20 | Point processes: Poisson process, renewal process, self-exciting process, Cox process; time-rescaling: goodness-of-fit, fast simulation of network models | Brown et al. '01, Mena and Paninski '14 | Uri Eden's point process notes; supplementary notes. |
Nov 27 | State space models; autoregressive models; Kalman filter; extended Kalman filter; fast tridiagonal methods. Applications in neural prosthetics, optimal smoothing of voltage/calcium traces, fitting common-input models for population spike train data | HMM tutorial by Rabiner; Kalman filter notes by Minka; Roweis and Ghahramani '99; Huys et al '06; Paninski et al '04; Jolivet et al '04; Beeman's notes on conductance-based neural modeling; Wu et al '05; Brown et al '98; Smith et al '04; Yu et al '05; Kulkarni and Paninski '08; Paninski et al '10, Vidne et al '12, Pfau et al '13, Gao et al '16 | state-space notes (need updating) |