Parameterization and Bayesian Modeling

My talk at the Institut Henri Poincaré tomorrow 2pm:

Progress in statistical computation often leads to advances in statistical modeling. For example, it is surprisingly common that an existing model is reparameterized, solely for computational purposes, but then this new configuration motivates a new family of models that is useful in applied statistics. One reason why this phenomenon may not have been noticed in statistics is that reparameterizations do not change the likelihood. In a Bayesian framework, however, a transformation of parameters typically suggests a new family of prior distributions. We discuss examples in censored and truncated data, mixture modeling, multivariate imputation, stochastic processes, and multilevel models.

Here’s the first slide:

\begin{itemize}
\item<2-> The folk theorem
\item<3-> The Pinocchio principle
\item<4-> Latent variables, transformations, and Bayes
\item<5-> Examples
\begin{itemize}
\item<6-> Truncated and censored data
\item<7-> Modeling continuous data using an underlying discrete distirbution
\item<8-> Modeling discrete data using an underlying continuous distirbution
\item<9-> Parameter expansion for hierarchical models
\item<10-> Iterative algorithms and time processes
\end{itemize}
\end{itemize}