The problem of overestimation of group-level variance parameters

John Lawson writes:

I have been experimenting using Bayesian Methods to estimate variance components, and I have noticed that even when I use a noninformative prior, my estimates are never close to the method of moments or REML estimates. In every case I have tried, the sum of the Bayesian estimated variance components is always larger than the sum of the estimates obtained by method of moments or REML.

For data sets I have used that arise from a simple one-way random effects model, the Bayesian estimates of the between groups variance component is usually larger than the method of moments or REML estimates. When I use a uniform prior on the between standard deviation (as you recommended in your 2006 paper) rather than an inverse gamma prior on the between variance component, the between variance component is usually reduced. However, for the dyestuff data in Davies(1949, p74), the opposite appears to be the case.

I am a worried that the Bayesian estimators of the variance components are too large. Do you have any comments or advice for me on this topic such as what noninformative prior I should use? Any response from you have would be greatly appreciated.

My reply:

In my 2006 paper I recommend a half-Cauchy prior distribution. If you set the scale on this prior to a reasonable value, it should reduce the tendency to overestimate the group-level variance. I’d also note that many statisticians actually like to overestimate the group-level variance, as this results in less shrinkage, and many statisticians are uncomfortable with shrinkage.

1 thought on “The problem of overestimation of group-level variance parameters

  1. In chapter 7 of his new book, Bayesian Analysis for the Social Sciences, Simon Jackman writes that using mcmcsamp to sample from the posterior for the variance from a model fit with lme4::lmer (using REML) underestimates the uncertainty about the variance and, at least in his example, underestimates the variance also.

Comments are closed.