A funny thing happened when fitting a hierarchical model . . .

Peter Selb writes:

I have fitted a hierarchical model to a variable, SF, that is bounded on the unit interval and is heavily skewed (therefore I’m assuming a beta distribution) using your extremely helpful bugs-package for R. The data has a balanced panel structure with 52 units observed 10 times each.
The location equation contains random unit intercepts and random slopes for the linear time index, T (plus a control, LQ2).
The model converges quite quickly. Both random effects vary greatly across units and exhibit strong positive correlation.

Now the puzzle:
Theory suggests that both random effects are functions of a group-level predictor, LNM.
When I extract mean random coefficients from the Bugs output and plot them against LNM, very strong positive and linear relationships show up in both cases.
However, when I include LNM as a group-level predictor, its coefficients are indistinguishable from zero. I have no clue why this happens. I first thought that this is due to the uncertainty related to the random effects estimates, but these are estimated quite precisely.

My reply:

I agree that this sounds strange. The first thing I would do is put your code together–maybe you’ve already done this–with your scatterplots of the betas vs. LNM, right before your second Bugs run that includes LNM. (When this sort of thing has happened to me, it’s often because I’ve accidentally changed one of the data vectors.)

The next step, I think, is to set up a fake-data simulation. See how that goes. Good luck!