Page 1 of 1

informative priorParam in random slope

Posted: Mon Jun 13, 2022 10:06 am
by wfleming
I'm trying to run a random slope model with MCMC estimation:

Code: Select all

 runMLwiN(outcome ~ 1 + lev1_var + (1 + lev1_var | group_ID) + (1 | individual_ID), data = data, estoptions = list(EstM = 1, mcmcMeth = list(burnin = 500)) 
However, the model won't converge because the 'prior variance matrix is not positive definite'. This is because variance and covariance in the matrix from the IGLS is 0.0000. While I'm not expecting variance at this level, I would still like to run the MCMC estimation and get the results from the MCMC estimation model. This is for the sake of reporting comparable results rather than reporting non-convergence or just the IGLS estimation results (I'm running a series of related models with some converging and getting very low variance (<0.1) and others failing because of the 0 variance in IGLS).

To achieve this I believe I can set some informative prior (say variance = 0.001) to get the chains going in the first place.

My best shot was:

Code: Select all

 estoptions = list(EstM = 1, mcmcMeth = list(burnin = 500, priorParam = list(rp2 = list("RP2_cov_Intercept_lev1_var" = c(estimate = 0.001, size = 500), "RP2_var_lev1_var" = c(estimate = 0.001, size = 500) ) )    
How does the code look setting the priorParam estimate option?

Many thanks

Re: informative priorParam in random slope

Posted: Tue Jun 14, 2022 11:05 am
by ChrisCharlton
As you state the 'prior variance matrix is not positive definite' message usually occurs when the starting values for the covariance matrix, used as the prior for MCMC, is invalid. Normally this matrix is obtained from the IGLS estimates, however you can specify it manually via the startval option. You can see an example of this in the example replicating chapter 5 of the MCMC guide: https://www.bristol.ac.uk/cmm/media/r2m ... CGuide05.R.

Re: informative priorParam in random slope

Posted: Mon Jul 25, 2022 3:09 pm
by wfleming
This worked exactly as I wanted. Thanks very much for the help.