Covariances using orthogonal polynomials
Posted: Wed Apr 20, 2016 10:05 am
Hi all,
I am fitting a 3-level repeated measures model with random intercept and slope at levels 2 (ID) and 3 (group), with a fixed quadratic time predictor which can take 3 values corresponding to the 3 measurement occasions.
Something like:
y_ijk = B_0ijk*cons + B_1jk * Time_ijk + B_2 * Time^2_ijk
B_0ijk = B_0 + v_0k + u_0jk + e_0ijk
B_1jk = B_1 + v_1k + u_1jk
I am fitting this model in MLwiN and in R to evaluate whether there are any differences. Now, if I fit the model with the time component "as is" (i.e., the values for Time_ijk = {0,1,2} and Time^2_ijk = {0,1,4}), I have the usual collinearity issues but I get exactly the same output from both programmes (using IGLS), as should be expected.
If I use orthogonal coefficients, MLwiN and R use different contrasts, but the outcomes are comparable in everything (including -2*logLik) except for variances and covariances.
Specifically:
- The total level-2 and level-3 variances are comparable but they are decomposed differently between intercept and Time. For instance, MLwiN may return something like Var(Intercept|Group) = 35 and Var(Time|Group) = 5, whereas R Var(Intercept|Group) = 20 and Var(Time|Group) = 21. Their sum is almost identical but the decomposition differs.
- Even more puzzling, the covariances are not comparable at all. R outputs a level-2 and level-3 covariance which is identical of what both programmes find when using "standard" (i.e., non-orthogonal) coefficients, whereas MLwiN returns something completely different (in this instance, they are both rather large and negative for R and not different from 0 for MLwiN).
To summarise the covariance issue, the situation looks as follows:
I appreciate that the two programmes use different contrasts, but I fail to see how this may affect the sign of the covariances or the decomposition of the variances.
Thank you for any thoughts,
k.
I am fitting a 3-level repeated measures model with random intercept and slope at levels 2 (ID) and 3 (group), with a fixed quadratic time predictor which can take 3 values corresponding to the 3 measurement occasions.
Something like:
y_ijk = B_0ijk*cons + B_1jk * Time_ijk + B_2 * Time^2_ijk
B_0ijk = B_0 + v_0k + u_0jk + e_0ijk
B_1jk = B_1 + v_1k + u_1jk
I am fitting this model in MLwiN and in R to evaluate whether there are any differences. Now, if I fit the model with the time component "as is" (i.e., the values for Time_ijk = {0,1,2} and Time^2_ijk = {0,1,4}), I have the usual collinearity issues but I get exactly the same output from both programmes (using IGLS), as should be expected.
If I use orthogonal coefficients, MLwiN and R use different contrasts, but the outcomes are comparable in everything (including -2*logLik) except for variances and covariances.
Specifically:
- The total level-2 and level-3 variances are comparable but they are decomposed differently between intercept and Time. For instance, MLwiN may return something like Var(Intercept|Group) = 35 and Var(Time|Group) = 5, whereas R Var(Intercept|Group) = 20 and Var(Time|Group) = 21. Their sum is almost identical but the decomposition differs.
- Even more puzzling, the covariances are not comparable at all. R outputs a level-2 and level-3 covariance which is identical of what both programmes find when using "standard" (i.e., non-orthogonal) coefficients, whereas MLwiN returns something completely different (in this instance, they are both rather large and negative for R and not different from 0 for MLwiN).
To summarise the covariance issue, the situation looks as follows:
Code: Select all
........................|___R___|_MLwiN_|
standard coefficients...|___-2__|___-2__|
orthogonal coefficients.|___-2__|___0.1_|
Thank you for any thoughts,
k.