Hi JMPers!
I'm trying to get a deeper understanding of the loglinear variance models. Therefore I used the InjectionMolding-example from the JMP-documentation. Working with JMP I am able to reproduce the results from the website, of course.
Additionally I implemented an ML-estimator (based on the Harvey-paper) in R. Further I tried the remlscore function from the statmod-package in R (REML estimation like Smyth 2002 describes it). The funny result is, that I get quite similar results with both functions in R for the estimated residual variance (~1.88, ~1.9) but that number is pretty different in JMP: ~6.69.
Now I was curious if this is just some difference in the parametrization? All other numbers match quite nicely.
Here is the R-code, if it helps:
library(statmod)
# Sample Data
d = read.csv("Desktop/InjectionMolding.csv")
x = cbind(rep(1, nrow(d)), d[,c(2, 3)], d[,2]*d[,3]) # MoldTemp, Screw.Speed, 2FI
x = apply(x, 2, as.numeric)
z = d[,4] # Hold.Time
z = cbind(rep(1, length(z)), z)
y = d[,9] # Shrinkage
y = as.matrix(y)
colnames(x) = c("Const", "MoldTemp", "ScrewSpeed", "MoldXScrew")
colnames(z) = c("Const", "HoldTime")
colnames(y) = c("Strength")
reml_est = remlscore(y=y,X=x,Z=z)
reml_est