Hey @LEP !
In terms of how you set up the design, you're spot on! I'm glad to see those "easy buttons" were useful in setting up the design for you.
Regarding the analysis, you did pick the correct model. However, it looks like the ordering of the factors in the grouping section does matter and it is not correct by default. The Operator factor should be at the bottom of the list with BuildMachine at the top. I found this by using the Heterogenity of Variance Tests and making sure that the nesting structure was correct. It seems that the platform treats Nested then Crossed as first factor nested within second, then both crossed with third. I'll be opening an internal ticket to address this issue for JMP 18. Thankfully, it's easy to fix interactively. Just click and drag Operator to the bottom in the X-Grouping list box.
For testing the differences in means, I'm inclined to think that you may need a different platform. The EMP table script opens the EMP Measurement Systems Analysis platform, which has a Bias Comparisons red triangle menu item (for the second red triangle in the Measurement Systems Analysis for Y outline). This performs a graphical comparison, but necessarily a formal hypothesis test. I'm not sure if the script has the right structure for the analysis there either, so I'll be including it in my internal ticket. The other option is to run the Fit Model script, with Operator and Part[BuildMachine] as random effects (Attribute>>Random Effect) and BuildMachine as a fixed effect. You can include all the crossed terms as random effects as well.
A quick note. If the variance component for BuildMachine is a significant contributor to the overall variation, then that in itself would indicate significant differences among the machines. But if you're looking to quantify what those differences are, then you can see my previous suggestions.
Hope that helps!
Caleb