Subscribe Bookmark
amy_bullock

Staff

Joined:

Aug 5, 2014

"Quality time" with the Ramírez family

Award-winning authors Brenda Ramírez and José G. Ramírez were our guests for the March installment of Analytically Speaking. If you missed their webcast, it’s now available on demand. During the interview, they discussed how traditional quality techniques – when updated with the latest advances in analytics and data visualization – help to gain new insights about processes, products and services.

Due to the large volume of questions submitted by our viewing audience during the live webcast, we were unable to address all inquiries. But, the Ramírezes were kind enough to follow up on a few of them in this post-event blog interview.

Question: Sometimes the EMP process doesn't provide the equivalent of a PT ratio for evaluating being able to discriminate conforming from nonconforming parts. What do you recommend for obtaining a comparable assessment of the classic AIAG based PT ratio using the EMP method?

Answer: The EMP platform in JMP can generate the traditional gauge R&R results by selecting “EMP Gauge R&R Results.” The precision to tolerance (PT) ratio can then be easily computed using the formula 6*(Gauge R&R Std Dev)/(USL-LSL). In the EMP context, for discriminating between conforming and non-conforming parts, the probable error can be used to define manufacturing specifications that provide a given likelihood that the measured part is within the customer specifications.

Question: Does an output that is unstable and unpredictable challenge us to evaluate if we have the right characteristics? Please cite an example showing the difference between a process parameter injection pressure versus a product characteristic part thickness.

Answer: In our book, we describe the use of a process flowchart, which depicts the four dimensions of any process step: inputs, outputs, process knobs, and noise factors. Figure 2.1 illustrates these dimensions for a Lamination step. The process knobs (process parameters) may include speed, temperature and pressure, while the outputs (product characteristics) may include the bond strength, thickness and delamination resistance. The bond strength of a laminate, for example, is determined by the inputs and process knobs and can also be impacted by the noise factors. Design of experiments is used to characterize and optimize the process parameters, while manufacturing disciplines and adherence to quality systems are needed to run the process in a repeatable manner over time. Ideally, an effective strategy monitors and controls the process inputs, process knobs and noise factors, ensuring that the process outputs meet customer’s specifications.

Question: Can your methodologies be used in a contact center environment?

Answer: Reliability methods can be used, for example, to estimate the median resolution time of a particular customer issue. These methodologies can also be used to evaluate the impact of eliminating particular causes (waiting for parts, wrong information, weather issues, etc.) on the estimated median resolution time.

Question: Can you comment on ways we might use JMP to analyze and optimize RF electronics data from antennae and microwave systems across multiple dimensions such as frequency spectrum, range of input power, across temperature range, etc.?

Answer: This is an area of application that is more suited to functional data analysis since the responses of interest (VSWR, return loss, attenuation, etc.) are curves rather than points. However, in my experience, there are certain frequencies within the frequency spectrum that are key for certain applications. These key frequencies can be used to monitor the process using process behavior charts, and experiments can be designed to optimize responses at those particular frequencies. There are also ways of summarizing the information within a given curve.

Question: In your experience, how do you handle an experiment done incorrectly by other parties, such as generating too many factors in the data that is difficult to narrow down by conventional cluster analysis or principal components?

Answer: In an experiment with too many factors and a few runs, we may not be able to estimate all the main effects and important two-factor that we care about. For those effects that we can estimate, we need to look at the confounding structure to understand what we are estimating. For two-level designs, the JMP Screening platform (within the Analyze>Modeling) is good at picking out big effects, but it does not effectively test each effect. We can use this information, along with our engineering and scientific knowledge, to decide if they are practically significant.

Question: You both have spent your entire careers working with engineers and scientists. What do you like most about working with them?

Answer: Yes, we have been collaborating and learning from engineers and scientists for many years, working of some very interesting projects. These collaborations always challenge us to put the problem first and to use statistics as a catalyst to provide sensible solutions that have a business impact. Through this process, we have learned more about the subject matter, have found the right balance of statistical sophistication and practical value, and have honed our communication skills. As a result, we have found fun and interesting ways to teach statistical content in a relevant and useful way.

Question: What are your favorite features in JMP?

Answer: Some of the features we use the most are Summary within the Tables menu, the Graph Builder, the Control Chart Builder, the Custom Designer, and the Distribution platform, that we use to fit different distributions and to calculate Ppk with confidence intervals.