In the semiconductor manufacturing enviroment, it is critical to maintain product yield, cycle time, and quality, due to the costly and complex process. Maintaining process stability and ability to recover from out of control situations are of utmost importance for product quality, cost, and delivery.

Using a structured problem-solving approach has been highly successful to quickly and systematically identify the root cause and solution to return a product or process to normal in a disciplined way and a minimal timeframe.

NXP has been developing and enhancing the capability and definition of structured roblem solving, integrated with well known Six SIgma methodology. 

This presentation shares how to enhance the effectiveness and efficiency to narrow down multiple potential root causes into a few most likely ones by using statistical modeling and screening techniques in JMP that are integrated with basic Six Sigma tools, such as 8D, IS-SINOT, and fault tree analysis.

 

 

 

Hello, everyone. My name is Akira Abe from NXP Semiconductors. Before we begin the presentation, I would like to briefly introduce myself. I am leading the systematic problem-solving methodologies at the NXP as a Master Black Belt.

Reflecting on the past 10 years, our environment has significantly changed with the breakthrough of the big data, machine learning, and the AI. This method, which is unimaginable when I took my Black Belt trainings. These are now easily accessible and easy to use with JMP. Today, I would like to share with you how this relatively new method using JMP can contribute to problem-solving.

In the semiconductor manufacturing environment, it is critical to maintain product yield, cycle time, and the quality due to the costly and the complex process. Maintaining process stability and the ability to recover from out-of-control situation is of the top most important for product quality, cost, and delivery.

Using the structure problem-solving approach has been highly successful to quickly and systematically identify the root cause and the solution to return a product or process to normal in a disciplined way and a minimal time frame. NXP has been developing and enhancing the capability and identify of the structure problem-solving, integrated with the well-known Six Sigma methodology.

This presentation will share how to enhance the effectiveness and the efficiency to narrow down multiple potential of root cause into a few most likely root causes by utilizing modeling and screening techniques using JMP, which integrated with Basic Six Sigma tools, such as ED, is/is not, and FTE.

Before the case study, here is a quick overview of the semiconductor manufacturing. It involves hundreds of the steps, like an ion implant and the pattern to build a circuit on silicon wafers. On the passing chips become components, and we test before shipping to the customer like an automotive and IoT.

When electrical test failure occur caused by unknown or complex factors, we apply the 8D problem-solving method. This allows us to systematically analyze all process step, including materials, to identify root cause.

Six Sigma methodology visualize the root cause identification process as a funnel. We begin with a broad set of the potential root causes, then systematically narrowing them down to ultimately identify the real root cause. This table summarizes the Six Sigma tool and the JMP tool used at each phase of the root cause identification process.

After identifying the potential root causes, next is identify the most likely root cause highlight in orange. Then, we use the DOE to pinpoint the real root cause. The focus this presentation is the JMP tool used in the most likely root cause identification phase. In this presentation, we will demonstrate how to narrowing down from potential root cause to most likely root cause using statistical screening and the modeling capabilities in JMP.

Especially, we will explore predictor screening, Lasso regression, PLS, and MDMCC, Multivariate Control Chart. This slide present a simplified flow diagram of the ED process. The initial step, root cause investigation is problem definition. In this phase, we analyze performance indicator using a control chart to visualize how performance degraded.

Next step is identify potential root cause. We use the cause and effect diagram as example to summarize possible contributing factors based on knowledge. Then, collect the relevant data and apply is/is not analysis tool to narrowing down the root causes. Then, use for pre-analysis FTA to consider relationship.

Final step is the root cause verification DOE in semiconductor manufacturing. Even after narrowing down the most likely root causes, challenge remain. In many interactions between process, output parameter, and the tool input parameter, these parameters may exhibit a complicating analysis. Additionally, PPM-level problem often lack sufficient statistical power for analysis. In such case, easy analysis may fail to detect critical difference requiring more time and the human resource to identify the actual root cause.

This table provide an example of how to choose optimal statistical analysis tool based on the problem occurrence rate and the data type. Earlier, we briefly touch on machine learning. Modeling techniques such as the decision tree and the neural networks are not only useful for prediction, but can also be used to root cause analysis.

JMP offer both screening and advanced modeling capabilities, which can be used for identifying root cause as well. There are many screening and modeling tool available in JMP, each with its own strengths and limitations. Therefore, it is important to select the most appropriate tool or method, case by case.

The first case study focus on high frequency issue and the use of the predictor screening analysis. Predictor screening support the narrowing down factor phase, which is highlighted in blue in the problem-solving flow diagram. It used the three-base modeling technique. It is an innovative approach that enables the identification of the most likely root causes from hundreds or even thousands of the variable in big data.

Bootstrap forest can evaluate the contribution of the each factor and the screen of less significant effect factors. Predictor screening is especially effective in case where traditional multiple regression analysis struggle, such as when the signal in the response is weak or when variable are highly collated.

Here is a case study using predictor screening. This example involves identifying the root cause of the yield loss due to shifting in electrical probing parameters. Based on knowledge, we selected the 54 potential factors. Each factor has the 500 observation for the analysis. As preparation, we perform the check for outlier and the missing data to ensure data quality. Then, we applied predictor screening to narrow down the most Y root cause from the data set.

In the predictor screening platform, we assign the electrical flow parameter as Y and the selected all 54 factor as X. Here are the analysis result from the predictor screening. The contribution column indicates the relevant contribution of the each variable factor with the higher contribution portion are likely to be the most likely factors. Based on this result, we selected their top 20 contributing factors and perform the detailed modeling analysis after this analysis.

Next is the generalized regression, Lasso regression for high frequency issue. It helps narrowing down of the factor phase here. Lasso applied the penalty term, so that showing some regression coefficient to zero. As a result, it helps automatically select the most likely root cause factors.

As we will see in the next slide, the profiler platform in the JMP include the variable importance report, which help assess the impact of each factor on the problem. Lasso regression is especially effective these conditions.

Here is a case study. This is the same case as before. Identify the root cause of yield loss by using 54 factors, total of 500 observations. Lasso regression automatically shrink the sum estimated regression coefficient to zero, effectively remove less important factor from the model automatically.

The variable importance report that highlighted the total effect percent is considered to have the greatest impact on the yield loss. The variable important index quantify how much each root cause factor contribute to the shift. Compared to the previous predictor screen result, which narrowed down to 20 factors, Lasso automatically reduce it to just six key factors.

Next is the PLS regression is particularly useful when many factors are highly collated. The PLS regression is effective, especially when positional multiple regression fail due to multicollinearity or wide and shallow data set, PLS regression is a right tool. While PLS used to be difficult to apply in the real-world operation, but JMP has removed these barriers, making PLS modeling accessible at the manufacturing level.

In our semiconductor manufacturing environment, we often encounter strong correlation among variable such as process temperature pressures. In such a case, PLS regression has proven extremely variable in identifying the most likely root causes.

Now, look at the case study using a PLS regression. This is the same case as before, identify root cause of the yield loss by using 54 factors, a total of 500 observation. To begin the analysis, under Fit Model, we selected this platform. The personality, we selected the PLS regression. Since the initial goal is screening, we select the main effect only in this case.

The VIP and the coefficient interface report from PLS regression. This plot center and standardize, allowing for a visually understand which variable are most likely to be a root cause. Here is a color-enhanced version of the report for better clarity. Darker red, so this area, indicate the most likely root cause factor. From this analysis, we could identify six key factors. Then use these six key factors perform response surface DOE include interaction factor for deep analysis.

The final case study focus on the low frequency issue and the use of the MDMCC. MDMCC offer comprehensive platform that support narrowing down potential root cause factor detecting the shifting point, and also pinpointing the real root cause parameter. All within a single integrated platform, it's a truly powerful tool. MDMCC use data from the in contour player as a baseline and that construct as a single control chart from the 100 or 1,000 of the multivariate process variable.

This method help detect process shift, identify breakdown in multiple aid relationship or correlations, and determine which process parameter changed leading to the issue. MDMCC, especially effective, so this situation, if the issue is PPM level event, if the response signal is weak.

Here is a case study. First step in the MDMCC is the analysis in the build a baseline model. We use the 250 observation from non-normal period to construct the baseline model. We analyze 1,250 observation using MDMCC to detect the shifting point, any data point above the upper control limit indicate the deviation from the baseline model, suggesting a potential process shift.

When hovering over a specific data point, a Pareto chart appears. The Pareto chart rank the variable by their contribution to the model break. The individual control chart are displayed by hovering over each bar in the Pareto chart. Through this review, we successfully identify which process step contributing to the PPM level yield loss. This approach enable us to resolve a PPM level yield loss issue in the semiconductor process with remarkable efficiency.

In summary, for high-frequency issue, predictor screening quickly identify and rank the most likely root causes. Lasso regression, automatically exclude low impact factors. PLS regression, handle correlated data and provide great visualization of the factor inference. For low frequency issue, MDMCC, especially effective for identifying the root cause without being affected by factor interactions. It also detects a shift point and the root cause in the single-integrate workflow.

Also, this JMP tool help minimize complex preparation tasks such as checking the multicollinearity, manual data, the pre-processing and the recalculating control limit, greatly improving the efficiency and the usability in the real-world problem-solving.

In conclusion, by integrating a Six Sigma tool with JMP advanced analytics, we can significantly enhance their problem-solving capabilities, enabling the more effective and rapid resolution of the complex challenges.

Key takeaways from the today's session, combining the structure approach of the Six Sigma with JMP statistical modeling and the screening tool allow for more efficient and the impact for problem-solving. JMP handling large data set with complex relationship include the multicollinearity, making it ideal for modern manufacturing and the engineering environment.

Using JMP, screening, and the modeling capability, we can effectively narrow down from dozen or potential root cause to a few key factors, saving time and improving accuracy. We hope his presentation will support your effort to drive faster and more reliable problem-solving.

Lastly, I thank all people who support me for this publication.

Presented At Discovery Summit 2025

Presenter

Skill level

Intermediate
  • Beginner
  • Intermediate
  • Advanced

Files

Published on ‎07-09-2025 08:58 AM by Community Manager Community Manager | Updated on ‎10-28-2025 11:41 AM

In the semiconductor manufacturing enviroment, it is critical to maintain product yield, cycle time, and quality, due to the costly and complex process. Maintaining process stability and ability to recover from out of control situations are of utmost importance for product quality, cost, and delivery.

Using a structured problem-solving approach has been highly successful to quickly and systematically identify the root cause and solution to return a product or process to normal in a disciplined way and a minimal timeframe.

NXP has been developing and enhancing the capability and definition of structured roblem solving, integrated with well known Six SIgma methodology. 

This presentation shares how to enhance the effectiveness and efficiency to narrow down multiple potential root causes into a few most likely ones by using statistical modeling and screening techniques in JMP that are integrated with basic Six Sigma tools, such as 8D, IS-SINOT, and fault tree analysis.

 

 

 

Hello, everyone. My name is Akira Abe from NXP Semiconductors. Before we begin the presentation, I would like to briefly introduce myself. I am leading the systematic problem-solving methodologies at the NXP as a Master Black Belt.

Reflecting on the past 10 years, our environment has significantly changed with the breakthrough of the big data, machine learning, and the AI. This method, which is unimaginable when I took my Black Belt trainings. These are now easily accessible and easy to use with JMP. Today, I would like to share with you how this relatively new method using JMP can contribute to problem-solving.

In the semiconductor manufacturing environment, it is critical to maintain product yield, cycle time, and the quality due to the costly and the complex process. Maintaining process stability and the ability to recover from out-of-control situation is of the top most important for product quality, cost, and delivery.

Using the structure problem-solving approach has been highly successful to quickly and systematically identify the root cause and the solution to return a product or process to normal in a disciplined way and a minimal time frame. NXP has been developing and enhancing the capability and identify of the structure problem-solving, integrated with the well-known Six Sigma methodology.

This presentation will share how to enhance the effectiveness and the efficiency to narrow down multiple potential of root cause into a few most likely root causes by utilizing modeling and screening techniques using JMP, which integrated with Basic Six Sigma tools, such as ED, is/is not, and FTE.

Before the case study, here is a quick overview of the semiconductor manufacturing. It involves hundreds of the steps, like an ion implant and the pattern to build a circuit on silicon wafers. On the passing chips become components, and we test before shipping to the customer like an automotive and IoT.

When electrical test failure occur caused by unknown or complex factors, we apply the 8D problem-solving method. This allows us to systematically analyze all process step, including materials, to identify root cause.

Six Sigma methodology visualize the root cause identification process as a funnel. We begin with a broad set of the potential root causes, then systematically narrowing them down to ultimately identify the real root cause. This table summarizes the Six Sigma tool and the JMP tool used at each phase of the root cause identification process.

After identifying the potential root causes, next is identify the most likely root cause highlight in orange. Then, we use the DOE to pinpoint the real root cause. The focus this presentation is the JMP tool used in the most likely root cause identification phase. In this presentation, we will demonstrate how to narrowing down from potential root cause to most likely root cause using statistical screening and the modeling capabilities in JMP.

Especially, we will explore predictor screening, Lasso regression, PLS, and MDMCC, Multivariate Control Chart. This slide present a simplified flow diagram of the ED process. The initial step, root cause investigation is problem definition. In this phase, we analyze performance indicator using a control chart to visualize how performance degraded.

Next step is identify potential root cause. We use the cause and effect diagram as example to summarize possible contributing factors based on knowledge. Then, collect the relevant data and apply is/is not analysis tool to narrowing down the root causes. Then, use for pre-analysis FTA to consider relationship.

Final step is the root cause verification DOE in semiconductor manufacturing. Even after narrowing down the most likely root causes, challenge remain. In many interactions between process, output parameter, and the tool input parameter, these parameters may exhibit a complicating analysis. Additionally, PPM-level problem often lack sufficient statistical power for analysis. In such case, easy analysis may fail to detect critical difference requiring more time and the human resource to identify the actual root cause.

This table provide an example of how to choose optimal statistical analysis tool based on the problem occurrence rate and the data type. Earlier, we briefly touch on machine learning. Modeling techniques such as the decision tree and the neural networks are not only useful for prediction, but can also be used to root cause analysis.

JMP offer both screening and advanced modeling capabilities, which can be used for identifying root cause as well. There are many screening and modeling tool available in JMP, each with its own strengths and limitations. Therefore, it is important to select the most appropriate tool or method, case by case.

The first case study focus on high frequency issue and the use of the predictor screening analysis. Predictor screening support the narrowing down factor phase, which is highlighted in blue in the problem-solving flow diagram. It used the three-base modeling technique. It is an innovative approach that enables the identification of the most likely root causes from hundreds or even thousands of the variable in big data.

Bootstrap forest can evaluate the contribution of the each factor and the screen of less significant effect factors. Predictor screening is especially effective in case where traditional multiple regression analysis struggle, such as when the signal in the response is weak or when variable are highly collated.

Here is a case study using predictor screening. This example involves identifying the root cause of the yield loss due to shifting in electrical probing parameters. Based on knowledge, we selected the 54 potential factors. Each factor has the 500 observation for the analysis. As preparation, we perform the check for outlier and the missing data to ensure data quality. Then, we applied predictor screening to narrow down the most Y root cause from the data set.

In the predictor screening platform, we assign the electrical flow parameter as Y and the selected all 54 factor as X. Here are the analysis result from the predictor screening. The contribution column indicates the relevant contribution of the each variable factor with the higher contribution portion are likely to be the most likely factors. Based on this result, we selected their top 20 contributing factors and perform the detailed modeling analysis after this analysis.

Next is the generalized regression, Lasso regression for high frequency issue. It helps narrowing down of the factor phase here. Lasso applied the penalty term, so that showing some regression coefficient to zero. As a result, it helps automatically select the most likely root cause factors.

As we will see in the next slide, the profiler platform in the JMP include the variable importance report, which help assess the impact of each factor on the problem. Lasso regression is especially effective these conditions.

Here is a case study. This is the same case as before. Identify the root cause of yield loss by using 54 factors, total of 500 observations. Lasso regression automatically shrink the sum estimated regression coefficient to zero, effectively remove less important factor from the model automatically.

The variable importance report that highlighted the total effect percent is considered to have the greatest impact on the yield loss. The variable important index quantify how much each root cause factor contribute to the shift. Compared to the previous predictor screen result, which narrowed down to 20 factors, Lasso automatically reduce it to just six key factors.

Next is the PLS regression is particularly useful when many factors are highly collated. The PLS regression is effective, especially when positional multiple regression fail due to multicollinearity or wide and shallow data set, PLS regression is a right tool. While PLS used to be difficult to apply in the real-world operation, but JMP has removed these barriers, making PLS modeling accessible at the manufacturing level.

In our semiconductor manufacturing environment, we often encounter strong correlation among variable such as process temperature pressures. In such a case, PLS regression has proven extremely variable in identifying the most likely root causes.

Now, look at the case study using a PLS regression. This is the same case as before, identify root cause of the yield loss by using 54 factors, a total of 500 observation. To begin the analysis, under Fit Model, we selected this platform. The personality, we selected the PLS regression. Since the initial goal is screening, we select the main effect only in this case.

The VIP and the coefficient interface report from PLS regression. This plot center and standardize, allowing for a visually understand which variable are most likely to be a root cause. Here is a color-enhanced version of the report for better clarity. Darker red, so this area, indicate the most likely root cause factor. From this analysis, we could identify six key factors. Then use these six key factors perform response surface DOE include interaction factor for deep analysis.

The final case study focus on the low frequency issue and the use of the MDMCC. MDMCC offer comprehensive platform that support narrowing down potential root cause factor detecting the shifting point, and also pinpointing the real root cause parameter. All within a single integrated platform, it's a truly powerful tool. MDMCC use data from the in contour player as a baseline and that construct as a single control chart from the 100 or 1,000 of the multivariate process variable.

This method help detect process shift, identify breakdown in multiple aid relationship or correlations, and determine which process parameter changed leading to the issue. MDMCC, especially effective, so this situation, if the issue is PPM level event, if the response signal is weak.

Here is a case study. First step in the MDMCC is the analysis in the build a baseline model. We use the 250 observation from non-normal period to construct the baseline model. We analyze 1,250 observation using MDMCC to detect the shifting point, any data point above the upper control limit indicate the deviation from the baseline model, suggesting a potential process shift.

When hovering over a specific data point, a Pareto chart appears. The Pareto chart rank the variable by their contribution to the model break. The individual control chart are displayed by hovering over each bar in the Pareto chart. Through this review, we successfully identify which process step contributing to the PPM level yield loss. This approach enable us to resolve a PPM level yield loss issue in the semiconductor process with remarkable efficiency.

In summary, for high-frequency issue, predictor screening quickly identify and rank the most likely root causes. Lasso regression, automatically exclude low impact factors. PLS regression, handle correlated data and provide great visualization of the factor inference. For low frequency issue, MDMCC, especially effective for identifying the root cause without being affected by factor interactions. It also detects a shift point and the root cause in the single-integrate workflow.

Also, this JMP tool help minimize complex preparation tasks such as checking the multicollinearity, manual data, the pre-processing and the recalculating control limit, greatly improving the efficiency and the usability in the real-world problem-solving.

In conclusion, by integrating a Six Sigma tool with JMP advanced analytics, we can significantly enhance their problem-solving capabilities, enabling the more effective and rapid resolution of the complex challenges.

Key takeaways from the today's session, combining the structure approach of the Six Sigma with JMP statistical modeling and the screening tool allow for more efficient and the impact for problem-solving. JMP handling large data set with complex relationship include the multicollinearity, making it ideal for modern manufacturing and the engineering environment.

Using JMP, screening, and the modeling capability, we can effectively narrow down from dozen or potential root cause to a few key factors, saving time and improving accuracy. We hope his presentation will support your effort to drive faster and more reliable problem-solving.

Lastly, I thank all people who support me for this publication.



Start:
Sun, Jun 1, 2025 09:00 AM EDT
End:
Sun, Jun 1, 2025 10:00 AM EDT
Attachments
0 Kudos