Choose Language Hide Translation Bar

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Tuesday, October 22, 2024
Executive Briefing Center 150
The semiconductor manufacturing industry stands on the brink of a transformative era, powered by advanced analytical techniques. This presentation delves into the application of predictive modeling and diagnostic analysis within JMP software to significantly enhance manufacturing outcomes, particularly during the crucial early sort and class test phases. By leveraging comprehensive parametric data collected across various stages of the semiconductor production process, we embark on a journey to refine the prediction of unit-level pass/fail outcomes and unearth the underlying causes of potential defects. Our study highlights the strategic use of JMP’s predictive modeling capabilities to accurately forecast the final system-level test status of semiconductor products. This approach not only allows for early detection of issues but also facilitates the implementation of corrective measures in a timely manner, thus ensuring higher yield rates and superior product quality. In parallel, diagnostic analysis within JMP offers a deep dive into the data, enabling manufacturers to identify and address root causes of failures across the intricate web of production processes. This presentation showcases real-world applications of these JMP features, demonstrating their pivotal role in streamlining semiconductor manufacturing workflows. See how predictive modeling and diagnostic analysis can be effectively employed to optimize production outcomes, reduce costs, and enhance product reliability. Join us in exploring the cutting-edge analytical strategies that promise to redefine the future of semiconductor manufacturing.
Tuesday, October 22, 2024
Executive Briefing Center 8
Tuesday, October 22, 2024
Executive Briefing Center 8
Once you’ve learned how easy it is to design an experiment in JMP, you never look at the world around you the same. Everything becomes an opportunity for an experiment! This presentation uses a practical example to demonstrate the process of design of experiments (DOE), including designing the experiment, modeling the results, and optimizing the inputs to provide the most desirable output. Attendees at last year’s Discovery conference were treated to an evening of unique fun: hitting glow-in-the-dark golf balls on the driving range at Indian Wells Golf Resort. The driving range has Toptracer technology that monitors each shot. Total distance, carry, ball speed, launch angle, and curve are some of the variables reported with each shot. A driving range that provides so much data provided a perfect opportunity to design an experiment using JMP! After an evening with fellow JMP users and friends, an experiment was designed using the Custom Designer in JMP. The design took only minutes to create. Input variables based on the golf stance setup were used in the design. These included variables such as grip, club head alignment, stance width, and ball location. The designed experiment was executed on the driving range, a model was created, and optimum settings to create the longest and straightest shot were discovered. The modeling and optimization were completed in minutes, while still on the driving range! This allowed for confirmation runs to immediately be performed. The benefits were later transferred the golf course as well.
Tuesday, October 22, 2024
Executive Briefing Center 9
There have been numerous studies showing efficacy of strategies in process optimization. The common comparisons made are usually between the ‘one factor at a time’ or OFAT experiments and a ‘design of experiments’  approach. When faced with an unfamiliar, high-dimensional process space (e.g. >10 factors), researchers often resort to the OFAT methods as they are easy to interpret.   Generally, it would be cost-prohibitive and logistically challenging to run multiple experiments geared towards the same objective just to evaluate which strategy outperforms others. To circumvent these issues, we used a Polymerase Chain Reaction (PCR) simulator with 12 unfamiliar continuous and categorical factors to explore these questions. Our team comes from decades of experience in process optimization in the electronic materials industry (former employees of Apple and others). We intentionally sought and selected a simulator from a research area completely unknown to us that has the ability to simulate a large number of factors and their complex interactions on many responses. To automate experimentation, we used a python web automation script. By using a simulator and our script, we can run through many experiments while mimicking real-life constraints and experimental budgets as seen in our own professional careers. While adhering to run budget rules, we compare the efficiency and accuracy of four strategies; two OFAT type strategies as commonly used in the industry, and two strategies from the DOE and advanced DOE genre. JMP is used for all experimental analyses and modeling and an objective attempt is made to compare the strategies.
Tuesday, October 22, 2024
Executive Briefing Center 7
Wednesday, October 23, 2024
Executive Briefing Center 9
Wednesday, October 23, 2024
Executive Briefing Center 7
Design of experiments (DOE) is a statistical method that guides the execution of experiments, analyzes them to detect the relevant variables, and optimizes the process or phenomenon under investigation. The use of DOE in product development can result in products that are easier and cheaper to manufacture, have enhanced performance and reliability, and require shorter product design and development times. Nowadays, machine learning (ML) is widely adopted as a data analytics tool due to increasing availability of large and complex sets of data. However, not all applications can afford to have big data. For example, in pharma and chemical industries, experimental data set is typically small due to cost constraints and the time needed to generate the valuable data. Nevertheless, incorporating machine learning into experimental design has proved to be an effective way for optimizing formulation in a small data set that can be collected cheaper and faster. There are three parts in this presentation. First, the literature relevant to machine learning-assisted experimental design is briefly summarized. Next, an adhesive case is presented to illustrate the efficiency of combining experimental design and machine learning to reduce the number of experiments needed for identifying the design space with an optimized catalyst package. In the third part, which pertains to an industrial sealant application, we use response surface data to compare the prediction error of the RSM model with models from various machine learning algorithms (RF, SVR, Lasso, SVEM, and XGBoost) using validation data runs within and outside the design space.   
Wednesday, October 23, 2024
Executive Briefing Center 8
Wednesday, October 23, 2024
Executive Briefing Center 9
Wednesday, October 23, 2024
Executive Briefing Center 9
Excursions can lead to significant costs at a manufacturing facility. In the semiconductor industry, production downtime, scrapped and low-yield wafers, and decreased output can result in substantial revenue losses. Engineers tasked with investigating these excursions must quickly uncover actionable insights for data-driven decisions that minimize downtime and improve product quality. JMP's Analytic Workflow can help you outline the steps and tools needed to efficiently investigate excursions. Explore how Query Builder facilitates data access from databases and how JMP's powerful Tables menu aids in data manipulation. Discover optimal table formats for visualizing and analyzing wafer map data. Utilize exploratory and analytical techniques to discover hidden relationships across manufacturing process steps.  Enhance your analysis with JMP Pro, leveraging features like image analysis in the new Torch Deep Learning Add-In for JMP Pro 18 to gain advanced insights. Automate and rerun your entire analysis using Workflow Builder in JMP, ensuring speed, repeatability, flexibility, and analytical power without the need for coding. Learn how leveraging these tools and techniques in JMP and JMP Pro can lead to efficient resolution and substantial cost savings in a fast-paced manufacturing environment.   Note:  The Workflow contained in the "Semiconductor Defects Workflow.zip" folder uses CSV import instead of Query Builder, although Query Builder will be used in the presentation.
Wednesday, October 23, 2024
Executive Briefing Center 150
Coral reefs across the planet are threatened by the rising seawater temperatures driven by climate change. Although many corals indeed "bleach," and consequently perish, as a result of prolonged exposure to abnormally high temperatures, some species (or even genotypes within a single species) maintain a marked level of climate resilience. Historically, we have identified these "super corals" in post-hoc fashion: searching through the proverbial rubble of a highly impacted reef to find the survivors. For coral reef restoration and other purposes, a more targeted, proactive means of identifying climate-resilient corals would be preferred to this "needle in a haystack" approach. To this end, I showcase a rich coral eco-physiological data set acquired during a month-long research expedition to the most remote corners of the Micronesian nation of Palau. After some rudimentary data processing and visualization, I show, using JMP Pro 17, how predictive models of coral resilience can be built relatively easily. I then demonstrate how GUIs derived from the models' prediction profilers can be embedded on web pages so that they can be used by scientists as a planning tool. Specifically, the model-based profilers allow researchers to predict the environmental conditions (e.g., depth, type of coral reef, salinity) at which they are most likely to find resilient corals during their bioprospecting surveys. This analytical tool will therefore aid marine biologists in locating corals with high climate tolerance that should be propagated in efforts to restore degraded reefs. 
Often involving sensitive reagents and complex, unstable products, the synthesis of organometallic catalysts can be challenging. Relatively weak bonds between metal centers and coordinating groups mean that aquo and dioxygen ligands can interrupt the desired molecular structure, frequently necessitating oxygen- and water-free working conditions. Following synthesis and characterization, the optimal conditions for these catalysts must be found. Costly and unsustainable metals such as rhodium, iridium, and palladium often form the centers of catalysts, and thus their consumption must be minimized. In this work, the relevance of Easy DOE to the optimization and analysis of three iridium catalysts is discussed in two groups of variables. The first set of conditions informs what the best working conditions of the catalysts are and helps outline its capabilities. Variables that are tested are the catalyst substituent (crown ether, methoxyethyl, or methyl), the substrate, the addition sodium or lithium salts, and the addition of water. The second set of conditions form an evaluation of environmental friendliness, nodding to Anastas and Warner’s criteria for green chemistry. Variables that are tested are the solvent (traditional solvents such as dichloromethane vs. greener choices such as acetonitrile), the pressure of hydrogen, and the sensitivity to oxygen. Easy DOE is employed to design these runs, slimming the input required to obtain meaningful data. Together, these two sets of conditions give a picture of the chemical environment that best suits the catalysts, as well as how to tune this chemistry for the greener.
Wednesday, October 23, 2024
Ped 2
Microtiter plate maps are a standard tool in laboratory experiments, allowing scientists to investigate physical, chemical, and/or biological reactions of test articles in various assays. Traditional data visualization methods of microtiter plates are often inefficient when conveying relationships unique to plate data, to include capturing both the spatial and temporal sources of variability. To address this problem, we created a JMP dashboard to visualize plate maps, providing users with unique insights into the spatial distribution and elements of their data. The dashboard facilitates easy visualization and exploratory data analysis through multiple interactive views of heat maps, scatter plots, and dose response curves on data simulated to highlight typical issues encountered in plate experiments. Users can dynamically switch between views, customize visualizations, and interact with individual or groups of data points that warrant probing. Additionally, the dashboard supports data filtering and annotation through row labeling, enhancing the interpretability and utility of plate map visualizations. The dashboard also serves as an effective tool in communicating data quality and potential areas of concern, such as plate and/or lab variability, or other process errors that may exist in the data. By facilitating flexible exploration and analysis of complex data sets, the dashboard empowers users to gain deeper insights from their plate-based experiments, accelerating scientific discovery and knowledge.
Wednesday, October 23, 2024
Ped 7
A marine drilling riser system is used in offshore exploration as a conduit connecting the drilling vessel with the subsea well. It is a complex structural subsea piping system commonly constructed by 75-90-feet long joints, typically sequentially assembled until they reach the wellhead, sometimes at water depth exceeding 10,000 feet. In a recertification project of a riser system meant to ensure compliance with regulatory requirements, inspection findings strongly indicated that the system had been exposed to an accelerated corrosion process. Corrosion rates for carbon steel in a seawater submerged application are normally measured to 0.1-0.4 mm per year. The inspection data showed localized corrosion rates exceeding 4 mm per year. Thirty riser joints were completely disassembled and inspected. However, 65 riser joints were inaccessible as they were located offshore and already in service. To quantify the operational risks and estimate the probability of non-compliance with the governing code, it became urgently necessary to extrapolate the corrosion data from the 30 inspected units to the inaccessible 65 units. Data distributions from the sample of 30 riser joints was used to run Monte Carlo simulations, using transfer equations modelled through a Fast Flexible Filling Design DOE in which the responses were generated through deterministic computer simulations. While the results of the simulations showed that the risk of non-compliance was unacceptable if the system was utilized to its design limits, even a slight reduction of the pressure level in the pipes reduced the risk of non-compliance to acceptable levels.
Wednesday, October 23, 2024
Ped 7
Fiber photometry is a cutting-edge technique that captures real-time in-vivo brain activity in laboratory animals. Often aligned with video-tracked behavioral data, fiber photometry allows scientists to directly pair observed behavior and brain signaling. While fiber photometry is useful for novel behavioral experiments, each experimental step (data collection, processing, analysis) introduces the possibility of error. As such, ensuring research is reproducible is critical, but the size of the data sets (up to 0.25 gigabytes) makes quality control a cumbersome task. JMP Graph Builder provides an excellent interface for dynamically and efficiently identifying quality control issues. In a project using fiber photometry to understand how norepinephrine signaling accompanies fear behavior in rodents, we used JMP to detect misalignment between the Ethovision XT-tracked behavior data and the fiber photometry data, to provide easy identification of behavioral tracking disruptions and to ensure that expected patterns were present. These quality control checks allowed for timely understanding of, intervention to, and correction to the data set, thus promoting research integrity. Additionally, the quality control plots gave scientists a novel and insightful way to understand their experiments. Graph Builder provided data quality checks for a sound analysis via JMP’s modeling platform. The scientists were also able to use JMP scripts and were motivated to learn how to use JMP themselves from this process. In this talk, we showcase the problem of quality control for large, complicated data sets, as exemplified through fiber photometry, and how JMP’s graphing capabilities allowed us to ensure quality data and reproducible research.