Choose Language Hide Translation Bar
Level III
Analytic work in Industry 4.0 applications: A checklist

It’s important to make sure all analytic work is reviewed properly to avoid deriving any misleading conclusions.It’s important to make sure all analytic work is reviewed properly to avoid deriving any misleading conclusions.Consider this hypothetical: You work for a company developing and manufacturing medical devices. The COVID-19 pandemic created a worldwide shortage of ventilators. Since your company has recently implemented a major digital transformation strategy to meet Industry 4.0 standards, you are able to predict operating failures in alternative assembly lines, provide online monitoring of wave soldering processes, and gather focused statistics on assembly defects from automated visual inspection robots.

The flexibility acquired by this digital transformation permitted the rapid conversion of the company’s production lines to make the much-needed mechanical ventilators. A major element in this transformation is the application of analytics, since the flexibility described above requires high-level analytic capabilities. Keep reading for additional background and a checklist for reviewing analytic-based reports.

Industry 4.0, the so-called fourth industrial revolution, relies on three basic elements:

  1. Sensor technology that can extensively measure products and processes online.
  2. Flexible manufacturing capabilities – such as 3D printing – that can efficiently produce batches of varying size.
  3. Analytics that power the industrial engine with the capability to monitor, diagnose, predict and optimize decisions.

One significant analytic challenge is data integration, since sensors may collect data in different time cycles. Dynamic time warping (DTW) or other techniques may be the best way to get a holistic view.

Most likely, any analytic work done in industry means that the data has been collected actively or passively and that models have been developed with empirical methods, first principles or hybrid models. The industrial cycle provides short-term opportunities to try out new products or new process set-ups and, based on the results, determine follow-up actions. But it’s important to make sure all analytic work is reviewed properly to avoid deriving any misleading conclusions, which could be very costly and/or time-consuming. For example, a lithium battery manufacturer discovered it had uncalibrated test equipment evaluating end-of-the-line products. The company was able to avoid a major recall by using the plant’s control charts to precisely identify the problematic batches. To avoid shipping immature products or defective batches, good diagnostic capabilities are vital for identifying the cause of any reported problems.

Specific analytic challenges in systems engineering and industrial applications include:

  • Engineering design
  • Manufacturing systems
  • Decision-support systems
  • Shop-floor control and layout
  • Fault detection and quality improvement
  • Condition-based maintenance
  • Customer and supplier relationship management
  • Energy and infrastructure management
  • Cybersecurity and security 

These challenges generate projects for better monitoring products and process; designing new or converting existing products and processes; and improving products and processes.

To address these problems by using data, proper access to analytic methods is necessary. One approach to review analytic work is to use the information quality framework that I presented at JMP Discovery Summit Prague. Information quality is determined by the goal of the analysis, the methods of analysis used in the study, the data resolution with respect to the study goal, data integration, and communicating the findings. (For more on this topic, including how to set up an information quality evaluation workshop, you may be interested in The Real Work of Data Science.)

Information quality (InfoQ), a concept Galit Shmueli and I introduced in Information Quality: The Potential of Data and Analytics to Generate Knowledge, is a general framework for planning, tracking and assessing the value of data analytics. InfoQ is based on four components and eight dimensions. It is defined as “the utility of a particular data set for achieving a given analysis goal by employing statistical analysis or data mining.” InfoQ is affected by the analysis goal (g), the available data (X), the implemented data analysis (f) and study utility (U). In evaluating an analytic project, each of these four components and the eight information quality dimensions must be examined.

The eight InfoQ dimensions are: 

  • Data resolution
  • Data structure
  • Data integration
  • Temporal relevance
  • Generalizability
  • Chronology of data and goal
  • Operationalization
  • Communication

Data resolution needs to match the study goal so the outcomes of the analysis must be provided at the right time to the right decision maker. For example, some image processing analytics used to evaluate quality of a fast-running paper production process must produce timely results so that the quality engineer can act properly. If a precise imaging classifier, used to identify quality issues hidden in a paper roll, is delayed because of computational constraints, the information it produces will be of very low quality.

The checklist below can be used to assess the eight information quality dimensions of a specific study. A JMP add-in for summarizing the information quality assessment is available in the Community. Examples of how to use JMP to generate information quality can be downloaded here.

The Checklist

These are the eight questions to ask when reviewing an analytic study after clarifying the goals and utility.



Data resolution

Is the data granularity adequate for the intended job? Has measurement uncertainty been evaluated and found appropriate?

Data structure

Is it possible to use data from different sources that reflect on the problem at hand?

Data integration

How is data from different sources integrated? Are there linkage issues that lead to dropping crucial information?

Temporal relevance

Does the time gap between data collection and analysis cause any concern?

Chronology of data and goal

Are the analytic findings communicated to the right persons in a timely manner?


Can general conclusions be derived beyond what was explicitly studied? For example, conclusions that can be applied to other products or processes.


Are the measured variables themselves relevant to the study goal? Are there any stated action item recommendations derived from the study?


Are findings properly communicated to the intended audience?


These questions should help the reviewers who need to assess an analytics-based study in Industry 4.0 applications and beyond. However, they are not meant to be prescriptive and will need to be adapted to specific situations. In 2020, the Fourth Industrial Revolution continues to grow in advanced manufacturing, combining extensive sensor data with flexible manufacturing and advanced analytics. A new comprehensive book, covering state-of-the-art elements of system engineering and the Fourth Industrial Revolution, is available here. 

For More Information

Learn more through these JMP resources related to Industry 4.0:

K-means clustering analysis of seven sensors tracking possible deformation under stress, shown in the ENBIS webinar listed above.K-means clustering analysis of seven sensors tracking possible deformation under stress, shown in the ENBIS webinar listed above.

1 Comment

I think we gain appreciable support for data analytics by better communicating the opportunity to turn the data chaos often associated with passive data collection into helpful information, through the integration of subject matter knowledge, data management, and data analytics.