Showing results for 
Show  only  | Search instead for 
Did you mean: 
Submit your abstract to the call for content for Discovery Summit Americas by April 23. Selected abstracts will be presented at Discovery Summit, Oct. 21- 24.
Discovery is online this week, April 16 and 18. Join us for these exciting interactive sessions.
Choose Language Hide Translation Bar
Level III
Want scientists and engineers to make more discoveries? Here's how to help them

Business and production systems have become much more capable at collecting data. Equipment collects a variety of sensor and parametric data, and today all kinds of information on buying habits and consumer preferences is available. This level of detail cannot be analyzed and comprehended with static, conventional reporting. Instead, business analysts, engineers and scientists can unlock insights and make discoveries with leverage provided by interactive, visual analytical software.

Analytical software has surfaced a new world of analytics that is characterized by these important traits:

  1. Data are “self-provisioned.” Users are able to get the data they need without assistance and without delay.
  2. The analytics are visual and interactive. As a result …
  3. Users can now conduct advanced analytics without a PhD in statistics.
  4. Analysts conduct their work “in-the-moment.” Insights often surface questions that analysts explore “in-the-moment” creating an active dynamic that further spawns discovery.
  5. Analytical thinking is completely coupled to the business thinking.
  6. More than descriptive, analytics are inferential.

An 'aha' moment

Consider this insurance example. Here demographic information from many thousands of current and potential clients was collected and maintained in a database. The insurance company was able to download the data into a spreadsheet and summarize the data but did they get the best exploitable insights? Answering even the simplest questions took days to acquire, splice and arrange the data.

Today, with integrated, interactive and visual analytics insights are revealed in seconds. The big question when it comes to prospective clients is how many of them were converted to new business and what are the factors that drive the conversion? By knowing this, focus can be brought to business practices that lead to higher rates of success.

Screen-Shot-2016-10-17-at-7.23.24-PM.pngWe started by loading the data in JMP. With only a few clicks, tens of thousands prospective client encounters, including demographic information such as income, education, age, martial status, etc., were loaded. You can see from the image above that overall about 12.5% (the blue area) of these prospects were converted into paying customers.

Now to the question at hand: What factors determine success in winning new business? One more click (on the Split button in the lower-left) and an “aha” moment ensued.Screen-Shot-2016-10-17-at-7.24.01-PM.png

The JMP chart above shows that a particular factor (which, due to confidentiality I can’t disclose so we’ll call it ... ), “factor Xn,” leads to an incredibly high conversion rate (about 90% as seen in the blue bar on the right) for a good number of prospects and that the remaining prospects had little chance of succeeding.

The analysts were stunned at seeing this. This insight had eluded them because the overall conversion rate was masking a major distinction, identified by factor Xn, among the prospects. Keep in mind that these analysts spend day-in and day-out poring over data, but this important insight and others that were to follow remained locked within.

This insight spawned a bunch of questions. First, it appears changing sales representative instructions were in order. Second, why was it that the conversion rate for other customers was so incredibly low? This led to questions about pricing, packaging and the like in combination with demographics that would be investigated with designed experiments.

Why it worked

Looking back at the six traits above, we can see that in this case:

  1. IT established systems that allowed users to get the data themselves: "self-provisioned data."
  2. Indeed the analytics were highly visual. Yes, all the statistical information is provided, but it is made accessible through graphics and interactivity.
  3. No PhD in statistics was necessary. The analysis above involves recursive partitioning with cross-validation. A mouthful to be sure, but that complexity (and statistical jargon) does not get in the way of a business analyst or engineer gaining the highest possible number and quality of exploitable insights. They can focus on their subject matter unfettered. In fact, my experience is that the tool almost becomes invisible as the focus is on the subject matter.
  4. Unlike the old days, when I started in this game, there was no need to submit a request that instructs programmers in IT to amend a report that will arrive several days later. The lapsed time between question-and-answer was gone, and so was the dependency.
  5. The old division of labor between analytics and business was gone. They must be welded together to be effective and efficient at finding exploitable business, engineering and scientific insights.
  6. Notice that the analysis is not simply descriptive, as it was in the old days. It is inferential because it leads analysts to predict future outcomes and ask further questions.

Not only were the analysts impressed with the insight, but they were also excited about how readily it was derived.

Build your own culture of analytics

What does it take to bring the new world of analytics into your organization and support a culture of analytics?

This is where IT comes in -- obviously, they have a major role to play. IT no longer needs to worry about conducting analytics. It’s best left to the analysts. Instead, IT are now enablers of analytics. They can do this by:

  1. Maintaining the hardware and software infrastructure that supports operational and analytical needs.
  2. Making data available in an analytically-friendly way so that data may be self-provisioned. We do lots of work in this area to ensure that analytical data demands do not affect operations. For example, in pharmaceutical, semiconductor, solar and other industries, unimpeded real-time data must be collected for traceability. Analytical demand on IT infrastructure cannot affect operational systems.
  3. Support the likes of our company, Predictum, in developing integrated analytical applications that further facilitate analysis, store and transfer knowledge and insights and gain other efficiencies and cost savings in areas of operations, research and compliance.
  4. Secure all systems.

Securing systems is a rapidly growing and increasingly demanding responsibility for IT -- so much so that we find that IT folks are usually very happy to be relieved of the burden of conducting analytics or involving themselves with analytics that analysts can better support themselves. Their enabling role is much more consistent with their other activities and responsibilities. For example, IT supports order/shipping/billing systems, but they do not order, ship or bill themselves -- so why should they conduct business, science or engineering analytics?

With the Internet of Things, new more capable equipment and the internet’s expanding reach, we can expect an exponential increase in the amount and quality of data well into the future. It’s best to prepare for the opportunities presented by building a culture of analytics now. That involves designing the right data architecture, providing JMP and enabling business analysts, scientists and engineers to advance their subject matter expertise with analytics.

Editor's Note: A version of this blog post first appeared in the Predictum blog. Thanks to Wayne Levin for sharing it here as well.

Last Modified: May 8, 2017 3:45 PM