Manage Learn to apply best practices and optimize your operations.

Keys to avoiding pitfalls on analytical models: testing, relevancy

Predictive modeling can lead to some pretty bad insights when done poorly, but overcoming some common issues can help users sidestep problems on predictive analytics projects.

Predictive modeling can be a powerful tool to help businesses see problems and opportunities that are coming their way, but when done poorly, it can lead them down a path of error and uncertainty. Understanding where the pitfalls lie is a must for getting the most out of your analytical models.

For example, speaking at the 2014 Predictive Analytics World conference in Boston, John Elder, president of consulting firm Elder Research Inc., said that the human brain is naturally inclined to see patterns. But that can become a problem when assessing predictive modeling results, he added, cautioning that it's easy to see some sort of correlation that isn't statistically significant. "There are some serious mistakes that happen when people look at data and they find something that isn't there," Elder said.

One of the most common errors made in predictive analytics projects is being biased in favor of positive results, he said. That often happens when data analysts favor modeling results that confirm their expectations. But just because one test of a predictive model shows positive results doesn't mean those results are truly meaningful.

"Interpretability is a dangerous thing," Elder said. "You can explain anything." He noted that people "often look for data to justify our decisions," when it should be the other way around.

Put predictive models to the test

One solution to that problem is to test models over and over again before deploying them for an operational function. Elder said it's better to catch mistakes in analytical models during testing than to wait for business users to find problems.

Sameer Chopra, vice president of advanced analytics at Chicago-based travel reservations website operator Orbitz Worldwide Inc., said testing needs to be at the heart of any predictive analytics project. He recommended that organizations conduct A/B tests to quantify the effect of a model's recommendations against a control group. "Make [testing] a part of the DNA of your organization," Chopra said.

To me, [analytics] is like the dog who chases the car: You have to know what you're going to do with it once you get it.
Jack Levisdirector of process management at UPS

Aside from ensuring that models work properly before putting them into production, well-planned testing can offer new insights. For example, Chopra said his team found that it's possible to identify unsatisfied customers before customer service issues escalate. The team made that discovery when it was testing different ways to translate text-based customer service notes into quantifiable analytics scores. That wasn't the outcome they were looking for at the time, but it has since helped limit customer service issues.

Another common roadblock to successful predictive modeling is that models sometimes generate insights that are irrelevant to an organization's business operations. If the output of a model doesn't connect to an operational process that can be changed and improved, you're just producing trivia.

"To me, [analytics] is like the dog [that] chases the car: You have to know what you're going to do with it once you get it," said Jack Levis, senior director of process management at shipping and delivery company UPS.

Analytical models fuel new business approach

Over the years, UPS has transitioned its business from a traditional approach to one that relies on analytics for things like directing drivers to stops during a specific order to conserve fuel and predicting how many customers will need to ship packages on a given day so the company can optimally allocate its resources.

Levis said it would have been difficult to move in that direction if managers on the business side didn't believe in the business value of analytical models. And a lack of faith would have been justified if the models delivered useless insights. That's why it's important for predictive models to deliver relevant results.

Anything can be modeled, and if you look at data long enough, you're bound to see some kind of correlation. But getting these softer issues straightened out at the beginning of a predictive analytics project can help prevent problems down the road.

"It's more likely your project will die because of human factors than technical factors," Elder said.

Ed Burns is site editor of SearchBusinessAnalytics. Email him at and follow him on Twitter: @EdBurnsTT.

Next Steps

No need for big data in predictive model development?

Predictive modeling needs business engagement, good technology

What can uplift modeling do for you?

Dig Deeper on Predictive analytics

Join the conversation


Send me notifications when other members comment.

Please create a username to comment.

What are you most worried about when developing analytical models?
I think confirmation bias should be a major concern with any analytics initiative. Sometimes we want to see things in the numbers that aren't there, and develop a model that serves a certain purpose.