Predictive analytics tools make it possible for companies to forecast customer behavior, litigation risks, product demand and other business scenarios. But it isn't as easy as gazing into a crystal ball: To succeed in creating effective predictive analytics models that generate actionable findings, organizations need to develop a strategy that ties the technology to business operations in a concrete and comprehensive way.
Otherwise, consultants warn, deployments are likely to turn into technical exercises with no real connection to business goals and processes -- and no predictive payoffs in the end.
For starters, the goals of a predictive analytics program need to be quantifiable and measurable, said Scott Schlesinger, senior vice president and head of the business information management consulting group at New York-based IT services provider Capgemini North America.
"While 'improving customer insights' is a laudable goal," Schlesinger said, "the predictive analytics target should be something like 'Increase, by cross selling [or] upselling, the number of customers who own two or more of our products by X%, thereby increasing profitability by Y%.'"
In turn, using such targeted metrics creates a need to effectively measure return on investment (ROI), according to Schlesinger. For example, a retailer looking to use predictive analytics to more accurately forecast demand for products could track stockouts -- the times when inventories are exhausted and products are temporarily unavailable. The existing number of out-of-stock incidents would serve as the basis for ROI measurements once the analytics effort begins, he said. After predictive models are developed and put into use, he added, "the effect on stockouts needs to be captured and documented to either justify or change the predictive analytics initiative."
Look before you leap into predictive analytics
Organizations should begin by doing a comprehensive assessment of their analytics requirements and capabilities, said Eric King, president of The Modeling Agency LLC, a Pittsburgh-based predictive analytics and data mining consultancy. For example, the assessment process could touch on such things as the size of the data sets to be analyzed; the experience and skill levels of the analytics team; and how quickly particular data sets are updated, including whether there is a real-time element that needs to be taken into account.
King said a well-executed assessment will help project managers develop a deployment roadmap for predictive analytics tools that incorporates both technical requirements and business objectives. It can also pay downstream dividends, he said, citing benefits such as more efficient modeling processes and the ability to avoid having to retrofit systems to accommodate new models.
Once projects are under way, King said, analytics teams should avoid focusing on what he calls "artificial metrics" -- findings that are accurate statistically but don't correspond to real business needs or objectives. Measuring the success of predictive models based simply on their accuracy is a road to nowhere if the models address the wrong questions to begin with, he said.
Analytics teams can also go wrong if they jump directly into the data without fully understanding its business context, King cautioned. For example, outliers in data sets are often downplayed or ignored, he said: "What a lot of rookies do is think it's an error and censor its value, or that it shouldn't be there or is skewing the results."
Going to analytics extremes pays off
But such extreme data points sometimes can be the most useful from a business standpoint, according to King. "Rare events may have the greatest value," he said. "Predictive analytics is more about low-incidence, high-impact occurrences. It comes back to understanding the business objectives."
More on predictive analytics planning
Discover how Dow Chemical and Wyndham found analytics skills in unexpected places
Read about the case for predictive analytics
Learn how predictive analytics helped cut expenses in the oil sector
Another mistake is resting on your laurels. Just because predictive models are effective doesn't mean the work on a project is over -- data sets and models need to be reviewed periodically and revised when necessary as business conditions and strategies change. "No model is a one-shot deal," said Rick Sherman, founder of consultancy Athena IT Solutions in Maynard, Mass.
And while business connections are crucial, once they're in place, IT and business managers should get out of the way of the data scientists, statisticians and other analytics professionals tasked with building and running the predictive models, Sherman said. "Set up an environment where they can be creative," he said. "In order for them to come up with models that are truly of value, they have to have the freedom to explore" and experiment with the available data.
About the author:
Christine Parizo is a freelance writer who specializes in covering business and technology issues. She writes for a variety of publications, including several TechTarget websites; she also works as a copy editor for Copyediting, a newsletter for editing professionals. Email her at email@example.com.
This was first published in July 2013