News Stay informed about the latest enterprise technology news and product updates.

Building predictive models requires business engagement, mix of skills

To generate useful predictive analytics results, predictive modeling teams need a combination of analytical and business know-how plus close ties to business users, say analytics pros and consultants.

A growing number of companies are setting up predictive analytics programs and investing in predictive modeling tools. But fully understanding the nuances of the discipline and honing internal processes for building effective predictive models can be elusive goals for many organizations.

“There are only a few companies that have mastered this,” said John Elder, CEO of Elder Research Inc., a consulting and training company in Charlottesville, Va., that focuses on predictive analytics and data mining. Elder sees predictive modeling as more art than science but thinks most users fail to grasp the distinction. “It’s almost like companies think it works like magic -- that you run this algorithm on this data and you’ll get an answer back that helps you do something to run your business better,” he said. “But you really have to think hard about setting up the problem right.”

To put a business in position to properly frame predictive analytics models, two things are required up front, according to analytics professionals and consultants. First, the company has to have a clearly defined business reason for pursuing an analytics program, as opposed to simply taking a joy ride with the technology because it’s in vogue. And second, the organization needs to invest in the right people to spearhead the predictive analytics process. Oftentimes, that means hiring statistically savvy analysts or tapping into the new category of data scientists, not just sending a group of mathematically inclined employees to get some basic training in analytics.

So what kind of skill set forms the best foundation to drive a successful predictive modeling effort? A strong statistical background is important, said Rick Sherman, managing partner at Athena IT Solutions, a business intelligence (BI) and analytics consultancy in Stow, Mass. But he added that the best candidates aren’t purely academic -- they have experience applying mathematical concepts to real business problems.

“The typical person building these models is someone who is a statistician or actuary -- not just business people who’ve used SAS in their MBA program,” Sherman said. “At the same time, it can’t just be a backroom Ph.D. mathematician. People who are developing these models really have to understand the business, understand the industry and understand the macro- and microeconomics, depending on what they’re trying to do.”

Predictive modelers, meet the business
Fully understanding business strategies and needs requires modelers to fully engage with business managers and workers, both during and after the development of predictive models, Sherman and others advised.

Business engagement is a practice that the predictive modeling team at Paychex Inc. takes seriously, according to Erika McBride, manager of modeling and risk review at the Rochester, N.Y.-based provider of human resources and payroll outsourcing services. “We bring the business units in from the get-go to get an understanding of what exactly it is they’re interested in,” McBride said, explaining that the modeling team adopted the process after previous experience with a less collaborative approach.

For instance, early on in the company’s five-year-old predictive analytics initiative, a model was developed to identify sales potential based on the geographic territories that sales representatives were assigned to cover. What the modelers didn’t realize at the time was that the sales department frequently changed its territorial alignments. As a result, McBride said, the territory-based predictive model wasn’t as useful as one at, say, the ZIP code level would have been. “If we understood that initially, our approach would have been different,” she noted. “If we had had more conversations, we would have known it would not be useful.”

Putting processes in place for testing and evaluating predictive models is another key component of modeling initiatives, consultants say -- and, they emphasize, that should be an ongoing process of continuous improvement, not a one-off exercise.

“There are well-known processes for constantly evolving models,” said David Menninger, a vice president and research director at Ventana Research in San Ramon, Calif. For example, Menninger, who focuses on analytics, BI and information management technologies, cited the “champion/challenger” approach, in which predictive modelers continually create new models to challenge the findings of an existing one.

Another set of eyes on predictive models
In addition, Menninger recommended having other modelers check the work of their colleagues to help avoid problems with model integrity. “For the same reason you do peer reviews in software development, you should have a peer review method for [data] sample selection and model development,” he said. “You can get blinded by what you’re close to, and a fresh set of eyes is always a good thing.”

We can bring someone in and within five days, they are trained and building models.

Ryan Carr, vice president of global modeling and analytics, Catalina Marketing

Data issues that can affect modeling success also need to be taken into consideration, Sherman warned. Sometimes, he said, the data available in corporate systems is incomplete, requiring modeling teams to incorporate external sources of information in order to develop well-rounded predictive models with “enough depth to come up with something that’s actionable.” Dirty and inconsistent data is often another stumbling block, he added.

In six years of building up what he calls a “modeling factory” at Catalina Marketing in St. Petersburg, Fla., Ryan Carr has adopted some of the practices detailed above and come up with his own list of effective predictive modeling techniques. Carr, vice president of global modeling and analytics at the provider of behavior-based marketing and advertising services, said his six-person team of statisticians creates about 1,000 models annually and is looking to expand its output to 10,000-plus models this year.

Given that lofty goal, Carr has worked to develop a scalable modeling operation by designing standardized processes and templates, exploring automated technologies and implementing training procedures aimed at enabling newly hired modelers to come up to speed in a matter of days.

“We’ve created templates that make the process easier,” Carr explained. “Now, people building models don’t need to be subject matter experts -- we can bring someone in and within five days, they are trained and building models and can turn out a standard model in three days.”

Beth Stackpole is a freelance writer who has been covering the intersection of technology and business for 25-plus years for a variety of trade and business publications and websites.

Dig Deeper on Predictive analytics

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.