Business intelligence (BI) software has become widely used – even to the point of being pervasive in many organizations. But for the most part, predictive analytics tools are still used by only the most sophisticated
In addition, whereas IT groups typically develop BI dashboards and reports for business users, predictive analytics models usually are created by a handful of highly skilled end users. It can be an eye-opening experience for IT workers to realize that the people who build predictive models are more data-savvy and technically oriented than they are. In fact, predictive model builders often view the IT staff merely as data gatherers whose purpose is to feed their data-hungry models.
The industries that pioneered the use of predictive analytics software are insurance, financial services and retail. Companies in those industries share the need to understand who their customers and prospects are, how to up-sell and cross-sell products and services, and how to predict customer behavior (including bad behavior through processes such as fraud detection.) Predictive analytics tools can help in all of those areas. Other industries that have benefited from the technology include telecommunications, travel, healthcare and pharmaceuticals.
Across industries, there are common approaches that can be taken in building the required predictive models, selecting technology and staffing up for successful predictive analytics projects.
Building predictive models is a combination of science and art. It’s an iterative process in which a model is created from an initial hypothesis and then refined until it produces a valuable business outcome – or discarded in favor of another model with more potential. Developing and then using predictive models involves the following tasks:
- Scope and define the predictive analytics project. What business processes will be analyzed as part of the initiative, and what are the desired business outcomes?
- Explore and profile your data. Because predictive analytics is a data-intensive application, considerable effort is required to determine the data that’s needed for the project, where it’s stored and whether it’s readily accessible, and its current state.
- Gather, cleanse and integrate the data. Once the necessary data is located and evaluated, work often needs to be done to turn it into a clean, consistent and comprehensive set of information that is ready to be analyzed. That process may be minimized if an enterprise data warehouse is leveraged as the primary data source. But external and unstructured data is often used to augment warehoused information, which can add to the data integration and cleansing work.
- Build the predictive models. The model builders take over here, testing models and their underlying hypotheses through steps such as including and ruling out different variables and factors; back-testing the models against historical data; and determining the potential business value of the analytical results produced by the models.
- Incorporate analytics into business processes. Predictive analytics tools and models are of no business value unless they’re incorporated into business processes so that they can be used to help manage (and hopefully grow) business operations.
- Monitor the models and measure their business results. Predictive models need to adapt to changing business conditions and data. And the results they’re producing need to be tracked so that you know which models are providing the most value to your organization.
- Manage the models. Prune the models with little business value, improve the ones that may not yet be delivering on their expected outcome but still have potential, and tune the ones that are producing valuable results to further improve them.
With a typical BI project, business users define their report requirements to the IT or BI group, which then identifies the required data, creates the reports and hands them off to the users. Similarly, in predictive analytics deployments, a joint business-IT team must scope and define the project, after which IT assesses, cleanses and integrates the required data. At this point, though, predictive analytics projects deviate from conventional BI projects because it is the users – for example, statisticians, mathematicians and quantitative analysts – who take over the process of building the predictive models.
The IT or BI group re-enters the picture after the models have been developed and start being used by business and data analysts. For example, IT or BI teams might incorporate the predictive analytics results into dashboards or reports for more pervasive BI use within their organizations. They might also take over the physical management of predictive models and their associated technology infrastructure.
To run predictive models, companies require statistical analysis, data mining or data visualization tools. Typically, predictive analytics software and other types of advanced data analytics tools are used by experienced analytics practitioners who are well versed in statistical techniques such as multivariate linear regression and survival analysis.
Most BI vendors sell integrated product suites that include query tools, dashboards and reporting software. But if they offer predictive analytics software, it tends to be sold as a separate and distinct product. While that’s starting to change, the predictive analytics tools now being used primarily come from vendors that specialize in statistical analysis, data mining or other advanced analytics.
Predictive analytics tools turn the BI software selection process on its head
Compared with a typical BI software evaluation, where the IT or BI group drives the software selection process while soliciting input and feedback from business users, an evaluation of predictive analytics tools is turned upside down – or at least it should be. Ideally, the statisticians and other users who build the predictive models take the lead in evaluating the predictive analysis tools that are being considered, with IT providing input on the software’s potential impact on the organization’s technology infrastructure. In this case, the users are likely to be the only ones who understand the statistical or data mining techniques they need and whether the various tools can support those requirements.
Predictive model builders and users must have a strong knowledge of data, statistics, an organization’s business operations and the industry in which it competes. Companies, even very large ones, often have only a small number of people with such skills. As a result, predictive modelers and analysts are likely to be viewed as the star players on a data analytics team.
The typical organizational structure places predictive analytics experts in individual business units or departments. The analysts work with business executives to determine the business requirements for specific predictive models and then go to the IT or BI group to get access to the required data. In this kind of structure, IT and BI workers are enablers: Their primary tasks are to gather, cleanse and integrate the data that the predictive analytics gurus need to run their models.
In conclusion, the critical success factors for successful deployments of predictive analytics tools include having the right expertise (i.e., predictive modelers with a statistical pedigree); delivering a comprehensive and consistent set of data for predictive analytics uses; and properly incorporating the predictive models into business processes so that they can be used help to improve business results.
About the author: Rick Sherman is the founder of Athena IT Solutions, a Stow, Mass.-based firm that provides data warehouse and business intelligence consulting, training and vendor services. In addition to having more than 20 years of experience in the IT business, Sherman is a published author of more than 50 articles, a frequent industry speaker, an Information Management Innovative Solution Awards judge and an expert contributor to both SearchBusinessAnalytics.com and SearchDataManagement.com. He blogs at The Data Doghouse and can be reached at firstname.lastname@example.org.