Data scientists apply machine learning and artificial intelligence algorithms to ever-increasing mounds of data...
in an attempt to tease out gold nuggets of information that can be used for business benefits. But, while there may be some case studies in which companies boast about their analytics achievements, one might ask how successful the data science practice is in general.
An interesting perspective on that can be found in a December 2016 Harvard Business Review article titled, "Why You're Not Getting Value from Your Data Science." In the article, author Kalyan Veeramachaneni shared a story that sheds some light on the ability of organizations to take advantage of their data science models and programs.
Veeramachaneni recounted that, during a panel discussion on machine learning, he first asked the 150 audience members if they had built a machine learning model, and about one-third raised their hands. He then asked how many people had used a model to generate business value, and then gone on to quantify the value that was produced. This time, no hands were raised. His interpretation of this was that analytics professionals typically spend most of their time at work building and tuning models, and very little, if any time examining how their analyses could be targeted at solving specific business problems.
In retrospect, that isn't an uncommon operational pattern. If we were to replace references to data science, analytics, machine learning and artificial intelligence with ones to any other emerging technology within recent memory, we'd probably find similar results. There's a predisposition for technologists to focus on using technology for its own sake rather than focusing on how it can be wielded for business value. In many cases, project plans are adopted that focus more on achieving milestones for installing and implementing a technology than on ones for achieving business results.
Veeramachaneni, who is a principal research scientist at MIT's Laboratory for Information and Decision Systems, provided some suggestions for applying data science models to achieve positive business benefits. They included focusing on simpler models; broadening the number of business issues an analytics team explores; trying to glean useful information from data samples, instead of entire data sets; and automating repeatable data processing steps to help reduce the time it takes to build models.
Small things pay off in data science
Those recommendations all point to a potential strategy for boosting data science effectiveness, basically by applying tenets of the 80/20 rule -- in this case, under the premise that a large percentage of the potential benefits come from a relatively small investment in time and effort on individual analytics projects. Here are some ideas for tactical steps to take as part of such a strategy.
Triage your analytics projects upfront. Devise a method for identifying, cataloging and assessing business issues in terms of the potential positive impact that analytics applications can have. That will help to find and prioritize opportunities for gaining real business benefits from your data science initiatives.
Focus on measurable business objectives. For each selected business issue, specify quantifiable goals for business improvements or process optimization that the analytics efforts are intended to help achieve. Some simple examples include increased revenue, higher sales margins, reduced costs and faster business cycles. More specific ones could involve increasing the effectiveness of online advertising or better segmenting customers for personalized marketing.
Be practical in building data science models. Instead of spending all your time creating ever-more complex analytical algorithms or scaling predictive models to fit massive amounts of data, concentrate on how simpler models can be applied faster to more problems. Getting reasonable benefits from three analytics projects may be significantly better than getting perfect results from one.
Adapt analytics results to different business units. Establish working relationships with the business teams that are expected to benefit from the analyses, and then devise the right methods for presenting analytics results to each of them to help motivate more informed business decisions and strategies that can generate the anticipated business benefits. If you generate results that aren't actionable, they're effectively worthless.
Know when to bail. Use measurable metrics as a gauge for continued investments of time and resources. If you aren't getting useful results, abandon the effort. Do the same if you are getting actionable results, but don't see ways that the business outcomes will ever meet the expected goals -- or how the intended business users can be convinced to operationalize the results in the first place.
To effectively apply data science techniques, analytics managers need to blend the skills of data scientists with the functionality of analytics tools to create a factory -- one that generates data science models that can be rapidly put into a production environment with clear business benefits in mind. But proper oversight must be applied to maintain control of the data science factory, and to ensure that it doesn't spiral downward into an unproductive process that produces models with little business impact.
Take our quiz to test your IQ on data science processes and techniques
Hurdles in setting up data science programs and how to overcome them
More from David Loshin: Three examples of machine learning algorithms