Last month, former NBA player and current television basketball analyst Charles Barkley made some telling comments about his perception of analytics. In criticizing Daryl Morey, the general manager of the Houston Rockets who is vocal about basing many of his player decisions on data analysis, Barkley described Morey as "one of those idiots who believe in analytics."
Barkley isn't alone in his understanding of how analytics works. At the MIT Sloan Sports Analytics Conference held in Boston in February, there was plenty of talk about whether to "believe" in analytics. ESPN SportsCenter anchor John Anderson made a point of stating several times that he does believe in analytics and that he "wants to convert people and make them believe."
I wasn't aware data analysis was something in which you could either believe or disbelieve. It's odd to think that the validity of data analysis methods that have been around for hundreds of years and been proven too many times to count would still be up for debate.
But the sentiments expressed by these commentators show how the public understands data analysis -- almost as something magical that requires belief -- and how much work data scientists need to do to improve the public's understanding.
Doubts come from the C-suite
It's not just the jocks who are confused. Data science teams within businesses often express frustration that their analyses are only considered as one factor among many in the decision-making process. Rather than making truly data-driven decisions, executives still defer to the "highest paid person's opinion."
The unfortunate thing is that it only takes one analytics misfire for some executives to lose their faith. Yet misfires are inevitable. Even the strongest data science teams can't be 100% accurate. The goal of basing decisions on data is simply to make the right decision more often than you would when relying on intuition. But even if a predictive model points to the right decision 75% of the time, the remaining 25% of cases could give it a black eye within the organization.
Executive support is key for analytics programs, and there may be some ways data science teams can increase analytics understanding among executives and, hopefully, support. First, they need to get better at communicating their findings. Rather than simply telling a decision maker that a certain strategy is the correct option because the data say so, they should try to explain why the data support the recommendation.
At the TDWI Executive Summit held in Las Vegas in February, managing partner of Archipelago Information Strategies Mike Lampa recommended hiring a data journalist, someone who understands analytics concepts, but whose primary skill is communication. It shouldn't always be the responsibility of data scientists to explain their work.
Own up to data's parameters
Analytics teams should also admit the limitations of their findings. Executives will feel less let down when a model fails to deliver the right recommendation when they understand at the outset that there is a quantifiable chance that the model will be wrong.
Acknowledging biases and limitations in datasets is also important. The underlying data analysis methods may be perfectly sound, but these factors can limit the effectiveness of any analysis and they don't always get discussed. Accounting for this kind of limitation could help build trust with executives and hedge against analytic misfires.
Data scientists are already asked to a lot these days, but maybe it's time to add one more job qualification: educator. The fact is much of the public today does not fully grasp what data analysis is all about. Like it or not, data scientists need to put in greater effort to ensure everyone understands why their work is valuable.
Data analysis methods for social media cover range of use cases
Real-time data analysis is as much a business problem as IT
Health data analysis is possible, even with strong privacy regs