Technical issues are only part of the equation when it comes to deploying advanced business and data analytics software, according to industry analysts.
Quite often, whether an analytics project succeeds or fails depends on human factors – for example, whether an organization’s IT department collaborates sufficiently with the business users who will be relying on the data analytics tools, and whether those users feel comfortable with the software once it’s up and running.
“You can’t deploy these technologies without considering people and process issues,” said Gartner Inc. analyst Rita Sallam. In fact, Sallam’s take is that meeting the needs of users is more important than the particular features and capabilities of whatever analytics technology is chosen for a project.
“To me, the technology itself is almost irrelevant – it’s almost a red herring,” she said. “Business users couldn’t care less about in-memory analytics. What they care about is being able to rapidly and intuitively analyze large amounts of data.”
Buying and installing a particular kind of business analytics software simply because it’s the latest and greatest technology on the market is a recipe for failure, Sallam added. To avoid problems, she advised, advanced data analytics tools have to be deployed in close partnership with end users.
“A lot of times in the past, the IT department has said, ‘Oh, we need a data warehouse,’ and they just went off and built the biggest, baddest data warehouse ever without talking to business users,” Sallam said. “And then nobody ever used it.”
Keep the lines of communication open on data analytics tools projects
Howard Dresner, a former Gartner analyst who now is president of Dresner Advisory Services LLC in Nashua, N.H., agreed with Sallam. IT departments and business intelligence (BI) teams that support deployments of analytic applications must communicate closely and openly with business users as technology decisions are made, he said.
“IT needs to be responsive. They need to be joined at the hip with the end user,” said Dresner, who coined the term “business intelligence” while working for Gartner.
Of course, business analytics tools must also meet an organization’s overall needs and requirements to justify investments in them – a fact that isn’t lost on Sri Vemparala, manager of reporting and BI at Stanford University in Palo Alto, Calif. New technologies are always alluring – but Vemparala said that at Stanford, where the BI group supports the university’s admissions, research and finance operations, in-memory analytics and other advanced data analytics tools aren’t really needed yet.
“I would say 80% of our BI is operational reporting at this point,” Vemparala said, adding that technologies such as in-memory analytics would be “a step beyond that.” And while Vemparala is interested in exploring ideas for taking Stanford’s BI program to the next level via analytics and performance metrics, he’s only looking for now.
He said that likely would change only if an analytics software vendor proves to him that a technology could provide significant value to the university and be easily deployed – by adding a bundled appliance, for example.
Success with data analytics tools requires proper data management
In addition to ensuring that business analytics technology is the right fit for specific end users and an organization as a whole, IT and BI teams need to make sure that they’ve fully addressed data management issues before deployment. In the case of in-memory analytics, data governance policies need to be put in place in order to ensure that data definitions, dimensions and calculations are consistent, Sallam said.
IT needs to be responsive. They need to be joined at the hip with the end user.
Howard Dresner, president, Dresner Advisory Services LLC
Proper change management procedures are also critical, from both technology and business-process standpoints, she added. That means putting enough thought and resources into training and supporting end users so that an analytics investment pays off in terms of adoption, usage and business results.
Sallam cited a 2006 Gartner case study on a business activity monitoring project at Euro Disney, which operates the Disneyland Paris theme park, as an example of how an organization was able to successfully deploy BI and data analytics tools due to effective change management. She said the BI system was designed to predict and then monitor the length of lines at the park’s rides and restaurants; when problems were identified, more workers were sent to the affected locations, helping to boost customer satisfaction, according to Sallam.
Instead of trying to make those staffing decisions based on experience or managerial intuition, park administrators learned to trust what the BI and analytics data was telling them – a shift in organizational culture that Sallam said Euro Disney was able to instill as part of the project.
Corporate BI standards could affect choices of data analytics tools
Businesses that have adopted a specific BI suite as a corporate standard should carefully consider the implications of buying data analytics technology that’s outside of their designated standard, according to Rick Sherman, founder of Stow, Mass.-based consulting firm Athena IT Solutions.
“If they do that, then they need to look at what the issues would be of having another technology and another data stack,” Sherman said. He explained that the need to coordinate data between different BI and data analytics tools could create complications for IT and end users alike.
Allowing ample time to make sure that all of the kinks have been worked out before any new technology is deployed sounds like simple advice – but it’s something that many organizations overlook on BI and analytics projects, said Mark Smith, an analyst at Ventana Research in Pleasanton, Calif.
“You have to do a certain level of testing,” Smith said. “Some of that’s obvious, but some [people] think you can deploy the technology right away.”