In-memory processing can serve as a high-octane fuel for supercharging big data analytics applications. But organizations should weigh factors such as additional systems infrastructure costs and the readiness of their business processes before gassing up with in-memory analytics technology.
Another key step in greasing the deployment skids is identifying big data analytics problems that have proven unsolvable or that could benefit from the performance boost typically provided by in-memory analysis applications.
"The integration of in-memory capabilities and big data boils down to use case and benefits," said Paul Barth, co-founder of data management and analytics consultancy NewVantage Partners. "You need to consider the business value of accelerating time to answer -- is it a matter of convenience, or is it a case when rapid turnaround and rapid analysis really benefits the decision-making process."
Detecting patterns in large stockpiles of data is one application where using in-memory analytics tools makes sense, Barth said, as are scenarios in which traditional business intelligence (BI) tools hit their limits on data volumes and processing speeds. Another example favoring in-memory technology that he cited: building an online recommendation engine that can be accelerated by running its business rules engine and analytics algorithms in memory.
Personal touch triggers big data flood
At ContactLab, an email marketing services provider in Milan, Italy, the need for in-memory analytics capabilities became apparent when its business model shifted from broad-based marketing campaigns to a more individualized outreach approach, said Massimo Fubini, the company's founder and director. ContactLab, which manages an average of 60,000 to 70,000 email and outbound SMS messages daily, found itself faced with a big data challenge as it tried to sort through hundreds of millions of data points on click-throughs, website visits and other actions to analyze customer behavior and serve up relevant marketing messages on the fly.
More on big data and in-memory analysis
Read about potential big data uses for in-memory analytics software
Get tips on matching in-memory processing speed to big data application needs
Learn about key issues in applying in-memory computing technology to big data
Conventional BI tools worked fine up until that point, Fubini said. But the change in business strategy changed the analytics game as well and opened the door to the deployment of a Hadoop system that captures the data and feeds it into in-memory analytics software -- in this case, SAS Visual Analytics.
As part of the big data environment, ContactLab also collects data from a variety of other sources, including mobile apps, social media sites, transactional systems and external marketing information services. The plethora of data makes it harder for marketing managers and other executives at the company's clients to know what questions to ask. But Fubini said the SAS tool's combination of in-memory analytics and data visualization capabilities lets ContactLab's analysts explore the data and come up with insights nearly instantaneously.
"This world is really changing," he said. "In the past, people knew what data was available and would ask for specific analytics. Now the amount of data we're collecting is huge, and the requirements around analysis are much more interactive. You can't give someone an answer in a day or two."
Know your people
Knowing your user base is another gauge for determining if in-memory applications are the right fit for a big data analytics initiative. "It's a bit of a judgment call, so you need to understand if your users can take advantage of the additional performance," said William McKnight, president of McKnight Consulting Group. "If you have data scientists on staff, you don't want them sitting there drilling and drilling into data only to get frustrated [by slow response times] and walk away. With super-fast performance, you can give them the advanced analytics capabilities they need."
Business process maturity is another issue to consider. Tapping in-memory technology to deliver self-service capabilities to analytics users or as a means to accelerate the performance of big data analytics processes is an admirable goal -- but it's a lost opportunity if business users can't quickly initiate actions based on the analytical insights that the software produces.
"The question is, are your business systems ready to take the results from the data mining exercise," said Tapan Patel, global product marketing manager for predictive analytics and data mining at SAS. "If the end goal is to make quicker, better decisions and you're getting insights quickly, but your CRM system is not ready to execute on near-real-time alerts with price changes or customer offers, the value [of in-memory analytics] may not be achieved."
Cindi Howson, founder of BI Scorecard, a research and consulting company that publishes technical evaluations of BI and analytics tools, said in-memory analysis has a range of potential uses, from speeding up the performance of existing databases to enabling the addition of new visual data discovery capabilities. "In-memory should be part of everyone's analytical environment," she said. "The question is where and how?"
Beth Stackpole is a freelance writer who has been covering the intersection of technology and business for more than 25 years.
Beth Stackpole asks:
Does your organization use in-memory tools to analyze big data?
0 ResponsesJoin the Discussion