News Stay informed about the latest enterprise technology news and product updates.

In-memory processing for analytics: Can you make the business case?

In-memory analytics promises streamlined data analysis processes for business users. But, analysts caution, the numerous in-memory technology options require careful planning and evaluation.

Once you’ve decided that in-memory analytics could be a valuable technology for your organization, the next steps are building a business case and then evaluating and selecting software.

While in-memory processing has great potential for business intelligence (BI) uses, Boris Evelson, an analyst at Forrester Research Inc. in Cambridge, Mass., warns that organizations should take a hard look at whether it’s really necessary for their operations. “If you’re at a traditional, fairly slow-moving business, where things don’t change very fast, you may not need in-memory,” he said. “However, if your business is fast-paced and changes frequently -- particularly in front-office areas such as marketing -- in-memory can be very helpful.”

Evelson recommends thinking of the business case for an in-memory BI project partly in terms of increased self-sufficiency for business users. Everything that can be done with in-memory tools can also be done with traditional data warehousing and BI systems, he said -- but in most cases, the latter approach requires more formalized processes and greater involvement on the part of IT, BI and data warehousing professionals.

More on evaluating analytics technologies

Learn about some of the key issues to consider in planning big data analytics projects

Read tips from consultant William McKnight on selecting BI tools for your organization

Get advice from IT pros and consultants on building a business case for real-time BI

“What is the value to your organization,” Evelson asked, “of not having to engage a professional programmer [to work on an analytics project]? And what are the things he or she could do if they weren’t tied up with that traditional analytics technology?”

Going from those kinds of considerations to selecting in-memory tools can be complicated. In a recent Forrester report, Evelson pointed out the wide variations in functionality and features offered by in-memory analytics vendors and said it’s important to understand the distinctions between the different types of technologies as well as their pros and cons.

Some of the specific issues he cited for evaluation include the extent to which different products are able to handle data sets too large for a given memory space; whether an in-memory application’s database can be accessed by BI tools from other vendors; and whether in-memory software uses the multidimensional cubes common in traditional online analytical processing or instead contains a fully loaded index “that does not require a fixed OLAP model.”

A proper mix for in-memory processing
To ensure that an in-memory processing investment delivers useful analytical results, the system should provide business users with rapid access to data across a variety of sources, recommends Rob Fiorillo, senior vice president for BI at Cincinnati-based consulting firm Itelligence Inc. “It should be very open and not focused on just one data source,” Fiorillo said.

Your business requirements should come with cost justifications before you even consider in-memory.

Julie Lockner, analyst, Enterprise Strategy Group

He added that in-memory business intelligence software should be tightly integrated with an organization’s conventional BI tools to provide a combination of analytics capabilities. And when comparison shopping between in-memory products, Fiorillo suggests looking closely at their data compression ratios; companies “should expect to get anywhere from a 20:1 to 50:1 reduction” in space requirements via compression, he said.

Julie Lockner, an analyst at Enterprise Strategy Group in Milford, Mass., said that most of the organizations she has had contact with are trying to push the BI envelope and become faster at analyzing data and more responsive to business developments. But she added that in-memory analytics projects need to be thought through, beyond simply saying the technology makes sense for an organization. “Your business requirements should come with cost justifications before you even consider in-memory,” Lockner said.

Moving on to starting an implementation, Lockner advised that an in-memory analytics project should be structured much as you would approach any development program, with agility as a key goal. She recommended the use of agile development methodologies that provide end users with features and functionality on an incremental basis, instead of the traditional waterfall development approach. The project team needs to be skilled in programming as well as analytics, architecture and perhaps even statistical analysis in order to tackle the deployment with maximum effectiveness, she said.

However, Lockner noted that people who have all the required skills don’t come cheap: Typically, they can command salaries of $150,000 and up, or hundreds of dollars per hour for external consultants. Like other types of BI initiatives, she said, in-memory analytics deployments can get expensive -- another good reason to build a solid business case before getting started.

Alan R. Earls is a Boston-area freelance writer focused on business and technology.

Dig Deeper on Self-service and collaborative business intelligence

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

SearchDataManagement

SearchAWS

SearchContentManagement

SearchOracle

SearchSAP

SearchSQLServer

Close