News Stay informed about the latest enterprise technology news and product updates.

Flexibility of in-memory business intelligence requires some vigilance

In-memory analytics tools give business users more control over how they use business intelligence data. But IT and BI pros need to be on guard to ensure that problems don't result, analysts warn.

In-memory analytics holds out the potential of providing powerful data analysis capabilities to business users without requiring much in the way of IT support for building queries and creating reports. But there is a possible dark side to the in-memory business intelligence (BI) equation to keep in mind, according to analysts.

And it could be summed up in this way: With the added capabilities comes increased responsibility for end users.

The schema-less approaches supported by some in-memory analytics tools offer users lots of flexibility for querying and analyzing data, said Forrester Research Inc. analyst Boris Evelson. In traditional BI applications in which data is pulled from disk drives, the fixed data models, star schemas and aggregated tables built by IT and data warehousing teams control how information can be utilized as part of the analytics process. By contrast, Evelson compares in-memory analytics to the use of Excel spreadsheets.

More on in-memory business intelligence and analytics

Read about an in-memory analytics project at manufacturing company Superior Graphite

Watch video Q&As on self-service, collaborative and mobile BI, featuring consultants and vendors

Learn about recent in-memory BI appliance developments involving Oracle and SAP

“It’s is up to you how you use it,” he said, referring to business users. “But if you don’t know what you’re doing, it can be a problem. You can make mistakes.”

Products at the high end of the in-memory processing scale can further exacerbate that issue. With them, “the real challenge is fundamentally that this kind of firepower and the quantity of data it can handle require companies to take a different look at their business analytics processes,” said Joshua Greenbaum, principal analyst at Enterprise Applications Consulting in Berkeley, Calif. He added that for many organizations, dealing with cultural issues can be the most difficult aspect of successfully using in-memory software.

Often, Greenbaum noted, companies don’t have a huge base of experience for dealing with the potential ramifications of in-memory BI. He cited the example of a large water and sewer utility with which he recently worked as part of a consulting engagement. The organization currently doesn’t use “smart” meters, but it plans to implement them in the near future. “Once they have that technology in place, with more data and more up-to-date data, the question is what will they measure and what will they do from an analytics standpoint,” he said. “It’s not intuitive -- you must think about it.”

Putting in-memory power to good use
The same question applies to the adoption of in-memory analytics, Greenbaum said: “What do we do now? That is the real crux of the problem. With in-memory, you have to figure out what this tremendous power means to running your business.”

And the decision to implement in-memory analytics applications should be tied to specific business problems, said Mike Ferguson, managing director of Intelligent Business Strategies Ltd., a U.K.-based research and consulting firm that focuses on BI, data management and data integration technologies.

This kind of firepower and the quantity of data it can handle require companies to take a different look at their business analytics processes.

Joshua Greenbaum, principal analyst, Enterprise Applications Consulting

“I think the use of in-memory analytics is still in its early days -- it’s not what you would call mainstream yet,” Ferguson said. But he sees in-memory technology as a natural next step for many BI applications, such as complex event processing (CEP) systems that capture real-time or near-real-time streams of data. In-memory BI tools that avoid round-trip I/O excursions to disks can enable users to “analyze data faster and be more automated and responsive, which could yield a significant return on investment,” he said.

In a manufacturing company, for example, if a large customer cancels or changes an order, that can have implications for production schedules and inventory levels. “It may force you to reschedule work for other customers, so being able to respond quickly may avoid potential business disruptions,” Ferguson said.

However, he pointed to another potential problem: In many cases, it’s hard to determine in advance the full value that an in-memory analytics deployment will provide to an organization. “You couldn’t necessarily say how much retail banking risk management is improved by in-memory, though it is likely to make a contribution,” Ferguson said, adding that he sees the technology “as part of the natural, evolutionary improvement in database and analytic practice rather than as an instant game changer.”

The big picture is that in-memory deployments generally are doable, Evelson said. But even though in-memory tools should be more user-friendly than conventional BI software is, he thinks it’s still a good idea for IT to keep its hand firmly on the tiller when it comes to issues such as security and data cleansing.

Alan R. Earls is a Boston-area freelance writer focused on business and technology.

Dig Deeper on Self-service and collaborative business intelligence

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.