In-memory analytics is getting a lot of attention from business intelligence (BI) users and vendors alike these...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
days, and that isn’t surprising. In-memory tools look at data sets stored entirely within a computer system’s memory instead of requiring information to be pulled from disk drives, with the potential to dramatically reduce I/O cycles and thus greatly improve query response times for business users.
In addition, some forms of in-memory analytics software can simplify the data analysis process by doing away with the need for the indexes, aggregated tables and multidimensional cubes commonly required by traditional data warehousing and BI applications. That can ease the data management workload for IT teams and make it easier to develop and run queries, opening more analytics activities to end users without specialized skills and creating opportunities to implement self-service BI capabilities.
BI vendors offer a variety of in-memory analytics tools with vastly different architectures, ranging from spreadsheet-based products to high-end platforms that can handle large amounts of data. But the common thread between them, according to BI industry analysts, is speed. “In-memory is for people who really need superb performance and very low latency -- but that is not most people,” said Gartner Inc. analyst Merv Adrian.
More on in-memory analytics and self-service BI
Read about key issues to consider before buying in-memory analytics tools and other advanced business analytics technologies
Get consultant Rick Sherman’s take on five trends for faster BI performance, including in-memory analytics
Watch our series of video Q&As on self-service, collaborative and mobile BI
Joshua Greenbaum, principal analyst at Enterprise Applications Consulting in Berkeley, Calif., said the expanding interest in in-memory analytics is the result of several converging developments in the IT industry. Some software vendors pioneered in-memory processing as much as 10 years ago, but the technology met with limited success until recently, Greenbaum said. One thing that is now helping to boost its adoption is the fact that memory has become “insanely cheap,” he noted.
The shift toward 64-bit systems has also helped, since they can support vastly more than the 4 GB addressable memory limit of 32-bit machines. Also aiding the in-memory BI cause are high-performance but reasonably priced blade servers that work well for analytical processing and can be easily combined to produce powerful multi-node systems, according to Greenbaum. Yet another factor, he said, is the development and growing use of columnar databases as alternatives to conventional row-based relational databases.
“Columnar technology has really matured, so putting that together with the hardware developments has created a perfect storm [for the in-memory approach],” Greenbaum said. He added that storing information in columns instead of rows enables substantial levels of data compression, making it more feasible to load large data sets into memory. It’s possible to run mainstream relational databases in memory as well, but Greenbaum said they don’t provide the increased compression efficiencies of columnar software.
Bring on the data for in-memory analytics
Because of the potential advantages of in-memory analytics, Greenbaum is seeing interest in the approach from large retailers, financial services firms and utilities as well as the U.S. military and other government agencies. For such organizations, “the use case is lots of data and the need to efficiently and quickly process that data,” he said.
In-memory is for people who really need superb performance and very low latency -- but that is not most people.
Merv Adrian, analyst, Gartner Inc.
Forrester Research Inc. analyst Boris Evelson thinks in-memory analytics is an important element in a broader trend toward making BI more central to how databases are designed, implemented and used. In addition to enabling business users to get information “at the speed of thought,” in-memory applications that minimize or even eliminate the need to aggregate data and build schemas can foster a more agile analytical process that is better able to adapt to changing business requirements than traditional BI is, Evelson said.
With such tools, “you bypass all those steps” involved in optimizing data for analysis and reporting, he added. “Your reports and dashboards are one and the same as the data models. Just by changing the reports or dashboards, you change the data model -- it isn’t separate.”
However, Evelson cautioned that the functionality offered by in-memory analytics vendors ranges widely. Before making any buying decisions, organizations should be sure they know about the different categories of in-memory tools and what each is best suited for, he said. He also recommends asking vendors about specific product features and whether IT will need to be involved in the setup and maintenance process.
A frequent point of confusion regarding in-memory business intelligence is the fact that it isn’t just a matter of eliminating disk drives, Evelson. “One of the most common questions I get is whether in-memory can be accomplished by simply running analytics on solid-state drives,” he said. “That will certainly be faster than relying on disk, but you’re retaining all the data modeling and all the I/O steps.”
Even so, IT and BI managers shouldn’t discount solid-state drives as a potential option, said Julie Lockner, an analyst at Enterprise Strategy Group in Milford, Mass. “You could get the answer you need in milliseconds from an in-memory system, but running a similar query through solid-state drives might get you the response in microseconds,” she said. And for some organizations, Lockner added, that likely will be good enough.
Alan R. Earls is a Boston-area freelance writer focused on business and technology.