With pressure growing by the day to make their products quicker to deploy and easier to use, business intelligence...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
(BI) and data warehouse vendors are increasingly turning to in-memory technology in place of traditional disk-based storage to speed up implementations and extend self-service capabilities.
QlikTech, for example, claims that by utilizing in-memory storage, its flagship BI app, QlikView 9, can refresh data in real time to support operational BI environments, such as monitoring financial transactions, and can be deployed in a matter of days, giving analysts and non-power users quick access to timely analysis with little IT assistance.
TIBCO Spotfire, meanwhile, recently released the latest version of its in-memory BI platform, while mega-vendors SAP and IBM have in-memory BI offerings of their own -- SAP NetWeaver Business Warehouse Accelerator and CognosNow, respectively. Even Microsoft is getting into the action, with in-memory capabilities expected to be part of its Project Gemini release next year.
More on business intelligence software
Find out how real-time data loading can help data warehouses support operational business intelligence
Get advice for writing better business intelligence RFPs
Find out whether you should stick with SAP BW or upgrade to a more robust data warehouse
Traditional BI technology loads data onto disk, often in the form of intricately modeled tables and multidimensional cubes, which can take weeks or months to develop. Queries are then made against the tables and cubes on disk. In-memory technology removes these steps, as data is loaded into random access memory and queried in the application or database itself. This greatly increases query speed and lessens the amount of data modeling needed, experts agree, meaning that in-memory BI apps can be up and running significantly faster than disk-based tools.
In-memory technology is emerging now thanks to both increased customer demand for fast and flexible operational BI and data analysis capabilities, as well as technological innovation, specifically the emergence of 64-bit processors, said Anthony Deighton, senior vice president for marketing at QlikTech.
Sixty-four-bit processors, which began to replace 32-bit processors in personal computers earlier this decade, significantly increased the amount of data that could be stored in-memory and ultimately helped reduce the price of memory, which traditionally had been much more expensive than disk, Deighton said, spurring its use in enterprise applications.
In-memory BI technology could prove a particular boon to operational workers, like call center representatives who need near-instant access to data while speaking with a customer, for example, or warehouse managers monitoring up-to-the-minute inventory levels, according to Philip Russom, senior manager of research and services at The Data Warehouse Institute."These people need to make these kinds of decisions in only a few minutes," Russom said.
But besides operational BI, there are other use cases for in-memory software, he said. Corporate performance management dashboards augmented with in-memory technology, for example, can be more easily updated throughout the day with fresh data, he said, either at predetermined times or on demand.
In-memory technology can also improve data warehouse performance, Russom said, especially when faced with unpredictable, ad hoc queries against large data sets -- a particularly popular activity among power users and business analysts. A number of data warehouse vendors have adopted in-memory technology for just such purposes, and more may follow.
"We should expect to see more in-memory database functions that relate directly to data warehousing, like very large multidimensional cubes and star schema in memory -- eventually multi-terabyte enterprise data warehouses in-memory --updated via streaming data, plus outbound data-event-driven alerts," Russom said.
One database vendor that has embraced in-memory technology is Kognitio, a U.K.-based data warehouse appliance vendor with a small but growing U.S. presence. The company's WX2 data warehouse appliance uses in-memory technology along with a massively parallel processing (MPP) architecture to query and return results on multi-terabytes of data in a matter of seconds, according to John Thompson, the company's general manager of U.S. operations.
"The laws of physics do apply. You can pull data off spinning disks only so fast," Thompson said. "Once you have a database in-memory, you can get query response times down to 200ths of a second."
British Telecom, the U.K.-based telecommunications giant, has been using WX2 for more than a decade to analyze customer behavior and set pricing plans. According to Arthur Winn, head of business pricing at BT, the company's analysts use SAP BusinessObjects front-end tools to query the Kognitio in-memory data warehouse appliance to test what-if pricing scenarios.
That is how BT developed its Business OnePlan Plus offering, which allows customers free intra-business calling, Winn said. To predict the revenue impact of the plan, BT analysts queried thousands of customer call records – 3.5 terabytes of data dating back 15 months -- stored in-memory on WX2 without IT assistance and received responses in near-real time.
Developing pricing plans, as done at BT, doesn't necessarily require instant query results that in-memory technology allows, but some operational BI environments do. It has a positive impact on worker performance, Russom said. "If you have to take a coffee break for every run of that query [before results are returned], you're just going to lose your train of thought."
Ultimately, Russom expects in-memory to overtake disk as the storage technology of choice for most BI tools and data warehouses as user demand for ever-faster query response time increases and the technology itself becomes even less expensive.
Organizations that have already invested in BI technology are therefore likely to see in-memory capabilities added to their existing tools in coming months and years, Russom said. And he suggested that those organizations evaluating BI vendors should take in-memory capabilities into account "if you know the BI platform will be part of a solution for operational BI or an OLAP application that demands really fast query response."
"People like to do in-memory processing to avoid the input-output problem [of disk-based queries]," Russom said. "In-memory is far, far faster."