This article is part of an Essential Guide, our editor-selected collection of our best articles, videos and other content on this topic. Explore more in this guide:
2. - Speed promised by in-memory technology: Read more in this section
- In-memory processing speed only valuable when matched with business needs
- Accelerate BI speeds, boost performance with in-memory technology
- In-memory becoming a commodity thanks to Oracle Exalytics
Explore other sections in this guide:
- 1. - What the Oracle in-memory database has to offer
- 3. - Uses for in-memory technologies
- 4. - Definitions of in-memory database terms
This article can also be found in the Premium Editorial Download "Business Information: Advanced analytics, in-memory technology push information limits."
Download it now to read this article plus other related content.
Two years ago, Bill Powell and other executives at Automotive Resources International got some gentle prodding from one of its vehicle fleet management customers regarding shortcomings in ARI's analytics capabilities.
The customer, a utility company, was looking for more detailed information to help it identify steps that could be taken to cut vehicle costs and increase efficiencies in its fleet operations, according to Powell, ARI's IT director. The analytical reports that ARI was providing "didn't go as deep as they wanted," he said. "A lot of the reports were more aggregate because they took so long to run."
Being able to do multiple planning scenarios in minutes as opposed to hours really changes things.
Joshua Greenbaum, analyst, Enterprise Applications Consulting
There was no lack of data available to analyze. ARI collects information on fuel consumption, idling time, driving speeds and other metrics every few minutes from GPS devices installed in vehicles. The Mount Laurel, N.J., services provider also receives volumes of data on gas purchases, which offers further insight into fuel efficiency and costs. "You can imagine the volume of information that's coming back," Powell said. The problem is that ARI's reporting and analytics infrastructure wasn't up to the challenge of processing all the incoming data on a timely basis, forcing the company to run many of the reports overnight and limiting what it could squeeze into them.
The customer complaints prompted ARI to take a hard look at its analytics systems. Already a user of SAP AG's BusinessObjects business intelligence (BI) software, the company turned to SAP for help in devising a more robust architecture that could deliver more detailed reports to its corporate clients -- and in real time, using a self-service approach that would enable customers to analyze data on their own. That ultimately led to a deployment of SAP's HANA computing appliance, which uses an in-memory database instead of storing data on conventional disks. The result of the shift to in-memory technology: a big acceleration in report-generation speeds and a significant expansion in data analysis capabilities.
"With HANA, we don't have to schedule reports anymore," Powell said. "We just design a report, put it out in front of the customer and then they can click it and it runs instantaneously, on the fly. If they want to start getting into the details and navigate down to anomalies or [data that points toward new] opportunities, they can."
In-memory processing turns hours into seconds
ARI also is tapping the HANA device itself for faster internal reporting on things such as profitability analysis and its customer relationship management processes. For example, reports on what individual customers spent on its fleet management services in a given year or month, and how that affected ARI's bottom line, previously took hours to run; now they come back in seconds. "HANA allows us to tear through that information rather quickly," Powell said.
In-memory computing used to be the domain of niche application workloads in industries such as telecommunications and financial services -- for example, sub-second database transactions for stock trading. But a perfect storm of vastly reduced memory prices, the move to 64-bit computing, and BI querying and reporting being pushed down to business users in more and more organizations has broadened the horizons of in-memory technology. In particular, it has made the leap into business intelligence uses, often in the form of self-service BI applications.
There are a variety of different types of in-memory products available, including main memory databases such as Oracle's TimesTen and a plethora of BI tools ranging from Microsoft's Excel-based PowerPivot technology to data discovery applications such as QlikView, Tableau and Spotfire and on to higher-end online analytical processing software. But HANA and Oracle's rival Exalytics appliance occupy the highest-performance corner of the in-memory world, offering prospective users turnkey systems tailored specifically for in-memory computing.
SAP made HANA generally available in June 2011, and Oracle followed suit with Exalytics in February 2012. In addition to accelerating existing applications, in-memory appliances like HANA and Exalytics can make applications that once might have been deemed impractical more possible. For example, one of the customers highlighted by SAP is Bigpoint GmbH, a Hamburg, Germany, online game developer that uses HANA to analyze the behavior of players of its Battlestar Galactica Online game. That involves processing data on thousands of discrete events a second; Bigpoint then uses the in-memory analytics findings to pitch targeted promotions for add-ons or virtual items to players in real time, according to SAP.
The vendor's website also touts a variety of other real-world uses for HANA, from analyzing genome data in cancer patients and the root cause of manufacturing faults to monitoring the consumption of draft beer in bars and restaurants. Oracle, meanwhile, points to advanced data visualizations, complex financial and operational planning applications, and BI processes involving diverse data sources as uses that Exalytics was designed to handle.
DIY on BI queries
Self-service BI was one of the motivating factors in buying Exalytics for Savvis, an IT hosting and colocation services provider in Town & Country, Mo. The company, one of Oracle's technology preview customers for Exalytics, deployed the system primarily to support ad hoc analysis by its customer service representatives. BI dashboards show up-to-date service-level agreements, call logs and the number of incoming help desk tickets; the service reps then can run their own queries against the data. "We wanted to give them a nice metadata layer, and they could write the reports themselves," said David Heilig, director of BI at Savvis.
Big gains possible for small businesses
Although appliances such as HANA and Exalytics are primarily oriented to large companies, some smaller ones are also finding uses for the in-memory computing systems. In SAP's case, that access has come through the vendor's on-demand enterprise resource planning (ERP) software, Business ByDesign, which uses HANA to run its analytics processes.
WL Plastics Inc., a manufacturer and seller of polyethylene pipes in Fort Worth, Texas, has seen dramatic improvements in data loading times for business intelligence and analytics reporting since deploying the HANA-enabled version of Business ByDesign, according to IT manager Brad Crimin. Some reports used to take five minutes to run, but others took as long as 24 hours. Now response times are measured in seconds, Crimin said.
"The performance is significantly better," he added. "Ever since the upgrade that introduced the HANA-backed analytics, every report in our system is more or less real time." SAP upped its ante in the quest to attract small and midsize businesses to HANA in March 2013, announcing a version of its on-premises Business One ERP software that runs entirely in memory on the appliance platform.
But Heilig added that the IT department has had to put the brakes on in-memory processing and reporting in some cases. For one thing, he said, memory is still more expensive than disk storage is, which can make using Exalytics costly. In addition, using the in-memory system to analyze data in real time might not make sense when information is being updated frequently -- for example, financial data at the close of a quarter.
"It's always a push: Users always want real-time [data]," Heilig said. "This allows us to do it when it makes business sense. But what people don't grasp fully is that if you're doing data analysis and the data is constantly changing underneath you, sometimes it doesn't work well."
Joshua Greenbaum, an analyst at Enterprise Applications Consulting in Berkeley, Calif., said in-memory systems aren't a magic bullet. "With my clients, I'm always talking about the problem we want to solve and if there's a technology to do it," Greenbaum said. "If HANA is part of the [initial] question that's being formed, you're starting out with the wrong question." HANA is best suited, he added, to applications that involve "lots and lots -- huge quantities -- of data that needs to be analyzed, and re-analyzed, iteratively."
Adding up the cost of in-memory technology
Pricing is also an issue: "The basic story on the cost of HANA," Greenbaum said half-jokingly, "is, 'How much do you want to spend?' " Oracle's starting price on Exalytics is $135,000 for a system with four 10-core Intel Xeon processors and 1 terabyte of memory -- but the cost likewise can go much higher as systems expand. Nathaniel Rowe, an analyst at Boston-based Aberdeen Group Inc., wrote in a report published in March 2013 that in-memory appliances "are almost exclusively meant for large enterprises and carry a price tag commensurate with the size of their intended customer[s]."
But SAP, for one, is doing a lot of negotiating with prospective customers on price, according to Greenbaum: He said the vendor has a lot riding on HANA and is willing to bargain in order to grow the number of companies using the system. Greenbaum sees supply chain planning and optimization as a sweet spot for in-memory technologies, which can help users run planning and forecasting models much faster than they could previously, enabling them to test a variety of what-if scenarios. "Being able to do multiple planning scenarios in minutes as opposed to hours really changes things," he said.
By the numbers
35: The percentage of midsize and large organizations expected to adopt in-memory
computing technology by 2015
Source: Gartner Inc.
$185: The retail per-GB price of computer memory in April 2005
$4.30: The cost of memory in January 2013
Prices for chips on NewEgg.com; source: John C. McCallum, http://www.jcmit.com/
Jon Reed, an independent SAP analyst and operator of the website JonERP.com, is skeptical that the faster analytical performance provided by in-memory systems will be a big benefit to companies in the long run, particularly as more organizations invest in the technology. "Right now, speed is a competitive advantage," he said. "But five years from now, speed will be a commodity. If you're slow, you'll be eliminated. But it won't give you any particular edge."
But for now, at least, users like ARI's Powell will gladly take the speed boost that in-memory computing has given his organization. And there's more to it than accelerated analytical throughput, he said. Many of ARI's fleet management clients also use competing services; Powell said his company can pull data provided to clients by rival services providers into its HANA system for analysis.
"You can use our business intelligence stack to manage everything," he said. "That's been a big hit for us, and it's been a big time-saver for our customers." And hopefully, that kind of improved customer service is something clients will remember when it comes time to renew their contracts.
Follow SearchBusinessAnalytics on Twitter: @BizAnalyticsTT.