News

Challenges lurk on in-memory BI projects -- proper preparation needed

Alan Earls

Like other types of technology projects, deployments of in-memory analytics software present some potential challenges. But IT analysts and consultants

    Requires Free Membership to View

say that for well-prepared organizations, those challenges shouldn’t be too daunting.

Steve Kent, director of business intelligence at Burlington, Mass.-based Collaborative Consulting, said one of the big requirements for successful in-memory BI initiatives is avoiding data quality and integrity issues. “The key is to have good information,” he said. “You must implement [data quality] business rules and have clean data if you are going to get good information out of the process.”

Managing the relationship between in-memory analytics tools and data warehouses is another area that requires proper attention, according to Kent. And there are basic data management issues to take into account. For example, some in-memory applications eliminate the need for data aggregation and the building of multidimensional cubes to support data analysis. But, Kent said, “in the case of extremely large data sets -- for example, in the pharmaceutical field -- you might still want to aggregate data.”

More info on BI and analytics challenges

Learn about the new challenges that “big data” brings to analytics initiatives

Watch a video Q&A on how auto insurer American Access overcame the pitfalls of deploying a BI system

Read consultant Rick Sherman’s list of big data analytics worst practices

Another key to success is doing your homework on business needs prior to an in-memory analytics deployment, said Julie Lockner, an analyst at Enterprise Strategy Group in Milford, Mass. In her view, organizations should work to make sure they have a solid set of business requirements that, barring unforeseen circumstances, won’t change too much during the course of a project.

“Everyone is talking about agility and BI, and agile data warehouses and agile analytics, but being agile is costly,” Lockner said. If you know in advance that your business needs are likely to change frequently, she added, you should try to create a development environment that will make it easier to manage those changes. Overall, Lockner advised, it’s important for an organization as a whole to realize the level of investment that will be required to make in-memory processing work for BI and analytics.

Once you’ve got the technical and business-requirements details mastered, the challenge of training is next, right? Actually, said Forrester Research Inc. analyst Boris Evelson, one of the premises of in-memory business intelligence is that in many cases it shouldn’t involve as much end-user training as conventional BI technologies do.

Know what you’re doing on in-memory BI
“If your users know their data, and if they know something like Excel, in-memory should require as little training as possible,” Evelson said. To help avoid any potential problems, he added, in-memory analytics should be targeted primarily for use by more experienced business users -- particularly if you’re implementing one of the higher-end forms of in-memory software. “It isn’t something to give to someone who doesn’t know what he or she is doing,” he said.

On the other hand, there may be more training and education issues for IT professionals themselves, said Mike Ferguson, managing director at Intelligent Business Strategies Ltd., a U.K.-based research and consulting firm. IT staffers involved in managing in-memory analytics applications need to assess the impact that the new tools will have on server and network performance, Ferguson noted. That includes developing an understanding of whether any IT infrastructure components need to be strengthened to support in-memory operations and whether some elements are no longer required.

For example, Ferguson said that integrating in-memory online analytical processing (OLAP) databases with data warehouse appliances could make it possible to eliminate some physical data marts. “That simplifies the architecture in a data warehouse environment, because it removes the need to extract data from a data warehouse and move it into data marts and on into cube data stores,” he said. As a result, there are fewer steps involved in preparing data for analysis and fewer data stores to manage, which should lower the cost of supporting analytics activities within an organization.

Over the next few years, Ferguson expects to see the development of massively parallel processing appliances that can support a combination of solid-state drives (SSDs) and in-memory data storage for BI and analytics uses. In such systems, he said, SSDs would replace conventional disk drives as the primary storage medium, with the most heavily accessed data going into memory. But that “will not happen overnight,” he added. “We need the prices of SSDs to fall.”

Ultimately, no matter how an in-memory analytics system is structured from a technical standpoint, the success of a project depends on choosing the right use case for the in-memory tools, Evelson said. BI applications and business processes that require a lot of rigor and adherence to standardized analytical routines -- ones in which you really don’t want business users to step out of well-defined boundaries -- are not good candidates for the in-memory approach, he cautioned.

Alan R. Earls is a Boston-area freelance writer focused on business and technology.


There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: