Improving BI performance without breaking the bank

Get tips from consultant Rick Sherman on designing and deploying a cost-effective business intelligence system that meets or exceeds your company’s BI performance expectations.

Demand from business users for actionable information is ever-expanding. And with business managers and workers...

seemingly connected 24/7 via their iPhones, BlackBerrys, iPads and other devices, expectations are being raised even further for business intelligence (BI) systems that can go beyond basic reporting and provide analytics capabilities at the speed of thought.

On the other hand, the still-sluggish economy has caused many organizations to become more cost-conscious and tighten their IT budgets. That raises a question: How can IT departments and BI teams deliver the BI and analytics capabilities that their business users crave without breaking the corporate bank?

Of course, ensuring that your BI system meets performance expectations is always easier when you’re rich – i.e., when you have a big budget for hardware, especially large amounts of memory and fast storage arrays. In fact, infrastructure spending can mask many a poorly designed BI system. The design can be flawed from the overall architecture all the way down to the BI dashboards and reports that end users see – but with enough money to throw at the system, it will perform well.

Not having big bucks to spend on hardware might make the BI performance effort seem like an uphill battle. But the corollary to heavy spending being able to mask a poor design is that good design practices may help you make up for the lack of said big bucks.

Designing a BI system to provide great performance on a shoestring budget means getting the BI architecture right; establishing standards to ensure that dashboards, reports and multidimensional data cubes are developed correctly; and making sure that your system is set up and deployed according to sound operational practices.

BI performance booster: the hub-and-spoke architecture
Let’s consider those three things in order. The hub-and-spoke architecture has been established as a best practice for data warehousing and BI implementations since the 1990s. The underlying design concept is that the data warehouse (i.e., the hub) is built and tuned for loading, cleansing and storing large volumes of information from operational systems and other data sources. Meanwhile, various data marts (the spokes) are designed and tuned for specific BI uses: processing the data needed to populate dashboards and reports and to answer online analytical processing (OLAP) queries.

The hub-and-spoke design is fundamental to good BI performance. Trying to do everything with a data warehouse alone requires you to tune it for data integration, loading and querying, which is a tall task. That might work when your BI workload is light, but it won’t scale as the number of business users grows or the volume of queries increases.

An even better information architecture that can scale along with increased BI demand without sending costs spiraling out of control is a “hub-and-spoke plus” design, in which the spokes themselves become hubs for smaller data marts or OLAP cubes (see enlargeable diagram below). That approach enables the underlying databases and cubes to be refined to meet increasingly specific query patterns, thus potentially reducing the need to buy more hardware in order to keep query response times from bogging down.

'Hub-and-spoke plus' BI architecture: click to enlarge

As you move on to designing your BI system for optimal performance, some of the tasks that you need to consider include dimensional modeling, OLAP cube design, and database design and tuning. Although it might add to your costs, involving a “real” database administrator, as opposed to an application programmer playing the role of a DBA, is crucial to ensuring that your databases are set up to support the required levels of BI performance.

Another key to success on BI projects typically is deploying a portfolio of different BI tools to your community of business users, including dashboards, performance scorecards, ad hoc query tools, reporting software and spreadsheet integration capabilities. In the “old days,” doing so would often be cost-prohibitive because the various products had to be bought from different BI vendors. Now many vendors offer full BI suites that include all of those technologies, which can make them more affordable. That’s important, because offering a variety of tools that match different needs within an organization hopefully will keep business users from getting frustrated with the BI system and going back to relying solely on their spreadsheets.

Creating BI development standards is another way to help keep costs under control while also making your IT and BI staff more productive. Using common templates or style sheets to design BI applications frees your developers from having to reinvent the wheel, potentially improving their ability to deliver new functionality on time – and reducing the likelihood that business users will be confused by inconsistencies between different applications. Standard approaches can also be set up for repetitive BI processes, such as data aggregation.

And when you’re ready to deploy the BI system, it isn’t always the most expensive technology configurations that provide great BI performance. Here are some important things to keep in mind:

  • Spread out the databases as well as your extract, transform and load (ETL) processes and BI querying across various physical or virtual servers. The trick is to distribute data workloads and processing in an optimal way, taking into consideration how they fluctuate throughout the day, week and month.
  • Get as much memory as you can afford, and leverage in-memory analytics and ETL caching technologies when possible. Using a 64-bit architecture is beneficial, especially for in-memory functionality.
  • Data storage typically is an area that you can’t avoid spending on – so just buy wisely.
  • Data virtualization, which pulls together data on the fly from operational systems, is a cost-effective method of integrating data for BI uses. But there are caveats: Virtualization can increase the amount of storage that’s needed, and it’s still best suited for development and test environments.

BI systems that meet business-user expectations regarding functionality and performance do require an investment, but the cost doesn’t need to blow your budget. Be forewarned, though: Cheaping out too much on the initial design might force you to pay for it later, whether through increased infrastructure costs or your BI system failing under the weight of unmet expectations. In the end, an inexpensive system that the business doesn’t use is not cost-effective.

Rick ShermanAbout the author: Rick Sherman is the founder of Athena IT Solutions, a Stow, Mass.-based firm that provides data warehouse and business intelligence consulting, training and vendor services. In addition to having more than 20 years of experience in the IT business, Sherman is a published author of more than 50 articles, a frequent industry speaker, an Information Management Innovative Solution Awards judge and an expert contributor to both and He blogs at The Data Doghouse and can be reached at

Dig Deeper on Business intelligence development

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.