Access your Pro+ Content below.
Information delivery snags create big data analytics bottleneck
This article is part of the Nov. 2012, Volume 1, Issue 11 issue of BI Trends + Strategies
Big data is all the rage these days because of a combination of forces, including the continued growth of data volumes, the increased velocity of data creation and updates from a variety of internal and external sources, and the availability of easily installed tools for building scalable analytics platforms using commodity hardware. Similar to the boom in automobile use driven by the increased capacity of the interstate highway system, the improvements in computational power and speed for business intelligence (BI) and analytics applications enable broader dissemination of actionable knowledge in organizations. When that's coupled with demands from business users for faster access to information to speed up decision making, the pressure to provide right-time intelligence capabilities grows exponentially. But in many organizations, there is a bottleneck in the technology infrastructure causing unwanted delays in the delivery of information. What can be done to break that bottleneck? Batch extract, transform and load (ETL) ...
Access this Pro+ Content for Free!
Features in this issue
With data visualization tools offering more and more functionality, managing projects to visualize data is becoming a bigger challenge for BI teams.
Data visualizations have become key components of BI applications. But there's more than meets the eye to enabling end users to 'see' information.
Stewarding data can be a tough nut to crack: lots of effort for a reward that isn't always apparent. To succeed, strong project management is needed.
Columns in this issue
Big data analytics applications require high-speed data availability. But one key part of many IT infrastructures isn't prepared for that challenge.