Sergey Nivens - Fotolia

Public sector needs efficient data analytics infrastructure

The public sector has been a big IT spender in recent years, but a lot of that investment went to on-premises big data infrastructure. It's time for an update, says one expert.

Government agencies -- whether at the federal, state or local level -- are big spenders when it comes to IT. But not everyone thinks these organizations spend efficiently.

"If the government doesn't rethink how it's going to manage and maintain what it has, it's going to continue to spend more than it needs to," said Ashwini Chharia, senior director of public sector services at NTT Data Inc., headquartered in Plano, Texas. The consulting, IT services and outsourcing company is the North American arm of Tokyo-based NTT Data Corp.

Efficient use of data analytics infrastructure needs to be at the center of everything public sector agencies do to improve efficiency, Chharia said. And in some cases they are taking this approach. The federal government, for example, has scaled back IT spending in recent years, opting mainly for cloud-hosted data infrastructure systems that carry less up-front capital expense and ongoing maintenance costs.

Some cities, too, have made targeted investments in analytics. For example, the city of Boston has implemented several analytics-centric projects to do things like make street repairs more efficient and improve emergency response times. Much of the infrastructure is built around API connections to prebuilt services and partnerships with private companies.

But not every public sector entity is as progressive when it comes to implementing data analytics infrastructure. A June 2016 forecast from Gartner predicted that government IT spending would remain flat for the remainder of this year, including on analytics and data infrastructure tools -- and this is after spending fell by 5.2% in 2015.

Chharia said the last decade or so saw public sector agencies make massive investment in on-premises technology that comes with huge management costs. Additionally, agencies often don't get as much value out of these as they could because they don't have the staff resources. This has led to a pullback in new spending, locking some agencies into situations where they pay too much to just sit on potentially valuable data.

Chharia contrasted this situation with what we're seeing today from leading tech-centric companies like Uber, whose data analytics infrastructure is largely built around hosted services like Amazon Simple Storage Service. This type of setup has the advantage of being quick and inexpensive to implement, while the downside is ongoing service fees. It's this kind of infrastructure Chharia thinks public sector agencies need to adopt if they are going to stay relevant in the 21st century.

"The payoff might be as simple as survival," he said. "Revenue is decreasing. Efficiency is key to survival. The ability to attract Millennials to a city will continue to erode if the jurisdiction doesn't become more efficient."

Next Steps

Vendors vie to be big data infrastructure of choice for businesses

Leading-edge big data companies shouldn't influence enterprises big data decisions

Big data is a whole new hardware and software world

Dig Deeper on Big data analytics