Real-estate agents sometimes joke that the three primary considerations in buying property are location, location, location. These days, however, that phrase could just as easily be uttered by business analysts talking about geospatial data.
"We've seen geospatial technology out in the market for a few decades," said Matt Gentile, a principal at Deloitte Financial Advisory Services LLP. "What we've witnessed in the last couple of years [is that] the pace at which data is being created that has a location element has grown exponentially."
Gentile describes the geospatial data explosion as a "mass proliferation" generated by sensors, Internet use, geocodes attached to digital images and handheld devices with GPS capabilities. The process of analyzing these sources of data could help organizations provide better service to customers and -- according to Gentile's new report,
In his report, Gentile provides insights into the ways geospatial data is a mature resource that can be beneficial to the public sector. The report also describes how government agencies can get started without breaking the bank.
The power of zoom
As the title indicates, the report's focus is "the power of zoom," which is how Deloitte refers to the combination of expertise and information that government agencies already possess with geospatial data and visualization techniques. Taken together, traditional and new forms of data have the potential to generate complex maps of information for historical and even predictive analysis, the report says. "Location-based data can be used to create policy at a human scale, allowing decision makers to 'zoom in' to understand events in our communities and 'zoom out' for broader context at the national and even global scale," it says.
Government agencies that incorporate geospatial data with existing information can expect three main benefits: the ability to see the bigger picture through visualization techniques, which could be useful when creating public policy; the ability to find a common focus by, for example, using "crowdsourced" information to determine where to allocate resources; and the ability to create new models for delivering services.
More on geospatial data
Back in 2003, businesses saw lots of promise in geographic information systems
Two years ago, a geospatial data company began virtualizing data storage
Just last year, Google launched Google Earth Builder for geospatial enterprise data
"Think about the questions you ask in your daily operations that include the question of 'where,'" Gentile said. "Any time you find that question, you are a prime candidate to begin to look at location and geospatial analytics to help you solve those problems."
Government data tends to have a location component already -- where power lines and water pipes lie, for example, Gentile said. Mapping this information in combination with, say, the geospatial data generated by a natural disaster can add a layer of detail for decision makers.
In the report, Gentile and his colleagues cite as an example a cholera outbreak in Haiti after a recent earthquake. Knowing cholera can spread from contaminated water sources, the international medical organization Doctors Without Borders and Google Inc.'s Crisis Response team mapped where the outbreak was occurring against a schematic of the water system. The analysis helped build a framework for deploying resources.
"It begins to take you along this progression of starting with a question, looking [at the data and information] within your own organization, [and] looking outside of your organization for other data sources to help tell the story," Gentile said. "You are building this rich tapestry to look at how you're doing things and how you [can] transform."
Answering the question 'Where?'
Collecting and analyzing geospatial data edges government into "big data" territory, which can be intimidating for those new to the terrain. The good news is that federal, state and even local governments can get started without a huge investment in technology or personnel, according to Gentile. "Unlike a decade ago," he said, "this is now out in the open, and anyone with any kind of computer programming background can pick this up and get started."
In fact, rather than going out and purchasing an expensive, enterprise-ready product, the biggest initial investment will be time, Gentile said. A company can start by finding a tech-savvy employee who can take on the task. In the beginning stages, the employee doesn't need advanced skills or significant training. "As you progress, and you want to get into predictive or deeper analysis, you'll find yourself walking down a path where more investment is necessary," he said. "But you can get started fairly easily."
Talent is one thing, but other aspects, such as geospatial data and technology, also need to be addressed. Here again, Gentile suggests getting started with a smaller, tangible question -- one that could be answered by combining and visualizing internal data in ways that have never been used before. As the program matures, look for external and even "open" data to add to the mix. Just don't get stuck waiting for perfect data. "By visualizing data for preliminary analysis to predict, speculate and infer (not conclude), patterns emerge that may lead to more detailed investigation," the report states.
As with acquiring external data, the initial technology investment doesn't have to be an expensive one. Instead, look for free or open source tools or mashup technologies, such as Google Maps, Gentile said. "Start small, start discrete and with a well-defined challenge you want to tackle," he said. "My feeling is that people are caught at the beginning -- thinking the first step has to get them to the top of the mountain."
Finally, it's important to create a culture that values geospatial data. So, incorporate it into the overarching vision and create a geospatial roadmap to outline where it's going, especially as technology continues to improve.