This article originally appeared on the BeyeNETWORK
For me, 2008 started in a funny way. The sun was out and no matter how much I wanted to believe that it was midnight, my mind was just having trouble accepting it. But, it was true. I celebrated this New Year’s Eve in Antarctica, and it got me thinking about the vagaries of time and its importance to business intelligence (BI).
I always like to lay things out as simply as possible to assist me in the process of problem solving, which is the main reason to do business intelligence. BI is about getting answers to the classical questions: what, where, when, who, why, how, how many, how much, etc. Time is the variable that allows us to answer the question: when. It is a critical parameter in problem solving that is best understood by remembering George Santayana’s famous maxim, “Those who cannot learn from history are doomed to repeat it.”
Precisely! One of the most useful functions of business intelligence is that it allows us to learn from history. We cannot prevent something that has already happened from happening; but by analyzing our business history, we might be able to identify ways that may prevent the same thing from happening again in the future. This forces us to think about time as a business intelligence dimension and to develop the metrics that we need for purposes of analysis. As a first cut, we think of time in very simple way: the past, the present and the future. But for time to be a useful dimension, it demands substantially more precision.
Time and IT are intricately and intimately intertwined as we learned well from the goings-on around the coming of the new millennium and the Y2K phenomenon. This showed both the difficulties that the variable “time” could cause in the IT domain, as well as the effectiveness of thinking things through in the context of problem solving. At its core, Y2K was a basic problem with time as an analytical dimension, so let’s look at this issue in some detail.
First of all, how do we measure time? Years, months, weeks, days, hours, minutes, seconds. Right? Are we now ready to establish the hierarchy and design our time dimension table using Kimball’s star schema? Not really. For starters, as we all know, there are a finite number of named time intervals larger than a year (e.g., a score, a century, a millennium) or smaller than a second (e.g., millisecond, microsecond, nanosecond), and an infinite number of nameless time intervals on both sides of the spectrum. We need to determine the intervals required by our business intelligence environment. However, we also have to deal with the complexities of time in the context of the laws of physics – and the fact that we have many useful metrics that don’t convert easily between them.
Historically, we developed a context for time as a result of our cosmic reality. Early man noticed that there was a periodic progression of light (which he named day) followed by no light (that became night). Eventually, we were able to figure out as a species that “a day” was defined by the sun – it was the amount of time it took for the earth to complete a full rotation on its axis. But man would soon learn that the moon’s rotation around the earth was also very important in terms of its impact on tides, length of gestation and other social rhythms. So important were some of these factors that the “moonth” became the first established category into which days were aggregated. And with this, some of our conversion dilemmas started because a year – the time it takes the earth to go around the sun once – is composed of 365 days and a fraction. We are all familiar with the different number of days in our months, best illustrated by the well-known ditty “30 days has September, April, June and November; all the rest have 31, except for February which has 28 or 29.” Leap year was brought in, of course, to deal with the accumulation of fractions of a day that over long periods of time forced a readjustment of our calendars.
Religions were the first enterprises to lay claim over time. It is no accident that many of our days are named for deities –Wednesday for Odin, Thursday for Thor. Even today, in our own Western culture, we start our current count around the year 0, the year of the birth of Christ, and we label any other year A.D. or B.C. depending on whether it came before or after. Furthermore, arithmetic issues forced us to change from the old Julian calendar dating back to pre-Christian days to the Gregorian calendar developed by the Vatican which serves us into the present. (Incidentally, the changeover occurred in 1582 for most of the world, but not until 1752 for the England and its colonies. There was a ten-day gap in the conversion, which caused the equivalent of a Y2K problem at the time.)
There are many non-Christian religions, of course, that use other markers for their own count. Hence we have the Jewish, Islamic, Hindu or Baha’i calendars, each with seminal starting points, such as the biblical “beginning of time” or Mohammed’s Hegira from Mecca to Medina. All these calendars are either solar, lunar or a combination of both and still need periodic adjustments by adding extra days or months at specific intervals.
Going beyond religion, measuring time has always been of keen interest to those that wield power: governments – the ability to predict the tides and their importance to navigation and trade; or the seasons with their importance for farming; or eclipses with the concomitant fear they brought to early societies. And later on, as has been so well documented in books on naval affairs, the need to determine where we were on the planet in terms of longitude. From early times, we could more or less figure out how close to a pole we were – latitude – by watching the angle at which the sun rose or set on the horizon. But longitude, how far away we were from a starting point along any cross-sectional circumference on our planetary sphere, was something else. For this we needed to be able to keep track of time on board a vessel so we could relate back to our original port. This was not accomplished accurately until well into the 18th century.
And governments cannot keep away from time. They fiddle around with it to their hearts content by making time zones follow political borders (for example, all of China is on Beijing time); changing it to fit economic purposes such as daylight savings time that Arizona insists on not adopting; or working on the vocabulary as when Napoleon renamed all the months of the year to fit his revolutionary logic.
While it’s important to start with the planetary basis of time in order to understand how we measure it, time really goes beyond and permeates virtually all aspects of human activity. Let’s start with the concept of age. How old are you? How old am I? People’s ages tends to group them into cohorts, generations and other categories that are useful in economic and social endeavors. But the concept of age is not exclusive to people since a horse, a bridge, a building, a school and a program can all have ages. Age tends to be measured starting with the exact moment of “birth” and counting in the appropriate time units until the present. Depending on our needs, we will round off as appropriate so that we can say John Doe is 39, rather than 39 years, 2 months, 5 days, 2 hours, etc.
Time is also a critical dimension we use in analytical comparisons. One example of this is the need to analyze the importance of seasonality. How do specific products (e.g., swimming suits, ski boots, diving gear, ice skates, Christmas cards) sell during the different parts of the year? Time is also a critical dimension if we want to understand business activity during the week as opposed to during the weekend.
While time is a key analytical dimension – one of the principal column headings in reports – most often the results of what we measure get counted in either units or dollars. And this brings us back, especially in government, to the concept of a budget. Budgets are all important in the federal government, as we know, because without an authorization and an appropriation approved by Congress into a budget, the Executive Branch is prohibited from spending money. Hence, it cannot operate without a budget; and budgets are the principal control mechanism. Since budgets are approved on an annual basis, this means that plans and expenditures have to be tracked against specific budget years. This creates some very complex requirements in the context of accounting programs, and also a very rich arena for business intelligence.
The color of money is one way that agencies refer to their need for tracking the budget year of the dollars that are being used to fund a specific project or program. It can get extremely complicated when you take into account the complexity of the appropriations processes and the ways that different agencies receive the money that they can actually use to accomplish their missions in any given year.
Did I say “any given year”? Please be sure you understood that to be a fiscal year, not a chronological year. So now we add more complexity into the pot since time in our real world works in chronological years – remember the earth revolving once around the sun – whereas most enterprises work with fiscal years that start and end on often discretionary dates. The Federal government’s fiscal year, as we know, starts on October 1 and ends on September 30 of the next chronological year. So when we speak about 1st quarter results for the government, we need to map back because we’re not talking January through March but rather October through December.
Last, we often trade off time and distance in our minds. We do this because we may find it more useful to say that New York and Washington are 45 minutes apart (flying time) rather than 240 miles, or that Philadelphia and San Francisco are 3 time zones away from each other. The most interesting illustration of the trade-off is in that fascinating metric of distance called the light year, which is not a measure of time but of distance. A light year is the distance that light travels in a vacuum during one year, or approximately 5.88 trillion miles. The equivalent of a geometric point on a line is a specific unitary time/date (e.g., December 31, 2007 at 11h 35m 23s PM), and the distance between two such points is, of course, length or distance. Hence, time and distance are geometrically equivalent.
But fear not, the confusion generated by the sun and the moon and their complicated orbits may be coming to an end. For the last few decades, physicists have been defining time using a different metric. A second of atomic time is defined as “the duration of 9,192,631,770 cycles of radiation corresponding to the transition between two hyperfine levels of the ground state of cesium 133.” And since we know from Einstein’s e=mc2 that time may be expressed in terms of an energy/mass ratio, it may be that in the future we can also trade off time for those two variables.
And that is why having the sun out at midnight on New Year’s Eve reminded me of time, budgets, analytics and business intelligence.
Dr. Barquin is the President of Barquin International, a consulting firm, since 1994. He specializes in developing information systems strategies, particularly data warehousing, customer relationship management, business intelligence and knowledge management, for public and private sector enterprises. He has consulted for the U.S. Military, many government agencies and international governments and corporations.
Dr. Barquin is a member of the E-Gov (Electronic Government) Advisory Board, and chair of its knowledge management conference series; member of the Digital Government Institute Advisory Board; and has been the Program Chair for E-Government and Knowledge Management programs at the Brookings Institution. He was also the co-founder and first president of The Data Warehousing Institute, and president of the Computer Ethics Institute. His PhD is from MIT. Dr. Barquin can be reached at email@example.com.
Editor's note: More government articles, resources, news and events are available in the BeyeNETWORK's Government Channel. Be sure to visit today!
Dig deeper on Business intelligence best practices