What's the difference between data and information?
It's not a trick question. Data are (or "is", depending on your grammarian tendencies) unadorned facts, while information consists of facts evaluated and/or interpreted to provide a particular meaning. It's the difference between plotted points on a graph, and a trend-line drawn to connect them. The points are interesting, but the line tells a story. Data points have value, but to really get answers you want information. Therefore in order to answer business questions, a typical reporting user needs and wants plenty of information at their disposal, all based on dependable data.
So then why is it that these users typically end up with a sea of data points, but are left on their own to derive any usable information?
Tough one...
I know there are some organizations that simply don't have the processes or mandate to find and fix inefficient processes -- including reporting processes. The scope and depth of a problem process simply sits in an organizational "blind spot". Northridge has been called in to "fix" some serious manual-extract, chicken-wire-and-duct-tape Excel situations, fed through a shadowy underworld of data connections hidden to the non-adept, where if users can get any kind of realistic data at all, they're thankful. Of course, if that kind of solution is already causing pain, you're too late for duct tape. But somehow the company can't see the larger-scale syndrome at work.
Also I've seen more than one "bottom-up" reporting system, where the shape and structure of the user-facing solution is based largely on the data and the structures it occupies, with little or no recognition of business-based reporting needs. In these situations, the user frequently must sift through arbitrarily-named (from their perspective) metrics, segmented in senseless ways (again, from their perspective) in order to get answers. Or they get data access along lines skew to the business roles they fill, restricting them along axes of data volume or visibility. "Data, data everywhere, nor any conclusions to be derived." Of course, by then time this has been built, there's neither budget nor guts to actually build a bridge to the users.
And of course you can never underestimate the sheer inertial power of "We've always shown it that way" ...
But in all cases, let's face it -- the goal is to get OUT of that situation where you're giving the right stuff in the wrong format, and the right people can't find it anyway. So, the key manuever is to evaluate one's reporting offerings through the lens of business context. Context, which sums up in one word the difference between data and information, gives users the background they need to derive conclusions, make decisions, and ultimately take action when they look at a report. In practical terms, you're showing users graphs with trend lines instead of grids of numbers. Using indicator graphics to show whether costs are up for current period over previous period. It also means organizing your reporting data in a way ammenable reporting needs.
Detailed data will still have its place since certain types of analysis may require a deeper look into the data behind an informational trend. But that's why the good lord invented drill-down logic. I refer to it as a "tip of the spear" strategy. You want to start with a summary, and allow further exploration along established data relationships. Click the "up vs down over prior period" indicator and it shows a grid-style report with daily details.
Accounting for users workflow-oriented interactions with data can help establish a more coherent and useful representation of data. This structuring of information based on users' needs is called Information Architecture, which is defined by the Information Architecture Institute as "The art and science of organizing and labeling web sites, intranets, online communities, and software to support findability and usability." And it can mean the difference between efficient and effective information delivery, and a hippie's backyard.
No comments:
Post a Comment