Our industry has to get much more analytical in how we make decisions about the analytic tool sets and techniques we deploy.
I recently pinged 2,500 senior decision-makers to ask how their organizations made decisions about analytics investments. I was surprised that for more than 60% of the respondents, decision-making around analytic strategy, architecture, tool sets, base platforms, techniques and capability development is basically ad hoc — they’ve been winging it. I needn’t point out the deep irony in that.
The transcendent importance of analytics has been clear for some time. In my book The New Know: Innovation Powered by Analytics (Wiley, 2009), I argued that analytics was emerging as an affordable and accessible source of competitive advantage. In the seven years since then, almost a thousand books and tens of thousands of blog posts, articles and webinars have piled on the proposition that analytics is a good thing. Washington insiders take it as fact that investment in analytics — or lack of investment — was the difference maker in the 2008 and 2012 presidential elections. Enough already. I don’t think we need any more surveys documenting “analytics, good; no analytics, bad.” It is time we added a little more nuance to the discussion.
Let’s start with the basics, though. Analytics is a big sandbox that encompasses the entire decision spectrum — from operational decisions to tactical decisions to strategic decisions (those with huge impact but low frequency).
But analytics is a heavily modified term. “Descriptive analytics” is for understanding what happened in the past. “Diagnostic analytics” is for unearthing why something happened. “Predictive analytics” follows linear extrapolations to forecast what will happen in the future. And “prescriptive analytics” considers what we should do next. To all of this I throw in big data and data science as part of the analytics superset.
Big data, of course, is a big part of why analytics has become essential. The existence of big data is not something we have a choice about. It simply is. Our choice is between ordering and exploiting it, on the one hand, and being overwhelmed by it, on the other. In the next four years, something in the neighborhood of 60 zettabytes of new digital information will be created. That is a number so big, it might require explication, even for the readers of Computerworld. The prefix zetta indicates multiplication by the seventh power of 1,000 . For perspective, consider that half a zettabyte is thought to approximate the entire World Wide Web in 2009.
When information is being created at a rate of 15 zettabytes per year, being overwhelmed can seem like the only option, especially given the fact that the digital storage industry manufactures about half a zettabyte of storage capacity a year. What do we do when we are creating more digital data than we have places to store it?
Which data has what value?
Organizations need to analyze what they want to know before investing in analytic platforms, tool sets and techniques.The first step on the path to analytic mastery is to become street smart about data resources — you have to know what data you are collecting, what data you can safely allow to slip away, how you are using the captured data, and how do your collection and use practices compare with others in your industry.
The second step is to decide who in your organization should be charged with rethinking existing business processes based on the new analytic tool sets. You don’t want a pure number-cruncher for this. You want someone who can think entrepreneurially about creating new revenue streams based on the new analytic tool sets.
For example, In the retailing vertical market, analytics has historically been applied to physical products — forecasting which products on the shelves might be approaching out-of-stock situations, for instance.
But in the new world of analytics, savvy retailers will use ever more data to move from being shelf-centric to shopper-centric as they learn to assess what shoppers’ true and recurring needs and wants are. What items, and in what quantities, do shoppers typically buy? Do they prefer self-service checkout lanes or human-assisted transactions? What time of day do they shop? Are they brand loyalists or price-sensitive bargain shoppers? Do they prefer to pay in cash, by debit card or by credit card with a reward incentive?
It can take a subtle mind to see what kind of data has real value. An illustration of this can be drawn from the Age of Sail. In the 19th century, Matthew Maury used “dusty old ship logs” (a data source previously thought to be useless) to plot the ocean’s currents. Maury was a pioneer of datafication (see: Big Data: A Revolution That Will Transform How We Live, Work, and Think, by Kenneth Cukier and Viktor Mayer-Schönberger). Today, some of the hardest-working analysts in the world build on his legacy. They labor in a little-known branch of the U.S. Navy, the Naval Meteorology and Oceanography Command, using oceanographic and atmospheric data to create info products to improve mission performance and marine safety.
Futurist Thornton A. May is a speaker, educator and adviser and the author of The New Know: Innovation Powered by Analytics. Visit his website at thorntonamay.com, and contact him at email@example.com.