Big Data, a relatively new term invented by a raft of PR-savvy technology companies eager to sell their software and hardware, has become a favorite hot topic at technology conferences around the world, received a tremendous amount of press in the mainstream media, and been the subject of endless speculation in scores of analyst blogs in recent years.
I submit, however, that these discussions should really be about Big Value Data -- data that’s truly mission-critical to the enterprise -- rather than merely Big Data, the vast majority of which, according to a recent post by Ash Ashutosh, CEO of Actifio, a provider of data management software., is “either duplicated data or synthesized data,” i.e., unactionable intelligence that isn’t critical to most missions.
The financial services industry has traditionally been far ahead of other industries when it comes to using large volumes of data for a variety of purposes -- including segmentation, marketing offers, and fraud detection -- and their initial Big Data initiatives pre-date the coining of the term by a number of years.
However, except when these initiatives have been applied to fraud detection, wherein millions of transactions are analyzed and abnormal patterns are detected, they’ve tended to focus on broader strategic issues -- items discussed semiannually in the boardroom -- rather than daily operations.
The industry is still taking the lead today, showing marked interest in leveraging new sources of data -- primarily social media and other public data -- in order to accommodate the data and analytics tools their stakeholders prefer, to better understand their markets and customer sentiment, and to make sound business decisions about products and services.
When you consider the unprecedented challenge of dealing with this unstructured data, the need for a refreshed focus on data governance becomes obvious.
Still, unstructured data is almost never truly mission-critical Big Value Data that keeps the daily operations of the institution moving -- on the fly, at any time, and with great agility.
That type of data needs to be governed when it’s at rest in institution-controlled databases, when it’s in motion between internal applications, when it’s en route from one application to an employee’s or customer’s device, and when it’s in transit from those devices to the Internet.
Big Value Data, then, is best described as small-but-important packets of data -- data whose very existence virtually demands governance -- that facilitate high-value transactions. I see three categories of it, all of which demand continued focus, investment, and governance by financial services institutions: pure transactional data, customer reporting data, and regulatory reporting data.
Pure transactional data
It’s no exaggeration to say that this type of data literally keeps global commerce moving. In the U.S. alone, financial services institutions move trillions of dollars every day on behalf of their customers, which represents very big value in very small amounts of data. You need look no further than the RBS meltdown of July 2012 -- when a software upgrade failed, delayed the processing of some one-hundred million payment transactions, and affected millions of customers -- to recognize the mission-critical nature of the pure transactional data category.
Customer reporting data
As important as pure transactional data is the reporting of financial data to customers -- particularly commercial and institutional customers for whom timely, accurate transaction reporting has as much value (if not more value) as the transaction processing itself. The customers know that if they don’t know where their money is, they can’t effectively manage it, and timely, reliable reporting from financial institutions ensures that their daily financial operations are optimized.
Regulatory reporting data
This is becoming a bigger concern due to the introduction of myriad new regulations compounding pre-existing regulations. Promptness, data quality assurance, and the ability to integrate data from multiple silos is unquestionably mission critical. While silo multiplicity doesn’t necessarily mean we’re talking about Big Data, the regulatory requirements of that data -- no matter how little or how much of the data there is -- does mean we’re talking about Big Value Data.
Many technology enthusiasts, analysts, and IT organizations have become downright enchanted by the Big Data concept. Their imaginations run wild with the possibilities. “What if,” they wonder, “the data one company in one industry collects on one customer could be processed, crunched, and conditioned into something another company in another industry would find valuable?” Could organizations with vast stores of personal data separate the valuable-to-them wheat from the valuable-to-someone-else chaff and then sell that chaff to the highest bidder? It’s a tantalizing question.
But while PR-savvy technology companies, technology conferences, the mainstream media, and analysts continue to tackle that question, I believe that financial services institutions owe it to themselves and their customers to pay less attention to that speculation and more attention to guaranteeing that they have secured their Big Value Data. They should not rest until they’re ready to sustain critical financial flows, ensure regulatory compliance, and guarantee that the specter of penalties and operational restrictions in no way looms overhead.