sponsored

Analytics Modernization: Data as Your Competitive Advantage

1 1

You are probably counting on your analytics to give you a competitive advantage. But your analytics strategy is only as good as the data that fuels it. Here are 5 key things to keep in mind as you design your new data management architecture.

RELATED TOPICS

If your business strategy involves competing on analytics, then now is the time to re-think your data management architecture. While overall IT spending is set to grow by 0.6 percent this year, analytics spending is expected to be much higher. The reason is simple: almost every company sees analytics as a key way to deliver strategic advantage. Computerworld estimated that Business Analytics spending grew at 38 percent last year. And a recent CIO survey by a leading IT research firm lists “BI/Analytics” as the top CIO spending priority for the third year in a row.

But your analytics strategy is only as good as the data that fuels it. Here are five things to keep in mind as you design your new data management architecture.

1.     The Data Warehouse Has a Lot of Life Left in It

A survey by Computerworld last year showed that 70 percent of the respondents planned to spend more on their data warehouse, not less. The point is, for many organizations, this is the place where new analytics insights are operationalized, and where the data behind critical management, operational ,and strategic decision making is managed and prepared.

Many mature organizations are moving into big data as a means of augmenting their existing data warehouse investment. The big data environment will be used to create a sandbox for analytics experimentation and innovation. When useful insights are found, they can be brought back to the DW/BI environment to be operationalized.

For organizations with a mature DW/BI environment supported by mature data-management practices, this is the best environment to operationalize analytics for the near term. It may also be the source of many good data management best practices. Other, less-mature organizations have a choice of either restarting their DW/BI initiatives and the data management to support them, or jumping directly to big data and starting from there.

2.     New Analytics Technology Will be Cumulative, Not a Replacement

For most large organizations with a significant investment in DW/BI and analytics, the new technologies for analytics will be cumulative.

  • For analytics persistence, there will be a choice of relational, Hadoop, Columnar, and Not Only SQL—plus others are sure to follow. It will be a matter of persisting the data in the technology that best fits the needs of the business initiative. That will mean managing even more data management complexity as all of these technologies are integrated.
  • For analytics tools and platforms, there will be significant change as users are given analytics and data visualization tools of choice, business self-service grows, and the environment evolves to include real time, streaming, and predictive analytics.
  • For organizations that are looking to move quickly, it is far easier to stand up new data warehouses and analytics platforms in the cloud than on-premises, but that also increases the complexity of data management.

The bottom line is that change will be the norm. More mature technologies will persist and newer technologies will be integrated into the environment to augment what currently exists.

3.     Hadoop Does Not Mean that Data Structure and Metadata Don’t Matter

The current emphasis in the Hadoop world is on the rapid ingestion of data. But that does not mean the requirement for data structures, data definition meaning and context, data quality management, and data security will go away.

For innovation, “pretty good” data may be good enough to determine if an analytics question is useful. But as that insight is used for more business-critical purposes, it will be more important than ever to ensure that the data has all the properties mentioned above. In a word, the data should never be used to support important business decisions and processes unless its quality has been verified.

The emerging requirement here will be data provenance. With a majority of data coming from outside of the organization, according to some analysts, it will also be important to track and rank the reliability of a given data source. If you are making a multi-billion dollar investment decision on a new fab, for example, it’s essential to use clean, complete, and timely data.

4.     Your Data Architecture Will Determine Your Destiny

The next generation of predictive and prescriptive analytics will require data from a wide variety of sources, both internal and external to your organization.

  • Big data requires managing unstructured data with all the familiar challenges of data volume, variety, velocity, and veracity.
  • But traditional, structured, data is getting “big” as well. The volume of this data is growing so quickly that it can choke your analytics processes if you are not careful.
  • Data may be on-premises or in cloud storage, data warehouses, or analytics applications.

The point is this: if you are using different tools to manage each of these different types of data, you may create new “data silos” which will slow down the delivery of business value from your analytics initiatives. This brings me to my final point.

5.     You Need an Enterprise-Wide Data Management Architecture for Analytics

If you are going to use analytics as a key market differentiator for your organization, you will need an organization-wide data management architecture to fuel your strategy in a highly consistent and reliable way.

Here are the three key attributes for a next-generation data management architecture: 

1.     Productive: IT departments are struggling to deliver trusted data in the required timeframes. The data management environment must become much faster and more productive in order to best serve the business.

2.     Flexible: As discussed above, the new data management architecture must be designed to quickly on-board new data and incorporate new analytics technologies. What’s more, all this must happen without disrupting the current delivery of data. Next-generation analytics requires a flexible data management architecture that works with any data and connects with any analytics engine or platform. New data technologies must “plug & play” in this world.

3.     Standardized and Automated: Let’s face it. You are not going to get additional staff to deal with data complexity and changing technology. The only way to accelerate the delivery of clean, complete, timely, and secure data is to use a standardized data management platform and tools. The era of data management by hand coding and “tool of choice” approach is over.

Investing in big data technology and new analytics tools is not enough to ensure the success of your analytics strategy. Success depends on building a highly productive and flexible data management architecture to fuel that strategy with clean, complete, timely, and secure data.

Informatica provides an industry-leading, end-to-end data management solution that touches all aspects of data management, including traditional analytics, cloud analytics, and big data analytics.

To take the next step on your analytics journey, see the workbook “Laying the Foundations for Next-Generation Analytics” and read “How to Build a Sustainable Data Infrastructure for Better Analytics: A Step-by-Step.”

RELATED TOPICS
A look inside the Microsoft Local Administrator Password Solution
View Comments
Join the discussion
Be the first to comment on this article. Our Commenting Policies