Enterprise BI models undergo radical transformation

'New analytics' provides access, tools directly to line-of-business users

1 2 Page 2
Page 2 of 2

With traditional analytic processing tools, analysts have to first develop a set of questions and then wait for IT to aggregate the relevant data, cleanse it and build paths between different data elements to enable analysis, said Church, who is responsible for managing 120 to 140 projects every year.

QlikView, on the other hand, lets analyst freely compare data elements and look for associations between them on the fly and on an ad hoc basis, she said.

Another organization that is taking advantage of similar capabilities is The CementBloc, a New York-based company that helps large drug companies fine-tune and optimize their communications and marketing strategies. The company uses Tibco's Sportfire analytics platform to integrate and analyze data from multiple information sources.

"With traditional BI tools, you have to know what you are going to predict," said Ira Haimowitz, executive vice president, intelligence and analytics at CementBloc. "You need to know what you are going to predict by customer segment, or by geography, and map that out to a program, and then you generate queries and reports," Haimowitz said.

Spotfire's in-memory database technology and its search and data visualization capabilities eliminate such requirements. The technology has allowed CementBloc to explore big and diverse data sets at will and find relationships between data elements they didn't know existed, he said.

QlikView and Tibco are not the only vendors offering BI, data visualization and data analytics tools. Over the past few years, many vendors, including Birst, Tableau, Datameer and Splunk, have joined traditional enterprise players such as IBM, Teradata and SAS in delivering capabilities that are driving new BI applications.

The tools offer enterprises "more and more ways to capture, move around, scrub and analyze data," said Bill Abbott, principal of applied analytics at PwC. Some companies are applying these tools to integrate, extract and analyze existing data sets. Many others are using them on top of brand-new data infrastructures based on big data technologies such as Hadoop, Abbott said.

"Twenty years ago, there was this heavy emphasis on requirements-gathering because you wanted to precalculate all the answers," said Anthony Deighton, CTO at QlikTech. "You needed to work with users upfront to get a feel for all the questions they would likely ask. It led to a service-heavy implementation model for BI projects," he said.

New analytics "is about detecting opportunities and threats you hadn't anticipated, or finding people you didn't know existed who could be your next customers," PwC noted in its report. "It's about learning what's really important, rather than what you thought was important. It's about identifying, committing, and following through on what your enterprise must change most."

Jaikumar Vijayan covers data security and privacy issues, financial services security and e-voting for Computerworld. Follow Jaikumar on Twitter at @jaivijayan or subscribe to Jaikumar's RSS feed . His email address is jvijayan@computerworld.com.

See more by Jaikumar Vijayan on Computerworld.com.

Copyright © 2012 IDG Communications, Inc.

1 2 Page 2
Page 2 of 2
7 inconvenient truths about the hybrid work trend
Shop Tech Products at Amazon