Keeping Users Close
Bryan Jones started on a shoestring budget -- but that's not why his first effort at predictive analytics failed. Jones, director of countermeasures and performance evaluations in the Office of the Inspector General at the U.S. Postal Service, wanted to help investigators determine which healthcare claims were most likely to be fraudulent.
After eight months, he had a working model, but the independent analytics group working on the project wasn't fully engaged with the department that would be using the tool. As a result, the raw spreadsheet output was largely ignored by investigators.
Best Practices
9 Steps to Success With Predictive Analytics
Follow these best practices to ensure a successful foray into predictive analytics.
1. Define the business proposition. What is the business problem you're trying to solve?
2. Recruit allies on the business side. Having the support of a key executive and a business stakeholder is crucial.
3. Start off with a quick win. Find a well-defined business problem where analytics can deliver measurable results.
4. Know the data you have. Do you have enough data -- with enough history and enough granularity -- to feed your model?
5. Get professional help. Creating predictive models is different from traditional descriptive analytics, and it's as much of an art as it is a science.
6. Be sure the decision-maker is prepared to act. An action plan alone isn't enough -- someone has to carry it out.
7. Don't get ahead of yourself. Stay within the scope of the defined project, even if success breeds pressure to expand the use of your current model.
8. Communicate the results in business language. Talk about things like revenue impact and fulfillment of business objectives.
9. Test, revise, repeat. Conduct A/B testing to demonstrate value. Present the results, gain support, then scale out.
Sources: Guy Peri, P&G; George Roumeliotis, Intuit; Dean Abbott, Abbott Analytics; Eric Siegel, Prediction Impact; Jon Elder, Elder Research; Anne Robinson, The Institute for Operations Research and the Management Sciences.
Fortunately, Jones' group had the support of the inspector general. "You're dead in the water if you don't have support from the top," he says.
The second time around, Jones hired a consultant to help with modeling and data prep, and embedded an analyst within the group that would be using the results.
And they made those results more "real" to users. For an investigation of contract fraud, for example, his team placed the results in a Web-based interactive heat map that showed each contract as a circle, with larger circles representing the biggest costs and red circles being the highest risks for fraud (see map, at left).
Investigators could click on the circles to see the details of the contracts and related contracts that were at risk. "That's when people started to notice that we really had something that could help them," says Jones.
Jones' advice: Get close to your customer, get professional help building your first model, and present the results in a compelling, easy-to-understand way. "We didn't have the right people or expertise to begin with. We didn't know what we didn't know," he says, so he turned to an outside data-mining expert to help with the models. "That relationship helped us understand why we failed and kept us from making the same mistakes again," Jones says.