So Abbott went looking for more data -- from the technicians themselves. "The secret sauce is taking the data you have and augmenting it so that the attributes have more information in them," he says. After speaking with the domain experts, his team came up with an approach that was successful.
"Instead of having hundreds of sparsely populated variables, we condensed this into dozens more information-rich variables, each tied to the historic relationships to parts being needed," Abbott explains. Essentially, they matched up the occurrence of certain keywords in repair histories to discover what percent of the time a part had been needed.
"What we were doing was reworking the data to be more aligned with what an expert would be thinking, instead of relying just on the algorithms to pull things together. This is a trick we use a lot because the algorithms are only so good at pulling together those patterns," he says.
9. Just assume that the keepers of the data will be fully on board and cooperative.
Many big predictive analytics projects fail because the initiators didn't cover all of the political bases before proceeding. One of the biggest obstacles can be the people who own the data, who control the data or who control how business stakeholders can use the data. One Elder Research client -- a payday lending firm, which offers short term loans to tide people over until their next paycheck -- never got past the project kickoff meeting due to internal dissent.
"All along the way we were challenged by the IT person, who was insulted that he had not been asked to do the work," Deal says. All of the key people who were integral to the project should have been on board before the first meeting started, he says.
Then there was the case of a debt collection firm that had big plans for figuring out how to improve its success rate. Abbott attended the initial launch meeting. "The IT people had control of the data and they were loath to relinquish any control to the business intelligence and data mining groups," he says.
The firm spent hundreds of thousands of dollars developing the models, only to have management put the project into a holding pattern "for evaluation" -- for three years. Since by then the information would have been useless, "holding pattern" was effectively a euphemism for killing the project. "They ran the model and collected statistics on its predictions, but it never was used to change decisions in the organization, so was a complete waste of time."
"The models were developed but never used because the political hoops weren't connected," Abbott says. So if you want to succeed, build a consensus -- and have C-suite support.