12 predictive analytics screw-ups

Make these mistakes and you won't need an algorithm to predict the outcome

1 2 3 4 5 6 Page 5
Page 5 of 6

10. If you build it they will come: Don't worry about how to serve it up.

OK, you've finally got a predictive model that actually works. Now what?

Organizations often talk extensively about the types of models they want built and the return on investment they expect, but then fail to deploy it successfully to the business.

When consultants at Elder Research ask how the business will deploy the models in the work environment, the response often is "What do you mean by deployment? Don't I just have models that are suddenly working for me?" The answer is no, says Deal.

Deployment strategies, or how the models will be used in the business environment once they are built, can range from very simple -- a spreadsheet or results list given to one person -- to very complex systems where data from multiple sources must be fed into the model.

Waiters
Source: REUTERS/Yuri Gripas.

Most organizations fall into the latter category, Deal says: They have complex processes and huge data sets that require more than just a spreadsheet or results list to make use of the output. Not only do companies have to invest in appropriate analytics software, which could cost $50,000 to $300,000 or more, but they may need software engineering work performed to connect the data source to the software that runs the models.

Finally, they may need to integrate the outputs into a visualization or business intelligence tool that people can use to read and interpret the results. "The deployment of a successful model is sometimes more work than building the model itself," he says.

Even then, the deployment strategy may need to be tweaked to meet the needs of users. For example, the Office of Inspector General for the U.S. Postal Service worked with Elder Research to develop a model for scoring suspicious activities for contract-fraud investigators.

At first the investigators ignored the predictive models. But the tool also gave them access to data they needed for their investigations.

Then the team decided to present the information in a more compelling way, creating heat maps to show which contracts on a map had the highest probability of fraud. Gradually, investigators started to appreciate the head start the scoring gave to their investigations.

Today, some 1,000 investigators are using it. It was a learning moment even for the experts at Elder Research. "We learned a lot about how people use the results, and how they develop an appreciation for the predictive models," Deal says.

1 2 3 4 5 6 Page 5
Page 5 of 6
It’s time to break the ChatGPT habit
Shop Tech Products at Amazon