Nowhere is more customer feedback needed than with mobile apps. What works? What do you like? What is needed that isn't there? And, most critically, why do you like or hate whatever you like and hate? A report crossed my desk the other day that came tantalizingly close — until I realized it didn't.
The report came from Applause, a mobile analytics firm. Applause says that it generates "actionable insights and quantifiable metrics based on what users are actually saying about their app experiences." And on the plus side, this report was based on more than 4,000 user reviews of the 95 most popular mobile apps. But the report — not unlike almost every other report I've seen trying to examine mobile app quality — suffers from a lack "why?" and "in what way?" This isn't based on interviews with these consumers. And the consumers were limited to choosing from poor, fair, good, excellent and winning.
The goal of actionable insights is the right one, but what does it tell us that, for example, McDonald's had the lowest overall score and that sports shop Fanatics had the highest? I asked the author of the report, Ben Gray, and he correctly said that he didn't know because the consumers weren't asked why. There are lots of charts plotting who was high and who was low, but without the particulars, what are these retailers to do about it?
Did consumers love the Fanatics app because it was wonderfully designed, with great response time and uptime and the delivery of everything needed? Or did many of them like it because they like the store in general? There was also a big difference in mobile operating system support. Nordstrom, for example, had 2,285 iOS reviews and 21 Android reviews. Forever21 had 11,522 iOS reviews and 1,536 Android reviews. How should those be weighted? (It wasn't all favoring iOS. Foot Locker had 76 iOS reviews and 323 Android reviews, while Victoria's Secret had 1,640 iOS reviews and 2,870 Android reviews.)
That said, let's not reject the good while seeking the perfect. The apps that fared the best were Fanatics, Domino's, Groupon, HauteLook, Overstock, REI and CVS, and the ones that got the worst scores were (from bad to slightly less bad) McDonald's, Michaels, Jimmy John's, Burger King, Macy's and Old Navy.
Mobile developers all over want to know what Fanatics and Domino's did — so they can reproduce it — and what McDonald's and Michael's did — so they can avoid it. But we don't know.
The report looked at the list of the weakest performers and observed: "Note that three of the six are quick service restaurants (QSRs). Killer mobile apps can increase restaurant traffic and drive business growth. Look no further than Domino’s Pizza, Five Guys, Pizza Hut, Starbucks and Taco Bell for evidence — and inspiration." That's hard to argue with, but what did Domino's do differently than McDonald's? The problem is that the answer could easily be "Nothing." Perhaps the people who participated just happened to really like Domino's and hated McDonald's. It could be more of a food reaction than how the mobile app is designed.
But just because this report isn't doing it doesn't mean that your retail shop can't. How about awarding healthy-sized gift certificates for users discussing their mobile experience on the phone? Callers would not be restricted to a script and would be instructed to focus on open-ended questions, with lots of "Why?" and "Can you site any more examples?" and "What would you have preferred?" and "When you said the site was slow, slower than what? Do you recall if other mobile apps were also slow at that same time?" If the analyst community won't tell you what you're doing right and wrong with your mobile app, you can ask your customers yourself.
This article is published as part of the IDG Contributor Network. Want to Join?