The Starbucks passwords-in-the-clear boo-boo has begun to fade now that the company has fixed the errant mobile-payment app. But it's clear that the situation is a major heads up, not for Starbucks security, but for all developers of mobile apps.
It looks more and more like Starbucks committed a sin of omission. The coffee chain tested its app before it was released back in May 2013, but it didn't look at everything that was being saved. It primarily was checking that everything worked as it was supposed to.
That kind of security sin of omission is one that a huge number of large companies are guilty of, especially when it comes to mobile apps.
Starbucks' official stance is that management did not know that an app-crash analytics program -- called Crashlytics and purchased by Twitter last year -- was storing passwords in clear text when it captured data to facilitate recovery after a crash. As spokesperson Linda Mills said, "We knew in May 2013 of the existence of the temporary crash log file for crash diagnostics, but did not know that the clear text contained account user name and password." I have talked to a person who is familiar with the early testing, however, who told me that management knew that passwords were being retained in clear text. What's the truth? I don't know. It could be that there were people in IT who knew about the password problem but didn't communicate that properly through channels.
But there's a bigger issue here, and it's about a lot more than just Starbucks. I'm talking about an assumption that I believe a lot of companies are prone to make when developing apps: There's no reason to suspect that sensitive information is being captured and displayed. With that assumption, no one looks for that sort of thing. And then it comes as a big surprise that, hey, yeah, that very thing is in fact happening.
Here's my message to all companies that might take that attitude: You have to view mobile apps as the informationally dangerous critters that they are. That means a lot more testing, including deeply peering to see what these apps are doing.
You say you don't have the time or money to do that? I disagree. When security researcher Daniel Wood examined the Starbucks files and performed static analysis, he didn't have a large budget, and it didn't take a huge amount of time. "This type of testing focuses on what type of data elements are being stored on the disk, whether in temporary cache files, memory, or permanent data files that are used for normal operation," Wood said. "It also includes analyzing application files for improper programmatic methods or functions being utilized and usually includes what is called secure code review to determine if an application is sanitizing input and output properly, among other concepts."
And you have ample reason to come up with the time and money to do this sort of testing. Failing to do so can lead to embarrassment and customer anger. So start thinking about what you need to check. First, how does your code interact with the mobile OS on the devices it will be available on? What about third-party code, the supposed culprit in the Starbucks situation? It did a few things it wasn't supposed to do, and that caused other parts of the app to do things -- like revealing passwords -- that no one had apparently considered possible.
But you have to assume that sort of thing is possible, and you have to specifically look for it.
You have to look for things that would be in conflict with industry and government regulations like PCI, HIPAA and Sarbanes-Oxley. You don't want to release an app that reveals payment-card data, restricted healthcare information or sensitive financial data. Until you run the kind of exhaustive tests Wood is talking about, you can't be sure that those things aren't happening.
Wood, who is now working closely with Starbucks as a consultant (unpaid for the moment), has talked with a lot of Starbucks people involved in this, and he has come to his own conclusion about how it happened. "From my review, I could tell that they tested the application and ensured that there were security protections in place. Unfortunately, whether it was a miss or oversight on their part, the application exposed those sensitive user data elements. This is where I reference how the business just doesn't understand the IT or security side of the house. This is the norm in almost every organization, small or large. Having a security architect/engineer brought into the project from the beginning during the planning and initiation phases of the life cycle would ensure that mishaps like these won't occur as much. Conducting a security impact analysis on something like including a third-party service should have been conducted and security testing should have focused specifically on this piece. If it was, it most likely would have caught something like this. Since I wasn't in the room when this decision and others were made, this is only conjecture, however, I believe this was probably the case."
So, no, this isn't really about Starbucks. "This is the norm in almost every organization, small or large." Ask your team today about the level of security testing that they perform for their mobile apps. Not functionality testing, but true security testing. You just might save yourself from a very big headache.
Evan Schuman has covered IT issues for a lot longer than he'll ever admit. The founding editor of retail technology site StorefrontBacktalk, he's been a columnist for CBSNews.com, RetailWeek and eWeek. Evan can be reached at eschuman@thecontentfirm.com and he can be followed at twitter.com/eschuman. Look for his column every Tuesday.