When it comes to mobile, you pretty much have no privacy rights

Police are very persistent in trying to gain access to suspects' devices.

privacy info protect ts
Thinkstock

As law enforcement continues to battle for access to mobile devices, police are being advised to not even look at a suspect's phone. The idea is that a phone that authenticates via facial recognition could fail to unlock for the officer repeatedly and then default to password/PIN.

This advice, contained in a series of vendor slides accessed by Motherboard, refers to iPhone's security lockout, which kicks in after five failed biometric authentication attempts. On the one hand, this could be an issue with FaceID. Unlike finger scans, it's hard to determine when one facial-recognition ends and a second begins. If someone looks at the phone and looks away and looks again, does that constitute two attempts? What if the person just looks at the phone for a relatively long time? Will the phone eventually conclude this should constitute more than one failed authentication attempt?

On the other hand, this seems unlikely. With the phone locked, the detective isn't going to learn much by staring at the screen. Maybe he or she might glance again, but five times?

Note: The Motherboard story includes a BBC link to a story about British law enforcement officers waiting for a suspect to unlock the phone and then quickly grabbing it, being careful to continually swipe the screen so it won't time out and ask for reauthentication.

All of this is of interest because, when dealing with police, there have been suggestions that PINs/passwords cannot be compelled but biometric authentication can. The argument typically speaks to practicality, meaning that law enforcement can physically force a suspect's fingers onto a mobile device — or place the suspect's face in front of the screen — but officers can't force suspects to speak or type their password/PIN.

Even at the practicality level, I don't see this holding up. First off, sometimes law enforcement can get, how shall we say, hands on? Legal or not, with the ever-present threat of physical violence or prolonged detention, it's going to be a rare suspect who will stick with a refusal. Even if the suspect does consistently refuse, can the suspect be jailed until the password/PIN is revealed?

I find it unlikely that most law enforcement will ask for a PIN/password and, when told "no," will simply accept it with a shrug and a, "Oh well. I tried. Such is life." More likely, the phone will be confiscated and held for a remarkably long time.

What the U.S. Supreme Court has ruled on is that a warrant is needed to search a phone. But a warrant isn't especially hard to get, so we're back to the core question: Does law enforcement have the right to search a phone's contents, even with a warrant? The answer appears to be yes, so the distinction of PIN/password versus biometric authentication will not ultimately make much of a difference.

Still, common advice is to deactivate biometric authentication when going into an environment where this issue will likely crop up, such as attending a demonstration or going to the airport for a visit to another country. It's an easy enough piece of advice, but I doubt it will offer protection for very long.

One good aspect to one legal ruling — a U.S. Supreme Court case called Riley v. California — is that it tried to draw a distinction between data stored on the mobile device and data stored in the cloud.

It's worth quoting from the Supreme Court decision itself:

"Treating a cell phone as a container whose contents may be searched incident to an arrest is a bit strained as an initial matter. But the analogy crumbles entirely when a cell phone is used to access data located elsewhere, at the tap of a screen. That is what cell phones, with increasing frequency, are designed to do by taking advantage of cloud computing.  Cell phone users often may not know whether particular information is stored on the device or in the cloud, and it generally makes little difference. Moreover, the same type of data may be stored locally on the device for one user and in the cloud for another. The United States concedes that the search incident to arrest exception may not be stretched to cover a search of files accessed remotely — that is, a search of files stored in the cloud. Such a search would be like finding a key in a suspect's pocket and arguing that it allowed law enforcement to unlock and search a house. But officers searching a phone's data would not typically know whether the information they are viewing was stored locally at the time of the arrest or has been pulled from the cloud. Although the Government recognizes the problem, its proposed solutions are unclear. It suggests that officers could disconnect a phone from the network before searching the device — the very solution whose feasibility it contested with respect to the threat of remote wiping. Alternatively, the Government proposes that law enforcement agencies 'develop protocols to address' concerns raised by cloud computing. Probably a good idea, but the Founders did not fight a revolution to gain the right to government agency protocols."

This cloud issue is crucial, and it's still very much undecided. This is especially critical when the device at issue is a corporate-issued phone or, even more likely to be the case, a BYOD phone owned by the consumer but including extensive sensitive corporate files, whether partitioned or not.

Another possibility for protecting the phone is to indeed have two levels of phone. The first, accessed by a simple password, would be sanitized for all, showing innocuous messages. It would have enough innocuous messages and photos and audio files to look legitimate. But another password — a much longer password — would access the real data.

This operates on a similar premise to a security system's hostage alert, where the business operator or homeowner has two codes. Both codes shut off the alarm, but one silently sends a signal to police. This placates a burglar who orders the owner to deactivate the alarm.

One could envision an entire cottage industry, with companies creating elaborate fake environments to placate law enforcement (or a suspicious spouse) while in reality revealing nothing. Handset makers would need to get behind this approach, but I am guessing Apple and Google would have little resistance if it meant making their phones seem more attractive to consumers and businesses.

Alternatively, travel to the border with a throwaway phone. If they insist on a PIN, give it to them, since the phone seized will reveal little. But of course travelers on business will probably want to use their real phone.

To summarize, law enforcement can use brute-force attacks to try to guess your password, they can force you to open your phone biometrically, they can ask but not insist on a PIN/password, but they can seize your phone and keep you in detention until you provide it. And a judge could later insist on you revealing your PIN/password.

Bottom line: If there's something your company doesn't want to share with the government — local, state or federal — you better hope it’s never touched a mobile device. (Pause.) Yeah, that's what I thought.

Copyright © 2018 IDG Communications, Inc.

Bing’s AI chatbot came to work for me. I had to fire it.
Shop Tech Products at Amazon