Mobile security forces difficult questions

As governments consider COVID-19 contact tracing and its privacy implications, it's not a bad idea for companies to take the opportunity to look more closely at their mobile agreements with employees.

A mobile phone displays an alert: 'Help Flatten the Curve' / COVID-19 coronavirus pandemic
Erik Mclean

As governments consider COVID-19 contact tracing and its privacy implications, it's not a bad idea for companies to take the opportunity to look more closely at their mobile agreements with employees. (By the way, just this week, Apple rolled out its latest iOS update, which included two COVID-19 updates, according to Apple: "iOS 13.5 speeds up access to the passcode field on devices with Face ID when you are wearing a face mask and introduces the Exposure Notification API to support COVID-19 contact tracing apps from public health authorities.")

Today, IT has to deal with pretty much one of two mobile scenarios: BYOD. where the employee uses the employee's personally owned device to perform enterprise business; and company-owned phones, which is the opposite: A company-owned phone where the employee, even if told not to, will use the phone for personal matters as well as business.

When it comes to security, compliance and what IT or Security have the right to do, neither is demonstrably better, unless you're willing to put rights and restrictions in writing and — this is the hard part — enforce them.

The biggest worry for either modes involves remote wipe. When a device is suspected to have been stolen, remote wipe needs to happen, to reduce the chance of enterprise data being stolen or an attack being waged. That question becomes difficult when the device is owned by the employee. Does the enterprise have the right to wipe it and permanently delete any personal data, images, messages, videos, etc.?

We'll get back to BYOD deletions in a moment. But for corporate devices, the deletion would seem to be much easier. And yet, it's not. Many companies encourage employees to not use the corporate mobile device for anything other than work, but few put it in writing and stress that the company may have to obliterate everything on the phone in the case of a perceived security emergency — and insist that it be signed before the phone is distributed.

Tanya Forsheit is an attorney and the chair of the privacy and date security group at law firm Frankfurt Kurnit Klein & Selz. Forsheit argues that it's "not realistic to use a company device for only business" but that companies are hesitant to directly tell employees what would happen if they did save personal information on the phone: 1) it could be deleted, and 2) it could be seen by colleagues in IT, Security, Telecom or other departments.

"A lot of companies don't want to say that and they haven't taken the chance to update their policies," Forsheit said in a Computerworld interview. "There is often no contract at all that says, 'If you use a personal phone, these are the rules.'"

This goes beyond remote wipe. What about whitelisting and blacklisting apps? Can an enterprise even legitimately make that request on a device owned by the employee? The answer, "Yes, if anything could hurt and risk access to my corporate data," is not necessarily going to work. It's a much easier argument to make for a corporate-owned phone, but what can a company do when employees download risky apps on a corporate device anyway? Fine them? Take away the phone? Terminate their employment? Don't threaten it unless you're prepared to do it and stick with it.

Forsheit maintains that partitions are often not the answer: "Partitions work somewhat, not completely, and that is often because of human error."

The risk with company-owned devices and improper behavior is not legal — which tends to grant full rights to a device owner — but is retention. And that speaks to punitive actions. If your company fires someone because they downloaded a non-whitelisted game app, will that lead to rebellion? And if you don't punish, why would anyone abide by your inconvenient rules?

Executives "don't want to upset the culture," Forsheit said.

There are no easy answers here, but you need to think through what you want your rules and enforcement actions to be. And you need to do it now, before an incident happens.

Copyright © 2020 IDG Communications, Inc.

It’s time to break the ChatGPT habit
Shop Tech Products at Amazon