New attacks secretly use smartphone cameras, speakers and microphones

Do you regard your smartphone cameras and speakers as a security threat? You might after checking out presentations from the 8th USENIX Workshop on Offensive Technologies (WOOT). If you were a target, you would neither see, nor hear the stealthy smartphone hacks happening.

Hear no evil, see no evil

Covert sound-based attacks using smartphone speakers and microphone

Our ears don’t hear ultrasonic sound, but speakers on our phones can produce those inaudible frequencies that can be exploited to exfiltrate data. A mobile device would first have to be infected with a Trojan, like from a tainted app, but even if the device is locked down so that data can’t be stolen over the network, covert sound-based attacks could still steal the data.

Luke Deshotels, from North Carolina State University, presented “Inaudible Sound as a Covert Channel in Mobile Devices” (pdf). It focuses on two proof-of-concept sound-based attacks that bypass Android security mechanisms; one using isolated sound and another using ultrasonic sound.

Unlike Bluetooth access, which is “listed as a dangerous permission on Android that users must explicitly allow,” ultrasonic sound “does not require permission and can emit sounds to anything that can hear them regardless of a pairing process.” The “ultrasonic sound can be received by a microphone on the same device or on another device.” The researchers “implemented an ultrasonic modem for Android and found that it could send signals up to 100 feet away.” If you think you might notice due to battery drain, the researchers noted that the transmission of the ultrasonic signal didn’t seem to use much power.

Short vibrations that can be felt but not heard were an example of isolated sound from Android devices. “These vibrations can be detected by the accelerometer, but they are not loud enough for humans to hear. If performed while the user is not holding the device, the vibrations will not be noticed.” Yet those short vibrations “can be detected by the accelerometer or microphone of the same device.” During an experiment to test isolated sound, the researchers used a Samsung Galaxy S4 as a transmitter running a vibration loop and a Google Nexus 7 as a receiver running an accelerometer monitor.

Using these sound-based attacks, the bit rate the researchers chose for distance experiments was “fairly low, but it is still sufficient for leaking sensitive data;” in fact, "IDs, social security numbers, credit card numbers, coordinates of locations visited, passwords, and more could be leaked in less than one minute."

They warned, “Data exfiltration via sound on mobile devices is a practical attack. The ranges supported by our implementation are more than sufficient for an attacker to stealthily record from an infected device. The maximum bitrates we recorded are also sufficient for sharing images and documents in intra-device attacks.”

The sound-based attacks could also be utilized by sensory malware that abuses the sensors of the infected device. Examples of such malware include PlaceRaider, malware that remotely exploits the Android camera and secretly snaps a picture every two seconds; Soundcomber, an Android app listens then steals credit card data; and TapPrints (pdf) that was 80 – 90% accurate for determining what was being typed on a smartphone or tablet.

“Many of the sensors and actuators on mobile devices are grossly underestimated in terms of their impact on security,” the research paper states. “No explicit permission is required to access the accelerometer despite the many potential ways to abuse it. The same can be said for the speakers and vibrator.”

As network security increases, and network connections are more closely monitored, the researchers said attackers will start using unconventional methods to steal data.

Covert attacks using smartphone cameras

Many places of employment deal with military contracts and a high level of secrecy, so employees are not allowed to have a camera in their phones. It must be removed by a technician and certified before the smartphone is allowed in the building. While it’s likely that is more about not being able to take photos, researchers demonstrated new security threats such as how front-facing and rear-facing smartphone cameras can be used to steal keystrokes and fingerprints.

Tobias Fiebig, Jan Krissler, and Ronny Hänsch from the Berlin University of Technology presented “Security Impact of High Resolution Smartphone Cameras” (pdf) at WOOT. Their attacks would get around any anti-malware and keylogging mechanisms such as a separate OS compartment on high-security smartphones.

Have you ever stopped to think of the front-facing camera on your phone as a keylogger? How far away do you hold your smartphone from your face/eyes? The researchers demonstrated how an “attacker can use reflections in the user’s face to perform keylogging with a smartphone’s front camera.” This attack works even on phones with wretched megapixel cameras; phone “cameras with only 2MP are already sufficient for corneal keylogging if the phone is held in not more than 30 centimeters (11.8 inches) distance. Cameras of 32MP even allow for keylogging operations if the phone is held at 60 cm (23.6 inches) distance.”

Let’s say your phone is facing down on your desk and you reach to pick it up. In the instant your finger touches the rear-facing camera, an attacker could nab your fingerprint. After that photo of your fingerprint has been extracted, it can then be used to create forgeries.

The researchers gave an attack scenario on a high profile target such as the secretary of defense who would have an encrypted phone and two different OS compartments, one for work, one for personal use.  An attacker could use the front-facing camera as a “facial reflection-based keylogger to extract the pin-code.” The rear-facing camera could grab the fingerprint needed for the secretary of defense’s biometric security code.

Even if the target notices the theft within 15 minutes and issues a remote wipe, it’s too late. The researchers suggested, “To hide their actions behind confusion, the attackers then use the forged fingerprints on a knife that is used in a murder. With the secretary of defense implicated in a crime, the whole incident goes unnoticed within the ensuing scandal.” While that might sound far-fetched, or like a plot from a movie, the attacks were not theory but actually tested by the researchers…just not on the secretary of defense.

They suggested capping the cameras with hardware lids to prevent attackers from stealing sensitive info and penetrating high-security environments.

Copyright © 2014 IDG Communications, Inc.

Bing’s AI chatbot came to work for me. I had to fire it.
Shop Tech Products at Amazon