If a hotel room has an Apple accessory docking station, would you plug in your iPhone? “Apple accessories, especially dock stations and alarm clocks become more and more popular. Nowadays, it is common to find such devices in hotel rooms,” wrote French security consultant and pentester Mathieu Renard. But can we really trust them? What if an alarm clock could silently jailbreak your iDevice while you sleep? “Wake up, Neo,” warned Renard. “Your phone got pwnd!”
At Hackito Ergo Sum 2013, an international security and hacking conference recently held in Paris, Renard presented iPown: Hacking Apple accessories to pwn iDevices. He started by looking at what an attacker would consider to be the most interesting Apple services before describing “how they can be exploited in order to retrieve confidential information or to deploy the evasi0n jailbreak.”
Some of those interesting Apple services have been exploited in the past. One example is how FinFisher spyware used an Apple iTunes flaw that “allowed government spying for three years.” Renard also looked at the diagnostics relay that allows paired devices to launch commands such as “Apple Support, Network, WiFi, System Configuration, VPN, UserDatabases” and others. “All the files returned are stored in clear text in a CPIO archive.” Renard added that UserDatabases will retrieve SMS, Contacts, Calendar, and email—all in clear text. “A malicious docking station could retrieve and inject SMS, call logs, application data, default preferences and data stored in the keychain.”
If the device is not jailbroken, then a malicious attacker could unlock the iDevice with a specially crafted MFi alarm clock. In theory, Apple accessories should be “safe” because Apple has MFi, or “Made for iPhone/iPod/iPad" – a licensing and certification program for electronic accessories that connect to iDevices. But with a list of hardware, including Raspberry Pi to power a docking station, Renard showed how to “Weaponize an Apple MFI accessory.” The payload could then be triggered by the alarm.
Renard concluded, “Apple made the choice of user experience instead of security. It is possible to build up a malicious device in order to get both the data and the control of iDevices. Don’t connect your device to an untrusted dock station.”
iSniff GPS for iStalking
Now for some other not-too-happy Apple news . . . iPhones and iPads collect data about wireless access points, even if users don’t connect to them, so Apple can use this “crowd-sourced” data for its location services. Although the data is not supposed to be public, the proof-of-concept tool iSniff GPS gets around that and could potentially make stalking easy.
Australian security researcher Hubert Seiwert said that by using his free iSniff GPS Python app, “You can send Apple a single Mac address of a Wi-Fi router and they will send back a result set including the GPS coordinates of that Mac address and about 400 others.” Seiwert told SC Magazine, “This could be used to locate where people live.”
Forensic analysis on encrypted iDevices
Apple has been swamped with so many law enforcement requests to decrypt seized iPhones that the company created a “waiting list,” reported CNET. The waiting time is about seven weeks. Other documents seemed to make it unclear “if Apple has created a backdoor for police,” but an Ars investigation pointed to “no.” However, Apple “holds the key to encrypted iCloud data on its own servers—if law enforcement sent the appropriate subpoena, the company can easily decrypt your cloud-stored data and send it off to the authorities.”
Coulds Apple “security” checks make identity theft easy?
Lastly, you know those suspicious emails asking for financial or personal data that scream phishing? The Register reported that Apple has been sending them out, asking some customers who want a new “shiny iThing,” to verify their identity first. This process is part of Apple's spot security checks. “The Apple Online Store's Terms and Conditions state that Apple reserves the right to verify the identity of the genuine credit card holder by requesting appropriate documentation.” Apple’s email says to:
Please scan a copy or take a photo of the following documentation in jpeg format and email it to email@example.com:
1. Card holder's Drivers license or National Identity Card or Passport and
2. Recent Credit Card / Bank Statement showing card holder name, address and card number.
One customer was told by the police that this was definitely a phishing email. Her bank had never heard of a private company asking for such personal data via email. Yet she found out the request was legitimately from Apple when Apple replied to her email, complaining that her passport scan must be in color.
While this may serve to verify the identity of the credit card holder, calling it a security check is over the top. It also provides enough private and detailed information that, if the unencrypted email is intercepted, it could allow an attacker to potentially steal the Apple customer’s identity.