While the device still appears to be widely available across most markets, Apple this morning confirmed that first weekend iPhone pre-sales should break 10 million. I’ve been exploring the camera technology improvements inside the new iPhones, which arrive Sept. 25.
Equipped with more memory (2GB) and a 70 percent faster processor, the new iPhone 6S and 6S Plus also have new cameras that promise better detail and image accuracy than any previous iPhone.
Both the front and rear cameras have been improved with higher pixel counts, 12-megapixels at the front and 5-megapixels for the video-conferencing “FaceTime” camera.
What happens when you squeeze more pixels inside a camera sensor is that the individual light receptors need to shrink to fit the space. This makes them less sensitive and slightly less accurate – you may have lots of pixels but the images become worse. iPhones have used 8-megapixel cameras for years while Apple focused on optimizing the capabilities of those sensors, enabling them to deliver better pictures than higher-res cameras. To maintain that reputation, Apple has introduced a range of new technologies to optimize image capture in its new cameras:
Apple’s new smartphones boast larger iSight camera sensors. The size increase means they should focus faster than previous cameras could, and should reduce the number of image artifacts captured when taking a shot.
These cameras offer 50 percent more Focus Pixels which translates into much faster autofocusing. First introduced on the iPhone 6, Focus Pixels image sensor technology is more accurate and faster than the contrast detection autofocus older cameras used.
Apple has designed its own image signal processor. The company claims it has improved color accuracy by placing red, green and blue filters closer to the top of the pixels than before. This promises “truer colors and sharper, more detailed photos."
Apple’s use of deep trench isolation technology helps resolve a fundamental problem: Higher sensor resolutions demand that more camera pixels are forced inside the same space, creating noise, discoloration or pixelization of neighboring sensors when capturing a photo. Popular Mechanics has put together the best description of how deep trench isolation technology prevents this:
“In order to prevent leakage between photodiodes, they [Apple] etched literal trenches in between each one, then filled the trenches with insulating material that stops electric current. This way, light still comes into the circuits the same way it always does, but once it's in the photodiode, it has nowhere to go, making for a cleaner, clearer photo once it's processed.”
FaceTime HD camera
The FaceTime camera has been improved, from 1 to 5 megapixels while built-in face detection means the camera will focus on your face -- so you can focus on what you want to discuss.
To make a better picture, Retina Flash temporarily transforms the display into the flash, using a built-in display driver to make the display three times brighter than normal. It’s not just about bright light, of course – that would reflect off your face and appear simply awful to whoever you were chatting with on FaceTime. To avoid this Apple uses True Tone lighting technology, which creates a customized flash tone in response to ambient light conditions. Apple claims this customized flash is smart enough to match natural lighting, so your face should look good on camera.
Live Photos are a new-to-Apple technology that capture 1.5 seconds of motion before and after an image is captured. Then, when you look at the image, you can also review the moment, with audio. These single JPEGs consist of a series of images, said Canalys analyst, Daniel Matte. This means you will be able to use these images as personalized watch faces on Apple Watch and explore them on other iOS devices with iOS 9.1.
Optical image stabilization
Optical image stabilization for still images is available on only the iPhone 6S Plus; the same is true when capturing video.
High-definition video improves with support for 4K — with over 8 million pixels per frame. The new iPhones only take 8MP photos whilst shooting in 4K. If you plan to capture a lot of 4K video please note optical image stabilization in video is only available on iPhone 6s Plus.
Apple’s marketing team has published a gallery of images captured by professional photographers using the smartphones, with little or no other processing.
While Live Photo means the camera is recording when you’re in the Photos app, information isn’t saved until you take a picture, so you aren’t being spied on, as some have suggested.
Google+? If you use social media and happen to be a Google+ user, why not join AppleHolic's Kool Aid Corner community and join the conversation as we pursue the spirit of the New Model Apple?
Got a story? Drop me a line via Twitter or in comments below and let me know. I'd like it if you chose to follow me on Twitter so I can let you know when fresh items are published here first on Computerworld.