12/09/2019 17:00
Apple's new iPhones shift smartphone camera battleground to AI
When Apple Inc (AAPL.O) introduced its triple-camera iPhone this week, marketing chief Phil Schiller waxed on about the device’s ability to create the perfect photograph by weaving it together with eight separate exposures captured before the main shot, a feat of “computational photography mad science.”
“When you press the shutter button it takes one long exposure, and then in just one second the neural engine analyzes the fused combination of long and short images, picking the best among them, selecting all the pixels, and pixel by pixel, going through 24 million pixels to optimize for detail and low noise,” Schiller said, describing a feature called “Deep Fusion” that will ship later this fall.
It was the kind of technical digression that, in years past, might have been reserved for design chief Jony Ive’s narration of a precision aluminum milling process to produce the iPhone’s clean lines. But in this case, Schiller, the company’s most enthusiastic photographer, was heaping his highest praise on custom silicon and artificial intelligence software.
The technology industry’s battleground for smartphone cameras has moved inside the phone, where sophisticated artificial intelligence software and special chips play a major role in how a phone’s photos look.
“Cameras and displays sell phones,” said Julie Ask, vice president and principal analyst at Forrester.
Apple added a third lens to the iPhone 11 Pro model, matching the three-camera setup of rivals like Samsung Electronics Co Ltd (005930.KS) and Huawei Technologies Co Ltd [HWT.UL], already a feature on their flagship models.