On paper almost everything that is inside the iPhone XS, launched on Wednesday, can also be found inside the iPhone X, which is last year's phone. In terms of looks and design, the two phones are near identical. Or rather 99 per cent identical. The differences are so minute, like three holes on the bottom edge instead of six in the iPhone X, that they don't matter. The screen is almost the same. Under the hood there are some bigger changes. For example, the iPhone XS comes with a more powerful A12 Bionic processor. But then the iPhone X is hardly a slow phone. And both phones run the iOS 12. So where is the new stuff? It's primarily in the camera: the iPhone finally has a camera that seems like a winner.
Let's get real here. The iPhone has a lot of goodness inside it, and over the years there have been a number of areas in which Apple has left its Android competitors behind. But there has also been an area in which it has lagged: this is the camera performance of the iPhones. For the last three odd years, since the days of the Nexus 6P, HTC 10 and the Galaxy S7, it is the companies like Google, HTC and Samsung that are setting the benchmarks when it comes to cameras in phones. A lot of that has happened because these companies moved to bigger image sensors in their cameras. Until now, Apple resisted, largely because it optimised the sensors it had in the iPhones very well.
This year it changes. Apple has finally moved to a much bigger image sensor in the iPhone XS and the iPhone XS Max. In the iPhone X, the company used a 12-megapixel image sensor that 1.22um pixels. In the new iPhones the company moves to a new sensor which too clicks 12-megapixel images but now has the pixels that measure 1.4um. In other words, the sensor in the iPhone XS and the iPhone XS is physically bigger. Its size is now more or less comparable to the size of image sensor in the phones like the Google Pixel 2 and the HTC U12, which too use a 12-megapixel image sensor that have pixel size of 1.4um.
The increase in the image sensor size is going to bring some significant improvements to the iPhone camera performance. The bigger image sensors and the bigger pixels capture more data and more light. The result is that it is easier to click photos that have better dynamic range, better details, less noise, less grain in low light pictures with cameras that have bigger image sensors.
Low light photography is one area where the iPhone 8 and the iPhone X struggle against the likes of the Pixel 2 and the Samsung Galaxy S9. The iPhone XS should fix this problem.
In addition to the bigger image sensor, Apple also seemingly using a few other tricks to improve the performance of the iPhone XS camera. Both the iPhone XS and the iPhone XS Max come with optical image stabilisation on their two rear cameras. And they likely have better lenses - even though the number of elements remain same at 6 and aperture is still F1.8 and F2.4 - that will make them less susceptible to flare.
Then there are improvements in the software. One of the reasons why Google Pixel 2 clicks such wonderful photos is its HDR+ mode, which is always on. Since the time of Nexus 6P, Google has used this HDR+ technology where cameras in its phones click multiple photos at different exposure when a user is shooting scene. These multiple photos are then combined to create one picture that has better amount of detail and higher dynamic range. Professional photographers know this technology as bracketing but Google brings it to masses and using its smart algorithms makes it seamless.
Apple is introducing something similar in the iPhone XS, iPhone XS Max and the iPhone XR. The company calls the technology Smart HDR. It says that with additional processing power offered by the A12 Bionic processor, it can offer Smart HDR that will be fast - happens within a second - and seamless.
The idea is again the same: combine images taken at multiple exposure to create a photo that is just so much more vibrant and full of details.
The idea is again the same: combine images taken at multiple exposure to create a photo that is just so much more vibrant and full of details.
Finally, there is Apple's "bokeh" talk. Almost all high-end phones nowadays come with portrait mode that blurs the background. In a DSLR camera, the blurred background is created with the laws of physics: the focus point and where the light is hitting. In smartphones, blurred backgrounds are created using algorithms.
Apple is now saying that even though algorithms can do a nice job, the effect is unreal, not natural. To bring the natural look to the blurred background - think those nice round colourful lights in many night scenes - in photos clicked with the iPhone XS, the company studied expensive DSLR cameras and ultra expensive lenses. When Apple executives introduced the iPhone XS at the "Gather Round" event, no names were dropped about these lenses. But the impression that went out was that Apple engineers and imaging group really took apart those $5000 Zeiss and Leica lenses and then worked to integrate the way these lenses function into algorithms so that they can create the same effect in photos clicked with the iPhone XS.
ALSO READ: iPhone XS, iPhone XS Max and iPhone XR quick review: Shiny, expensive and for Apple fans
As I often write, the proof of the pudding is in eating. In the coming days we hope to try the iPhone XS camera in detail. But at least on the paper, Apple seems to be doing much more this time around to improve the camera performance of the iPhone. And about time too because in the last couple of years the iPhone has fallen slightly behind the Pixels and the Galaxys when it comes to clicking sunset or a street lit in night in rain.
No comments:
Post a Comment