Is it true that the camera of the new iPhone shoots worse than Android smartphones

Iphones

For many smartphone camera has long replaced amateur and even professional devices for shooting. And this is logical, because the smartphone is always with you, and you do not need to constantly carry additional equipment. But due to size limitations, professional optics cannot be installed in it, and the shortcomings of the modules have to be compensated for by software processing algorithms. Not a single mobile camera can do without it, so you can call them mandatory in every smartphone.

But the algorithms that Apple has put into its cameras often do not improve, but, on the contrary, spoil the final pictures. Let’s see how iPhone processing pictures compared to the Google Pixel, and see what the Cupertino company needs to do to bring back the title of the best mobile camera to the iPhone.

Photo editing on iPhone

Recently we talked about a blind test, which Marques Brownlee spent on a separate site. And new iPhone 14 Pro according to the results of the voting, it even lost to the budget Pixel 6a. Well, where is the brand new firmware from Apple and the cheap Pixel? It seems that smartphones are at different poles. At least for the price, this is true. As a result, Brownlee even recorded a separate video in which he tries to figure out how the algorithms on the iPhone change the resulting picture.

Naturally, one software is not enough to get a good photo. You also need good optics. The iPhone has no problem with this. The camera system that Apple puts into the iPhone 14 Pro is one of the best on the market. Therefore, you should not doubt the possibilities of optics. But there are questions about how the lack of a large sensor is compensated.

After the camera takes a picture, algorithms and reduce noise level to smooth the image, adjust the white balance and increase brightness to display more details in low light conditions. In recent years, many companies, including Apple, have taken this processing to a whole new level. For example, the introduction of Smart HDR into the iPhone made it possible to combine several photos with different settings into one and get a bright and high-quality picture.

At the same time, the person does not participate in the selection of these photographs. Everything is at the mercy of automation. But with a strong intervention of algorithms, the final gluing may turn out to be completely different from what you are counting on when taking a picture. It is this problem that arises when using the camera on iPhone.

You are unlikely to encounter any difficulties on a clear sunny day or just in good light. But if you are trying to shoot a scene with a lot of different colors, textures and, in addition, in twilight or darkness, then there may already be problems. Algorithms must clearly determine which part of the photo can be stretched and changed colors or in which direction to adjust the white balance.

How to shoot an iPhone

This is how it works for Google smartphones using the company’s own algorithms. Just check out the sample photos below. See how the iPhone highlighted the outline of the hood on the jacket and what it did with the sky. On photos from Pixel 7 Pro it looks much more realistic, without any overexposure and incomprehensible color palette.

Or take a look at this couple of photos. To be honest, if I only saw shot from iPhone 14 Pro, then I would decide that the person is sick. So much the iPhone’s camera tinted it yellow. Plus to this Apple smartphone I also decided to emphasize all the features of the face and highlighted all the wrinkles and flaws of the face with a very high quality.

For years, iPhone users have been making fun of the owners. smartphones on Android for the fact that the photos on them looked too artificial. With the advent of Smart HDR, the reverse process occurs, and in many ways it is the pictures from the iPhone that look unnatural. I think that many iPhone owners should disable Smart HDR and use those photographs that are obtained without it. Standard algorithms without any bells and whistles will suffice for most users.

You can, of course, download the Halide app and take pictures with it in RAW format without any processing and then set all the settings yourself. But this is not convenient for everyone. Such a photo definitely needs to be finalized in the editor, and this takes time. Hope built in ProRAW on iPhone Pro will make clean photos not worth it, even they are subjected to a little processing. This method has one big disadvantage – pictures in this format take up much more space. And if your smartphone runs out or just has limited memory, then this can become a serious problem.

In fact, I personally do not fully understand how such a famous blogger got such terrible photos on iPhone. It feels like it was shot not on the iPhone 14 Pro, but on the iPhone 4. It seems to me that even I with my iPhone XR will take better pictures. It is possible that the matter is in crooked hands, or maybe I don’t understand something in photography. In any case, I would like Apple to rework the photo processing algorithms to get more realistic shots. I really hope that this will be implemented in the iPhone 15, and iPhone camera once again become the best mobile camera on the market. After all, the future iOS 17 makes me think about switching from Android to iPhone and I would like to get a better camera.

Rate article
( No ratings yet )
Apple Gear News