Introduced in September 2019, the iPhone X is the most “advanced” of the ever presented “apple” smartphones.
Along with an updated design, frameless display and many other changes, the anniversary iPhone also received an innovative TrueDepth camera system, support for Slow Sync Flash technology, as well as a new dual main camera with improved sensors.
In this article, we both with King Billy will tell you what the new iPhone X cameras are and how they work.
- How the TrueDepth Camera System Works
- What improvements received the main camera module iPhone X
- What is Slow Sync Flash Technology
How the TrueDepth Camera System Works
TrueDepth is an innovative iPhone X camera system that includes a 7-megapixel front camera, an infrared camera, a point projector and an infrared emitter.
The point projector projects over 30,000 invisible points onto the user’s face to create his map. Subsequently, this ultra-precise card is used to authenticate the owner of the iPhone X.
The infrared camera reads the point structure of the user’s face and creates an infrared image based on the data received, which is sent to a special Secure Enclave module built into the A11 Bionic processor. The Secure Enclave module compares the user’s face with a previously saved card.
It is important to note that the comparison is performed in a special way. A fragment of the neural core of the microprocessor A11 Bionic converts an infrared image and a depth map into a mathematical representation, which is compared with the recorded face data.
An infrared emitter is responsible for recognizing the face of the owner of the iPhone X in all conditions, even in the dark. The emitter directs a beam of infrared light, also invisible, to the user’s face.
In addition to unlocking your smartphone and paying for goods in digital stores with a glance, TrueDepth can also be used in a number of other situations.
When you take a photo with the front camera, the iPhone X also uses TrueDepth, allowing you to get even better selfies even under difficult shooting conditions.
TrueDepth uses a variety of sensors to accurately determine facial contours and collects data in three dimensions. Thanks to these data, you can even more effectively use third-party applications that apply various masks and filters to your face.
For example, in the Snapchat application, you can select different masks that will move the way your face moves, and vary depending on your emotions.
What improvements has the main camera module iPhone X
iPhone X received a dual camera with a resolution of 12 megapixels, which consists of a module with a conventional lens and a module with a wide-angle lens (telephoto). Unlike cameras in the iPhone 8 and 8 Plus, the iPhone X uses other matrices to provide even higher quality photos and videos.
The main 12-megapixel camera with a conventional lens received an aperture of the f / 1.8 lens, which is enough for most situations, including night shooting. The 12-megapixel camera now offers an aperture lens with an f / 2.4 aperture value, although the aperture ratio of the iPhone 8 Plus is f / 2.8. The second major innovation is the presence of optical stabilization, which is clearly not enough in the iPhone 7 Plus and iPhone 8 Plus . Now, with a two-fold approximation, not only quality is not lost (due to less noise), but the risk of getting a blurry picture is also significantly reduced. It is the presence of an optical stabilizer in two photomodules at the same time that caused Apple to turn the camera from horizontal to vertical.
Another innovation in the iPhone X was the support for shooting smooth high-definition video in 4K resolution. From now on, users can shoot videos in 4K with a frame rate of 24 frames to 60 frames / sec. Moreover, Slow-mo video is now recorded in Full HD 1080p resolution at speeds up to 240 fps. The front camera can record video only in 1080p resolution at a speed of 30 fps.
For recording videos, Apple came up with a new standard called HEVK, thanks to which the manufacturer was able to reduce the size of the final file with the same quality.
What is Slow Sync Flash Technology
At the presentation of the iPhone X, Apple casually mentioned that the smartphone received a new technology for shooting Slow Sync Flash, thanks to which photos taken in the dark will become better.
To get started, let’s figure out what Slow Sync is. In fact, this is not a new technology and it has been used by other smartphone manufacturers for several years.
Each camera has a shutter between the lens and the sensor. Most of the time it is closed. His work can be compared with a century that covers the human eye and opens only to let in the light necessary to create a photograph.
By default, cameras shoot at high shutter speeds. Such a short “session” of work allows you to “freeze” the movement in the picture and as a result makes the photo more clear.
At a slow shutter speed, everything happens the other way around – it stays open longer, more light passes, the movements do not “freeze”. The main disadvantage of shooting in this mode is that due to the movement of objects or the shake of the photographer’s hands, the picture may be blurry.
However, if you know what to do, you can get a really interesting photo it the end. If you wish, you can set the shutter so that it does not close for hours on end and “absorbs” every particle of light, and combines all of its work into one shot. This technique is used when photographing the starry sky.
But what happens if you add a flash to this? There are two main methods for synchronizing with the flash: the front curtain and the rear curtain.
For example, the shutter opens for 0.5 seconds. The front curtain starts the flash immediately after the shutter opens and keeps it open for half a second. The back opens the shutter for half a second, and the flash turns on moments before the shutter closes.
Now add exposure. This setting determines how light or dark the picture will be. If you do not use the flash at night (for example, for landscape photography), the camera may correctly set the background. But if in the foreground you have a person or object, his exposure will be incorrect or too dark.
In order to see a person or object in the foreground, you must turn on the flash. In turn, the camera adjusts the exposure so that after the flash is turned off, the “subject” in the foreground is not overexposed.
The problem with this traditional flash technique is that the rest of the image gets darker and as a result no details are visible in the background.
Now let’s look at the Slow Sync shots taken on the iPhone X:
If the iPhone X camera “knows” that it will use the flash when shooting, it first darken the exposure. The shutter opens, but does not close in the blink of an eye, as it was before. The flash turns off before the shutter closes, thereby “freezing” the image.
Since the shutter has been open for a long period of time, it collects more light and sends it into the background. As a result, the background becomes brighter. Since the shutter collected more light from the person (or subject) in the foreground, the flash doesn’t need to be illuminated “to the maximum” —that is, it may not be so bright anymore. Due to this, a person or an object in the foreground will also not be “illuminated”, as it could be before.
iPhone X is the first Apple smartphone with a truly “smart” camera. Thanks to new modules, as well as TrueDepth and Slow Sync Flash technologies, photos taken on Phone X in most cases are clear, sharp and high-quality. An important innovation in the iPhone X was also support for shooting smooth high-definition video in 4K resolution. Now users can shoot 4K videos with a frame rate of 24 to 60 frames / sec. The new “apple” smartphone takes the best pictures with a flash in the history of Apple, so if you like to take pictures in the dark, this is another argument for buying new items…