On Pixel 2 and Pixel 2 XL, Google uses machine learning to analyze the image and create a segmentation mask to separate the subject from the background. The result is imperfect images that pack powerful emotional characteristics. In low light, the custom disc-blur can look gorgeous and the noise seems deliberately pushed away from a mechanical pattern and into an artistic grain. It's generally consistent with how it applies the blur effect but can be far too soft around the edges. It appears as though the iPhone's camera system is allowing highlights to blow out in an effort to preserve skin tones. In practice, Apple's Portrait Mode looks overly "warm" to me. That way, it's non-destructive and you can toggle depth mode on or off at any time. So, closer background elements can receive less blur than background elements that are further away.Īpple can display the portrait mode effect live during capture, and stores depth data as part of the HEIF (high-efficiency image format) or stuffs it into the header for JPG images. ![]() ![]() Because of the layers, it can apply the custom disc-blur to lesser and greater degrees depending on the depth data. (It was 9 layers on iOS 10, it may be more by now, including foreground and background layers.) It then uses machine learning to separate the subject and apply a custom disc-blur to the background and foreground layers. On iPhone X, like iPhone 8 Plus and iPhone 7 Plus, Apple uses a dual-lens camera system to capture both the image and a layered depth map.
0 Comments
Leave a Reply. |