News of Google’s Pixel 4 camera is bursting everywhere. It’s basically a new and improved camera. The improvements have been explained by Professor Mark Levoy, who leads camera technology development at Google Research. Levoy stated that what’s most important to a good photo is first subject, then lighting and followed after that by your hardware (lens and camera body). He said that he and his team believe that there’s a different equation at play now, which replaces that camera body component with something else: software.
Levoy also stated that lens is important and that the Pixel 4 represents that with the addition of a telephoto lens to the existing wide-angle hardware lens it offers.
Levoy calls their approach a “software-defined camera,” which most of the time just means capturing multiple photos, and combining data from each in order to produce a better, single, final picture.
There are four new features for the Pixel 4 phone powered by computational photography, which include Live HDR with dual-exposure controls, which shows you a real-time image of what the final photo will look like with the HDR treatment applied, instead of just giving you a very different-looking final shot. It also bakes in exposure controls that allow you to adjust the highlights and shadows in the image on the fly, which is useful if you want bolder highlights or silhouettes from shadows, for instance.
There is also a “learning-based white balance” which tackles the issue of getting your white balance right and brings cooler colors to your photo. It also helps when the lighting is not so great.
The new wide-range portrait mode makes use of info from both the dual-pixel imaging sensors that Pixel 4 uses, as well as the new second lens to derive more depth data and provide an expanded, more accurate portrait mode to separate the subject from the background. It now works on large objects and portraits where the person in focus is standing farther back, and it provides better bokeh shape (the shape of the defocused elements in the background) and better definition of strands of hair and fur, which has always been tricky for software background blur.
Finally, Night Sight mode is improved and a new astral photography mode for capturing the night sky and star fields. Levoy also said that they plan to improve the camera over time via software updates, so this is just the start for Pixel 4.