Google's Pixel 4 and 4 XL mark the first time Google used dual main cameras in a smartphone, across both Pixel and Nexus lineups. In the latest Google AI Blog post, Google explains how it improved depth sensing on the dual cameras, as well as how it improved distance perception, which is needed to know what needs to be blurred out. Aside from just using the second camera, Google also uses the second camera's autofocus system to improve depth estimation to more closely match with the look of natural bokeh from an SLR camera. With the Pixel 2 and Pixel 3, Google split each pixel...
Source: https://www.gsmarena.com/google_outlines_how_the_pixel_4s_dual_cameras_capture_depth_in_portrait_photos-news-40597.php