Gadgetory


All Cool Mind-blowing Gadgets You Love in One Place

Pixel 3/3 XL: How The Camera Works

2018-10-15
so Google's been leveraging a lot of computational photography a lot of machine learning and AI to make their smartphone cameras better and we saw this in last year's pixel to and pixel to excel and we're seeing even more of it this year with the pixel 3 and pixel 3 Excel and there are a handful of new features in the pixel 3 that make this camera better than last year's camera so let's talk about what those features are how they work and how you can benefit from them now before I dive into all these new features you're going to notice one very common theme among all of them and it's the fact that they all require multiple photos and then leverage machine learning to achieve the desired result and one of the news features that Google talked about during their press event was called top shot and top shot isn't necessarily new we actually saw Sony do something very similar to this with predictive capture but basically what this does is it will take a series of photos and then pick a photo that it thinks looks best and there's a bunch of different criteria that it looks for when it does this like if somebody's blinking or if they're not looking directly up the camera or if they're smiling and that's how it goes about selecting the recommended photo but you also have the option of picking any of the other photos that it took if you want and the other benefit of top shot is if you don't happen to press the shutter button at the exact right moment you might have still captured the moment that you want it anyways because it is taking a series of photos instead of just one the next feature that Google made a pretty big deal about is called super resume and super resume is Google's way of making a 2x zoom without having a secondary telephoto zoom lens and super resolume takes advantage of these small tiny movements in your hand when you're taking a photo and it'll take a series of photos all of which are ever so slightly different and then merges them down uses machine learning to create an image that is much more detailed and a lot sharper when you're zoomed in versus the standard digital zoom that you would get on other smartphone cameras and I think it's going to be pretty interesting to see how super resin stacks up to other smartphones that actually have a telephoto zoom lens and the cool thing about super resume is that if you have it stabilized on something like a tripod it will still use machine learning to create and mimic your natural hand movements in order to capture more detail when you're zoomed in so with the pixel 3 and pixel 3 Excel Google's also adding a new feature called night sight and night sight at the time of the recording of this video isn't available on the pixel 3 just yet but what this is gonna allow you to do is take better low-light and nighttime photography and the way this works is it sacrifices that 9 frame buffer and that zero shutter lag and instead will require you to hold the phone steady while it takes up to a maximum of 15 frames or 15 photos and then it merges those photos together to create an image that has the equivalent of a five-second exposure and you're probably wondering if you have to hold the phone steady well the image come out blurry and theoretically the answer is no because Google's merging algorithm is really smart and it's able to discard anything that has motion blur or anything that it doesn't need if there's any unnecessary movement and you should still get a photo that is very sharp very detailed and most importantly very well exposed the next feature that I want to talk about is portrait mode and portrait mode isn't new to the pixel we saw this on the pixel 2 and pixel to excel but Google's made one very significant change on the pixel 3 and pixel 3 excel that makes portrait mode even better and without getting into a super long-winded explanation the pixel 2 used a stereo pair of images and would leverage the split pixels from the dual pixel sensor to create two images that were ever so slightly different and this was how Google would mimic the effect of having two lenses with just one lens and then it would use those two images to create a depth map to properly separate the foreground from the background well with the pixel 3 Google is using what's called a learning-based algorithm and the benefit to this is you're gonna get much more accurate def mapping and better separation of the foreground in the background and also better background defocusing so let's take a look at this example for a second the pixel 2 is on the left and the pixel 3 is on the right and you'll notice right away that the pixel 3 does a much better job of sensing depth and how far away the background is from the foreground whereas the pixel - is sort of aggressively blurring everything out as if the background was much farther away from the subject and some of the subjects that are relatively on the same plane as the helmet are also being blurred out when they really shouldn't be and if you zoom in and look at my desk you'll notice a very aggressive cutoff between in focus and out of focus on the pixel - whereas on the pixel 3 the in focus - out of focus is much more gradual the last feature that I want to mention is computational Raw and I don't think this is a feature that Google actually mentioned during their press event but the pixel 3 and pixel 3 excel can take raw photos now and if you've ever edited photos in a program like Lightroom or Photoshop you'll know how much more beneficial shooting in RAW can be because you can push the colors a lot more you can push the highlights and black levels without the image falling apart like you would with a compressed JPEG and the way that it does this is it combines up to 15 images merges them together and this has a few benefits one you get better low-light performance 2 you get better dynamic range and 3 you get an image that is much closer to what you would get on a DSLR even though you're working with a really tiny sensor so this is a very big deal especially if you take photography very seriously or you take smartphone photography very seriously you're gonna be able to edit these and make them look however you want and you're also going to be able to access them from pretty much anywhere because it's available on your Google Photos app so that's it thank you for watching this video I hope you all found it helpful and enjoyed it if you did give it a thumbs up and subscribe to the channel down below and of course keep it tuned here to Android Authority for more videos like this because we are your source for all things Android
We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.