Jimmy Westenberg / Android Authority
I firmly believe that for most users, the camera is the backbone of the smartphone upgrade cycle. Better performance is always good, but even with mid-range hardware, performance metrics are no longer the key to purchasing decisions. On the other hand, imaging provides the most obvious year-on-year improvement. Since the launch of the first Pixel, Google has been focusing on photography.
The irony is that although the popularity of its smartphones mainly depends on imaging capabilities, Google’s hardware development on the front of the camera has been unexpectedly slow.
Did you know that the Pixel series has been using the same camera sensor since the Pixel 3 was launched in 2018? The sensor is no different from the previous Pixel 2. Or take Pixel 5 as an example. It finally added an ultra-wide sensor, but did not include a table stake like a telephoto sensor. On the contrary, Google insists on using its software-based Super Res Zoom technology, which is effective to a certain extent, but cannot be compared with true optical zoom. Elsewhere, the company chose a telephoto lens on the Pixel 4 the previous year, but chose not to include an ultra-wide-angle sensor, which you absolutely cannot replicate with software.
The Pixel series is a classic case of Google building consumer products with engineer thinking.
Google’s imaging strategy and the overall strategy of smartphones are diametrically opposed to the specifications that almost all other OEMs promote in the Android field. For most of the past four years, Google has dig out the best part it can do from Pixel’s camera sensors in almost an Apple-like manner, and built consumer products with the mentality of an engineer. In addition, even Apple chose to use hardware solutions instead of reinventing the wheel.
All this will change with Pixel 6 and Pixel 6 Pro, which is a very exciting prospect.
Why Google is behind the camera curve
Robert Triggs/Android Authority
Let’s start with the obvious-it’s obvious that Google has been pushing the aging IMX363 sensor to the limit. Our own tests revealed how far the Pixel 5 lags behind the competition. From the HDR noise to the zoom capability, to the lackluster ultra-wide-angle camera, there are some things that even software cannot overcome.
read more: Google Pixel 5 camera and the best Android camera phone comparison
You can blame Marc Levoy, the former head of camera, for this aversion to change. In an interview released around Pixel 5, Levoy said that he does not believe that pixel binning and the resulting increase in the signal-to-noise ratio of high-resolution sensors will bring tangible improvements in imaging. This may be correct in 2019, but it has since been proven wrong by mobile phones that use these sensors extensively.
Although few people can match Google’s software capabilities, improvements in sensors enable competitors to overcome many hardware limitations. Huawei has been the first to use RYYB sensors to achieve night vision-like features, while Sony has used the expertise of its camera department to improve color science. Other companies such as OnePlus have chosen to cooperate with traditional camera manufacturers such as Hasselblad to enhance their competitiveness.
Where Google’s software compensates for hardware defects, the camera sensor has caught up and surpassed its capabilities.
Elsewhere, BBK Group has invested heavily in imaging, and phones like Oppo Find X3 contain a large number of camera sensors to cover any possible use cases. Xiaomi has also joined the competition. The Mi 11 Ultra is one of the best-equipped camera flagships around, not only in terms of hardware, but also because of its excellent camera adjustments.
Where Google is one mile ahead, it is now at best keeping pace with its competitors and lagging behind in more than one respect.
A new sensor provides Google’s software with the hardware it needs
If Google’s thinking process behind the Pixel series shows us one thing, it is that the company is not interested in competing for a quarter mile at a time. Instead, it prefers to take a big step forward and perfect the hardware. With Levoy no longer at the helm, Google seems to have realized the mistakes in its early thinking.
The upgraded camera sensor is exactly what the Pixel 6 series needs to enhance the gaming experience, and it is exactly what it gets. Don’t get me wrong, Google’s software prowess continues to ensure that Pixel phones are some of the best camera phones. This software pushes the hardware to the best performance, but we already know that Google’s imaging algorithm shines on high-end hardware.
read more: What to expect from the Google Pixel 6 camera
The port of the Google Camera app already exists on phones with sensors, and the results are convincing. As Google chooses the latest sensors, the already excellent software can maximize the advantages of years of hardware advancement. Google’s commitment to realize artificial intelligence and machine learning into the future through the Tensor chipset will only further improve this. However, it goes far beyond the tangible but expected upgrade of image quality.
Newer sensors will bring basic enhancements, such as faster focusing speeds. Although Google has not revealed exactly which camera sensor it will use, the latest Android 12 beta leak indicates that the Pixel 6 series may use the Samsung ISOCELL GN1 sensor as its main wide-angle camera. It is worth noting that this is not Samsung’s latest sensor, but Isocell GN2, which can not only use the complete sensor reading to detect phase changes, but also compare pixels in multiple directions to help focus. Remember, this is still Google, so it’s no surprise that we may not see the latest technology. We do our best.
In any case, GN1 can provide many things. For example, Google’s astrophotography mode can already capture images of stars. Increasing the pixel size through pixel binning can greatly increase the amount of light falling on the sensor. Increased light capture means you should be able to achieve similar results in less time-maybe even a handheld device.
Or what about night vision? In low light conditions, the exposure time of Pixel 5 can range from three to five seconds, and sometimes even longer. Increasing the sensitivity can reduce the exposure time required, and combined with Google’s software technology, you can get near-instant low-light photos with the same fidelity that makes the Pixel a popular choice for low-light photography.
related: The best camera phone you can get
The natural depth of field of the high-resolution sensor combined with the excellent portrait mode algorithm should theoretically produce a more natural bokeh. The 4x optical zoom included in the Pixel 6 Pro, combined with Super Res Zoom enhancements and high-resolution sensors, can achieve seamless zooming from 1x to far beyond the limit of optical zoom without any visual quality degradation throughout.
Speaking of zoom, Pixel 6 Pro at last A standard triple sensor with the above-mentioned telephoto and ultra-wide-angle lenses. This has been common in almost all recent high-end flagship products for a few years, but Google has wobbly between zoom and ultra-wide-angle photography between the Pixel 4 and Pixel 5 generations. Pixel 6 may miss the telephoto camera, but at least the top products have covered all the basics.
Custom tensor chips take advantage of Google’s software
Robert Triggs/Android Authority
Hardware is not the only area where the Pixel 6 series is popular: its legendary computational photography has also received some upgrades.
From the beginning, the entire photography experience of Pixel was full of the advantages of AI. From HDR+ on the original Pixel to the Pixel 2 visual computing core that accelerates HDR processing. The tight AI integration in Google’s custom Tensor chip should take it to a new level to achieve more features.
Synthesizing and processing up to 10 high-resolution images can put the processor in trouble. Is the specially designed chip to do this? Not that much. Imagine a continuous shooting mode that can capture each individual image with the same fidelity as a single HDR+ lens.
In fact, the Pixel 6 demo seems to hint at a very interesting future. For example, a demo talked about using data from a secondary camera to sharpen blurred faces. The same technique can also be extended to better object removal.
The upgraded camera hardware finally puts the Pixel series on a level playing field with competitors.
Video has never been the strong point of the Pixel series, but the upgraded sensor should help bring the Pixel closer to shape in terms of quality and functionality. It took several years for Google to join the 4K/60fps capture trend. Slow motion capture is still limited to full HD 120fps. However, newer sensors and processors can push it all the way to 480fps, and the startup quality is better.
At the same time, Samsung, Xiaomi, etc. have already promoted 8K video shooting. Do you need 8K footage? maybe not. But the possibility of down-sampling 8K material to 4K for better details and colors is exciting, and switching to a faster ISP combined with the Tensor chipset can realize this future.
Or how does HDR video inject the same software magic to make Pixel’s still camera so special?
Pixel cameras have always been reliable-Pixel 6 makes it exciting again
Robert Triggs/Android Authority
I am excited about the possibilities here. Since Google did something new with the Pixel series, this has been a hot spot, no, I haven’t counted the failed Soli experiments. The Pixel 6 Pro has been completely redesigned both externally and internally, just like the fresh air that Google’s hardware needs to work, and so is its camera. It remains to be seen how fancy neural networks will make a difference, but upgraded camera hardware alone can put the Pixel 6 Pro at a level that competes with competitors. This will be of great help in describing camera innovations in the coming years.
Are you excited about the Pixel 6 series, or is the camera in your current phone sufficient for your needs? Let us know in the comments.