Image sensors, especially ones with small pixels like in smartphones, feature an on-chip micro lens array to help guide all the arriving light to the sensor’s photo diode. Right now, each pixel gets its own lens, but Sony has developed a way for four adjacent pixels to share a lens.
This is aimed at Quad Bayer sensors where the four pixels already share a color filter. Differences in the sensitivity of pixels prevented such designs, but Sony says it has figured it out and calls its technology 2×2 On-Chip Lens (OCL).
The main advantage of this design seems to be improved autofocus performance, especially in the dark. Phase Detection Autofocus relies on focus pixels whose output is not part of the final image – which means they have to be spaced out, reducing the coverage.
With 2×2 OCL, all pixels can be used as part of the focus detection system. This also solves a related issue where low horizontal detail negatively affects the AF system’s performance.
Also, keep in mind that focus pixels experience noise just like their image-capturing neighbors, which makes calculating the right focus tricky.
Here’s a quick demo of how the new tech in action:
Conventional sensor:
Sony’s 2×2 OCL sensor:
Sony also claims that this 2×2 lens system improves the light gathering efficiency when pixel binning is used.
It’s not clear how long it will take for this new technology to make its way into smartphones, but things have been moving very quickly recently (we went from 48MP through 64MP to 108MP resolution in under a year).
Comments