It hasn’t been long since we saw 108MP camera sensor debut on the Xiaomi Mi Note 10 and more recently it appeared on the Samsung Galaxy S20 device. This is the highest camera sensor on smartphones at the moment and only a few devices have it but still, Samsung believes 108MP camera is not enough is going for a 600MP camera sensor which reportedly would be better than the human eyes.
According to scientist and photographer Dr Roger Clark, the human eye is capable of capturing up to 576 megapixels. That is huge when compared to the 108MP camera on the Mi Note 10 and Galaxy S20 device. A 600MP camera sensor in a smartphone may sound “crazy” but that is exactly what Samsung is up to.
According to Samsung, to create its latest 108MP image sensor, Samsung used its Nonacell technology, which increases the amount of light absorption that pixels are capable of. But for this, Samsung wants to further improve its pixel technology to a point where the company can make a 600MP sensor.
Yongin Park – Samsung’s EVP, Head of Sensor Business Team, System LSI Business stated.
Taking pictures or videos throughout the day has become part of our normal lifestyles and no longer done just to capture special events. Whip out your mobile camera to immortalize a delectable-looking meal, to record your latest dance moves, or even just when you’re having a good hair day, and you’re ready to share your images with friends right away. These seamless experiences have become possible thanks to remarkable advancements in recent mobile photography, and at the very heart of this revolution is the mobile chips that transform light into digital data – image sensors.
The image sensors we ourselves perceive the world through – our eyes – are said to match a resolution of around 500 megapixels (Mp). Compared to most DSLR cameras today that offer 40Mp resolution and flagship smartphones with 12Mp, we as an industry still have a long way to go to be able to match human perception capabilities.
Simply putting as many pixels as possible together into a sensor might seem like the easy fix, but this would result in a massive image sensor that takes over the entirety of a device. To fit millions of pixels in today’s smartphones that feature other cutting-edge specs like high screen-to-body ratios and slim designs, pixels inevitably have to shrink so that sensors can be as compact as possible.
On the flip side, smaller pixels can result in fuzzy or dull pictures, due to the smaller area that each pixel receives light information from. The impasse between the number of pixels a sensor has and pixels’ sizes has become a balancing act that requires solid technological prowess.
Cutting-Edge Pixel Technologies
Drawing from the technology leadership and experience our memory business possesses, Samsung has been managing to expertly navigate this balance in our image sensors. In May 2019, we were able to announce the industry’s first 64Mp sensor, and just six months later, brought 108Mp sensors to the market.
For our latest 108Mp image sensor, the ISOCELL Bright HM1, we implemented our proprietary ‘Nonacell technology,’ which dramatically increases the number of light absorption pixels is capable of. Compared to previous Tetracell technology which features a 2×2 array, the 3×3 pixel structure of Nonacell technology allows, for instance, nine 0.8μm pixels to function as one 2.4-μm pixel. This also mitigates the issue raised by low-light settings where light information is often scarce.
In 2019, Samsung was also the first to introduce image sensors based on 0.7μm pixels. The industry had considered 0.8μm as the smallest possible size pixels could be reduced to, but to our engineers, ‘technological limitations’ are just another challenge that motivates their innovation.
Sensors that Go Beyond Our Senses
Most cameras today can only take pictures that are visible to the human eye at wavelengths between 450 and 750 nanometers (nm). Sensors able to detect light wavelengths outside of that range are hard to come by, but their use can benefit a wide range of areas. For example, image sensors equipped for ultraviolet light perception can be used for diagnosing skin cancer by capturing pictures to showcase healthy cells and cancerous cells in different colours. Infrared image sensors can also be harnessed for more efficient quality control in agriculture and other industries. Somewhere in the future, we might even be able to have sensors that can see microbes not visible to the naked eye.
Not only are we developing image sensors, but we are also looking into other types of sensors that can register smells or tastes. Sensors that even go beyond human senses will soon become an integral part of our daily lives, and we are excited by the potential such sensors to have to make the invisible visible and help people by going beyond what our own senses are capable of.
Aiming for 600MP for All
To date, the major applications for image sensors have been in the smartphones field, but this is expected to expand soon into other rapidly-emerging fields such as autonomous vehicles, IoT and drones. Samsung is proud to have been leading the small-pixel, high-resolution sensor trend that will continue through 2020 and beyond, and is prepared to ride the next wave of technological innovation with a comprehensive product portfolio that addresses the diverse needs of device manufacturers. Through relentless innovation, we are determined to open up endless possibilities in pixel technologies that might even deliver image sensors that can capture more detail than the human eye.
Samsung believes that in the future, sensors may even be developed that can see microbes not visible to the naked eye. It is also looking into other types of sensors that can register smells or tastes.