Vertical Divider
Sony Embeds AI in Camera Module
May 17, 2020
Sony unveiled the IMX500 and IMX501, two 12.3-megapixel sensors with onboard AI processing chips, designed to handle "light" machine learning tasks — like recognizing if a stray dog or cat enters your backyard — on their own, without sending any video to the cloud or another system. Instead, they can deliver anonymous metadata pings to alert you about what they've seen.
Figure 1: Sony’s Intelligent Visual Sensor
May 17, 2020
Sony unveiled the IMX500 and IMX501, two 12.3-megapixel sensors with onboard AI processing chips, designed to handle "light" machine learning tasks — like recognizing if a stray dog or cat enters your backyard — on their own, without sending any video to the cloud or another system. Instead, they can deliver anonymous metadata pings to alert you about what they've seen.
Figure 1: Sony’s Intelligent Visual Sensor
Mark Hanson, Sony's VP of Technology and Business Innovation said it could make cameras significantly more useful. It's a way to ensure better privacy, since the sensors can handle AI tasks entirely on-device.
Onboard AI also means the information is processed in real-time. A camera at the front of a store could count the number of people entering and detect the presence of a gun. Another camera could keep track of item stock on store shelves, while others could monitor the flow of foot traffic to determine shopping "hot spots."
These image sensors won't replace the work of sophisticated cloud-based machine learning tools, which can go far beyond just recognizing objects. They're a step towards a future where cameras can function more like self-contained computers.
And they can also assist more complex computer vision systems. For example, in a cashier-less store like Amazon's, where tons of video data is being captured and processed by a slew of cameras, Sony's sensors could detect exactly where customers are located. Samples for the IMX500 have already been sent out to Sony's partners, while the IMX501, which is packaged to fit onto devices like smart boards, will start shipping in June. The new cameras will start shipping by the end of the year and throughout 2021.
The LiDAR camera of the new iPad Pro is the first ever consumer CMOS Image Sensor (CIS) product with in-pixel connection – and a single photon avalanche diode (SPAD) array. There was a space race for this new generation of 3D sensing chip, but most believed ST Microelectronics would be the first producer.
Figure 2: Single-Photon Avalanche Diode (SDPA) Technology
Onboard AI also means the information is processed in real-time. A camera at the front of a store could count the number of people entering and detect the presence of a gun. Another camera could keep track of item stock on store shelves, while others could monitor the flow of foot traffic to determine shopping "hot spots."
These image sensors won't replace the work of sophisticated cloud-based machine learning tools, which can go far beyond just recognizing objects. They're a step towards a future where cameras can function more like self-contained computers.
And they can also assist more complex computer vision systems. For example, in a cashier-less store like Amazon's, where tons of video data is being captured and processed by a slew of cameras, Sony's sensors could detect exactly where customers are located. Samples for the IMX500 have already been sent out to Sony's partners, while the IMX501, which is packaged to fit onto devices like smart boards, will start shipping in June. The new cameras will start shipping by the end of the year and throughout 2021.
The LiDAR camera of the new iPad Pro is the first ever consumer CMOS Image Sensor (CIS) product with in-pixel connection – and a single photon avalanche diode (SPAD) array. There was a space race for this new generation of 3D sensing chip, but most believed ST Microelectronics would be the first producer.
Figure 2: Single-Photon Avalanche Diode (SDPA) Technology
Source: Yole Development
ST Microelectronics was chosen for the TrueDepth camera in the iPhone X. The SPAD array, or direct Time-of-Flight (dToF), camera was on the roadmap of all CIS players. ST Microelectronics had the lead and was shipping millions of SPAD detectors, until Sony’s technology entered the race. In three years’, time Sony’s process engineers brought to reality a product that have long lasting consequences in the consumer world but also in industrial robotics, automotive, in short, all sensing markets.
A year ago, when Sony renamed its semiconductor division “Imaging & Sensing”, the sensing part was limited to Industrial Machine Vision image sensors, which remained a high-end niche market. Then it made two separate moves. The first was the supply of iToF sensors to Huawei and Samsung, generating $300M in 2019. The second was a design win of dToF sensors for Apple iPads, which could eventually end up in iPhones. Sony’s sensing revenues will probably exceed $1B in 2020 out of a business just surpassing the $10B landmark. This successful transition from imaging to sensing has been instrumental and will be a building block for the prosperous future of the division.
Figure 3: Pixel Size Roadmap
Source: Yole Development
ST Microelectronics’ ams division is still doing well, but the dToF array would have required more capital beyond what ST Microelectronics has already invested in the CIS business. The Apple–Sony move will open up opportunities for ST Microelectronics as ST Micro could launch an unexpected mission of its own. In terms of technology innovation, the LiDAR module of the new iPad Pro is as packed as the Apollo program’s pioneering Eagle lander. In a small footprint it is a complex optical module, holding the sensor, a Vertical Cavity Surface Emitting Laser (VCSEL) and a VCSEL driver.
Figure 4: Apple iPad Pro LiDAR Cross Section
Figure 4: Apple iPad Pro LiDAR Cross Section
Source: SystemPlus Consulting
The Sony’s 30K resolution, 10µm pixel size sensor is using dToF technology with a SPAD array. The in-pixel connection is realized between the CIS and the logic wafer with hybrid bonding Direct Bonding Interconnect technology, which is the first time Sony is using 3D stacking for its ToF sensors. Deep trench isolation, trenches filled with metal, completely isolate the pixels.
Figure 5: Apple iPad LiDAR 3D Sensor Cross-Section
Figure 5: Apple iPad LiDAR 3D Sensor Cross-Section
Source: SystemPlus Consulting
Apart from the CIS from Sony, the LiDAR is equipped with a VCSEL from Lumentum. The laser is designed to have multiple electrodes connected separately to the emitter array.
Figure 6: Apple iPad Pro LiDAR VCSEL
Source: SystemPlus Consulting
A wafer level chip scale packaging (WLCSP), five-side molded driver integrated circuit from Texas Instruments generates the pulse and drives the VCSEL power and beam shape. Finally, a diffractive optical element is assembled on top of the VCSEL to generate a dot pattern.
Contact Us
|
Barry Young
|