(a) (b) Multiband Imaging in One Device: There is a growing demand for integrated imaging systems that can simultaneously capture both red-green-blue (RGB) visible light and near-infrared (NIR) wavelengths that contain range-finding, or depth-of-field, information. In medicine, for example, the ability to capture all of these wavelengths simultaneously with one compact device would make it easier and less time-consuming to identify and pinpoint a wide range of targets in different parts of the body, such as pathological lesions. Until now, however, trying to detect both RGB and NIR signals on the same chip would compromise either one or the other. Researchers at Olympus will detail how they used 3D wafer-stacking technology to integrate two separate CMOS imagers into one device, each optimized for either RGB or NIR through a careful balance of active silicon thickness and pixel size. The top imager is optimized for visible detection with an array of small pixels and a thinned 3µm active silicon layer. NIR signals pass through it to reach the bottom imager, which is optimized for NIR detection with an array of larger pixels and thick active silicon. The researchers say there is no degradation in color reproduction, sensitivity or resolution. Image (a) is a conceptual drawing of the multi-storied photodiode CMOS imager, and (b) is a scanning electron microscope image showing a crosssection of an actual device. The two images underneath were obtained by the image sensor. The color one at left is an visual light RGB image reconstructed from signals from photodiode arrays in the device’s top and bottom substrates, while the black-and-white one on the right is an infrared image obtained by using the signals from the bottom photodiode array. (Paper 30.1, Multi-Storied Photodiode CMOS Image Sensor for Multiband Imaging with 3D Technology; Y. Takemoto et al, Olympus)