Source: USPTO
Apple's U.S. Patent No. 8,497,897 for "Image capture using luminance and chrominance sensors" describes a unique multi-sensor camera system that can be used in portable devices like the iPhone.
The main thrust of the patent is to combine three separate images generated by one luminance sensor disposed between two chrominance sensors. Each sensor has a "lens train," or lens assembly, in front of it that directs light toward the sensor surface. The document notes that the sensors can be disposed on a single circuit board, or separated.
Important to system's functionality is sensor layout. In most embodiments, the luminance sensor is flanked on two sides by the chrominance sensors. This positioning allows the camera to compare information sourced from three generated images. For example, an image processing module can take raw data from the three sensors, comprising luminance, color, and other data, to form a composite color picture. The resulting photo would be of higher quality than a system using a single unified sensor.
To execute an effective comparison of the two chrominance sensor images, a stereo map is created so that differences, or redundancies, can be measured. Depending on the situation and system setup (filters, pixel count, etc.), the stereo map is processed and combined with data from the luminance sensor to create an accurate scene representation.
Source: USPTO
The stereo map also solves a "blind spot" issue that arises when using three sensors with three lens trains. The patent offers the example of an object in the foreground obscuring an object in the background (as seen in the first illustration). Depending on the scene, color information may be non-existent for one sensor, which would negatively affect a photo's resolution.
To overcome this inherent flaw, one embodiment proposes the two chrominance sensors be offset so that their blind regions do not overlap. If a nearby object creates a blind region for a first sensor, the offset will allow for the image processor to replace compromised image data with information from a second sensor.
Further, the image processor can use the stereo disparity map created from data generated by the two chrominance sensor images to compensate for distortion.
Other embodiments call for varied resolutions or lens configurations for the chrominance and luminance sensors, including larger apertures, different filters, or modified image data collection. These features could enhance low-light picture taking, for example, by compensating for lack of luminance with information provided by a modified chrominance sensor. Here, as with the above embodiments, the image processor is required to compile data from all three sensors.
While Apple is unlikely to implement the three-sensor camera tech anytime soon, a future iPhone could theoretically carry such a platform.
Apple's luminance and chrominance sensor patent was first filed for in 2010 and credits David S. Gere as its inventor.
Data source: AppleInsider (By Mikey Campbell)
Post a Comment