A recent study published in Nature proposes an integrated scanning high-speed imaging sensor named a meta-imaging sensor to accomplish aberration-corrected high-speed 3-D photography for all applications without requiring any hardware modifications.
Planar digital image sensors provide widespread applications in a variety of fields. The pixel count of digital images has increased rapidly in recent years. However, spatially non-uniform optical aberrations resulting from flawed lenses or outside disturbances severely restrict the practical performance of imaging systems.
The new meta-imaging sensor takes extra-fine 4-D distributions through a micro-lens array rather than directly detecting a 2-D intensity projection. This allows for the flexible and accurate formation of complex-modulated images during post-processing.
The meta-imaging sensor takes photos up to one gigapixel in size using just a single spherical lens, which results in orders-of-magnitude lower system costs and optical imaging costs. The meta-imaging sensor provides aberration correction of a thousand arc seconds without compromising the acquisition speed.
Optical Constraint of Two-dimensional Imaging Sensors
Two-dimensional imaging sensors have transformed numerous industries, including mobile technology, autonomous vehicles, industrial inspection, medical diagnostics, surveillance, biology, and astronomy.
The pixels in digital sensors have increased significantly over the past ten years thanks to the explosive growth of the semiconductor industry. However, rather than being constrained by electronics, most imaging systems now operate at a practical bottleneck caused by optics.
For instance, an imaging system's effective pixel count is typically constrained to the megapixel range even with a gigapixel sensor. This is due to optical aberrations caused by faulty lenses or outside influences that cause light emitted from a single point to scatter over a large area on a 2D sensor.
Various light field freedoms, including local coherence and depth, are lost when three-dimensional (3D) images are projected onto a 2D plane. As a result, getting high-accuracy depth maps with an integrated sensor becomes challenging.
Limitations of Current Optical Engineering Techniques for Aberration Correction
Optical engineering experts have been developing ideal imaging systems that use many precisely constructed lenses in a sequential mode to correct aberrations for hundreds of years. The space-bandwidth product expresses the degrees of freedom for an optical system.
It establishes an upper limit on the effective pixels due to diffraction constraints. However, it also causes the difficulty of optical design and manufacturing to rise exponentially.
Free-form optics and metalenses can solve this issue by producing optimal lens surfaces if given enough machining precision on a large scale. Picture deblurring algorithms increase image contrast by accurately estimating the point spread function. Picture deblurring algorithms with an implicit aperture preserve more data by lowering the nulls in the frequency domain.
However, recovering the high-frequency data lost by a low modulation transfer is extremely challenging, and these methods typically call for particular data for spatially non-uniform aberrations. All optical engineering techniques for aberration correction are susceptible to dynamic environmental aberrations, even with shallow depths of field.
Adaptive Optics for Active Aberration Corrections
Adaptive optics can actively compensate for aberrations by using a spatial light modulator or deformable mirror array to route the light beams emanating from one place to the same location on the sensor at various angles. Aberrated wave-fronts can be monitored using a wavefront sensor by particular assessment measures.
The effective field of vision (FOV) of the existing adaptive optics technique is quite narrow because of spatially non-uniform aberrations. The FOV of adaptive optics is restricted by atmospheric turbulence to a diameter of around 40 arcseconds, which is insufficient for a large synoptic survey telescope.
The design of portable devices or lightweight systems is challenging because of the complexity, cost, and mass of most adaptive optics systems today.
Development of Meta-Imaging Sensor for Aberration-Corrected 3D Imaging
Wu et al. presented an integrated scanning high-speed imaging framework, called a meta-imaging sensor, containing software and hardware to provide aberration-corrected high-speed 3-D imaging at a reasonable price.
The meta-imaging sensor enables the formation and measurements of the light field at high-speed with a pulsating coded micro-lens array. The micro-lens arrays are significantly more accurate than conventional light-field techniques.
The researchers used digital adaptive optics on an integrated sensor and produced high-performance 3D imaging with multisite aberration correction. They created an optical flow-based correction algorithm to avoid motion artifacts and preserve the imaging speed by making use of spatiotemporal continuity (up to the camera frame rate).
They also performed a quantitative study using many applications in industrial inspection, photography, video surveillance, autonomous driving, and astronomy to determine the capabilities of the meta-imaging sensor.
Potential of Purposed Meta-Imaging Sensor
The meta-imaging sensor offers a resolution improvement that is almost ten times better, particularly in situations with severe non-uniform aberrations. The meta-imaging sensor enables multisite aberration corrections on an 80-cm-aperture telescope with a diameter of over 1,000 arcseconds, opening the door to high-resolution ground-based synoptic surveys.
Simultaneous millisecond-scale acquisition of megapixel depth maps can be done with more precision and resolution than current light-field cameras for various industrial applications.
The suggested meta-imaging sensor opens new frontiers for computational imaging in useable, general applications. By combining the flexibility of precise optical modulations for incoherent light digitally, it delivers orders-of-magnitude superiority, unreachable for conventional 2D sensors.
Wu, J., Guo, Y., Deng, C., Zhang, A., Qiao, H., Lu, Z., Xie, J., Fang, L., & Dai, Q. (2022). An integrated imaging sensor for aberration-corrected 3D photography. Nature, 1–10. https://www.nature.com/articles/s41586-022-05306-8