For the first time, a team of researchers has managed to combine two technologies extensively used in applications such as bio-imaging, optical communications, and Light Detection and Ranging (LIDAR) systems that scan the surroundings of self-driving trucks and cars.
In a joint effort between the U.S. Department of Energy’s (DOE) Argonne National Laboratory and Harvard University, researchers effectively crafted a metasurface-based lens on a Micro-Electro-Mechanical System (MEMS) platform. The outcome is a new infrared light-focusing system that integrates the best features of both technologies while decreasing the size of the optical system.
Metasurfaces can be organized at the nanoscale to function like lenses. These metalenses were pioneered by Federico Capasso, Harvard’s Robert L. Wallace Professor of Applied Physics, and his group at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS). The lenses are quickly finding applications because they are a lot thinner and less bulky than current lenses, and can be equipped with the same technology used to fabricate computer chips. The MEMSs, meanwhile, are small mechanical devices that contain miniature, movable mirrors.
These devices are key today for many technologies. They have become technologically pervasive and have been adopted for everything from activating automobile airbags to the global positioning systems of smartphones.
Daniel Lopez, Nanofabrication & Devices Group Leader at Argonne’s Center for Nanoscale Materials, a DOE Office of Science User Facility
Lopez, Capasso and four co-authors explain how they fabricated and tested their new device in an article in APL Photonics, titled “Dynamic metasurface lens based on MEMS technology.” The device measures 900 µm in diameter and 10 µm in thickness (a human hair is roughly 50 µm thick).
The collaboration’s continuing work to further develop unique applications for the two technologies is conducted at Argonne’s Center for Nanoscale Materials, SEAS and the Harvard Center for Nanoscale Systems, which is part of the National Nanotechnology Coordinated Infrastructure.
In the technologically merged optical system, MEMS mirrors reflect scanned light, which the metalens then focuses without the need for an extra optical component such as a focusing lens. The challenge that the Argonne/Harvard team solved was to incorporate the two technologies without compromising their performance.
The ultimate goal would be to fabricate all parts of an optical system —the light source, the MEMS, and the metasurface-based optics — with the same technology used to manufacture electronics nowadays.
“Then, in principle, optical systems could be made as thin as credit cards,” Lopez said.
These lens-on-MEMS devices could progress the LIDAR systems used to direct self-driving cars. Present LIDAR systems, which scan for hurdles in their immediate vicinity, are, by contrast, several feet in diameter.
“You need specific, big, bulky lenses, and you need mechanical objects to move them around, which is slow and expensive,” said Lopez.
This first successful integration of metalenses and MEMS, made possible by their highly compatible technologies, will bring high speed and agility to optical systems, as well unprecedented functionalities.
This research also received support from the Air Force Office of Scientific Research, the National Science Foundation and the Singapore Agency for Science, Technology and Research’s National Science Scholarship.