Skip To Content
ADVERTISEMENT

Metasurfaces Expand a Camera’s Vision

Two researchers holding a metasurface

Xingjie Ni (left) with Zhiwen Liu displays a camera sensor integrated with a 3-mm × 3-mm metasurface. The metasurface turns a conventional camera into a hyperspectro-polarimetric camera. [Image: Kate Myers, Penn State]

All light carries spectral and polarization data that human eyes cannot register. Efforts to detect those data along with intensity usually require bulky and expensive equipment.

Now, a team at a US university has designed a metasurface element that attaches to a standard camera and simultaneously captures intensity, hyperspectral and full-Stokes polarization information of near-infrared light (Sci. Adv., doi: 10.1126/sciadv.adp5192). Paired with the metasurface encoder is a machine-learning neural network decoder that recovers all the image data in real time.

Metasurfaces to the rescue

According to the researchers from the Pennsylvania State University, conventional imaging systems that capture spectral and polarization data contain diffractive optics, stacked layers of filters or linear micropolarizer arrays. Those devices detect only linear polarization states over a few wavelength channels. Some researchers have tried to use a metasurface lens or rotating metasurface filters to analyze the incoming light, but these attempts revealed other issues with the methods.

The Penn State metasurface has three different layers of grid organization. At the smallest scale are “meta-atoms,” which lead author Xingjie Ni describes as “the fundamental building block or unit cell of the metasurface.”

“These smallest repeating structures, when arranged in specific patterns, determine the metasurface's overall optical properties,” Ni adds. The team selected 100 patterns from an existing meta-atom library, with an eye toward the patterns’ differing responses to polarization and wavelength. “The more distinct the responses, the better the resolution and the less crosstalk we achieve,” Ni says. Ultimately, the researchers chose “split-ring” and “split-door” designs, which reveal sharp spectral changes from Fano resonances and clearly distinguish between circular polarizations.

Next, the researchers arranged the designs into 100 arrays of meta-atoms, each 9 μm × 9 μm, and then placed the arrays into 90 μm × 90 μm “superpixels.” Finally, the team assembled a grid of 20 × 20 superpixels to make the metasurface. The entire collection was created by etching amorphous silicon on a substrate of indium–tin–oxide glass.

Sorting out the information

The metasurfaces encode the spectral and polarization data into the intensity data that a standard camera picks up, but how does it all get decoded? The team designed training and validation datasets and then taught the machine-learning model with them.

The sequence of images shows that at the shorter wavelengths, the left- and right-circularly polarized images of the beetle’s striped back look different, but the difference between the two polarizations fades out at 890 nm.

As a final demonstration, the researchers performed “hyperspectro-polarization imaging” on a scarab beetle under 750-nm laser illumination. The sequence of images shows that at the shorter wavelengths, the left- and right-circularly polarized images of the beetle’s striped back look different, but the difference between the two polarizations fades out at 890 nm.

“Currently, we are developing other metasurface encoders to capture additional aspects of the light field using standard cameras,” Ni says. “For instance, we are working on a metasurface that can encode the temporal information of incoming light. Before our prototype becomes fully practical, there are still technical challenges to address, such as achieving repeatable integration of the metasurface encoder with the camera sensor and reducing manufacturing costs.”

Possible future applications of the technology, according to Ni, include biomedical imaging and precision agriculture. In the latter, spectral and polarization data from crop imaging could help assess plants’ needs for irrigation and fertilizers.

Publish Date: 11 September 2024

Add a Comment