Skip To Content
ADVERTISEMENT

References and Resources

Acknowledgments: L.T.V. acknowledges support from DARPA Young Faculty Award D19AP00036. D.G.S. acknowledges support from AFOSR/EOARD grant FA9550-19-1-7005. G.L.B. acknowledges support under NSF contract CCF-0926148 and Air Force contract FA8651-06-C-0127.

Image credits, p. 31. Clockwise from top left: T. Shahan / Getty Images; G. Horváth et al., in G. Horváth (ed.), Polarized Light and Polarization Vision in Animal Sciences, pp. 147–170, Springer (2014), published with permission of Springer Nature; A. Stöckl et al., Proc. R. Soc. B 289, 20220758 (2022), CC-BY 4.0; J.P. Kumar, in A. Singh and M. Kango-Singh (eds), Molecular Genetics of Axial Patterning, Growth and Disease in Drosophila Eye, pp. 97–120, Springer (2020), published with permission of Springer Nature.

Complete References and Resources

Insect nanostructures

D.G. Stavenga et al. “Photoreceptor spectral tuning by colorful, multilayered facet lenses in long-legged fly eyes (Dolichopodidae),” J. Comp. Physiol. A 203, 23 (2017).

D.G Stavenga et al. “Light on the moth-eye corneal nipple array of butterflies,” Proc. R. Soc. B 273, 661 (2005).

E.P. Ivanova et al. “Molecular organization of the nanoscale surface structures of the dragonfly Hemianax papuensis wing epicuticle,” PLOS ONE 8, e67893 (2013).

K. Lee et al. “Mesostructure of ordered corneal nano-nipple arrays: The role of 5–7 coordination defects,” Sci. Rep. 6, 28342 (2016).  doi:10.1038/srep28342

C. Bernhard et al. “Function of the corneal nipples in the compound eyes of insects,” Acta Physiol. Scand. 58, 381 (1963).

Neural circuits and small brains

J. Haag et al. “Fly motion vision is based on Reichardt detectors regardless of the signal-to-noise ratio,” Proc. Natl. Acad. Sci. U.S.A. 101, 16333 (2004).

H.S. Cheong et al. “Multi-regional circuits underlying visually guided decision-making in Drosophila,” Curr. Opin. Neurobiol. 65, 77 (2020).

H. Haberkern and V. Jayaraman. “Studying small brains to understand the building blocks of cognition,” Curr. Opin. Neurobiol., 37, 59 (2016).

R. Behnia and C. Desplan. “Visual circuits in flies: Beginning to see the whole picture,” Curr. Opin. Neurobiol. 34, 125 (2015).

E.J. Warrant. “The remarkable visual capacities of nocturnal insects: Vision at the limits with small eyes and tiny brains,” Philos. Trans. R. Soc. B 372, 20160063 (2017).

Insect polarization vision

T. Heinloth et al. “Insect responses to linearly polarized reflections: Orphan behaviors without neural circuits,” Front. Cell. Neurosci. 12, 50 (2018).

R. Schwind. “Evidence for true polarization vision based on a two-channel analyzer system in the eye of the water bug, Notonecta glauca,” J. Comp. Physiol. A, 154, 53 (1984).

A. Meglič et al. “Horsefly object-directed polarotaxis is mediated by a stochastically distributed ommatidial subtype in the ventral retina,” Proc. Natl. Acad. Sci. U.S.A. 116, 21843 (2019).

G. Horváth (ed.). Polarized Light and Polarization Vision in Animal Sciences, Springer (2014).

Insect spectral and coherent information processing

K. Hamdorf et al. “Insect visual pigment sensitive to ultraviolet light,” Nature 231, 458 (1971).

T.W. Cronin and M.J. Bok. “Photoreception and vision in the ultraviolet,” J. Exp. Biol. 219, 2790 (2016).

C.J. van der Kooi et al. “Evolution of insect color vision: From spectral sensitivity to visual ecology,” Annu. Rev. Entomol. 66, 435 (2021).

D.G. Stavenga. “Partial coherence and other optical delicacies of lepidopteran superposition eyes,” J. Exp. Biol. 209, 1904 (2006).

G.D. Bernard. “Evidence for visual function of corneal interference filters,” J. Insect Phvsiol. 17, 2287 (1971).

G. Belušič et al. “A cute and highly contrast-sensitive superposition eye—the diurnal owlfly Libelloides macaronius,” J. Exp. Biol. 216, 2081 (2013).

Insect polarization signal processing

T. Heinloth et al. “Insect responses to linearly polarized reflections: Orphan behaviors without neural circuits,” Front. Cell. Neurosci. 12, 50 (2018).

G. Belušič et al. “A cute and highly contrast-sensitive superposition eye—the diurnal owlfly Libelloides macaronius,” J. Exp. Biol. 216, 2081 (2013).

N.J. Marshall et al. “Polarisation signals: A new currency for communication,” J. Exp. Biol. 222, jeb134213 (2019).

R.A.R. Childers et al. “A hypothesis for robust polarization vision: An example from the Australian imperial blue butterfly, Jalmenus evagoras,” J. Exp. Biol. 226, jeb244515 (2023).

Visual acuity in insects

J. Kemppainen et al. “High-speed imaging of light-induced photoreceptor microsaccades in compound eyes,” Commun. Biol., 5, 203 (2022).

J. Theobald. “Insect flight: Navigating with smooth turns and quick saccades,” Curr. Biol. 27, R1125 (2017).

M. Juusola et al. “Microsaccadic sampling of moving image information provides Drosophila hyperacute vision,” eLife 6, e26117 (2017).

Engineered 3D and colorimetric imaging

J. Chang and G. Wetzstein. “Deep optics for monocular depth estimation and 3D object detection,” Proc. IEEE/CVF Intl. Conf. Comp. Vis. 2019, 10193 (2019).

Q. Guo et al. “Compact single-shot metalens depth sensors inspired by eyes of jumping spiders,” Proc. Natl. Acad. Sci. U.S.A. 116, 22959 (2019).

N. Antipa et al. “DiffuserCam: Lensless single-exposure 3D imaging,” Optica 5, 1 (2017).

V. Boominathan et al. “Recent advances in lensless imaging,” Optica 9, 1 (2022).

Polarimetric imaging

E. Arbabi et al. “Full-Stokes imaging polarimetry using dielectric metasurfaces,” ACS Photonics 5, 3132 (2018).

N.A. Rubin et al. “Matrix Fourier optics enables a compact full-Stokes polarization camera,” Science  365, eaax1839 (2019).

J. Feng et al. “Insect-inspired nanofibrous polyaniline multi-scale films for hybrid polarimetric imaging with scattered light,” Nanoscale Horiz. 7, 319 (2022).

X. Weng et al. “Non-line-of-sight full-Stokes polarimetric imaging with solution-processed metagratings and shallow neural networks,” ACS Photonics 10, 2570 (2023).

Y.Y. Schechner and N. Karpel. “Recovery of underwater visibility and structure by polarization analysis,” IEEE J. Ocean. Eng. 30, 570 (2005).

Encoded coherent signal processing

J.W. Goodman. Introduction to Fourier Optics, McGraw-Hill Physical and Quantum Electronics Series, W.H. Freeman (2005).

S. Divitt and L. Novotny. “Spatial coherence of sunlight and its implications for light management in photovoltaics,” Optica 2, 95 (2015).

G.S. Agarwal et al. “Coherence properties of sunlight,” Opt. Lett. 29, 459 (2004).

Y. Zhou et al. “Flat optics for image differentiation,” Nat. Photonics 14, 316 (2020) .

H. Mashaal et al. “First direct measurement of the spatial coherence of sunlight,” Opt. Lett. 37, 3516 (2012).

G. Wetzstein et al., “Inference in artificial intelligence with deep optics and photonics,” Nature 588, 39 (2020).

B. Muminov and L.T. Vuong. “Fourier optical preprocessing in lieu of deep learning,” Optica 7 1079 (2020).

Compressive sensing

E.J. Candes and M.B. Wakin. “An introduction to compressive sampling,” IEEE Signal Process. Mag. 25(2), 21 (Mar. 2008).

J. Feng et al. “Polarimetric compressed sensing from insect-inspired meso-ordered encoders,” submitted.

A. Liutkus et al. “Imaging with nature: Compressive imaging using a multiply scattering medium,” Sci. Rep. 4, 5552 (2014).

T. Wang et al. “Image sensing with multilayer nonlinear optical neural networks,” Nat. Photonics 17, 408 (2023).

B. Muminov and L.T. Vuong. “Vortex Fourier encoding for small-brain classification of MNIST digits with no hidden layers,” in Proc. SPIE Vol. 11388, Image Sensing Technologies: Materials, Devices, Systems, and Applications VII, 113880T, pp. 79–84 (2020).

Toward real-time applications for UAS

G.L. Barrows et al. “Biologically inspired visual sensing and flight control,” Aeronaut. J. 107, 159 (2003).

G.L. Barrows and C. Neely. “Mixed-mode VLSI optic flow sensors for in-flight control of a micro air vehicle,” in Proc. SPIE Vol. 4109, Critical Technologies for the Future of Computing, pp. 52–63 (2000).

L.F. Tammero and M.H. Dickinson. “The influence of visual landscape on the free flight behavior of the fruit fly Drosophila melanogaster,” J. Exp. Biol. 205, 327 (2002).

A.M. Hyslop and J.S. Humbert. “Autonomous navigation in three-dimensional urban environments using wide-field integration of optic flow,” J. Guid. Control Dyn. 33, 147 (2010).

F. Rodriguez et al. “Sequentially trained, shallow neural networks for real-time 3D odometry,” in Artificial Intelligence for Security and Defence Applications VII, SPIE, September 2023.

H.G. Krapp et al. “Dendritic structure and receptive-field organization of optic flow processing interneurons in the fly,” J. Neurophysiol. 79, 1902 (1998).

G.L. Barrows et al. “Vision based hover in place,” US Patent 20120197461A1, Aug. 02, 2012.

Add a Comment