0% found this document useful (0 votes)
7 views10 pages

nc 小型复眼

Uploaded by

1519872966
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views10 pages

nc 小型复眼

Uploaded by

1519872966
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

nature communications

Article https://doi.org/10.1038/s41467-022-33072-8

Miniature optoelectronic compound eye


camera

Received: 10 November 2021 Zhi-Yong Hu1, Yong-Lai Zhang 1 , Chong Pan2, Jian-Yu Dou2, Zhen-Ze Li1,
Zhen-Nan Tian 1, Jiang-Wei Mao1, Qi-Dai Chen1 & Hong-Bo Sun 1,3
Accepted: 31 August 2022

Inspired by insect compound eyes (CEs) that feature unique optical schemes
Check for updates for imaging, there has recently been growing interest in developing optoe-
lectronic CE cameras with comparable size and functions. However, con-
1234567890():,;
1234567890():,;

sidering the mismatch between the complex 3D configuration of CEs and the
planar nature of available imaging sensors, it is currently challenging to reach
this end. Here, we report a paradigm in miniature optoelectronic integrated CE
camera by manufacturing polymer CEs with 19~160 logarithmic profile
ommatidia via femtosecond laser two-photon polymerization. In contrast to μ-
CEs with spherical ommatidia that suffer from defocusing problems, the as-
obtained μ-CEs with logarithmic ommatidia permit direct integration with a
commercial CMOS detector, because the depth-of-field and focus range of all
the logarithmic ommatidia are significantly increased. The optoelectronic
integrated μ-CE camera enables large field-of-view imaging (90°), spatial
position identification and sensitive trajectory monitoring of moving targets.
Moreover, the miniature μ-CE camera can be integrated with a microfluidic
chip and serves as an on-chip camera for real-time microorganisms monitor-
ing. The insect-scale optoelectronic μ-CE camera provides a practical route for
integrating well-developed planar imaging sensors with complex micro-optics
elements, holding great promise for cutting-edge applications in endoscopy
and robot vision.

Natural compound eyes (CEs) of insects are advanced and complex In the past decade, great efforts have been devoted to the
imaging systems that consist of numerous closely packed and hemi- development of artificial CEs through bionic manufacturing. Planar
spherically distributed ommatidia. Each ommatidium works as an CEs cameras were first implemented by integrating a microlens array
independent photosensitive unit and cooperates with each other as a (MLA) with commercial CCD/CMOS detectors, by which high-
whole to realize prey recognition and enemy defense. Notably, insect resolution imaging is realized, whereas the FOV is limited due to the
CEs feature small size, distortion-free imaging, wide field-of-view planar structure12. To achieve large FOV, three-dimensional (3D) CEs
(FOV), and sensitive motion tracking ability1–4, which inspires the with insect-like structures have been successfully prepared with the
innovation of artificial CEs, aiming to overcome the limitations of help of advanced micro-nano fabrication technologies. For instance,
existing imaging technologies and promote their performance in Lee et al. prepared a bionic CE that is anatomically as well as func-
medical endoscopy, panoramic imaging, micro navigation and robot tionally close to natural CEs by integrating curved MLAs, polymer
vision5–11. cones, and waveguide cores fabricated via microlens templating,

1
State Key Laboratory of Integrated Optoelectronics, College of Electronic Science and Engineering, Jilin University, 2699 Qianjin Street, 130012
Changchun, China. 2Fluid Mechanics Key Laboratory of Ministry of Education, Institute of Fluid Mechanics, Beihang University, 100191 Beijing, China. 3State
Key Laboratory of Precision Measurement Technology and Instruments, Department of Precision Instrument, Tsinghua University, Haidian District, 100084
Beijing, China. e-mail: [email protected]

Nature Communications | (2022)13:5634 1


Article https://doi.org/10.1038/s41467-022-33072-8

reconfigurable microtemplating, and self-writing in a photoresist, In this paper, we report a bionic μ-CE camera with an integrated
respectively13,14. Besides, 3D miniature CEs with hundreds of ommati- optoelectronic system that enables large-FOV imaging and real-time
dia have also been fabricated via femtosecond laser additive/sub- 3D monitoring of microorganisms. To overcome the defocusing pro-
tractive manufacturing15,16. However, these CEs can only functionalize blem, we fabricate a polymer CE of special surface profiles with a
as a unique 3D MLA, their imaging performance is usually evaluated feature size similar to mosquito CEs via femtosecond laser two-photon
with the image acquisition system of a microscope by continuously polymerization (FL-TPP). Especially, the profile of each ommatidium is
tuning the image distance of ommatidia at different positions. The lack designed following a logarithmic function. In this way, the depth-of-
of integrated photodetectors significantly limits their portability and filed and focus range of ommatidia can be significantly increased; and
real-time monitoring ability. the as-obtained μ-CE can be suitably integrated with a commercial
To develop an optoelectronic integrated CE system, independent CMOS detector (OV9734), forming an optoelectronic integrated μ-CE
ommatidium (microlens) can be attached to individual photo- camera.
detectors. The as-formed camera array with a curved surface dis-
tribution can work together and functionalize as an optoelectronic CE. Results
Typically, Floreano et al. prepared an artificial CE camera (2.2 cm3 and Design principle of an optoelectronic μ-CE camera
1.75 g) by cutting and assembling hard microlens and photodetectors, Natural CE systems provide the inspiration for developing advanced
achieving an enhanced FOV in a single direction17. Rogers et al. com- imaging technologies. For instance, the CEs of a dragonfly consist of
bined a flexible lens array with a deformable hemispherical silicon thousands of sophisticated, closely packed, and 3D distributed
photodiode array, forming a CE camera (14.72 mm in size) with a FOV ommatidia that consist of facet lenses, crystalline cones, rhabdoms,
as high as 140–180° 2,3. Since each ommatidium only contributes a pixel and photoreceptor neural networks underneath (Fig. 1a). The coop-
for imaging, their imaging resolution is relatively low, and the total size eration of ommatidia with different orientations enables large-FOV
is much larger than that of an insect CE. The above-mentioned pioneer and distortion-free imaging, as well as sensitive prey/enemy detection.
works lay a solid foundation for developing advanced CE cameras. However, in the case of artificial CE cameras, we have to combine
Nevertheless, further miniaturization of the whole CE system becomes micro-optic elements of complex 3D configurations (e.g., MLA dis-
challenging due to the incompatibility of complex CE and available tributed on a hemispherical dome) with digital photodetectors. Espe-
imaging sensors. At present, reports on optoelectronic integrated CE cially, when the overall size of CEs is close to that of insect ones, all
(μ-CE) cameras with a feature size comparable to insect ones are ommatidia need to share a single image sensor. In this case, the inte-
still rare. gration of optical and electrical components becomes very tricky,

Fig. 1 | Design and fabrication of the optoelectronic CE camera. a Photograph of software; The incident light of the ommatidium was set as a plane wave. e The
a dragonfly; the insets are the microscopic image of its CE and the schematic schematic diagram for the fabrication of CEs using FL-TPP; the inset is a photograph
diagram of the organism underneath. b Schematic illustration of the defocusing of an as-prepared μ-CE and the head of a mosquito. f The schematic diagram for the
issue in optoelectronic integration; there is a mismatch between the curved image optoelectronic integration; the inset is the photograph of an optoelectronic inte-
surface and the planar image sensor. c, d Simulated light field of CEs with spherical grated μ-CE camera.
ommatidia (c) and logarithmic ommatidia (d) via COMSOL Multiphysics simulation

Nature Communications | (2022)13:5634 2


Article https://doi.org/10.1038/s41467-022-33072-8

since the planar detector cannot match the curved image plane. And the surface profile function of the logarithmic lens can be
Consequently, photodetectors cannot receive all images formed by described as:
the ommatidia, which results in defocusing effect (Fig. 1b). The mis-
match between the curved image plane and the planar photodetector d2  d1
a= D 2 ð3Þ
consists of the main bottleneck for developing optoelectronic μ-CE
2
cameras. In this regard, it is crucial to design the optical and electrical
components as a whole and make the CEs more suitable for com-  h i1 
1   2π
mercial planar detectors. φLL ðr, θÞ = 
2 2
ln 2a a2 r 4 + 1 + 2ad 1 r 2 + d 1 + 2a2 r 2 + 1 + 2ad 1 * ð4Þ
2a λ
For microlens, the surface profile is the decisive factor that gov-
erns its optical characteristics. Nevertheless, to the best of our
knowledge, most of the ommatidia of CEs resort to a simple spherical φLL ðr, θÞ λ
H LL ðr, θÞ = * ð5Þ
profile. Generally, to correct the optical aberrations (including sphe- nL  n0 2π
rical aberration, field curvature, coma, astigmatism, and distortion) of
traditional spherical lenses, various aspheric lenses (paraboloid, where r is radius coordinates, θ, azimuth coordinates, D, lens diameter,
hyperboloid, conical, and free-form surfaces) have been designed and nL and n0, lens refractive index, and environmental refractive index,
prepared. Furthermore, to improve the correction effect of off-axis respectively. R, curvature radius, fSL, the focal length of the spherical
optical aberrations and achieve uniform imaging with a large FOV, lens, d1 and d2, and the start and end points of the logarithmic lens’s
aspheric-based multi-lens systems, and multi-aperture imaging sys- focal lines, respectively. To compare the focusing range and depth-of-
tems are ideal solutions18–22. The logarithmic lens, a special aspherical field of these two lenses, a spherical ommatidium (D = 50 μm,
lens, can produce a focal line with uniform intensity distribution, fSL = 355 μm) and a logarithmic ommatidium (D = 50 μm, d1 = 100 μm,
thereby effectively expanding the focus range and depth of field23. In d2 = 800 μm) were prepared (Fig. 2a, b). With reasonably high 3D
this work, we design the μ-CE using logarithmic lenses as ommatidia processing precision, FL-TPP can guarantee the high fidelity of their
instead of traditional spherical lenses. To make a direct comparison profiles, in which the experimental profiles match well with the
between CEs with spherical ommatidia and with logarithmic ones, we theoretical functions. In the focusing tests, a continuous-wave He-Ne
simulated the focused light fields of the two CEs, as shown in Fig. 1c laser with a wavelength of 632.8 nm was used for focusing and imaging
and d. Notably, the spherical CE forms a curved focal array that can tests in water. Notably, both the theoretical simulations and experi-
hardly match a planar detector unless ommatidia are precisely mental results confirm that the logarithmic lens shows a much larger
designed with different focal lengths24. In that case, the design and focus range and depth-of-field than that of the spherical lens. The
fabrication difficulties would increase sharply with the number of focus ranges of the spherical lens and the logarithmic one are 360 and
ommatidia, and the image size from different ommatidia varies 705 μm, respectively. Correspondingly, the depth-of-field of the
obviously. By comparison, logarithmic CE with uniform ommatidia spherical lens and the logarithmic ones is 200 and 440 μm,
shows a significantly elongated focus range at the cost of energy dis- respectively, (Supplementary Fig. 1).
persion in the focal spot, enabling planar detection. The unique focusing properties of the logarithmic lens make it
Nevertheless, the use of aspheric microlenses as ommatidia possible to solve the defocusing problem of CEs with spherical
significantly increases the fabrication difficulties of μ-CEs. Con- ommatidia. To provide solid evidence, we fabricated a spherical CE
ventional technologies capable of μ-CE fabrication, for instance, (control experiment) and a logarithmic one (Fig. 2c, d), respectively.
photoresist thermal reflow25, inkjet printing26, laser processing These two CEs were designed by closely arranging ~160 ommatidia of
assisted wet etching and thermal embossing15,16, cannot get precise uniform size (spherical ommatidium: D = 40 μm, fSL = 240 μm; loga-
control over the surface profiles. To address this issue, FL-TPP27–34, rithmic ommatidium: D = 40 μm, d1 = 50 μm, d2 = 600 μm) on a sphe-
which enables arbitrary 3D fabrication, is employed to produce μ- rical lens dome (400 μm in diameter and 90 μm in height). In this way,
CEs with ommatidia of function-defined surface profiles. For the fill factor can reach 100%, which can provide a higher light utili-
instance, we have previously reported the μ-CE with aspheric lens zation rate and signal-to-noise ratio. Scanning electron microscope
ommatidia to reduce spherical aberration in high-quality (SEM) images confirm the high surface smoothness and the distinct
imaging35. Nevertheless, the resultant μ-CE is still incompatible profiles of these two CEs. The cross-section profiles extracted from the
with planar CCD/CMOS detectors due to the defocusing problem. laser confocal microscope (LSCM, Supplementary Fig. 2) along the
The feature size of the as-obtained μ-CE is comparable to that of a dotted lines provide more details with respect to the size of the CEs
mosquito (Fig. 1e). Interestingly, the μ-CE can be directly inte- and the profile of their ommatidia. Besides, the statistics on the uni-
grated with a commercial CMOS detector (OV9734, OmniVision formity of each ommatidium on these two CEs are also provided
Company), working as an optoelectronic integrated μ-CE cam- (Standard deviations for the diameter of spherical and logarithmic
era (Fig. 1f). ommatidia are 0.62 and 0.57 μm, respectively. And standard devia-
tions for the height of spherical and logarithmic ommatidia are 0.16
Comparisons between spherical and logarithmic CEs and 0.15 μm, respectively, Supplementary Fig. 3), indicating the uni-
To make a comprehensive comparison between spherical and loga- formity of ommatidia.
rithmic ommatidia, we first fabricated a single spherical microlens and To compare their imaging performance, we first investigated their
a logarithmic microlens via two-photon polymerization using a nega- focusing properties using a microscopic imaging system with a ×10/
tive tone photoresist SZ2080 and evaluated their focusing properties. 0.25NA objective lens (Supplementary Fig. 4). By tuning the image
The surface profile and function of the spherical ommatidium can be distance, the focusing region varies from the center to the outer region
described as: (Fig. 2e, see the dotted circle). Relative focus spot intensity distribu-
tions along the orange dotted line further confirm the change of focus
pffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi region. Obviously, for spherical CE, it is impossible to detect the clear
H SL ðr, θÞ = R2  r 2 ð1Þ
focal spots from all the ommatidia at one time, indicating the defo-
cusing problem. In contrast, due to the enlarged focus range, all the
  clear focus spots can be collected at one time in the case of logarithmic
  1 CE (Fig. 2f), as confirmed by the focus spot intensity distribution. The
f SL = 1= nL  n0 × ð2Þ
R subsequent imaging tests further confirm their different imaging

Nature Communications | (2022)13:5634 3


Article https://doi.org/10.1038/s41467-022-33072-8

Fig. 2 | Comparisons between spherical and logarithmic CEs. a, b Optical (c) and logarithmic (d) μ-CEs; top images are overall and magnified SEM images,
properties of a a single spherical lens (SL, radius: 25 μm, focal length: 355 μm) and and the bottom results are relative cross-section profiles. e, f Focusing images (top)
b a logarithmic lens (LL, radius: 25 μm, focusing range: 100–800 μm). The top-left and normalized intensity distributions along the dotted line (bottom) of spherical
images are microscopic images; the top-right results are cross-sectional profiles; (e) and logarithmic (f) μ-CEs. g, h Imaging results of spherical (g) and logarithmic
the bottom images are simulative (Sim) and experimental (Exp) focusing intensity (h) μ-CEs; the insets are magnified images of different ommatidia.
distributions along the optical axis. c, d Morphologies of the as-prepared spherical

Nature Communications | (2022)13:5634 4


Article https://doi.org/10.1038/s41467-022-33072-8

Fig. 3 | Measurement of FOV and angular sensitivity function (ASF) of the distribution along the X-axis (c) and Y-axis (d) under different incident angles (0°,
logarithmic μ-CE. a Schematic illustration (top) and the focused light field images 30°, and 45°); the wavelength of the continuous light is 633 nm. e The normalized
(bottom) of the logarithmic μ-CE; the incident angles are 0°, 30°, and 45°, intensity distribution under normal incidence extracted from the focusing image
respectively. b Comparison of the intensity distributions along the X-axis and Y-axis (inset) along the dotted line. f ASF of the logarithmic μ-CE (FWHM = 12.1°).
at normal incidence; the insert is an image of a focal point. c, d Normalized intensity

abilities. Here, we employed the bright letter “F” as an object. By tuning angles (0°, 30°, and 45°) were collected (Fig. 3a), in which the deflec-
the position of the spherical CE, the ommatidia in the focused region tion of the focused region can be detected. And the best focus area
can form a clear image, whereas the images from the rest of the always follows the light incident angle to move. Besides, to investigate
ommatidia are out-of-focus, as shown in the insets of Fig. 2g. For the the focusing performance, the point spread functions (PSF) of a
logarithmic CE, clear images can be collected from all the ommatidia, focused spot along the X-axis and Y-axis were measured under normal
simultaneously (Fig. 2h). Based on the comprehensive comparison incidence (Fig. 3b). The focused spot appears as a standard circle, and
between these two CEs, we can conclude that the spherical CE suffers its intensity is Gaussian, indicating that the ommatidium has a high
from the defocusing problem, whereas the logarithmic CE is capable of processing quality that meets the design. Normalized intensity dis-
integration with a planar CCD/CMOS detector, despite the image tributions along the X-axis and Y-axis under different incident angles
brightness is reduced due to the energy dispersion. are plotted in Fig. 3c, d, respectively. The full width at half maximum
To get deep insight into the optical properties of the logarithmic (FWHM) almost remains constant (X-axis: FWHM = 4.3 ± 0.2, Y-axis:
CE, we also investigate its FOV and the angular sensitivity function FWHM = 4.1 ± 0.2), indicating the low aberration in a wide FOV. The
(ASF). According to the numerical derivation, the FOV of a CE can be slight difference in FWHM along the two directions can be attributed
calculated by the following formula: to the non-absolute normal incidence. In addition to FOV, ASF, which
0 1 represents the sensitivity to moving objects, is another key parameter
of CEs. Under normal incidence, normalized intensity distributions of
B H m Dm C the ommatidia along the dotted line were extracted from the optical
FOV = 2arcsin@ 2A
ð6Þ
2 Dm
Hm + 2
image (Fig. 3e), and the intensity distribution was plotted as a function
of the incidence angle and Gaussian fitting (Fig. 3f). The measured
where Hm and Dm are the height and the diameter of the main lens FWHM value of the ASF was 12.1°. As a control experiment, the FOV and
dome. Obviously, the FOV of CEs is governed by the geometry of the ASF of the spherical CE are 90° and 19.3°, respectively (Supplementary
main lens dome where the ommatidia are distributed. According to the Fig. 5). The two CEs show similar FOV due to the same parameters of
parameters of our CE model, its theoretical FOV is 96.9°. To evaluate the dome. In addition, the logarithmic CE demonstrates higher angular
the FOV experimentally, optical micrographs at different incident sensitivity due to the faster attenuation of the focal intensity at oblique

Nature Communications | (2022)13:5634 5


Article https://doi.org/10.1038/s41467-022-33072-8

incidence. It is worth pointing out that, with designable 3D processing On-chip camera for living microorganisms
ability, FL-TPP enables the direct fabrication of CEs with arbitrary 3D Unlike a monocular camera that can only determine the object’s dis-
geometries. Consequently, the FOV can be tuned by varying the size of tance upon knowing its true size, the μ-CE camera enables 3D detec-
the dome. Nevertheless, to integrate a single lens CE with a planar tion of the object trajectory based on the principle of multi-eye vision.
CMOS detector, a suitable FOV is ~90°. From the viewpoint of TPP When we observe the target objects using the μ-CE camera, ommatidia
fabrication, it is undoubtedly possible to fabricate compound eyes with different orientations can image the same target from different
with much larger FOV. Nevertheless, considering the imaging quality view angles. Notably, the as-obtained image array may be slightly dif-
and the utilization of ommatidia at the marginal part, it is not necessary ferent from each other in size and position deviations. By processing
to pursue a larger FOV. Through the comprehensive comparison, it can the instantaneous 3D imaging information of the target objects, real-
be concluded that the logarithmic CE shows similar optical perfor- time spatial location can be directly reconstructed (Fig. 5a). Here, a
mance to the spherical one, while the former can well address the machine learning method was utilized to calibrate the μ-CE camera
defocusing problem. based on the back propagation (BP) neural network36. In the calibration
of the μ-CE camera, the imaging of target objects with known para-
Optoelectronic integrated CE camera meters is firstly implemented (Supplementary Fig. 12, Due to the small
The use of logarithmic CE makes it possible to integrate the CE with focal length, the μ-CE camera has a large depth of field. Experiments
commercial CMOS detectors. In this work, to improve imaging quality, have proved that clear imaging can be achieved for objects with a
we optimized the parameters of ommatidia (D = 110 μm, d1 = 500 μm, distance greater than 1.4 mm). After the calibration, micro squares and
d2 = 1000 μm) and keeps the dome unchanged (400 μm in diameter triangles of different sizes are placed in different spatial positions. The
and 90 μm in height). A designed 19-ommatidia CE was fabricated as the images captured by the μ-CE camera and the reconstructed spatial
optical component and combined with a CMOS photosensitive chip positions are presented in Fig. 5b and c, in which the dashed and solid
(Omnivision 9734, 2.5 × 1.7 mm) as electrical components, achieving lines represent real-case and reconstructed spatial positions, respec-
optoelectronic integration (Supplementary Figs. 6 and 7). The CE with a tively. The reconstructed parameters are in good agreement with the
feature size of ~400 μm can cover more than 80,000 pixels which can real-case values. (Unlabeled original images, magnified 3D recon-
guarantee a reasonable imaging resolution. The optoelectronic CE struction results, and detailed data statistics are shown in Supple-
camera enables directly capturing images of different types of target mentary Fig. 13).
objects. As shown in Fig. 4a, clear images including bright-field/dark- In addition, to demonstrate a proof-of-concept application, we
field “F” and various insects can be formed by every ommatidium proposed an advanced, miniature, and portable on-chip camera sys-
(Magnified images are shown in Supplementary Fig. 8, high background tem by integrating the μ-CE camera with a microfluidic chip (Fig. 5d),
noise may originate from light spillage under oblique incidence and which was further employed in monitoring the real-time motion tra-
light incidence in the blank area outside the lens. This problem can be jectory of a living microorganism, Paramecium. The photographs of
well solved by adding baffles, but for TPP, it is challenging). the on-chip camera and its core component (the μ-CE camera) are
As a μ-CE camera, the cooperation of ommatidia enables sensitive shown in Fig. 5d. In the real-case observation, a green living Para-
trajectory monitoring of moving targets. It is well known that insect CE mecium with algae symbiosis was employed as a dynamic target; it is
vision is very sensitive to moving objects. When the target moves, the trapped in a reservoir on the microfluidic chip underneath the μ-CE
ommatidia of the CE image, in turn, produce a flicker effect. In this way, camera. As a representative demonstration, a paramecium motion
insects can perceive the moving trajectory and speed of their pre- video was recorded by the μ-CE camera, and the real-time trajectory
dators or preys in real-time, and make effective feedback immediately. reconstruction is performed at a rate of 24 frames per second (Fig. 5e,
Inspired by insect CEs, this optoelectronic μ-CE camera is expected to Supplementary Movie 2). The time-lapse images at different moments
achieve similar functionality. Briefly, the relative direction and distance and their instantaneous position are shown in Fig. 5f. The μ-CE camera
of the target can be determined by the image definition of different has revealed the capability for microscopic 3D trajectory reconstruc-
ommatidia and the image size, respectively. To establish the relation- tion, which is very promising for microscopic stereo imaging, micro-
ship between image size and object distance, we calibrated the object- scopic flow field measurement, and real-time tracking monitoring.
image relationship of the μ-CE camera first (Supplementary Figs. 9 and
10). To verify its position identification ability, a triangular object of Discussion
known size (side length: 20 mm) was placed in three spatial positions, After millions of years of evolution, natural Arthropods have possessed
and corresponding images were captured by the μ-CE camera (Fig. 4b). advanced visual systems, which provide intriguing inspiration for
The real spatial positions of the triangle (the distance and azimuth) in developing compact and miniature cameras. Generally, CEs consist of
the three cases are quantified as 220 mm/0°, 233 mm/19.3°, and thousands of omnidirectionally distributed ommatidia that point in
282 mm/38.8°, respectively (Fig. 4c). For comparison, the recon- different directions and images independently, which enables wide
structed distance and azimuth for the three cases are 226 mm/0°, FOV detection. Nevertheless, to achieve similar imaging capability, we
233 mm/19.4°, and 262 mm/38.8°, respectively (Fig. 4d), in good have to develop artificial CE cameras that work in a completely dif-
agreement with the real values. ferent way from natural CEs, in which a swarm of microlenses with
To assess its application in moving trajectory reconstruction, we complex 3D arrangement has to be integrated with available photo-
recorded the motion of a living beetle using this μ-CE camera. The detectors. The main challenge to reach this end is how to overcome the
schematic diagram of the experiment is shown in Fig. 4e (Supple- mismatch between nonplanar imaging with respect to omnidir-
mentary Fig. 11 for details). Time-lapse images of the beetle at different ectionally distributed ommatidia and planar detection with respect to
moments are captured by a traditional digital camera (Fig. 4f) and our commercial CCD/CMOS detectors when the feature size of the
optoelectronic CE camera (Fig. 4g), respectively. In the video recorded optoelectronic system decreased to micro-scale. To address the
by the μ-CE camera (Supplementary Movie 1), the image definition of defocusing problems of 3D CEs, typical solutions, including an optical
different ommatidia at different times and positions can be calculated relay system37, multi-layer lens assembly22,38, curved multi-focus24, or
(Fig. 4h), in which the distance and azimuth angle of the beetle can be the use of curved photodetectors3,17 have been successfully reported,
determined simultaneously. In this way, the spatial positions of the which is of great significance to developing optoelectronic CE cameras
beetle at different times can be reconstructed (Fig. 4i). The moving (Supplementary Table 1). Nevertheless, from the practical point of
trajectory reconstruction ability makes the optoelectronic μ-CE a view, these strategies are more or less limited in enormous difficulties
preferred vision system for miniature robots. with respect to fabrication, assembly, and integration. At present, the

Nature Communications | (2022)13:5634 6


Article https://doi.org/10.1038/s41467-022-33072-8

Fig. 4 | The imaging capability of the optoelectronic μ-CE camera. the μ-CE camera. f Time-lapse photography of a free-crawling beetle at different
a Photographs of different targets (top) and the images collected by the μ-CE moments. The photograph is generated by combining five photographs together.
camera (bottom). b Schematic illustrations of the different spatial positions of a The inset is the photograph of the beetle. g Images captured by the μ-CE camera at
triangle model (top) and relative images captured by the μ-CE camera (bottom). different times. h The definition statistics of five ommatidia (marked in circles)
c The true spatial position of the triangle relative to the μ-CE camera. d The cal- extracted from the images captured by the μ-CE camera at different times. The
culated spatial position of the triangle according to the images; the side length of definition of ommatidia in different positions reaches the maximum at different
the triangular is known (20 mm). The radius of the imaging FOV is 80 mm. moments. i The calculated spatial positions of the beetle at different moments and
e Schematic diagram of the experimental setup for monitoring beetle motion using the as-generated movement trajectory.

Nature Communications | (2022)13:5634 7


Article https://doi.org/10.1038/s41467-022-33072-8

Fig. 5 | On-chip camera for living microorganisms. a Schematic diagram of the middle), and the SEM image of the logarithmic CE (top-right). The insets are
basic principle for microscopic 3D reconstruction based on a μ-CE camera. schematic illustrations of its working mechanism (bottom-middle) and the micro-
b, c Images captured by the μ-CE camera (top) and the 3D reconstruction results of scopic image of a Paramecium. e The reconstructed 3D trajectory of the Paramecia.
the target objects (a micro-square and a triangle) at two different spatial positions. f Images of the Paramecium captured by the μ-CE camera (top) and the recon-
Dashed and solid lines represent real and reconstructed spatial positions, respec- structed spatial positions (bottom) at different moments. The radius of the imaging
tively. d Photograph of the on-chip camera system (left), the μ-CE camera (top- FOV is 150 μm.

development of optoelectronic CE cameras is still at an early stage, In this way, the as-obtained μ-CEs can be well integrated with a com-
there is a big space to make innovation on this cutting-edge topic. mercial CMOS detector (OV9734, Omnivision), forming an optoelec-
In this paper, we addressed this issue by designing and producing tronic μ-CEs camera. Importantly, the feature size of the μ-CEs is only
artificial μ-CEs with logarithmic-profile ommatidia via TPP fabrication. ~400 μm, similar to the CEs of a mosquito; and the total weight of the
To confirm this idea, spherical and logarithmic CEs of the same size μ-CEs camera (including the COMS chip) is only 230 mg. With 19–160
were fabricated in the same way, and their imaging performances were logarithmic ommatidia that can image independently and simulta-
compared in detail. As compared with CEs with spherical ommatidia, neously, our optoelectronic μ-CE camera enables large-FOV imaging
the defocusing problem can be effectively avoided in the case of (90°), spatial position identification, and sensitive trajectory mon-
logarithmic CEs, because the depth-of-field and focus range of all the itoring of moving targets. In a typical demonstration, the moving tra-
logarithmic ommatidia are significantly increased. Inevitably, this jectory of a living beetle can be reconstructed based on the real-time
scheme will result in a loss of energy, which darkens the image slightly. video recorded by the μ-CE camera. Furthermore, taking advantage of

Nature Communications | (2022)13:5634 8


Article https://doi.org/10.1038/s41467-022-33072-8

the small size and the unique imaging ability, the miniature μ-CE of the sample was characterized by a laser scanning confocal micro-
camera can be integrated with microfluidic devices, serving as an on- scope (LSCM, OLS4100, Olympus). Optical images of the sample were
chip camera for real-time monitoring of living microorganisms. A obtained using a transmission optical microscope (CX41, Olympus).
machine learning method was employed to calibrate the μ-CE camera The transmission spectrum of the structure was measured by a
based on the back propagation (BP) neural network, in which the homemade micro-area transmission test system with a high-sensitive
imaging of target objects with known parameters is first implemented. spectrometer (ULS2048x64-EVO, Avantes).
Based on the calibration, the 3D motion trajectory of a Paramecium in
5 s has been reconstructed from the real-time video. Data availability
In short, the development of miniature μ-CE cameras with inte- The data that support the findings of this study are available from the
grated optoelectronic systems is very important, which makes it pos- corresponding author upon request. Source data are provided with
sible to see the world from the perspective of insects. Featuring small this paper.
feature size, lightweight, portability, and multi-ommatidia omnidirec-
tional imaging ability, the μ-CE camera holds great promise for cutting- Code availability
edge applications in robotic vision, medical endoscopes, miniature All the relevant code used to generate the results in this paper and
navigation, moving target tracking, and many other micro-vision fields. Supplementary information is available upon request.

Methods References
Photoresist preparation 1. Brady, D. J. et al. Multiscale gigapixel photography. Nature 486,
An organic-inorganic hybrid photoresist SZ2080 (IESL-FORTH)39,40 386–389 (2012).
with 1% of photosensitizer (4,4-Bis(diethylamino)benzophenone) was 2. Ko, H. C. et al. A hemispherical electronic eye camera based on
dropped onto a precleaned cover glass. Then, the sample was heated compressible silicon optoelectronics. Nature 454, 748–753 (2008).
on a hot plate at 100 °C for 1 h to remove the organic solvent. After 3. Song, Y. M. et al. Digital cameras with designs inspired by the
solidification, the sample was cooled to room temperature for use. arthropod eye. Nature 497, 95–99 (2013).
4. Ma, Z. C. et al. Smart compound eyes enable tunable imaging. Adv.
Fabrication of CEs Funct. Mater. 29, 1903340 (2019).
In this experiment, a commercial galvanometer-based FL-TPP proces- 5. Lee, G. J., Choi, C., Kim, D. H. & Song, Y. M. Bioinspired artificial
sing system (Maleon Nano system, Jicheng ultrafast equipment co. eyes: optic components, digital cameras, and visual prostheses.
LTD) was employed for the fabrication. First, the near-infrared laser Adv. Funct. Mater. 28, 1705202 (2018).
(ErFemto-780MP: central wavelength of 780 nm, a pulse length of 6. Iyer, V., Najafi, A., James, J., Fuller, S. & Gollakota, S. Wireless
100 fs, pulse repetition frequency of 80 MHz) is tightly focused into steerable vision for live insects and insect-scale robots. Sci. Robot.
the resin by a high numerical aperture objective lens (×60, NA = 1.35, 5, eabb0839 (2020).
Olympus). Then, 3D scanning of light spots can be realized with the 7. Li, J. et al. Ultrathin monolithic 3D printed optical coherence
help of a 2D galvanometer and a 1D piezoelectric platform. The laser tomography endoscopy for preclinical and clinical use. Light Sci.
power measured in front of the objective lens was 18–20 mW. The Appl. 9, 1–10 (2020).
processing data were converted into a 3D point cloud with a spacing of 8. Yanny, K. et al. Miniscope3D: optimized single-shot miniature 3D
200 nm and the exposure time at a single point of 300 μs. By opti- fluorescence microscopy. Light Sci. Appl. 9, 1–13 (2020).
mizing the laser processing parameters, surface roughness as low as 9. Pahlevaninezhad, H. et al. Nano-optic endoscope for high-
6 nm can be achieved, which is much lower than λ/20 (λ, the working resolution optical coherence tomography in vivo. Nat. Photonics 12,
wavelength, Supplementary Fig. 14). After processing, the sample was 540–547 (2018).
soaked in the n-propanol solution for 40 min to remove the unexposed 10. Lin, R. J. et al. Achromatic metalens array for full-colour light-field
photoresist. Besides, a DRS-TPP processing technology was used to imaging. Nat. Nanotechnol. 14, 227–231 (2019).
shorten the processing time (Details can be found in Supplementary 11. Luo, Y. et al. Varifocal metalens for optical sectioning fluorescence
Fig. 15)41. Consequently, it only takes ~1.5 h to fabricate a CE lens of microscopy. Nano Lett. 21, 5133–5142 (2021).
400 μm. All of these processes are implemented in a yellow light 12. Tanida, J. et al. Thin observation module by bound optics (TOMBO):
environment to avoid the overall exposure of the sample. After concept and experimental verification. Appl. Opt. 40,
development, the CE sample was quickly dried in the air and irradiated 1806–1813 (2001).
under a high-power ultraviolet lamp (wavelength: 365 nm, power: 13. Lee, L. P. & Szema, R. Inspirations from biological optics for
about 2w) for 12 h to increase the optical transmittance in the visible advanced photonic systems. Science 310, 1148–1150 (2005).
band (photobleaching). Long-time ultraviolet photobleaching can 14. Jeong, K.-H., Kim, J. & Lee, L. P. Biologically inspired artificial
effectively improve the optical transmittance of the device. Finally, at a compound eyes. Science 312, 557–561 (2006).
wavelength of 633 nm, more than 93% of transmittance can be 15. Deng, Z. et al. Dragonfly‐eye‐inspired artificial compound eyes with
achieved (Supplementary Fig. 16). sophisticated imaging. Adv. Funct. Mater. 26, 1995–2001 (2016).
16. Liu, X. Q. et al. Rapid engraving of artificial compound eyes from
Optoelectronic integration of the CE camera curved sapphire substrate. Adv. Funct. Mater. 29, 1900037 (2019).
To facilitate the packaging process, the substrate (cover glass) of the 17. Floreano, D. et al. Miniature curved artificial compound eyes. PNAS
as-obtained CE was cut into a size of 1.5 × 2.0 mm2 using a diamond 110, 9267–9272 (2013).
wire cutter (STX-202AQ), keeping the CE in the center. Then, CE, 18. Thiele, S., Arzenbacher, K., Gissibl, T., Giessen, H. & Herkommer, A.
together with the glass substrate, was attached to a commercial min- M. 3D-printed eagle eye: Compound microlens system for foveated
iature CMOS chip (total active array size of 1280 × 720 pixel², and each imaging. Sci. Adv. 3, e1602655 (2017).
pixel has a size of 1.4 × 1.4 μm²). A UV curable adhesive NOA61 (Nor- 19. Gissibl, T., Thiele, S., Herkommer, A. & Giessen, H. Two-photon
land) was used to fix them (Supplementary Fig. 17). direct laser writing of ultracompact multi-lens objectives. Nat.
Photonics 10, 554–560 (2016).
Structure characterization 20. Hao, C. et al. Single‐layer aberration‐compensated flat lens for
The surface morphology of the structure is characterized by a field robust wide‐angle imaging. Laser Photonics Rev. 14,
emission electron microscope (SEM, JSM-7500F, JEOL). The 3D profile 2000017 (2020).

Nature Communications | (2022)13:5634 9


Article https://doi.org/10.1038/s41467-022-33072-8

21. Juodkazis, S. 3D printed micro-optics. Nat. Photonics 10, 41. Hu, Z.-Y. et al. Two-photon polymerization nanomanufacturing
499–501 (2016). based on the definition–reinforcement–solidification (DRS) strat-
22. Toulouse, A. et al. Ultra-compact 3D-printed wide-angle cameras egy. J. Light. Technol. 39, 2091–2098 (2021).
realized by multi-aperture freeform optical design. Opt. Express 30,
707–720 (2022). Acknowledgements
23. Golub, I., Chebbi, B., Shaw, D. & Nowacki, D. Characterization of a Y.-L.Z. was supported by the National Natural Science Foundation of
refractive logarithmic axicon. Opt. Lett. 35, 2828–2830 (2010). China (NSFC) under Grant Nos. #61935008 and #61775078. Q.-D.C. was
24. Chen, J. et al. Hybrid imprinting process to fabricate a multi-layer supported by the National Natural Science Foundation of China (NSFC)
compound eye for multispectral imaging. Opt. Express 25, under Grant Nos. #61827826 and #61825502.
4180–4189 (2017).
25. Kim, K., Jang, K. W., Ryu, J. K. & Jeong, K. H. Biologically inspired Author contributions
ultrathin arrayed camera for high-contrast and high-resolution Z.-Y.H. and Y.-L.Z. conceived the idea; Z.-Y.H., Y.-L.Z., Q.-D.C., and H.-
imaging. Light Sci. Appl. 9, 28 (2020). B.S. designed the entire CE system. Z.-Z. L. helped for the simulation. Z.-
26. Luo, Y. et al. Direct fabrication of microlens arrays with high Y.H., Z.-N.T., and J.-W.M. undertook the experiments and the char-
numerical aperture by ink-jetting on nanotextured surface. Appl. acterization. C.P. and J.-Y.D. helped for the calibration and the moving
Surf. Sci. 279, 36–40 (2013). trajectory reconstruction; Q.-D.C. helped for setting up the laser fabri-
27. Li, R. et al. Stimuli-responsive actuator fabricated by dynamic cation system. Z.-Y.H. and Y.-L.Z. wrote the paper.
asymmetric femtosecond bessel beam for in situ particle and cell
manipulation. ACS Nano 14, 5233–5242 (2020). Competing interests
28. Ma, Z.-C. et al. Femtosecond laser programmed artificial muscu- The authors declare no competing interests.
loskeletal systems. Nat. Commun. 11, 1–10 (2020).
29. Juodkazis, S. Laser polymerized photonic wire bonds approach 1 Additional information
Tbit/s data rates. Light Sci. Appl. 9, 72 (2020). Supplementary information The online version contains
30. Liu, Y. et al. Structural color three-dimensional printing by shrinking supplementary material available at
photonic crystals. Nat. Commun. 10, 1–8 (2019). https://doi.org/10.1038/s41467-022-33072-8.
31. Ni, J. et al. Gigantic vortical differential scattering as a monochro-
matic probe for multiscale chiral structures. PNAS 118, Correspondence and requests for materials should be addressed to
e2020055118 (2021). Yong-Lai Zhang.
32. Ni, J. et al. Giant helical dichroism of single chiral nanostructures
with photonic orbital angular momentum. ACS Nano 15, Peer review information Nature Communications thanks Koji Sugioka,
2893–2900 (2021). Alois Herkommer and the other anonymous reviewer(s) for their con-
33. Park, S. H., Yang, D. Y. & Lee, K. S. Two-photon stereolithography for tribution to the peer review of this work.
realizing ultraprecise three-dimensional nano/microdevices. Laser
Photonics Rev. 3, 1–11 (2009). Reprints and permission information is available at
34. Xin, C. et al. Environmentally adaptive shape-morphing micro- http://www.nature.com/reprints
robots for localized cancer cell treatment. ACS Nano 15,
18048–18059 (2021). Publisher’s note Springer Nature remains neutral with regard to jur-
35. Wu, D. et al. Bioinspired fabrication of high‐quality 3D artificial isdictional claims in published maps and institutional affiliations.
compound eyes by voxel‐modulation femtosecond laser writing for
distortion‐free wide‐field‐of‐view imaging. Adv. Opt. Mater. 2, Open Access This article is licensed under a Creative Commons
751–758 (2014). Attribution 4.0 International License, which permits use, sharing,
36. Goi, E. et al. Nanoprinted high-neuron-density optical linear per- adaptation, distribution and reproduction in any medium or format, as
ceptrons performing near-infrared inference on a CMOS chip. Light long as you give appropriate credit to the original author(s) and the
Sci. Appl. 10, 1–11 (2021). source, provide a link to the Creative Commons license, and indicate if
37. Shi, C. et al. SCECam: a spherical compound eye camera for fast changes were made. The images or other third party material in this
location and recognition of objects at a large field of view. Opt. article are included in the article’s Creative Commons license, unless
Express 25, 32333–32345 (2017). indicated otherwise in a credit line to the material. If material is not
38. Zhang, H. et al. Development of a low cost high precision three- included in the article’s Creative Commons license and your intended
layer 3D artificial compound eye. Opt. Express 21, use is not permitted by statutory regulation or exceeds the permitted
22232–22245 (2013). use, you will need to obtain permission directly from the copyright
39. Ovsianikov, A. et al. Ultra-low shrinkage hybrid photosensitive holder. To view a copy of this license, visit http://creativecommons.org/
material for two-photon polymerization microfabrication. ACS Nano licenses/by/4.0/.
2, 2257–2262 (2008).
40. Gailevičius, D. et al. Additive-manufacturing of 3D glass-ceramics © The Author(s) 2022
down to nanoscale resolution. Nanoscale Horiz. 4, 647–651 (2019).

Nature Communications | (2022)13:5634 10

You might also like