Spatial Monitoring of Insects
Spatial Monitoring of Insects
Victoria, Australia.
Abstract
Insects are the most important global pollinator of crops and play a key role in maintaining the sustain-
ability of natural ecosystems. Insect pollination monitoring and management are therefore essential
for improving crop production and food security. Computer vision facilitated pollinator monitoring
can intensify data collection over what is feasible using manual approaches. The new data it gener-
ates may provide a detailed understanding of insect distributions and facilitate fine-grained analysis
sufficient to predict their pollination efficacy and underpin precision pollination. Current computer
vision facilitated insect tracking in complex outdoor environments is restricted in spatial coverage
and often constrained to a single insect species. This limits its relevance to agriculture. Therefore,
in this article we introduce a novel system to facilitate markerless data capture for insect counting,
insect motion tracking, behaviour analysis and pollination prediction across large agricultural areas.
Our system is comprised of Edge Computing multi-point video recording, offline automated multi-
species insect counting, tracking and behavioural analysis. We implement and test our system on a
commercial berry farm to demonstrate its capabilities. Our system successfully tracked four insect
varieties, at nine monitoring stations within a poly-tunnel, obtaining an F-score above 0.8 for each
variety. The system enabled calculation of key metrics to assess the relative pollination impact of
each insect variety. With this technological advancement, detailed, ongoing data collection for preci-
sion pollination becomes achievable. This is important to inform growers and apiarists managing crop
pollination, as it allows data-driven decisions to be made to improve food production and food security.
Keywords: deep learning, camera trapping, honeybees, pollination, food security, insect tracking
1
2 Spatial Monitoring and Insect Behavioural Analysis Using Computer Vision for Precision Pollination
The remainder of the paper is organised as fol- spatiotemporal resolution required for efficient
lows. In Section 2 we present a brief overview of pollination monitoring.
related work concerning computer vision for insect To overcome the difficulties listed above,
tracking in the wild. Section 3 presents our new we previously presented a Hybrid Detection
methods and their implementation. In section 4 and Tracking (HyDaT) algorithm (Ratnayake,
we describe experiments to evaluate the perfor- Dyer, & Dorin, 2021b) and a Polytrack algo-
mance of our approach and present the results of a rithm (Ratnayake et al., 2021a) to track multi-
pollination analysis to demonstrate our methods’ ple unmarked insects in uncontrolled conditions.
application. In Section 5 we discuss the strengths HyDaT and Polytrack algorithms use a hybrid
and limitations of our approach and suggest future detection model consisting of a deep learning-
work. Section 6 concludes the paper. based detection model (Bochkovskiy, Wang, &
Liao, 2020; Redmon & Farhadi, 2017) and a fore-
2 Related Work ground/background segmentation-based detection
model (Zivkovic & Van Der Heijden, 2006). This
Recently there has been an increase in the use of enables tracking unmarked and free-flying insects
computer vision and deep learning in agriculture amidst the changes in the environment. However,
(Kamilaris & Prenafeta-Boldú, 2018; Odemer, these earlier algorithms are limited to one species
2022). This has been prominent in land cover clas- and one study location at a time. To gain a sophis-
sification (Lu et al., 2017), fruit counting (Afonso ticated understanding of agricultural pollination,
et al., 2020), yield estimation (Koirala, Walsh, these constraints are limiting since analysis of
Wang, & McCarthy, 2019), weed detection (Su, the behaviour of multiple insect species that con-
Kong, Qiao, & Sukkarieh, 2021), beneficial and tribute simultaneously, in multiple locations, to
insect pest monitoring (Amarathunga, Grundy, overall pollination levels or deficiencies is impor-
Parry, & Dorin, 2021), and insect tracking and tant (Garibaldi et al., 2020; Rader et al., 2016).
behavioural analysis (Høye et al., 2021). Applica- Currently there is no computer vision facilitated
tions of insect tracking and behavioural analysis system, or any other practical system, capable
algorithms are usually confined to controlled envi- of achieving this goal. In addition, no previous
ronments such as laboratories (Branson, Robie, method can identify and classify insect pollination
Bender, Perona, & Dickinson, 2009; Haalck, behaviour across large-scale industrial agricultural
Mangan, Webb, & Risse, 2020; Pérez-Escudero, areas at a level of detail that permits sub-site-
Vicente-Page, Hinz, Arganda, & De Polavieja, specific interventions to increase farm yield via
2014; Walter & Couzin, 2021), and semi-controlled improved pollination.
environments such as at beehive entrances (Camp-
bell, Mummert, & Sukthankar, 2008; Magnier et 3 Methods and
al., 2019; Yang, Collins, & Beckerleg, 2018). In
these situations, image backgrounds and illumina- Implementation
tion under which insects are tracked vary only a
In this section, we explain the methods and imple-
little, simplifying automated detection and track-
mentation of our insect and pollination monitoring
ing tasks. Pollination monitoring of crops however,
system. An overview of the proposed methodology
may require tracking unmarked insects outdoors
is shown in Fig. 1.
in uncontrolled environments subjected to vege-
tation movement caused by the wind, frequent
illumination shifts, and movements of tracked and 3.1 Multi-point remote video
non-target animals. These environmental changes, capture
combined with the complexity of insect movement Video footage of freely foraging, unmarked insects
under such variable conditions, increases the dif- required for insect tracking and behavioural anal-
ficulty of the tracking problem. Recent studies ysis was collected using edge computing-based
attempted to address these issues through in-situ remote camera trap devices built on the Raspberry
insect monitoring algorithms (Bjerge, Mann, & Pi single board computer. We used a Raspberry
Høye, 2021; Bjerge, Nielsen, Sepstrup, Helsing- Pi 4 and Raspberry Pi camera v2 (Sony IMX219
Nielsen, & Høye, 2021), but were limited in the
4 Spatial Monitoring and Insect Behavioural Analysis Using Computer Vision for Precision Pollination
8-megapixel sensor) because it is widely avail- robust tracking of insects and flowers is essen-
able, customisable, there’s a wide range of plug-in tial for accurate pollination and insect behavioural
sensors, and it is sufficiently low-cost for repli- analysis. Here, we extended methods proposed in
cation across a large area (Jolles, 2021). Videos Ratnayake et al. (2021a, 2021b) to track multiple
are recorded at 1920 × 1080 resolution at 30f ps. insect varieties simultaneously and to detail their
The system is powered using a 20000mAh battery interactions with flowers. In the following sections
bank. However, we do not process videos to track we present the technical details of our methods.
pollinators in situ since the Raspberry Pi is cur- At the start of processing each video sequence,
rently incapable of processing high quality videos our algorithm extracts the time and location at
in real-time, and our key goals required detection which the video was captured from the sequence’s
of insects. Reducing the video resolution or the embedded metadata. Next, the video is processed
capture frame-rate to compensate for the lack of to track movement of insects and their interac-
speed of the device is not currently feasible within tions with flowers. Pilot research revealed that the
the limitations imposed by pollinator insect speed position of each respective flower being recorded
and size. Video recording units were distributed varies throughout a day due to wind and farm
across nine data collection points in an experimen- management activities, and flowers may physically
tal site (section 3.4 below) and were programmed move termed heliotropism in some cases to track
to continuously record sets of footage clips of 10 sunlight (Kevan, 1975; van der Kooi, Kevan, &
minutes duration. The caption of each video clip Koski, 2019). Therefore, it is essential to track
contained metadata on camera location, recording flower position within the frame to reliably iden-
date and recording time. (Refer to code availabil- tify insect-flower interactions. The positions of all
ity for the software used in the video recording visible flowers are first recorded at the start of a
unit.) video sequence and updated in predefined user-
specified intervals (Parameters values are provided
3.2 Automated multi-species insect with the source code). A “predict and detect”
tracking approach is used to track flower movement. The
predicted next position of each flower is initially
We processed the videos captured remotely using identical to its current position, since the magni-
an offline automated video processing algorithm. tude of flower movement within a short interval
Since food crops are usually grown in uncon- (e.g., ≈ 100seconds) is assumed to be small. We
trolled or semi-controlled environments subject then used the Hungarian algorithm (Kuhn, 1955)
to changes in illumination and foliage movement to associate the predicted position of each flower
caused by wind and/or insect and human activity, to a flower detection in order to form a continuous
flower movement track. If a flower being tracked
Spatial Monitoring and Insect Behavioural Analysis Using Computer Vision for Precision Pollination 5
is undetected in a given frame, the last detected the experimental environment respectively. Here,
position is carried forward. If a detected flower si = {si1 , si2 , ..., si|si | } denotes the subset of insects
cannot be assigned to any predictions it is con- in S that belong to the ith species type, and sij is
sidered to be a new flower. At the end of a video the j th insect in si . |.| is the cardinality of a given
sequence, the final positions of flowers and their set – e.g., |S| is the number of species types, |si | is
respective tracks of interacting insects are saved the number of insects belonging to the ith species.
for later pollination analysis and visualisation.
When an insect is first detected inside a video • Number of flowers visited by an insect
frame, the automated video processing algorithm species
identifies its species using the Polytrack deep
learning model (Ratnayake et al., 2021a). In addi- The number of flowers visited by an insect
tion, it saves a snapshot of the insect for (optional species si is defined as F V (si ), where nf sij is the
human) visual verification. After detection and number of times insect sij of species si visited
identification of an insect, the Polytrack algorithm flower f ∈ F .
tracks it through subsequent frames. In each frame
i
after the first detection of an insect, its posi- |s |
X X
i
tion is compared with the position of recorded F V (s ) = nf sij (1)
flowers to identify flower visits. If an insect is j=1 f ∈F
detected inside the radius of a flower for more • Total number of visits to a flower f from
than 5 consecutive frames (at 30 fps this ensures species si
it is not flying over the flower at typical for- Total number of visits to a flower f from species
aging flight speeds (Spaethe, Tautz, & Chittka, si is defined as V F (f, si ).
2001)), the spatial overlap is stored as a flower i
|s |
visit. The radius of a flower is computed to include X
i
its dorsal area and an external boundary thresh- V F (f, s ) = nf sij (2)
old. This threshold is incorporated as some insects j=1
station themselves outside of a flower while access- • Total number of visits to a flower f
ing nectar or pollen. Repeat visits to a flower Total number of visits to a flower f is defined
that occur after an intermediate visit to another as V (f ).
flower are recorded as flower re-visits. When an |S| |si |
X X
insect exits the video frame, a file with data on V (f ) = nf sij (3)
camera location, time of capture and insect trajec- i=1 j=1
tories with flower visitation information is saved • Number of flowers fertilised with visits
for behavioural analysis. The software and recom- from species si
mended tracking parameter values are available Number of flowers fertilised with visits from
with the source code. species si is defined as Npol (si ), where V̂ is the
number of visits required for fully fertilisation
3.3 Insect behaviour analysis of a flower.
Fig. 2: Implementation of the pollination monitoring system. a, A map of the Sunny Ridge
berry farm (implementation site near the city of Melbourne, Victoria, Australia.) and surrounding areas.
Locations of managed honeybee hives are indicated with yellow circles. b, Nine data collection points
in strawberry polytunnels. c, Edge computing-based remote video capture units placed over strawberry
vegetation. d, A sample image to indicate the field of view captured by a monitoring unit. (The white
ruler measures 31 cm end-to-end).
Table 1: Results of the experimental evaluations for the test video dataset. “Detections made”
shows the number of insects/flowers detected by the algorithm compared against human observations.
“Trackletts generated” shows the total number of tracks generated for each insect variety. “Visible frames”
indicates the number of frames the insects/flowers were fully visible in the frame. “Evaluation matrices”
present the average precision, recall and F-score values for tracked insects. “Flower visits” compares the
total number of insect visits to flowers counted through human observations and automatically identified
through the software for tracked insects. TP = True positive, FP = False positive, FN = False-negative.
Insect/ Detections Trackletts Visible Evaluation matrices Flower Visits
Flower made generated frames
Precision Recall F-score Obs. TP FP FN
Honeybee 20/20 23 ∗ 16846 0.99 0.92 0.95 67 65 0 2‡
Syrphidae 5/6 6∗† 3436 1.00 0.71 0.81 5 4 1 1
Lepidoptera 3/4 5∗† 3158 0.99 0.71 0.81 6 6 1 0
Vespidae 10/10 10 589 1.00 0.73 0.83 0 0 0 0
Flower 68/72 68 179306 1.00 0.94 0.97 N/A N/A N/A N/A
∗
Multiple tracks generated by single insect.
†
Insect not detected for tracking.
‡
Resulted from undetected flower(s).
Fig. 4: Trajectories of insects and flower positions recorded in test videos. Track colour indicates
insect variety. The number of tracks recorded for each insect type is shown in the legend in brackets
beside insect type. Flower locations are circled in yellow.
1989; Garibaldi et al., 2020). Therefore, the num- received at least four insect visits during the bio-
ber of insect visits to a flower can be used to logically relevant data collection period [5 hours]
predict its pollination level. We used the collected over which our system operated. Analysis results
spatial monitoring data to identify flowers that are shown in Fig. 5.
10 Spatial Monitoring and Insect Behavioural Analysis Using Computer Vision for Precision Pollination
Fig. 5: Results of the spatial monitoring and insect behavioural analysis for precision pol-
lination. Bar charts above the plots indicate the number of tracks, total number of flower visits, and
number of flowers recorded at each location. Bar colour for tracks and flower visits indicates the pro-
portion of tracks recorded for each insect type. Strawberry flowers typically require four visits for full
fertilisation (Chagnon et al., 1989; Garibaldi et al., 2020). The dark grey portion of the flowers’ bar graph
shows the number of flowers with over four insect visits. “T” and “F” in the title blocks are the total
number of tracks and flowers recorded at each location. Trajectory plots show all insect tracks recorded
at each location throughout the data collection period. Track colours represent different insect varieties.
Flower locations are circled in yellow.
Flower-visitation behaviour reflects insects’ on the strawberry flowers by calculating the per-
crop pollination contributions. We quantified this centage of flowers that received visits from each
Spatial Monitoring and Insect Behavioural Analysis Using Computer Vision for Precision Pollination 11
insect type. We further analysed insect-flower vis- this study, a novel multi-point computer vision-
its to evaluate the pollination efficacy of insect based system is presented to facilitate digital
types by calculating the proportion of flowers that spatial monitoring and insect behavioural analy-
received the minimum of four insect visits required sis on large scale farms. Our system operates in
for fertilisation. Results of this analysis are shown real-world commercial agricultural environments
in Fig. 6. (Fig. 2) to capture videos of insects, identify
At all data collection points, we recorded a them (Fig. 3), and count the number of differ-
higher number of honeybees than other insects ent varieties over large areas (Fig. 5). Analysis
(Fig. 5). These insects contributed the most of the insect behavioural data allows comparison
towards achieving the flower-visitation targets of the contributions of different insect varieties
required for fertilisation (Fig. 6). The next high- to crop pollination (Fig. 5 and 6). Here, we dis-
est recorded insect were the Vespids (341 tracks) cuss the implications of our research for precision
(Fig. 5). However, Vespids were rarely observed to pollination.
be visiting flowers – at location 1 we did identify
Vespidae flower visits; see Fig. 6. This suggests 5.1 Computer vision for insect
that Vespids do not contribute much to strawberry tracking and behavioural
pollination. Indeed Vespids may be a predator of
analysis
other insects (Spencer, Barton, Ripple, & New-
some, 2020) and can act to inhibit pollination. We Our methods remove the major constraints
recorded relatively low Lepidopteran and Syrphi- imposed by the limitations of human observers
dae counts in most areas of the farm (Fig. 5). The for horticultural pollination monitoring and the
contribution of these species towards achieving collection of high-resolution spatiotemporal data
flower-visitor targets required for pollination was (Fig. 5) on insect behaviour. The approach there-
observed to be much lower than that of honeybees fore also paves the way for computer vision and
(Fig. 6). This effect is evident by the low relative edge computing devices to identify insect species
frequency with which these insects made succes- for other entomological and ethological applica-
sive visits to flowers to meet the four required tions.
for optimal fertilisation (Fig. 6). For example, the The use of relatively inexpensive Raspberry
highest frequency of a non-honeybee pollinator to Pi edge computing devices (Fig. 2) for remote
meet four visits was Lepidoptera at location 9 recording provides a high degree of scalability and
where less than 15% of flowers achieve this level customisability (Aslanpour et al., 2021; O’Grady,
of pollination; whilst at all locations honeybees Langton, & O’Hare, 2019) for insect monitoring.
significantly exceeded this level of pollination per- However, the limited capabilities of these devices
formance (Fig. 6). When pollination across all confines the size of recorded study areas (Fig. 2d)
locations is considered, over 68% of the recorded and offers only low frame rates and low qual-
strawberry flowers received the minimum of four ity video. This reduced the system’s ability to
insect visits required for fertilisation, and 67% detect small Syrphidae, and resulted in issues with
of flowers attained this threshold through honey- the detection and tracking of fast-moving Vespids
bee visits alone. This data thus reconfirms which (Table 1). In addition, the current implementation
insects seem, at least as far as the number of vis- continuously recorded videos on the Raspberry
its is concerned, to contribute the most towards Pi even when there was no insect in the camera
pollination at the site. frame. This wastes the limited storage and power
capacities available on edge computing devices.
5 Discussion and Future Work We aim to address this drawback in future work
by implementing an in-situ algorithm on the edge-
Insect pollination monitoring can improve our computing device for real-time event processing.
understanding of the behaviour of insects on crops. It is likely that with the rapid improvement of
It can therefore potentially boost crop yield on camera technology, video quality and resolution
farms were it not currently heavily constrained by will overcome current limitations and enhance the
the labour required for manual data collection. In accuracy and efficiency of our methods.
12 Spatial Monitoring and Insect Behavioural Analysis Using Computer Vision for Precision Pollination
Fig. 6: Contribution of different insect varieties towards strawberry pollination. Bar chart
shows percentage of flowers visited by each insect type. The dark grey portion shows the percentage of
flowers with over four (number of visits required for strawberry flower fertilisation (Chagnon et al., 1989;
Garibaldi et al., 2020)) from each insect type. The red dashed line in the plots show the total percentage
of flowers with more than four visits in a location.
We applied our new methods to monitor insect improvements in cameras and object detection
pollination behaviour in strawberry crops. Straw- technologies (Stojnić et al., 2021) will help here.
berry flowers bloom within a narrow vertical spa- Our algorithm uses deep learning to detect
tial range and are usually visible from above (Fig. and classify insects. The results of experimen-
2d). By contrast, other crops, such as tomatoes or tal evaluation showed limitations in Lepidopteran
raspberry, grow within complex three-dimensional detection and visually similar insect detection (i.e.
structures of vines or canes, making overhead cam- honeybees, Syrphidae and Vespidae (Fig. 3 and
era tracking of insects problematic. Monitoring Table 1)). Detection of Lepidopterans was chal-
their behaviour in such three-dimensional crops lenging because they sometimes appear similar
will require camera placements at oblique angles. in shape to foliage and shadows in the envi-
Insect detection is an essential precursor to ronment. Also, they rested stationary on flowers
tracking and monitoring. Our algorithm accu- for extended periods, prompting the algorithm to
rately detected honeybees and Vespidae but per- classify them as part of the background. Detec-
formed relatively poorly on Syrphidae (Table 1). tion and classification of visually similar insects
This is because of the relatively small pixel area requires a deep learning model trained with large
covered by the insect with our setup (Syrphi- annotated datasets. For the current study, we
dae covers ≈ 40 ± 10 pixels compared to ≈ built a dataset from scratch in the absence of
1001 ± 475 pixels for a honeybee) (Fig. 3). Future suitable open annotated datasets for entomology
Spatial Monitoring and Insect Behavioural Analysis Using Computer Vision for Precision Pollination 13
(Høye et al., 2021). However, our dataset was substitute for species richness Fijen et al. (2018);
unbalanced, since the number of instances in each Garibaldi et al. (2016) as variations in behaviour
class was influenced by the relative abundance of and foraging inherent to different insect species
insects recorded at the site (Wang et al., 2016). may be important.
We propose that future research should use char- Compared to manual pollination monitoring,
acteristics of insect behaviour, such as spatial our methods provide high-resolution behavioural
signatures of insect movement, to improve species data classified by insect type. Our spatial moni-
classification tasks (Kirkeby et al., 2021). This will toring results (Fig. 5) can assist farm managers to
help overcome limitations associated with camera identify farm areas that require immediate atten-
quality and deep learning datasets. The video data tion in order to maximise fruit set. Furthermore,
we publish with this article offers a starting point the behavioural pollination contribution analysis
for such solutions. (Fig. 6) can provide tools and data to identify
efficient pollinator species for a particular crop,
5.2 Spatial monitoring for precision enabling data-driven pollination management.
pollination Pollination monitoring helps understand the
impact of climate change and other anthropogenic
Spatial monitoring and insect behavioural analy- activities on insect populations (Settele, Bishop, &
sis can help growers understand the distribution Potts, 2016). Recently, climate change and other
of pollinators across a farm and their impact on anthropogenic pressures, including intensive agri-
pollination. We quantified pollination by counting culture, have caused a decline in some pollinator
insect numbers and insect-flower interactions (Fig. populations (Hallmann et al., 2017; Outhwaite,
5). Farm areas with many flowers and insects will McCann, & Newbold, 2022; Schweiger et al., 2010;
likely yield the most crop if there are a suitable Vanbergen & Initiative, 2013) threatening global
number of insect-flower interactions. Strawberry food security and terrestrial ecosystem health.
flowers require at least four insect visits for full The most impacted pollinator populations are
fertilisation (Chagnon et al., 1989; Garibaldi et native and wild insects that must compete for
al., 2020). However, it is important to note that food with managed pollinators while coping with
crop yield and visitation rates have been observed disease, pollution and habitat loss (Wood et al.,
to have a non-linear relationship (Garibaldi et al., 2020). Digital pollination monitoring systems like
2020), where higher flower visitation rates can that described here, provide much-needed data for
result in lower crop yield (Garibaldi et al., 2020; understanding the impacts of climate change on
Rollin & Garibaldi, 2019). Therefore, it is benefi- insect biodiversity and can ultimately provide a
cial to maintain insect flower visits at an optimum sound basis for conservation.
value that depends on the crop type, pollinator
species, and environmental conditions (Garibaldi
et al., 2020).
6 Conclusions
Although different behaviours and morpholo- In this paper, we presented a computer vision
gies make some insect species more effective polli- facilitated system for spatial monitoring and
nators of some flowers than others, we compared insect behavioural analysis to underpin agricul-
the contribution of different insect varieties to tural precision pollination. Our system comprised
strawberry pollination using the number of insect of edge computing-based remote video capture,
flower visits as a proxy (Fig. 6). The analysis offline, automated, unmarked multi-species insect
suggests that strawberries can obtain sufficient tracking, and insect behavioural analysis. The sys-
pollination solely from honeybees (Figure 6), even tem tracked four insect types with F-scores above
without the presence of other insects. However, 0.8 when implemented on a commercial straw-
an agricultural system driven by a single pollina- berry farm. Analysis of the spatial distribution of
tor type may not be desirable. Pollinator diversity flower-visiting behaviour of different insect vari-
and associated high flower visitor richness have eties across the farm, allowed for the inference of
been shown to affect pollination and crop yield flower fertilisation, and the comparison of insects’
Garibaldi et al. (2016). Often the high abundance pollination contribution. We determined that 67%
of a single pollinator species cannot be used as a of flowers met or exceeded the specified criteria
14 Spatial Monitoring and Insect Behavioural Analysis Using Computer Vision for Precision Pollination
for reliable pollination through honeybee visits. Don Chathurika Amarathunga; Writing – orig-
However, alternative pollinators were less effective inal draft: Malika Nisal Ratnayake; Writing
at our study site. This advancement of computer – review & editing: Malika Nisal Ratnayake,
vision, spatial monitoring and insect behavioural Don Chathurika Amarathunga, Asaduz Zaman,
analysis, provides pollinator data to growers much Adrian G. Dyer, Alan Dorin.
more rapidly, broadly and deeply than manual
observation. Such rich sources of insect-flower
interaction data potentially enable precision polli-
nation and pollinator management for large-scale References
commercial agriculture.
Abadi, M., Barham, P., Chen, J., Chen, Z., Davis,
Acknowledgments. The authors would like to A., Dean, J., . . . Zheng, X. (2016). Ten-
thank Sunny Ridge Australia for the opportunity sorFlow: A system for large-scale machine
to conduct research at their farm. learning. Proceedings of the 12th usenix
symposium on operating systems design and
Declarations implementation, osdi 2016 (pp. 265—283).
• Funding: Authors were supported by the Aus- Abdel-Raziq, H.M., Palmer, D.M., Koenig, P.A.,
tralian Research Council Discovery Projects Molnar, A.C., Petersen, K.H. (2021). Sys-
grant DP160100161 and Monash-Bosch AgTech tem design for inferring colony-level pol-
Launchpad primer grant. This study was lination activity through miniature bee-
funded by AgriFutures grant PRJ-012993. mounted sensors. Scientific reports, 11 (1),
Amarathunga is supported by ARC Research 1–12.
Hub IH180100002.
• Competing interests: The authors have no com-
peting interests to declare that are relevant to Afonso, M., Fonteijn, H., Fiorentin, F.S., Lensink,
the content of this article. D., Mooij, M., Faber, N., . . . Wehrens, R.
• Ethics approval: Not applicable (2020). Tomato fruit detection and counting
• Consent to participate: Not applicable in greenhouses using deep learning. Fron-
• Consent for publication: Not applicable tiers in plant science, 11 , 1759.
• Availability of data and materials: The datasets
generated during and/or analysed during the
Aizen, M.A., Garibaldi, L.A., Cunningham, S.A.,
current study are available here. Data will be
Klein, A.M. (2009). How much does agri-
uploaded to a permanent repository and a DOI
culture depend on pollinators? lessons from
will be provided upon the acceptance of the
long-term trends in crop production. Annals
manuscript.
of botany, 103 (9), 1579–1588.
• Code availability: Code is available through
github.com/malikaratnayake/Polytrack2.0
• Authors’ contributions: Conceptualization: Amarathunga, D.C.K., Grundy, J., Parry, H.,
Malika Nisal Ratnayake, Adrian G. Dyer, Alan Dorin, A. (2021). Methods of insect
Dorin; Data curation: Malika Nisal Ratnayake; image capture and classification: A system-
Formal analysis: Malika Nisal Ratnayake; atic literature review. Smart Agricultural
Funding acquisition: Adrian G. Dyer, Alan Technology, 100023.
Dorin; Investigation: Malika Nisal Ratnayake,
Don Chathurika Amarathunga, Asaduz Zaman;
Methodology: Malika Nisal Ratnayake, Adrian Aslanpour, M.S., Toosi, A.N., Cicconetti, C.,
G. Dyer, Alan Dorin; Project administration: Javadi, B., Sbarski, P., Taibi, D., . . . Dust-
Adrian G. Dyer, Alan Dorin; Resources: Adrian dar, S. (2021). Serverless edge computing:
G. Dyer, Alan Dorin; Software: Malika Nisal vision and challenges. 2021 australasian
Ratnayake; Supervision: Adrian G. Dyer, Alan computer science week multiconference (pp.
Dorin; Validation: Malika Nisal Ratnayake, 1–10).
Spatial Monitoring and Insect Behavioural Analysis Using Computer Vision for Precision Pollination 15
Garibaldi, L.A., Sáez, A., Aizen, M.A., Fijen, T., the National Academy of Sciences, 118 (2).
Bartomeus, I. (2020). Crop pollination man-
agement needs flower-visitor monitoring and
target values. Journal of Applied Ecology, Jolles, J.W. (2021). Broad-scale applications of
57 (4), 664–670. the raspberry pi: A review and guide for biol-
ogists. Methods in Ecology and Evolution,
12 (9), 1562–1579.
Goscinski, W.J., McIntosh, P., Felzmann, U.C.,
Maksimenko, A., Hall, C.J., Gureyev, T., . . .
others (2014). The multi-modal australian Kamilaris, A., & Prenafeta-Boldú, F.X. (2018).
sciences imaging and visualization environ- Deep learning in agriculture: A survey. Com-
ment (massive) high performance computing puters and electronics in agriculture, 147 ,
infrastructure: applications in neuroscience 70–90.
and neuroinformatics research. Frontiers in
Neuroinformatics, 8 , 30.
Kevan, P.G. (1975). Sun-tracking solar furnaces
in high arctic flowers: significance for pol-
Haalck, L., Mangan, M., Webb, B., Risse, B. lination and insects. Science, 189 (4204),
(2020). Towards image-based animal track- 723–726.
ing in natural environments using a freely
moving camera. Journal of neuroscience
methods, 330 , 108455. Kirkeby, C., Rydhmer, K., Cook, S.M., Strand,
A., Torrance, M.T., Swain, J.L., . . . others
(2021). Advances in automatic identification
Hall, M.A., Jones, J., Rocchetti, M., Wright, D., of flying insects using optical sensors and
Rader, R. (2020). Bee visitation and machine learning. Scientific reports, 11 (1),
fruit quality in berries under protected crop- 1–8.
ping vary along the length of polytunnels.
Journal of Economic Entomology, 113 (3),
1337–1346. Koirala, A., Walsh, K.B., Wang, Z., McCarthy, C.
(2019). Deep learning–method overview and
review of use for fruit detection and yield
Hallmann, C.A., Sorg, M., Jongejans, E., Sie- estimation. Computers and electronics in
pel, H., Hofland, N., Schwan, H., . . . others agriculture, 162 , 219–234.
(2017). More than 75 percent decline over
27 years in total flying insect biomass in pro-
tected areas. PloS one, 12 (10), e0185809. Kuhn, H.W. (1955). The hungarian method
for the assignment problem. Naval research
logistics quarterly, 2 (1-2), 83–97.
Howard, S.R., Nisal Ratnayake, M., Dyer, A.G.,
Garcia, J.E., Dorin, A. (2021). Towards
precision apiculture: Traditional and techno- Lu, H., Fu, X., Liu, C., Li, L.-g., He, Y.-x., Li,
logical insect monitoring methods in straw- N.-w. (2017). Cultivated land information
berry and raspberry crop polytunnels tell extraction in uav imagery based on deep
different pollination stories. Plos one, 16 (5), convolutional neural network and transfer
e0251572. learning. Journal of Mountain Science,
14 (4), 731–741.
Magnier, B., Gabbay, E., Bougamale, F., Moradi, Ratnayake, M.N., Dyer, A.G., Dorin, A. (2021b).
B., Pfister, F., Slangen, P. (2019). Multiple Tracking individual honeybees among
honey bees tracking and trajectory model- wildflower clusters with computer vision-
ing. Multimodal sensing: Technologies and facilitated pollinator monitoring. Plos one,
applications (Vol. 11059, p. 110590Z). 16 (2), e0239504.
Ratnayake, M.N., Dyer, A.G., Dorin, A. (2021a). Spaethe, J., Tautz, J., Chittka, L. (2001).
Towards computer vision and deep learn- Visual constraints in foraging bumblebees:
ing facilitated pollination monitoring for flower size and color affect search time and
agriculture. Proceedings of the ieee/cvf flight behavior. Proceedings of the National
18 Spatial Monitoring and Insect Behavioural Analysis Using Computer Vision for Precision Pollination
Academy of Sciences, 98 (7), 3898–3903. Yang, C., Collins, J., Beckerleg, M. (2018). A
model for pollen measurement using video
monitoring of honey bees. Sensing and
Spencer, E.E., Barton, P.S., Ripple, W.J., New- Imaging, 19 (1), 1–29.
some, T.M. (2020). Invasive european wasps
alter scavenging dynamics around carrion.
Food Webs, 24 , e00144. Zivkovic, Z., & Van Der Heijden, F. (2006). Effi-
cient adaptive density estimation per image
pixel for the task of background subtraction.
Stojnić, V., Risojević, V., Muštra, M., Jovanović, Pattern recognition letters, 27 (7), 773–780.
V., Filipi, J., Kezić, N., Babić, Z. (2021).
A method for detection of small moving
objects in uav videos. Remote Sensing,
13 (4), 653.
Wang, S., Liu, W., Wu, J., Cao, L., Meng, Q.,
Kennedy, P.J. (2016). Training deep neu-
ral networks on imbalanced data sets. 2016
international joint conference on neural net-
works (ijcnn) (pp. 4368–4374).