Artiuculo 2
Artiuculo 2
A R T I C L E I N F O A B S T R A C T
Keywords: As global demands on the poultry production and welfare both intensify, the precision poultry farming tech-
Poultry production nologies such as computer vision-based cybernetics system is becoming important in addressing the current is-
Animal welfare sues related to animal welfare and production efficiencies. The integration of computer vision technology has
Computer vision
become a catalyst for transformative change in precision farming, particularly concerning productivity and
Deep learning
welfare. This review paper delineates the central role of computer vision in precision poultry farming, focusing
on its applications in non-contact monitoring methods that employ advanced sensors and cameras to enhance
farm biosecurity and bird observation without disturbance. We delved into the multifaceted advancements such
as the utilization of convolutional neural networks (CNNs) for behavior analysis and health monitoring, evi-
denced by the high accuracy sorting of eggs and identification of health concerns within target-dense farm en-
vironments. The review paper underscores advancements in precision agriculture, including accurate egg weight
estimation and egg classification within cage-free systems, paralleling the poultry sector’s evolution towards
more ethical farming practices. Moreover, it addresses the progress in poultry growth monitoring and examines
case studies of commercial farms, showcasing how these innovations are being practically applied to enhance
productivity and animal welfare. Challenges remain, particularly in terms of environmental variability and data
annotation for deep learning models. Nevertheless, the review emphasizes the scope for future innovations like
voice-controlled robotics and virtual reality applications, which have the potential to enhance poultry farming to
new levels of efficiency, humanity, and sustainability. The insights assert that the continued exploration and
development in computer vision technologies are not only instrumental for the poultry sector but also offer a
blueprint for agricultural enhancement at large.
* Corresponding author.
E-mail address: [email protected] (L. Chai).
https://doi.org/10.1016/j.compag.2024.109339
Received 3 June 2024; Received in revised form 2 August 2024; Accepted 8 August 2024
Available online 17 August 2024
0168-1699/© 2024 Elsevier B.V. All rights are reserved, including those for text and data mining, AI training, and similar technologies.
X. Yang et al. Computers and Electronics in Agriculture 225 (2024) 109339
varying flock density, complicate the capture of clear visual data, which October 2023 reveals a significant body of research − 246 papers, by
is essential for detailed behavior analysis. The advent of deep learning, searching the core keywords − Computer Vision and Poultry, dedicated to
particularly the use of Convolutional Neural Networks (CNNs), repre- computer vision’s transformative impact on poultry management. Using
sents a considerable leap forward in overcoming these issues (Bist et al., the “AND” function in the search bar of scientific databases helps refine
2023b, 2024; Guo et al., 2023). These models have equipped farmers results to papers that utilize poultry as experimental animals and employ
with the tools to delve into the nuances of animal behavior and health, image vision as a method. This research collectively emphasizes the
evidenced by high-accuracy applications ranging from sorting eggs to potential of computer vision to enhance and streamline poultry man-
identifying sick birds in crowded environments using algorithms like agement, historically dependent on manual processes. The dedication to
You Only Look Once (YOLO)(Jocher, 2020; Ma et al., 2020). Addition- precision and efficiency is evident throughout these works, showcasing
ally, the field of poultry farming has seen technological advancements in the dynamic capabilities of computer vision technologies. These
the evaluation of egg weight − a key quality and value determinant − research works foretell the emergence of a new era in farming focused
through automated measurement systems that employ a range of tech- on sustainability, efficiency, and improved welfare, led by the advances
niques from Artificial Neural Networks (ANN) to Support Vector Ma- in computer vision.
chine (SVM)(Amraei et al., 2017; Pacure Angelia et al., 2022). This
innovation offers a substantial upgrade over manual weighing methods, 2. Computer vision in poultry Farming: A decade of publication
streamlining the process with improved efficiency and precision(Yang trends
et al., 2023c). The integration of deep learning with machine learning
regression techniques has been particularly significant for the compre- Within the poultry sector, advancements in technology have ushered
hensive classification and weighing of eggs, including those from cage- in a new era of research possibilities. Computer vision, powered by deep
free systems, a change aligned with the sector’s move from traditional learning and neural networks, is fast becoming a game-changer. To
caging to more ethical farming practices. The shift toward cage-free egg gauge the depth of this integration, established databases like PubHub
production has necessitated adaptable computer vision systems capable and Web of Science (aligned with search results from Scopus and
of handling a wide array of egg types, from floor eggs to those destined PubMed) were scoured focusing on terms synonymous with both com-
for commercial distribution(Bist et al., 2022, 2023a). Such advance- puter vision (“deep learning”, “neural networks”, “image processing”,
ments underscore the necessity for computer vision systems that can “image recognition”) and poultry studies (“chickens”, “avian”, “layers”,
accurately classify and weigh eggs, ensuring uniform quality for both “broilers”).
producers and consumers(Mertens et al., 2005).
Significant progress has been made in other aspects of poultry
farming as well, like monitoring growth and detecting health disorders 2.1. Yearly publication analysis
and body weight prediction (Bist et al., 2023a, Yang et al., 2024).
Neethirajan (2022) proposed a novel methodology, centering on the When searching with the “core keywords” Computer Vision and
locomotive behaviors of poultry. By integrating sophisticated tracking Poultry, there were 246 papers published from January 2013 to October
algorithms, notably the Kalman Filter, their system was capable of 2023, averaging 23 papers annually. As shown in Fig. 1, 2022had the
projecting growth trajectories from the birds’ activity levels(Neethir- peak annual publication rate with 64 papers, and 2019 had the fastest
ajan, 2022). Angelia et al. (2021) explored egg classification techniques. growth rate at 127.27 %. This suggests that research in this field is un-
Employing a region-convolutional neural network, they succeeded in dergoing rapid development and is in a swift ascending phase. The surge
classifying eggs with a remarkable 93.3 % accuracy, demonstrating the in publications indicates a growing interest and significant advance-
efficacy of image processing technologies in determining egg grades ments in merging computational technologies such as machine learning
such as Grade A, B, C, and Inedible(Pacure Angelia et al., 2022). within avian studies. This upward trajectory may be attributed to the
Lamping et al. (2022) developed ChickenNet, an innovative framework realization of the potential impacts of applying AI techniques to poultry
designed to evaluate the plumage condition of laying hens. This system, research, such as improved poultry health monitoring, better disease
an extension of the Mask Region-Based Convolutional Neural Network detection, and enhanced production efficiency. The trend also highlights
(R-CNN) model, underwent testing at various image resolutions, a collaborative effort between the tech sector and poultry science,
resulting in a mean average precision (mAP) of 98.02 % in identifying bringing forth interdisciplinary solutions. For researchers and stake-
hens and a 91.83 % accuracy rate in predicting the status of their holders, this trend underscores the importance of investing in this
plumage(Lamping et al., 2022). Advanced computer vision techniques intersection of technologies, as it promises to redefine the future of
have shown promising results in improving precision, reducing labor- poultry management and production.
intensive processes, and enhancing overall farm efficiency. The use of
sophisticated algorithms and multimodal systems, incorporating
different sensors and data types, further amplifies the potential of these 2.2. Journals at the forefront
technologies(Astill et al., 2020; Li et al., 2023a). However, the journey is
not without its hurdles. Real farm applications still face challenges such The top 30 journals by publication volume are shown in Fig. 2. The
as environmental variability and the need for vast, labeled datasets for journal with the most publications on this topic is “Animals” with 20
deep learning models (Andriyanov et al., 2021; Joffe and Usher, 2017). articles; “Poultry Science” ranks second with 14 articles, and “Sensors” is
Future directions in this field appear promising, with potential ad- third with 10 articles. Navigating the evolving landscape of computer
vancements like voice-controlled robotics and virtual reality integration vision as it intersects with poultry research can be significantly
(Zang et al., 2011; Kanash et al., 2021). The amalgamation of computer enhanced by closely analyzing leading journals in the field. By pin-
vision with these cutting-edge technologies could further revolutionize pointing and routinely checking into authoritative journals such as
poultry farming, making it more efficient, welfare-friendly, and sus- “Animals”, “Poultry Science”, and “Sensors”, researchers can stay
tainable by offering a more intelligent system to manage poultry pro- abreast of the most recent and impactful findings. These publications
duction. The ongoing research and development in this domain are not often serve as reservoirs of quality information, given their rigorous
only crucial for the poultry sector but also serve as a blueprint for other peer-review processes. Beyond the immediate academic content, these
sectors in agriculture, demonstrating the vast potential of computer journals can spotlight emerging technological trends, novel methodol-
vision and artificial intelligence (AI) in enhancing productivity and ogies, and innovative applications specific to the realm of poultry. This
welfare(Franzo et al., 2023; Zhang et al., 2023e). is particularly vital for those who aim to integrate advanced computer
An extensive review of literature spanning from January 2013 to vision techniques into poultry research and management.
2
X. Yang et al. Computers and Electronics in Agriculture 225 (2024) 109339
Fig. 1. Annual publication trend of literature related to computer vision and poultry from January 2013 to October 2023.
Fig. 2. Journal publication analysis from January 2013 to October 2023 on computer vision and poultry.
2.3. Global Contributions: Country-wise analysis position themselves in this evolving landscape, seek partnerships, and
stay updated with the latest methodologies and findings from these
The top 54 countries in terms of publication volume in the research leading nations.
fields of computer vision and poultry are shown in Fig. 3. The country
with the highest publication volume in this area is the United States of 2.4. Keyword Trends: Evolution of focus topics
America with 75 papers (30.49 %), followed by China with 58 papers
(23.58 %), and the United Kingdom ranking third with 22 papers (8.94 Keywords in a paper are a concise summary and encapsulation of the
%). This distribution provides a roadmap for researchers and in- research objectives, subjects, and methods. Analysis based on keywords
stitutions. Engaging with the leading countries can open doors for in- can reflect the evolution of themes and research hotspots in a specific
ternational collaborations, knowledge exchange, and access to more field over a certain period. Using computer vision and poultry as search
extensive datasets and resources. It is crucial for individuals and in- keywords for the timeframe from January 2013 to October 2023, as
stitutions to understand these global research dynamics to effectively shown in Fig. 4, the top five keywords in terms of frequency are:
3
X. Yang et al. Computers and Electronics in Agriculture 225 (2024) 109339
Fig. 3. Analysis of research countries from January 2013 to October 2023 on computer vision and poultry.
Fig. 4. Keyword frequency analysis of computer vision and poultry from January 2013 to October 2023 (This word cloud was generated through bibliometric
analysis to count the frequency of each word in the dataset. Words that appear more frequently are displayed in larger fonts).
machine learning, poultry, avian influenza, random forest, and salmo- ’Salmonella’ brings to the forefront issues of food safety, highlighting
nella. Dividing the time frame from January 2013 to October 2023 into the critical need to monitor and control bacterial pathogens that can
four periods, as shown in Fig. 5, represents the popularity ranking and affect both animal and human health. This progression mirrors the
ranking changes of keyword frequency related to computer vision and sector’s dedication to leveraging technology for holistic advancements
poultry. Over the span of a decade, from 2013 to 2023, the persistent in poultry health, production, and welfare.
prominence of machine learning stands out, as it emerges as a pivotal
topic across all four distinct time periods. This underscores its enduring 3. Delving into the core applications and implications of
relevance and crucial role in various domains. Delving into the evolution computer vision in poultry farming
of computer vision topics, the early phase from 2013 to 2015 was
marked by the presence of concepts such as “hyperspectral imaging” and 3.1. Fundamentals of computer vision in poultry management
“information gain”. Progressing to 2016–2018, there was a notable
introduction of terms like “random forest” and “2D, 3-dimensional”. As Computer vision, rooted in interpreting visual data similarly to
we transitioned into the 2019–2021 period, the field exhibited a pro- human vision, has seen a surge in applications, notably in animal
nounced tilt towards advanced methodologies, prominently featuring farming. With the growing demand for poultry products due to a rising
“convolutional neural networks”. This momentum carried forward into global population, the sector is pushed to maintain quality care for
2022–2023, where “convolutional neural networks” remained at the increasing numbers of animals(Li et al., 2021a). Traditional methods
forefront, complemented by the emergence of “big data”. In poultry sometimes fail to detect early signs of abnormalities in animals, which
science, the evolution of these keywords reflects the sector’s intricate may affect their health and productivity. To address this, computer
response to emerging challenges and opportunities. The prominence of vision technologies, particularly CNNs, provide objective and real-time
terms like “avian influenza” underscores the ongoing efforts in disease monitoring tools(Fernandes et al., 2020). Emerging digital image
management and prevention, while the emergence of “animal welfare” acquisition technologies have enabled areas like digital image process-
indicates a growing emphasis on ethological research and ensuring ing and image analysis, which play crucial roles in interpreting visual
optimal living conditions for poultry. Moreover, the keyword data. Modern sensors, such as infrared cameras and hyperspectral
4
X. Yang et al. Computers and Electronics in Agriculture 225 (2024) 109339
Fig. 5. Analysis of popularity rankings and ranking changes for computer vision and poultry across different time periods from January 2013 to October 2023.
imaging tools, enable diverse applications in poultry farming, including bird movement introduce noise and imperfections into raw images
behavior monitoring and body weight measurement(Li et al., 2020; (Zhang and Zhou, 2023; Zhang et al., 2023e). Adaptive image noise
Olejnik et al., 2022). Cameras and sensors, the foundational components removal tools, equipped with classification capabilities, ensure data
of computer vision, offer a holistic view of an environment when their remains free from visual degradation(Chen et al., 2020a). Deep learning
data is fused. Red, green, and blue wavelengths (RGB) cameras, known techniques further address challenges such as blur, shadows, and poor
for high-resolution images, capture the visual spectrum(Brenner et al., lighting(Anvari and Athitsos, 2022). Therefore, specialized image pro-
2023). Thermal infrared cameras provide insights into heat patterns, cessing techniques that cater to the unique challenges in poultry envi-
and depth sensors offer spatial information by combining with RGB data ronments, like bird movement and dust, are essential for preparing data
(Feng et al., 2021). However, challenges like the photogrammetric co- for further analysis. Feature extraction in poultry imaging involves
processing of thermal infrared and RGB images, make calibrated sys- extracting relevant features, like bird size or health indicators. Tech-
tems and advanced algorithms indispensable for accurate data inter- niques like the vision transformers and segment anything model have
pretation(Dlesk et al., 2022). In poultry farming, obtaining clear visual facilitated bird detection(Dosovitskiy et al., 2021; Jamil et al., 2022;
data poses a challenge. Factors like dust, varying light conditions, and Yang et al., 2023c). Moreover, similarity search concepts have been
5
X. Yang et al. Computers and Electronics in Agriculture 225 (2024) 109339
valuable in high-resolution imaging, making multi-modal imaging system specifically for chicken eggs. Their approach ingeniously tackled
frameworks crucial for efficient feature extraction(Somnath et al., challenges like varying ambient light conditions and the potential oc-
2018). Computer vision’s integration with broader management systems clusion of eggs, achieving remarkable accuracy (R2 of 0.984) in volume
has been paramount in sectors like livestock and transportation. These estimation(Okinda et al., 2020b). Another study took on the task of
systems enable real-time and accurate data acquisition, leading to pre- recognizing cracks on eggshells, a task made challenging due to natural
dictive modeling for precise decisions. In poultry management, com- dark spots on the egg surface. Their method used negative laplacian of
puter vision offers myriad benefits. It improves efficiency in monitoring gaussian (LOG) operator to enhance crack visibility, achieving an
livestock, facilitating real-time egg quality monitoring, disease detec- impressive 92.5 % recognition rate, which could significantly reduce the
tion, and growth pattern evaluations(Dorea et al., 2020; Kumar et al., risk of selling damaged eggs(Guanjun et al., 2019). The weight of an egg
2023). Applications even extend to monitoring chicken behavior, can be a direct indicator of its quality (Schwagele 2011). Recognizing
showcasing the potential in recognizing and classifying behaviors for this, Cen et al. (2006) embarked on developing a machine vision system
livestock well-being. Moreover, decision-making is enhanced; mobile specifically for this purpose. Their system, which employed image seg-
health apps equipped with computer vision offer non-invasive assess- mentation based on RGB intensity, showed a strong correlation between
ments of superficial wounds (e.g., inflammation or tissue damage can be predicted and actual weights, indicating its reliability(Cen et al., 2006).
used to determine the severity of feather damage), improving poultry Angelia et al. (2021) ventured into the domain of egg classification.
health care(Zhang et al., 2023d). In sum, computer vision’s integration Using the region-convolutional neural network, they achieved a 93.3 %
in poultry management promotes better health monitoring, minimizes accuracy rate, showcasing the potential of image processing in egg grade
manual labor, and encourages data-driven decisions, thus enhancing (Grade A, B, C, Inedible) determination(Pacure Angelia et al., 2022). Sex
overall poultry farming efficiency(Zheng et al., 2021; Abraham et al., determination in breeder eggs has always been a topic of interest. A
2021). Fig. 6 below provides a succinct flowchart that illustrates the comprehensive review in this area highlighted the potential of non-
end-to-end integration of computer vision into poultry management, invasive methods, discussing cutting-edge techniques like Raman spec-
capturing each pivotal step from image acquisition to data-driven de- troscopy and computer colorimetric setups, which could revolutionize
cision making. hatchery practices(Aleynikov, 2022). Another noteworthy development
was an automated system for egg grading. This system, capable of
3.2. Egg quality assessment identifying, counting, and classifying eggs, achieved a staggering 98 %
accuracy for individual classifications based on a two-stage model (real-
In the modern poultry and egg sector, ensuring the quality of eggs is time multitask detection (RTMDet) and random forest networks)(Yang
not just a matter of meeting consumer expectations but also a testament et al., 2023a). Other pioneering studies in egg quality field focused on
to the advancements in technology and research. As the global demand diverse areas such as eggshell quality assessment(Pan et al., 2011),
for eggs continues to rise, the sector faces the challenge of maintaining determining egg freshness(Qi et al., 2020), yolk color analysis(Ma et al.,
quality while scaling up production. Traditional methods of quality 2017), contamination detection, size and shape analysis(Nasir et al.,
assurance, often manual and time-consuming, are increasingly being 2018), surface defect detection(Mota-Grajales et al., 2019), and a deep
replaced by automated systems. Among these, computer vision, when dive into internal quality assessment (like blood spots)(Arivazhagan
synergized with machine learning, has emerged as a frontrunner in et al., 2013). While the aforementioned studies have made significant
revolutionizing egg quality assurance. Okinda et al. (2020) delved into strides in egg quality assurance using computer vision, there’s still room
the realm of volume estimation, introducing a depth image-based for growth. One avenue for exploration is the integration of real-time
Fig. 6. A flowchart that illustrates the end-to-end process of computer vision-based cybernetics system in poultry management.
6
X. Yang et al. Computers and Electronics in Agriculture 225 (2024) 109339
monitoring systems. These systems, capable of instantly processing and learning algorithms, their system demonstrated a robust correlation
providing feedback on the data they receive, could revolutionize poultry between feeding behaviors and growth trajectories, with an efficacy rate
farms by allowing instantaneous quality checks. As eggs are produced, of 95 % (Nakarmi et al., 2014). Neethirajan (2022) took a different
any issues such as cracks and stains could be immediately identified and approach, focusing on the movement patterns of poultry. Their system,
addressed, ensuring optimal quality. However, the challenges of poultry which employed advanced tracking algorithms like the Kalman Filter,
environments, characterized by varying levels of light, temperature, and could predict growth rates based on the activity levels of the birds. This
humidity, necessitate the development of systems robust enough to not only ensured optimal the tracking of the chicken’s temporal and
operate under these varying environmental conditions(Bist et al., 2022, spatial changes but also provided insights into the overall movement the
2023a, 2023c). Lighting is pivotal for computer vision; different lighting flock (Neethirajan, 2022).
conditions can affect image quality and analysis accuracy. As we Jung et al. (2021) investigated the keel bone development in poultry
advance, reducing false positives-instances where a system mistakenly using computer vision techniques and 3D imaging. They were able to
flags a good quality egg as subpar-becomes paramount. This ensures that monitor the keel bone damage of laying hens providing crucial insights
quality eggs aren’t wrongfully discarded, preventing unnecessary into their overall physical development and health with a precision rate
wastage. The future also beckons the exploration of multi-modal sys- of 86 %(Jung et al., 2021). You et al. (2021) provided a deep dive into
tems, which amalgamate traditional computer vision techniques with the nutritional aspect. Their system, which utilized the random forest
other types of data input. For instance, combining computer vision with classifier, could monitor the food intake of individual birds, correlating
other sensors like infrared(Zhang et al., 2023c), which can detect tem- it with their growth rates, and ensuring that the birds received optimal
perature variations indicating egg freshness, or ultrasonic sensors nutrition for healthy growth(You et al., 2021). Lin et al. (2020) intro-
(Mocanu et al., 2016), adept at identifying minuscule, otherwise invis- duced a system using time-lapse imaging and Faster region-based con-
ible eggshell cracks, can offer a more comprehensive egg quality volutional neural network (R-CNN) deep learning algorithms to monitor
assessment. Such holistic evaluations ensure that only the best eggs chicken movement, drinking habits, and growth patterns. With a
make their way to consumers (Fig. 7). detection accuracy of 98.16 % and tracking accuracy of 98.94 %, this
method offers a comprehensive insight into chicken behavior and
growth, especially in addressing heat stress in tropical regions(Lin et al.,
3.3. Broiler growth monitoring through computer vision
2020). Thompson et al. (2023) and Nakrosis et al. (2023) all focused on
different aspects of poultry growth, from dropping classification to
In the rapidly evolving poultry sector, monitoring the growth of meat
feather growth patterns. Their computer vision systems, which
chickens (broilers) has become paramount. As the global poultry market
employed techniques like the YOLOv5 and K-means algorithm, provided
expands, optimizing growth rate becomes imperative while ensuring the
comprehensive insights into each aspect, ensuring that the birds grew
health and well-being of the broilers. Traditional methods, which often
healthily and uniformly with an average accuracy rate of 89 %
rely on manual measurements and visual checks, are gradually being
(Thompson et al., 2023; Nakrosis et al., 2023). While these studies have
overshadowed by the precision and efficiency of computer vision,
significantly advanced the field of poultry growth monitoring using
especially when augmented by machine learning techniques. A
computer vision, there’s still a vast expanse to explore. Future research
groundbreaking study by delving deep into the potential of computer
could delve into real-time system integration in poultry farms for
vision for monitoring poultry growth. By addressing challenges like bird
continuous monitoring(Raj and Jayanthi, 2018). Adaptable systems,
movement, they introduced a depth image-based system. Their method,
such as those with depth camera for different poultry sizes or those that
which adeptly navigated these challenges, achieved real-time weight
can recalibrate based on varying light conditions, will be pivotal(Lee,
estimation, ensuring consistent growth rates and early detection of
2012; Lin et al., 2019). As machine learning models evolve, integrating
growth anomalies (Mortensen et al., 2016). Furthering the discourse,
them with computer vision can further refine system accuracy. The
Aydin et al. (2010) concentrated on poultry posture and activity levels as
exploration of multi-modal systems, merging computer vision with other
indicators of health and growth. Using sophisticated image processing
sensory data like acoustic or thermal sensors, can offer a comprehensive
techniques like the CNN, they discerned between postures of healthy
solution, revolutionizing poultry growth monitoring. Table 1 lists Pri-
and potentially ailing birds. Their system, which melded edge detection
mary methods in computer vision technology for overseeing and
with pattern recognition, demonstrating the promise of computer vision
tracking poultry growth in monitoring systems.
in early detection of gait abnormalities and its association with body
weight and growth rate (Aydin et al., 2010).
Nakarmi et al. (2014)’s research, on the other hand, was centered on 3.4. Poultry Health, welfare and disease detection
poultry feeding patterns. Recognizing the direct correlation between
feeding patterns and poultry health and growth, they devised a com- The field of poultry health management stands on the cusp of
puter vision system to monitor individual bird feeding activity. By transformation with the potential adoption of computer vision tech-
employing image segmentation techniques combined with deep nologies. The precision in monitoring sick birds, a key welfare indicator,
7
X. Yang et al. Computers and Electronics in Agriculture 225 (2024) 109339
Table 1 diseases, such as Newcastle Disease and Avian Influenza, showcasing the
Main computer vision techniques in poultry growth monitoring systems. utility of thermal imaging in preemptive health measures(Sadeghi et al.,
Technique Target Sensor Bird type Reference 2023). This advancement is complemented by the prowess of deep
learning in disease diagnostics through the deployment of convolutional
Bayesian artificial Body weight 3d camera Broiler Mortensen
neural network et al. (2016) neural networks, with models like MobileNetV2 and extreme inception
Linear real-time Distribution Top view Broiler Kashiha et al. (Xception) achieving high diagnostic accuracies (98 %) in fecal image
model camera (2013) classification, thus equipping farmers with powerful tools for disease
Yolov3 Gender Digital Hens and Yao et al. management. Furthermore, the behavioral patterns of laying hens have
camera roosters (2020)
Commercial Growth rate Upper Broiler De Wet et al.
been decoded using computer vision, enabling continuous and individ-
software view (2003) ual behavior (standing, walking, and scratching) monitoring, a signifi-
camera cant advancement over traditional human observation. The potential of
K-nearest neighbors’ Bone Digital Broiler Castro Júnior computer vision systems in agriculture is substantial, inviting the
algorithm camera et al. ((2022)
development of sophisticated algorithms capable of adapting to the
Faster R-CNN Heat stress Web Broiler Lin et al.
camera (2018) ever-changing farm conditions, such as variable ambient light and the
Matlab Inactive 3d camera Broiler Aydin (2017) multiple behaviors of poultry(Khairunissa et al., 2021; Ifuchenwuwa
birds et al., 2023). The ongoing refinement and validation of these technol-
Matlab Thermal Top view Laying Del Valle ogies across different poultry breeds and settings are critical to their
comfort camera hens et al. (2021))
Bot-SORT Tracking Top view Cage-free Siriani et al.
success. In addition, the pioneering advancements in computer vision
camera chickens (2023) technologies hold immense promise for poultry health management.
Segment anything Body weight Thermal Cage-free Yang et al., Groundbreaking methods in poultry health management are show-
model camera chickens 2023c) cased using thermographic imaging and the novel technique of con-
Faster region-based Drinking Digital Broiler Lin et al.
verting audio files to images for stress analysis. This process involves
convolutional time camera (2020)
neural network analyzing the transformed images with pretrained CNN, achieving sig-
Software carne 2.2 Fat content Digital Boiler and Chmiel et al. nificant accuracy in stress detection(van den Heuvel et al., 2022).
camera turkey (2011) Integrating virtual reality (VR) and eye-tracking technology could also
Generative Chicken face Digital Chicken Ma et al. represent a future direction for enhancing poultry disease detection,
adversarial camera (2022)
offering precise monitoring and analysis capabilities. A study presents a
network-masked
autoencoders VR system using eye-tracking to diagnose neurodegenerative diseases,
Faster R-CNN Droppings Top view Broiler Zhou et al. successfully eliciting diagnostic eye movements and enhancing remote,
camera (2023) accurate detection of conditions like Parkinson’s(Orlosky et al., 2017).
Mobilenetv2 Health Digital Broiler Li et al (2023)
The integration of advanced computer vision with multi-modal sensory
assessment camera
YOLOv5 Floor eggs Top view Cage-free Subedi et al. data underscores a future where adaptable and scalable solutions
camera chickens (2023b) become the cornerstone of poultry health management and animal
YOLOv5 Pecking Top view Cage-free Subedi et al. welfare.
camera chickens (2023a)
YOLOv5 Mislaying Vertical Cage-free Bist et al.
view chickens (2023d)
3.5. Integrating robotics and computer vision
camera
YOLOv4 Preference Top view Laying Kodaira et al. The integration of robotics and computer vision in poultry processing
behavior camera hens (2023) reveals a landscape of innovative technologies aimed at enhancing ef-
ficiency and animal welfare. Misimi et al. (2016) introduced the GRIB-
YOLOX Counting Top view Broiler Li et al.
camera (2022) BOT, a 3D vision-guided robot for harvesting of chicken fillets, which
represents a significant step toward automating the manual processes
currently in place(Misimi et al., 2016). This innovation is paralleled by
has markedly improved by leveraging image analysis to estimate growth developments in ethological research, where robots like PoulBot were
and detect health anomalies. Zhuang et al. (2018) developed a real-time used to study and influence the behavior of domestic chicken chicks,
health monitoring algorithm for broilers using image processing and thereby advancing our understanding of animal-robot interactions(Gri-
Support Vector Machine (SVM), achieving 99.469 % accuracy in bovskiy et al., 2018). Chen and Wang (2018) described a machine vision
detecting H5N2 bird flu(Zhuang et al., 2018). This is particularly evident method for recognizing visceral contours in poultry carcasses, which
in the processing sector, where machine vision systems have been greatly improves the accuracy of processing and highlights the potential
adeptly employed to correlate carcass characteristics with viscera, for automation in tasks that were traditionally challenging to mechanize
thereby ensuring quality control during evisceration (Chen et al., 2023). (Chen and Wang, 2018). Concurrently, PoultryBot demonstrates the
The adaptability of broilers to their rearing environment has been feasibility of using autonomous robots for tasks such as floor egg
quantified using computer-vision-based indices, Massari et al. (2022) collection in commercial poultry houses, despite a need for further
tested cluster and unrest CV-based indexes on twenty broilers to monitor refinement in collection mechanisms and navigation systems(Vroe-
movement, validating their effectiveness in controlled settings, and gindeweij et al., 2018). The advancements in evisceration are showcased
suggesting applicability in precision livestock farming(Massari et al., by six degrees of freedom robot system, which used robotics and ma-
2022). Additionally, the task of pose estimation has been addressed chine vision to achieve high accuracy in poultry incisions for eviscera-
through multi-part detection models Zheng et al. (2022). In addition, tion(Chen et al., 2021a).
exploring automatic poultry pose recognition using deep neural net- In the realm of egg handling, a study has developed a sophisticated
works (DNNs), outperforming algorithms like YOLOV3 with higher method involving an improved three-channel convolutional neural
precision and recall, indicating potential for monitoring poultry health network (T-CNN) and you only look once (YOLOv5) technique for egg
on large-scale farms (Fang et al., 2022). The main computer vision detection and segmentation(Zhang et al., 2023a). The method includes
techniques for monitoring poultry health and welfare are summerized in median filtering, OTSU method (OTSU) for segmentation, and the Kirsch
Table 2. operator for edge extraction, followed by feature extraction via T-CNN
In the realm of disease detection, thermographic and AI methodol- and classification using a support vector machine (SVM). The technique
ogies have been synergized to facilitate the early identification of achieved a 95.65 % accuracy rate in egg recognition, with a low
8
X. Yang et al. Computers and Electronics in Agriculture 225 (2024) 109339
Table 2
Main computer vision techniques for welfare indicators detection.
Technique Target Sensor Bird type Reference
Residual network (ResNet) Sick bird Digital camera Broiler Zhang and Chen
(2020)
CNNs Manure Top view camera Chicken Zhu and Zhou (2021)
Logistic regression Comb Google Search / Bakar et al. (2023)
Dense convolutional Network Chicken Top view camera Broiler Cao et al. (2021)
U2-Net Plumage Side view camera Layer Heo et al. (2023)
Threshold segmentation Muscle Vertical view Broiler Chen et al. (2022)
camera
Visual geometry group network (VGGNet) and Avian pox, Infectious Laryngotracheitis, Newcastle, and Digital camera Chicken Quach et al. (2020)
ResNet Marek
Chan-Vese model Head and body Side view camera Caged chicken Xiao et al. (2017)
Xception Eimeria / / Boufenar et al. (2022)
Decision Tree Slouching, eye foaming, lethargy, feather loss, color Side view camera Caged chicken Quintana et al. (2022)
paling, and raling
U-Net and Pix2pixHD. Chicken Side view camera Caged chicken Yang et al. (2023e)
CNNs Crowdedness Kinect sensor Cage-free Pu et al. (2018)
chicken
CNNs Locomotion, perching, feeding, drinking, and nesting. Top view camera laying-hen Nakarmi et al. (2014)
CNNs Breeder Top view camera Broiler Pereira et al. (2013)
Mask R-CNN Postures Top view camera Broiler Joo et al. (2022)
YOLOv4
YOLOv3 egg breeders Top view camera Hens Wang et al. (2020)
Matlab Flock movement Top view camera Broiler Neves et al. (2015)
Improved Sparrow Search Algorithm and Support Aggressive behaviors High-definition Taihang Li et al. (2023b)
Vector Machine cameras chickens
Matlab Eating behaviors High-speed camera Broiler Mehdizadeh et al.
(2015)
YOLOv5 and deep sort Mobility Top view camera Broiler Jaihuni et al. (2023)
misrecognition rate, demonstrating its efficacy for potential use in for both scientific research and potential improvements in poultry
automated goose egg picking systems(Zhang et al., 2023a). This is welfare. This project utilized video and audio data, along with advanced
complemented by the development of robots designed for the removal of data analysis systems, to build formal models of animal behavior for
broiler mortality, and autonomous egg picking systems that promise to implementation in robots(Gribovskiy et al., 2010; Chen et al., 2019).
reduce manual labor significantly while enhancing production effi- Future studies in poultry farming could revolutionize the sector by
ciency. Livestock robots capable of picking and classifying eggs on farms leveraging voice-controlled robotics and VR for enhanced human-
are equipped with various sensors and virtual instrument devices, –machine interaction. The potential of voice-controlled machinery, as
indicating a shift towards multifunctional farm automation(Wang et al., demonstrated by the Raspberry Pi project, indicates significant benefits
2019). in terms of automation. Using smartphone devices to control agricul-
Recent advancements highlight the importance of robotics in various tural machinery could greatly improve efficiency and reduce labor needs
sectors. High-throughput robotics help detect antimicrobial-resistant in the poultry sector. The incorporation of Raspberry Pi 3 and its built-in
bacteria, linking robotics with public health(Truswell et al., 2023). Wi-Fi capability could serve as a cornerstone for internet-based auto-
Machine vision, using an improved region-based active contour method, mation, streamlining operations through simple voice commands.
accurately positions viscera, showcasing the potential of image pro- Furthermore, this system necessitates the use of a microSD card loaded
cessing in complex chicken slaughtering tasks(Chen et al., 2021b). with Raspbian OS to boot the Raspberry Pi. The integration of these
Additionally, pick-and-place systems for handling deformable poultry technologies effectively transforms a conventional farm into a ’smart
pieces from cluttered bins highlight the need to evaluate robotic farm’, where tasks are automated, and efficiency is greatly enhanced.
adaptability to meet varying demands in the food industry(Raja et al., The system’s design takes into consideration ease of use, with a focus on
2023). The selective compliance articulated robot arm (SCARA) robot creating a seamless interface for farmers who may not have extensive
with a pneumatic gripper is specifically designed for egg handling in the technical knowledge. The use of voice commands signifies a move to-
poultry sector, showcasing automation’s potential to increase produc- wards more natural forms of human–machine interaction, reducing the
tion speed and reduce manual labor(Prakash et al., 2021). Moreover, learning curve and increasing accessibility(Chavan et al., 2019). In
smart mobile robots for free-range farms and real-time recognition addition, further advancement could integrate VR-based robotics, taking
studies of egg-collecting robots in free-range duck sheds exhibit the advantage of immersive teleoperation systems to bridge the physical and
growing influence of machine learning models, like YOLOv5s, on robotic virtual worlds in poultry environments. By using algorithms for real-
efficiency and environmental adaptability(Chang et al., 2020; Fei et al., time 3D reconstruction of unstructured agricultural scenes, operators
2023). Lastly, the studies explored sophisticated integrations of machine could remotely guide robots through complex tasks within a virtual
vision and robotics tailored to specific needs within the poultry sector. representation of the actual environment(Fadzli et al., 2023). This
Chen et al. (2019) constructed an eviscerating robot system for the immersive approach could facilitate precise control over farming ac-
poultry processing sector, enhancing production efficiency, ensuring tivities, from feeding and health monitoring to egg collection, while
production efficiency and ensuring the health standards of poultry minimizing human presence and disruption to the birds. Such syner-
products and reducing labor intensity using parallel robots and machine gistic application of voice control and VR in robotics could lead to
vision, with a visual system developed on MATLAB. Gribovskiy et al. breakthroughs in operational productivity and animal welfare, ushering
(2010) delved into the realm of ethology and robotics, where a mobile in a new era of precision farming in the poultry sector. With these
robot, PoulBot, was designed to interact with and influence chick technologies, future research could develop sophisticated models that
behavior, showing young chicks accept robot as member as new insights simulate entire poultry operations, allowing for the optimization of
9
X. Yang et al. Computers and Electronics in Agriculture 225 (2024) 109339
workflows and the exploration of novel farming strategies before their accurate, real-time counting in dense environments(Cao et al., 2021).
real-world implementation(Chen et al., 2020b). Collectively, these Yao et al. (2020) focused on chicken gender classification, achieving a
studies signify a transformative period in the poultry sector, marked by 96.85 % accuracy using YOLOv3 for detection and a VGG-19 based
rapid technological advancements. The convergence of robotics, com- classifier(Yao et al., 2020). Liu et al. (2021) detailed an automated
puter vision, and ethology is not only enhancing production and effi- system for detecting and removing dead chickens, integrating a visible
ciency but also contributing to better animal welfare and global health light camera and YOLOv4 algorithm into a robotic system, enhancing
outcomes by enabling early detection of animal abnormal behaviors and biosecurity with a 95.24 % precision rate(Liu et al., 2021). Fig. 10 shows
removing unnormal birds in time, ensuring higher standards of food the practical application observed in these case studies of free-range
safety. As these technologies evolve, they hold the potential to address chicken houses. The integration of such systems not only streamlines
some of the most pressing challenges in the sector, including labor operations but also ensures high standards of care and well-being for the
shortages, food safety, and disease surveillance. Fig. 8 shows computer chickens by providing them with a healthier living and well-managed
vision-based robotics and their roles of poultry sector. environment that is conducive to their health and productivity.
In cage-free hen houses, innovative computer vision and deep
3.6. Case Studies: Successful implementations around the globe learning methodologies address specific poultry management chal-
lenges. Lamping et al. (2022) introduced ChickenNet, a system for
To understand the application of computer vision in the poultry assessing the plumage condition of laying hens. It extends the Mask R-
sector, we examined cases from commercial farms, including enclosed CNN framework and was tested with different image resolutions,
broiler houses, free-range, and cage-free environments. In enclosed achieving a 98.02 % mAP for hen detection and 91.83 % for plumage
broiler houses, significant advancements have been made using com- condition prediction(Lamping et al., 2022). Subedi et al. (2023)
puter vision and deep learning. A study conducted by Mortensen et al. described the development of deep learning models (YOLOv5s-egg,
(2016) describes a 3D camera-based system utilizing a Kinect camera for YOLOv5x-egg, YOLOv7-egg) for detecting floor eggs in cage-free envi-
broiler weight prediction, achieving an average error of 7.8 %. This ronments. The YOLOv5x-egg model, in particular, showed a 90 % pre-
technology shows promise for broader applications such as activity cision and 92.1 % mAP, indicating its potential utility in varying
analysis and health monitoring(Mortensen et al., 2016). Additionally, conditions for automatic floor egg monitoring(Subedi et al., 2023b).
Eijk et al. (2022) detailed a study employing computer vision algo- These advancements in computer vision and deep learning demonstrate
rithms, including Mask R-CNN and U-Net models, to monitor broiler significant potential for enhancing welfare monitoring and operational
interactions with resources like feeders and drinkers, enhancing farm efficiency in cage-free poultry farming. Fig. 11 depicts the practical
management and welfare practices(van der Eijk et al., 2022). Furthering application observed in these case studies of cage-free houses.
these developments, Cakic et al. (2023) ’s research introduces the use of
high-performance computing (HPC) and deep learning to create pre- 4. Future technologies on the horizon
dictive models for smart poultry farms. These models, effective in tasks
like chicken counting, dead chicken detection, weight assessment, and Recent techniques from fields such as chat generative pre-trained
uneven growth detection, were implemented on edge AI devices. Uti- transformer (ChatGPT), autonomous vehicles (AVs), and large voice
lizing Faster R-CNN architectures for chicken detection and Mask R-CNN models like speech audio language music open neural network (SAL-
for segmentation, the study demonstrated high accuracies, paving the MONN), contextual speech model with instruction-following/in-
way for real-time farm monitoring. This underscores the potential of context-learning capabilities (COSMIC), and multi-modal music under-
integrating HPC, deep learning, and edge computing in smart agricul- standing and generation (M2UGen), provide a wealth of inspiration for
ture solutions, especially in poultry farming. Fig. 9 illustrates the prac- the poultry sector, suggesting new avenues for optimizing the environ-
tical application observed in these case studies of enclosed broiler mental conditions and overall management of poultry houses, leading to
houses. enhanced growth rates, better health outcomes, and reduced waste. The
In free-range chicken houses, advanced computer vision and deep data processing and natural language capabilities of ChatGPT, as applied
learning have significantly improved farm management and animal in precision agriculture, could offer poultry farmers advanced tools for
welfare. Cao et al. (2021) discussed the development of the locally managing health, nutrition, and environmental controls, allowing for
constrained dense fully convolutional network (LC-DenseFCN) model, a enhanced decision-making through simplified interaction with complex
deep learning method for chicken counting with a 97 % accuracy rate. datasets(Biswas, 2023; Genç, 2023; Potamitis, 2023). Similarly, occlu-
This model utilized densely connected convolutional networks (Dense- sion management techniques from self-driving car technology, utilizing
Net) as the backbone network and a unique LC-Loss function for light detection and ranging (LiDAR) and YOLOv2 algorithms, could
revolutionize poultry monitoring systems, enabling precise tracking of
individual birds and swift correction of visual occlusion errors, even
under challenging conditions such as low light or high-density settings
(Yahya et al., 2020). The use of training simulators, inspired by the car
learning to act (CARLA) simulator for autonomous vehicles, could be
developed for the poultry sector to train algorithms in virtual environ-
ments that mirror actual farm conditions, improving the predictability
and management of flock dynamics. Additionally, the real-time pro-
cessing power of end-to-end deep learning, akin to CNN approaches in
AVs, could be applied to instantly process visual data from farms,
ensuring accurate health assessments and headcounts. Techniques for
image classification and semantic segmentation, crucial for navigation
in AVs, could be adapted to segment and classify different areas of
poultry farms, enhancing detection and reducing errors in bird counting
(Liang et al., 2020; Tippannavar et al., 2023). Furthermore, the inte-
Fig. 8. Enhancing poultry sector security with computer vision-based robotics gration of SALMONN’s audio processing capabilities could provide in-
(and). sights into the respiratory health indicators of poultry(Tang et al., 2023),
Source: Ren et al. (2020), Gribovskiy et al. (2010), Park et al. (2022)Vroe- while COSMIC’s emergent instruction-following capabilities could
gindeweij et al. (2018) enable farmers to seamlessly translate sensor data into actionable
10
X. Yang et al. Computers and Electronics in Agriculture 225 (2024) 109339
Fig. 9. Case study of broilers: (A) body weight prediction(Mortensen et al., 2016), (B) broiler interaction monitoring (van der Eijk et al., 2022), and (C) broiler
detection interface developed using HPC (Cakic et al., 2023).
insights(Pan et al., 2023). M2UGen’s prowess in multi-modal generation adaptors and improvements. Its capacity to analyze and interpret
suggests potential for creating stimulating or calming farm environ- diverse visual information holds the promise of revolutionizing the way
ments, innovating control interfaces for farming equipment, and we monitor and manage poultry health, behavior, and overall welfare.
providing interactive staff training using natural language(Hussain With its robust segmentation capabilities, SAM could offer unprece-
et al., 2023). Collectively, these advanced technologies can significantly dented insights into the nuanced environments of poultry farming,
enhance poultry farming operations, leading to better animal welfare- enabling more efficient and humane practices.
characterized by good nutrition, comfortable living conditions, robust
health, natural living, and humane handling, provided they are used 5.1. SAM for tracking
responsibly in conjunction with human expertise and in adherence to
the Animal Welfare Act. SAM’s foray into tracking applications, particularly within the field
of video object segmentation (VOS), has proven to be a game-changer.
5. Large vision models for poultry science This innovative tracking method, known as track anything model
(TAM)(Yang et al., 2023d), integrates SAM with the established tracker
In the swiftly advancing domain of artificial general intelligence XMem to segment and follow any object in video footage. In the context
(AGI), the advent of the segment anything model (SAM) stands out as a of poultry science, this technology holds significant promise; for
vanguard development(Kirillov et al., 2023). Unveiled by Meta AI in instance, it can be tailored to track the speed and movement of chickens
2023, SAM revolutionizes the field with a pioneering, zero-shot seg- within a farm setting(Yang et al., 2023c). Users initiate the tracking by
mentation approach(Ahmadi et al., 2023). As a universal image seg- selecting an object, prompting SAM to generate a segmentation mask,
mentation model, SAM adeptly addresses majority of segmentation which XMem then uses to track the object’s movement through the
challenges within new and complex datasets, employing the sophisti- video based on temporal and spatial data. This ability to monitor in real-
cated art of prompt engineering. SAM’s architecture is a paragon of large time and make immediate adjustments is invaluable for farmers and
vision models, meticulously engineered to navigate the intricate land- researchers aiming to understand chicken behavior. However, TAM
scape of segmentation tasks with unparalleled agility and precision. By faces challenges in zero-shot scenarios where it must perform without
initiating the use of prompt-based segmentation in its preparatory pre-existing data, a situation often encountered in poultry environments
phase, SAM not only enhances the pre-training paradigm but also re- when new or occluded behaviors emerge. Despite these challenges, the
defines it, laying down a novel standard that underscores the trans- integration of SAM into poultry management practices heralds a new era
formative adaptability of vision models. This innovation is particularly of precision farming, offering insights that could lead to enhanced
pertinent to the poultry sector, where SAM can be further tailored with productivity and improved animal welfare.
11
X. Yang et al. Computers and Electronics in Agriculture 225 (2024) 109339
Fig. 10. Case study of free-range chickens: (A) chicken counting(Cao et al., 2021), (B) dead chicken removal(Liu et al., 2021), and (C) chicken gender detection(Yao
et al., 2020).
Fig. 11. Case study of cage-free chickens: (A) plumage condition assessment(Lamping et al., 2022), and (B) floor egg detection (Subedi et al., 2023b).
5.2. 3D-SAM-adapter traditional methods. Recognizing the requirements of the poultry in-
dustry, the application of the adapted 3D SAM model for estimating
The SAM has garnered acclaim for its proficiency in general-purpose poultry body weight and volume through 3D imaging represents a novel
semantic segmentation, demonstrating a strong ability to generalize approach(Pleuss et al., 2019; You et al., 2021). Accurate measurement
across a variety of everyday images. However, challenges arise when of these parameters is vital for effective health monitoring and growth
SAM is tasked with identifying objects characterized by small size, management in poultry farming. By employing the 3D SAM model,
irregular shape, and low contrast, due to its foundational design for 2D there’s potential to transform body weight estimation practices, offering
imagery which doesn’t capture the complex 3D spatial information. To poultry scientists and farmers a tool that could outperform conventional
address these challenges, the 3D-SAM-adapter, an innovative solution methods in both precision and efficiency. This advancement paves the
that modifies the original 2D SAM to interpret volumetric data, effec- way for more informed decision-making in poultry nutrition and wel-
tively enhancing its performance and enabling it to bridge the dimen- fare, leading to optimized farm operations and enhanced animal health.
sional divide between 2D and 3D data interpretation(Gong et al., 2023).
This adaptation has shown significant performance improvements over
12
X. Yang et al. Computers and Electronics in Agriculture 225 (2024) 109339
5.3. MobileSAM meat discrimination) but at the cost of increased development time and
expertise(Arsalane et al., 2016). Such codes are meticulously tailored to
The SAM is known for its comprehensive image segmentation ca- the hardware, ensuring efficient execution of tasks but creating a steep
pabilities, but its performance can be hampered by the considerable learning curve for those not versed in low-level programming.
computational weight of its image encoder. This issue has been inge- Secondly, establishing a reliable connection between hardware and
niously addressed by MobileSAM, which implements a knowledge the computational model on farms is complex, given the need for robust
distillation technique to distill the capabilities of the original SAM’s infrastructure to handle data transmission and processing. For instance,
heavy-duty image encoder, ViT-H, into a more lightweight version. This in a poultry farm, computer vision systems may be deployed for moni-
streamlined encoder retains compatibility with SAM’s mask decoder toring the health and growth of chicken, detecting behavioral patterns,
while offering a significant reduction in size-over 60 times smaller than or automating the counting and sorting processes (Chmiel et al., 2011).
the original-without compromising on performance. Incorporating these Each of these applications generates a vast amount of data that must be
innovations, MobileSAM stands out as especially beneficial for mobile processed in real time to be effective. This data flow demands high-
applications, significantly advancing the practical deployment of SAM in bandwidth, low-latency communication channels to transfer video
various settings, including poultry farming(Zhang et al., 2023b). By feeds from the cameras to the processors without significant delay.
minimizing the need for heavy computational resources, MobileSAM Moreover, the computational models that analyze this visual data must
can be trained in less than a day on a single GPU, making it an ideal be hosted on hardware that can process information rapidly and accu-
candidate for deployment in poultry farms where computational re- rately. Typically, this would involve servers equipped with high-
sources can be limited(Sigut et al., 2020). This adaptation not only en- performance GPUs or FPGAs that can execute complex machine
sures that the technology is accessible and cost-effective for farmers but learning algorithms(Afif et al., 2020). These servers must be connected
also opens new avenues for real-time monitoring and management of to the farm’s network infrastructure in a way that ensures continuous
poultry health, behavior, and productivity directly from mobile devices. operation despite the environmental challenges present in such settings,
like temperature fluctuations, dust, and humidity(El-Medany, 2008).
6. Current technology limitations Additionally, the hardware must be calibrated to handle the peculiar-
ities of the farm’s environment. For example, variations in lighting
Integrating computer vision in poultry farming represents a cutting- conditions throughout the day can affect the accuracy of image recog-
edge approach to agricultural management, leveraging deep learning nition and object detection algorithms. Therefore, the hardware and
and sophisticated hardware to optimize production. However, this software must be adaptable and robust enough to maintain performance
integration is fraught with challenges. Firstly, the cost and complexity of regardless of these variables.
digital signal processors (DSPs), field-programmable gate arrays Lastly, the practical deployment of computer vision technologies in
(FPGAs), and graphics processing units (GPUs) present substantial bar- poultry farming is significantly challenged by the environmental factors
riers(Feng et al., 2019). These barriers are not solely financial but also inherent to such agricultural settings. The presence of dust, for example,
technical, as leveraging these technologies requires a depth of pro- can occlude camera lenses and interfere with the image quality being fed
gramming and hardware knowledge often absent in farm settings. For into vision algorithms, leading to reduced accuracy in detecting or
instance, DSPs, essential for real-time processing tasks like grading classifying birds or behaviors(Guo et al., 2023). Similarly, variable
poultry eggs, are priced between $500-$2,000 but demand familiarity lighting conditions can dramatically affect image capture; the stark
with digital signal processing and embedded system programming contrast between bright daylight, the shadows of an indoor and light
(HajiRassouliha et al., 2018). FPGAs, ranging from $1,000-$3,000, offer density setting may require algorithms to have dynamic range capabil-
configurability crucial for tasks such as sorting eggs. However, they ities and adjustment mechanisms to maintain consistent performance
necessitate expertise in hardware description languages and the ability (Zhou and Lin, 2007). Water lines and other farm equipment can also
to manage complex logic networks. The price reflects their versatility introduce visual noise that confuses the models. For instance, reflections
and capability to perform parallel processing tasks effectively(Mon- or refractions from water surfaces can lead to false detections or mis-
masson et al., 2011). GPUs, which fall within the $1,500-$15,000 range, classifications. The movement and presence of equipment like feeders
are the powerhouse for behavior classification through deep learning. and drinkers can obstruct the view or be mistakenly identified as part of
They require a substantial investment not only in the hardware but also the chicken by the vision system(Li et al., 2021b), necessitating so-
in developing and optimizing algorithms, which often involves knowl- phisticated background subtraction techniques and object tracking al-
edge of high-level programming languages and machine learning li- gorithms that are robust to such changes. Moreover, behavioral analysis
braries(Rozemberczki et al., 2021). Table 3 encapsulates the common of poultry, an application of computer vision, can be affected by these
price ranges for these critical hardware types, highlighting the associ- disturbances. The detection of abnormal behaviors indicative of diseases
ated costs and technical requirements for their application in poultry or stress requires continuous and clear observation of the poultry, which
farming. Beyond the hardware, the implementation of custom codes, can be disrupted by the environmental factors. To counteract these
especially those written in assembly language, further complicates the challenges, computer vision systems in poultry farming must be
integration process. Assembly language coding for DSPs or FPGAs de- designed with advanced features such as: (1) to cope with the changes in
mands a granular level of control and an understanding of the processor lighting, cameras must have mechanisms that adjust their settings
architecture. It can optimize the performance for specific tasks (e.g., dynamically for optimal image capture(Kromanis and Kripakaran,
2021); (2) cameras and processing units must be protected and sealed
Table 3
against dust and moisture to ensure longevity and consistent operation
Price Range Overview of Essential Computer Vision Hardware for Poultry (Shajahan et al., 2021); (3) algorithms must be trained on datasets that
Farming. include the range of environmental conditions expected in a poultry
farm to improve their robustness(Yang et al., 2023b). These improve-
Hardware Price Use case in poultry farming
type ments can enhance the reliability and effectiveness of computer vision
applications in poultry farming, ensuring that the potential benefits of
DSPs $500 − $2,000 (HajiRassouliha Grading of Poultry Eggs (Wang
et al., 2018) et al., 2010)
these technologies can be fully realized in such a complex ecosystem.
FPGAs $1,000 − $3,000 Sorting eggs (Akkoyun et al.,
2023) 7. Conclusions
GPUs $1,500 – $15,000 Behavior classification(Pu
et al., 2018)
This review highlights the transformative role of computer vision in
13
X. Yang et al. Computers and Electronics in Agriculture 225 (2024) 109339
poultry management, emphasizing a shift towards technology-driven Arivazhagan, S., R.N. Shebiah, H. Sudharsan, R.R. Kannan, R. Ramesh. 2013. External
and Internal Defect Detection of Egg using Machine Vision. 4.
approaches. By integrating classical machine learning and advanced
Arsalane, A., N. El Barbri, K. Rhofir, A. Tabyaoui, A. Klilou. 2016. Building a portable
deep learning frameworks like CNNs, significant improvements in ani- device based on DSP for meat discrimination.Pages 1–6 in 2016 International
mal welfare and farm operations have been achieved. Challenges such as Conference on Engineering & MIS (ICEMIS).
complex backgrounds and bird occlusions are addressed using non- Astill, J., Dara, R.A., Fraser, E.D.G., Roberts, B., Sharif, S., 2020. Smart poultry
management: Smart sensors, big data, and the internet of things. Computers and
visible light and depth-based sensors, enhancing monitoring accuracy Electronics in Agriculture 170, 105291.
and efficiency. Aydin, A., 2017. Using 3D vision camera system to automatically assess the level of
Advancements span from egg quality assessment to health moni- inactivity in broiler chickens. Computers and Electronics in Agriculture 135, 4–10.
Aydin, A., Cangar, O., Ozcan, S.E., Bahr, C., Berckmans, D., 2010. Application of a fully
toring, reducing manual labor and boosting productivity. The use of automatic analysis tool to assess the activity of broiler chickens with different gait
multimodal systems integrating diverse sensors further extends these scores. Computers and Electronics in Agriculture 73, 194–199.
capabilities. However, adapting to real farm conditions and the need for Bist, R.B., L. Chai, X. Yang, S. Subedi, Y. Guo. 2022. Air Quality in Cage-free Houses
during Pullets Production.Page 1 in 2022 ASABE Annual International Meeting.
extensive annotated datasets remain challenges. American Society of Agricultural and Biological Engineers.
Future innovations, including voice-activated robotics and virtual Bakar, M.A.A., Ker, P.J., Tang, S.G., Baharuddin, M.Z., Lee, H.J., Omar, A.R., 2023.
reality, promise greater efficiency and sustainability in poultry farming. Translating conventional wisdom on chicken comb color into automated monitoring
of disease-infected chicken using chromaticity-based machine learning models.
Continued research and development in flexible, scalable, and ethical Frontiers in Veterinary Science 10, 1174700.
solutions merging human expertise with automation will fundamentally Bist, R.B., Subedi, S., Chai, L., Regmi, P., Ritz, C.W., Kim, W.K., Yang, X., 2023a. Effects
transform the sector, contributing to global food security and animal of Perching on Poultry Welfare and Production: A Review. Poultry 2, 134–157.
Bist, R.B., Subedi, S., Yang, X., Chai, L., 2023b. A Novel YOLOv6 Object Detector for
welfare.
Monitoring Piling Behavior of Cage-Free Laying Hens. AgriEngineering 5, 905–923.
Bist, R.B., Subedi, S., Yang, X., Chai, L., 2023c. Automatic Detection of Cage-Free Dead
CRediT authorship contribution statement Hens with Deep Learning Methods. AgriEngineering 5, 1020–1038.
Bist, R.B., Yang, X., Subedi, S., Chai, L., 2023d. Mislaying behavior detection in cage-free
hens with deep learning technologies. Poultry Science 102, 102729.
Xiao Yang: Writing – original draft, Methodology, Investigation, Bist, R.B., Yang, X., Subedi, S., Chai, L., 2024. Automatic detection of bumblefoot in
Conceptualization. Ramesh Bahadur Bist: Writing – original draft, cage-free hens using computer vision technologies. Poultry Science 103 (7), 103780.
Investigation. Bidur Paneru: Writing – original draft, Investigation. Biswas, S.S., 2023. Role of Chat GPT in Public Health. Ann Biomed Eng 51, 868–869.
Boufenar, C., Djamai, S., Mezghiche, D., Soudani, L., 2022, November. Identification of
Tianming Liu: Writing – original draft, Investigation. Todd Applegate: Chicken Eimeria Species using Deep Learning Approaches. In: 2022 First
Writing – original draft, Investigation. Casey Ritz: Writing – original International Conference on Computer Communications and Intelligent Systems
draft, Investigation. Woo Kim: Writing – original draft, Investigation. (I3CIS). IEEE, pp. 105–110.
Brenner, M., Reyes, N.H., Susnjak, T., Barczak, A.L.C., 2023. RGB-D And Thermal Sensor
Prafulla Regmi: Writing – original draft, Investigation. Lilong Chai: Fusion: A Systematic Literature Review. IEEE Access 11, 82410–82442.
Writing – original draft, Supervision, Project administration, Method- Cakic, S., Popovic, T., Krco, S., Nedic, D., Babic, D., Jovovic, I., 2023. Developing Edge AI
ology, Investigation, Funding acquisition, Conceptualization. Computer Vision for Smart Poultry Farms Using Deep Learning and HPC. Sensors 23,
3002.
Cao, L., Xiao, Z., Liao, X., Yao, Y., Wu, K., Mu, J., Li, J., Pu, H., 2021. Automated Chicken
Declaration of competing interest Counting in Surveillance Camera Environments Based on the Point Supervision
Algorithm: LC-DenseFCN. Agriculture 11, 493.
Cen, Y., Ying, Y., Rao, X., 2006. Egg weight detection on machine vision system.Pages
The authors declare that they have no known competing financial 337–346 in Optics for Natural Resources, Agriculture, and Foods. SPIE.
interests or personal relationships that could have appeared to influence Chang, C.-L., Xie, B.-X., Wang, C.-H., 2020. Visual Guidance and Egg Collection Scheme
the work reported in this paper. for a Smart Poultry Robot for Free-Range Farms. Sensors 20, 6624.
Chavan, B., Jadhav, D., Atar, S., Kadam, S., 2019. Voice controlled machineries in
agricultural field using Raspberry Pi. verified 17 November 2023 Available at. htt
Data availability ps://www.academia.edu/download/60504461/IRJET-V6I379220190906-74686-1
n3uk3.pdf.
Data will be made available on request. Chen, M., J. Sun, K. Saga, T. Tanjo, K. Aida. 2020a. An adaptive noise removal tool for
IoT image processing under influence of weather conditions: poster abstract.Pages
655–656 in Proceedings of the 18th Conference on Embedded Networked Sensor
Acknowledgments Systems. SenSys ’20. Association for Computing Machinery, New York, NY, USA.
Chen, Y., K. Feng, Y. Jiang, Z. Hu. 2021a. Design and research on six degrees of freedom
robot evisceration system for poultry.Pages 382–386 in Proceedings of the 2020 3rd
The study was sponsored by the USDA-NIFA AFRI (2023-68008- International Conference on E-Business, Information Management and Computer
39853), Georgia Research Alliance (Venture Fund), Oracle America Science. EBIMCS ’20. Association for Computing Machinery, New York, NY, USA.
(Oracle for Research Grant, CPQ-2060433), and UGA IIPA Equipment Chen, Y., Lu, J., Feng, K., Wan, L., Ai, H., 2023. Nutritional metabolism evaluation and
image segmentation of the chicken muscle and internal organs for automatic
grant. evisceration. Journal of Animal Physiology and Animal Nutrition 107 (1), 228–237.
Chen, Y., Wan, L., Liu, Z., 2019. The study on recognition and location of intelligent
References robot system for eviscerating poultry.Pages 499–503. 34rd Youth Academic Annual
Conference of Chinese Association of Automation (YAC) in 2019.
Chen, Y., Feng, K., Lu, J., Hu, Z., 2021b. Machine vision on the positioning accuracy
Abraham, G., R. R., and M. Nithya, 2021. In: Smart Agriculture Based on IoT and
evaluation of poultry viscera in the automatic evisceration robot system.
Machine Learning, pp. 414–419.
International Journal of Food Properties 24, 933–943.
Afif, M., Said, Y., Atri, M., 2020. Computer vision algorithms acceleration using graphic
Chen, Y., Wang, S.C., 2018. Poultry carcass visceral contour recognition method using
processors NVIDIA CUDA. Cluster Comput 23, 3335–3347.
image processing. Journal of Applied Poultry Research 27, 316–324.
Ahmadi, M., A.G. Lonbar, A. Sharifi, A.T. Beris, M. Nouri, A.S. Javidi. 2023. Application
Chen, Y., Zhang, B., Zhou, J., Wang, K., 2020b. Real-time 3D unstructured environment
of Segment Anything Model for Civil Infrastructure Defect Assessment. Available at
reconstruction utilizing VR and Kinect-based immersive teleoperation for
http://arxiv.org/abs/2304.12600 (verified 2 July 2023).
agricultural field robots. Computers and Electronics in Agriculture 175, 105579.
Akkoyun, F., Ozcelik, A., Arpaci, I., Erçetin, A., Gucluer, S., 2023. A Multi-Flow
Chmiel, M., Słowiński, M., Dasiewicz, K., 2011. Application of computer vision systems
Production Line for Sorting of Eggs Using Image Processing. Sensors 23, 117.
for estimation of fat content in poultry meat. Food Control 22, 1424–1427.
Aleynikov, A.F., 2022. Application of computer vision in food industry to predict sexual
de Castro Júnior, S.L., da Silva, I.J.O., Nazareno, A.C., M. de O. Mota, 2022. COMPUTER
dimorphism in poultry eggs during incubation. IOP Conf. Ser.: Earth Environ. Sci.
VISION FOR MORPHOMETRIC EVALUATION OF BROILER CHICKEN BONES. Eng.
1112 (012057).
Agríc. 42, e20210150.
Amraei, S., Abdanan Mehdizadeh, S., Sallary, S., 2017. Application of computer vision
De Wet, L., Vranken, E., Chedad, A., Aerts, J.-M., Ceunen, J., Berckmans, D., 2003.
and support vector regression for weight prediction of live broiler chicken.
Computer-assisted image analysis to quantify daily growth rates of broiler chickens.
Engineering in Agriculture, Environment and Food 10, 266–271.
British Poultry Science 44, 524–532.
Andriyanov, N.A., 2021. Application of computer vision systems for monitoring the
Del Valle, J.E., Pereira, D.F., Mollo Neto, M., Gabriel Filho, L.R.A., Salgado, D.D., 2021.
condition of drivers based on facial image analysis. Pattern Recognition and Image
Unrest index for estimating thermal comfort of poultry birds (Gallus gallus
Analysis 31, 489–495.
domesticus) using computer vision techniques. Biosystems Engineering 206,
Anvari, Z., V. Athitsos. 2022. A Survey on Deep learning based Document Image
123–134.
Enhancement. Available at http://arxiv.org/abs/2112.02719 (verified 25 October
2023).
14
X. Yang et al. Computers and Electronics in Agriculture 225 (2024) 109339
Dlesk, A., Vach, K., Pavelka, K., 2022. Photogrammetric Co-Processing of Thermal Kirillov, A., E. Mintun, N. Ravi, H. Mao, C. Rolland, L. Gustafson, T. Xiao, S. Whitehead,
Infrared Images and RGB Images. Sensors 22, 1655. A.C. Berg, W.-Y. Lo, P. Dollár, R. Girshick. 2023. Segment Anything. Available at
Dorea, J.R., Bresolin, T., Ferreira, R.E.P., Pereira, L.G.R., 2020. 383 Harnessing the http://arxiv.org/abs/2304.02643 (verified 25 June 2023).
Power of Computer Vision System to Improve Management Decisions in Livestock Kodaira, V., Siriani, A.L.R., Medeiros, H.P., De Moura, D.J., Pereira, D.F., 2023.
Operations. J Anim Sci 98, 138–139. Assessment of Preference Behavior of Layer Hens under Different Light Colors and
Dosovitskiy, A., L. Beyer, A. Kolesnikov, D. Weissenborn, X. Zhai, T. Unterthiner, M. Temperature Environments in Long-Time Footage Using a Computer Vision System.
Dehghani, M. Minderer, G. Heigold, S. Gelly, J. Uszkoreit, N. Houlsby. 2021. An Animals 13, 2426.
Image is Worth 16x16 Words: Transformers for Image Recognition at Scale. Kromanis, R., Kripakaran, P., 2021. A multiple camera position approach for accurate
Available at http://arxiv.org/abs/2010.11929 (verified 25 October 2023). displacement measurement using computer vision. J Civil Struct Health Monit 11,
El-Medany, W.M. 2008. FPGA implementation for humidity and temperature remote 661–678.
sensing system.Pages 1–4 in 2008 IEEE 14th International Mixed-Signals, Sensors, Kumar, S., Sharma, S.C., Kumar, R., 2023. Wireless Sensor Network Based Real-Time
and Systems Test Workshop. Pedestrian Detection and Classification for Intelligent Transportation System.
Fang, C., Zheng, H., Yang, J., Deng, H., Zhang, T., 2022. Study on Poultry Pose International Journal of Mathematical, Engineering and Management Sciences 8,
Estimation Based on Multi-Parts Detection. Animals 12, 1322. 194–212.
Fei, J.D., W. Hao, W. Jun, X. Wei. 2023. Real-Time Recognition Study of Egg-Collecting Lamping, C., Derks, M., Groot Koerkamp, P., Kootstra, G., 2022. ChickenNet - an end-to-
Robot in Free-Range Duck Sheds. Available at https://papers.ssrn.com/ end approach for plumage condition assessment of laying hens in commercial farms
abstract=4396479 (verified 15 November 2023). using computer vision. Computers and Electronics in Agriculture 194, 106695.
Feng, X., Jiang, Y., Yang, X., Du, M., Li, X., 2019. Computer vision algorithms and Lee, S. 2012. Depth camera image processing and applications.Pages 545–548 in 2012
hardware implementations: A survey. Integration 69, 309–320. 19th IEEE International Conference on Image Processing.
Feng, Z., Song, L., Duan, J., He, L., Zhang, Y., Wei, Y., Feng, W., 2021. Monitoring Wheat Li, L., Z. Wang, W. Hou, Z. Zhou, M. Di, H. Xue, Y. Yu. 2023b. Recognition of Aggressive
Powdery Mildew Based on Hyperspectral, Thermal Infrared, and RGB Image Data Chicken Behavior Based on Machine Learning. Available at https://papers.ssrn.com/
Fusion. Sensors (basel) 22, 31. abstract=4442722 (verified 5 November 2023).
Fernandes, A.F.A., Dórea, J.R.R., G. J. de M. Rosa, 2020. Image Analysis and Computer Li, G., Huang, Y., Chen, Z., Chesser, G.D., Purswell, J.L., Linhoss, J., Zhao, Y., 2021a.
Vision Applications in Animal Sciences: An Overview. Front Vet Sci 7, 551269. Practices and Applications of Convolutional Neural Network-Based Computer Vision
Franzo, G., Legnardi, M., Faustini, G., Tucciarone, C.M., Cecchinato, M., 2023. When Systems in Animal Farming: A Review. Sensors 21, 1492.
Everything Becomes Bigger: Big Data for Big Poultry Production. Animals 13, 1804. Li, G., Hui, X., Chen, Z., Chesser, G.D., Zhao, Y., 2021b. Development and evaluation of a
Lin, C.Y., Hsieh, K.W., Tsai, Y.C., Kuo, Y.F., 2018. Monitoring chicken heat stress using method to detect broilers continuously walking around feeder as an indication of
deep convolutional neural networks. In 2018 ASABE Annual International Meeting restricted feeding behaviors. Computers and Electronics in Agriculture 181, 105982.
(p. 1). American Society of Agricultural and Biological Engineers. Li, G., Gates, R.S., Ramirez, B.C., 2023. An On-Site Feces Image Classifier System for
Genç, N., 2023. Artificial Intelligence in Physical Education and Sports: New Horizons Chicken Health Assessment: A Proof of Concept. Applied Engineering in Agriculture
with ChatGPT. MJSS 6, 17–32. 39 (4), 417–426.
Gong, S., Y. Zhong, W. Ma, J. Li, Z. Wang, J. Zhang, P.-A. Heng, Q. Dou. 2023. 3DSAM- Li, N., Ren, Z., Li, D., Zeng, L., 2020. Review: Automated techniques for monitoring the
adapter: Holistic Adaptation of SAM from 2D to 3D for Promptable Medical Image behaviour and welfare of broilers and laying hens: towards the goal of precision
Segmentation. Available at http://arxiv.org/abs/2306.13465 (verified 20 January livestock farming. Animal 14, 617–625.
2024). Li, X., Zhao, Z., Wu, J., Huang, Y., Wen, J., Sun, S., Xie, H., Sun, J., Gao, Y., 2022. Y-BGD:
Gribovskiy, A., J. Halloy, J.-L. Deneubourg, H. Bleuler, F. Mondada. 2010. Towards Broiler counting based on multi-object tracking. Computers and Electronics in
mixed societies of chickens and robots.Pages 4722–4728 in 2010 IEEE/RSJ Agriculture 202, 107347.
International Conference on Intelligent Robots and Systems. Liang, M., B. Yang, W. Zeng, Y. Chen, R. Hu, S. Casas, R. Urtasun. 2020. PnPNet: End-to-
Gribovskiy, A., Halloy, J., Deneubourg, J.L., Mondada, F., 2018. Designing a socially End Perception and Prediction with Tracking in the Loop. Available at http://arxiv.
integrated mobile robot for ethological research. Robotics and Autonomous Systems org/abs/2005.14711 (verified 23 November 2023).
103, 42–55. Lin, C.-Y., Hsieh, K.-W., Tsai, Y.-C., Kuo, Y.-F., 2020. Automatic Monitoring of Chicken
Guanjun, B., Mimi, J., Yi, X., Shibo, C., Qinghua, Y., 2019. Cracked egg recognition based Movement and Drinking Time Using Convolutional Neural Networks. Trans. ASABE
on machine vision. Computers and Electronics in Agriculture 158, 159–166. 63, 2029–2038.
Guo, Y., Aggrey, S.E., Yang, X., Oladeinde, A., Qiao, Y., Chai, L., 2023. Detecting broiler Lin, C., Yeh, F., Wu, B., Yang, C., 2019. The effects of reflected glare and visual field
chickens on litter floor with the YOLOv5-CBAM deep learning model. Artificial lighting on computer vision syndrome. Clinical and Experimental Optometry 102,
Intelligence in Agriculture 9, 36–45. 513–520.
HajiRassouliha, A., Taberner, A.J., Nash, M.P., Nielsen, P.M.F., 2018. Suitability of Liu, H.-W., Chen, C.-H., Tsai, Y.-C., Hsieh, K.-W., Lin, H.-T., 2021. Identifying Images of
recent hardware accelerators (DSPs, FPGAs, and GPUs) for computer vision and Dead Chickens with a Chicken Removal System Integrated with a Deep Learning
image processing algorithms. Signal Processing: Image Communication 68, 101–119. Algorithm. Sensors 21, 3579.
Heo, S., Cho, S., Dinh, P.T.N., Park, J., Jin, D.-H., Cha, J., Kim, Y.-K., Koh, Y.J., Lee, S.H., Ma, X., Karimpour, A., Wu, Y.-J., 2020. Statistical evaluation of data requirement for
Lee, J.H., 2023. A genome-wide association study for eumelanin pigmentation in ramp metering performance assessment. Transportation Research Part a: Policy and
chicken plumage using a computer vision approach. Animal Genetics 54, 355–362. Practice 141, 248–261.
Joffe, B.P., Usher, C.T., 2017. Autonomous robotic system for picking up floor eggs in Ma, X., Lu, X., Huang, Y., Yang, X., Xu, Z., Mo, G., Ren, Y., Li, L., 2022. An Advanced
poultry houses. In 2017 ASABE Annual International Meeting (p. 1). American Chicken Face Detection Network Based on GAN and MAE. Animals 12, 3055.
Society of Agricultural and Biological Engineers. Ma, L., Sun, K., Tu, K., Pan, L., Zhang, W., 2017. Identification of double-yolked duck egg
Hussain, A.S., S. Liu, C. Sun, Y. Shan. 2023. M$^{2}$UGen: Multi-modal Music using computer vision. PLOS ONE 12, e0190054.
Understanding and Generation with the Power of Large Language Models. Available Massari, J.M., de Moura, D.J., de Alencar Nääs, I., Pereira, D.F., Branco, T., 2022.
at http://arxiv.org/abs/2311.11255 (verified 23 November 2023). Computer-Vision-Based Indexes for Analyzing Broiler Response to Rearing
Ifuchenwuwa, A.E., Osaghae, E.O., Basaky, F.D., 2023. Deep Learning Implementation Environment: A Proof of Concept. Animals 12, 846.
for Poultry Disease Detection and Control. 8. Mehdizadeh, S.A., Neves, D.P., Tscharke, M., Nääs, I.A., Banhazi, T.M., 2015. Image
Jaihuni, M., Gan, H., Tabler, T., Prado, M., Qi, H., Zhao, Y., 2023. Broiler Mobility analysis method to evaluate beak and head motion of broiler chickens during
Assessment via a Semi-Supervised Deep Learning Model and Neo-Deep Sort feeding. Computers and Electronics in Agriculture 114, 88–95.
Algorithm. Animals 13, 2719. Mertens, K., De Ketelaere, B., Kamers, B., Bamelis, F.R., Kemps, B.J., Verhoelst, E.M., De
Jamil, S., M.J. Piran, O.-J. Kwon. 2022. A Comprehensive Survey of Transformers for Baerdemaeker, J.G., Decuypere, E.M., 2005. Dirt detection on brown eggs by means
Computer Vision. Available at http://arxiv.org/abs/2211.06004 (verified 25 of color computer vision. Poultry Science 84, 1653–1659.
October 2023). Misimi, E., Øye, E.R., Eilertsen, A., Mathiassen, J.R., Åsebø, O.B., Gjerstad, T., Buljo, J.,
Jocher, G. 2020. YOLOv5 by Ultralytics. Available at https://github.com/ultralytics/ Skotheim, Ø., 2016. GRIBBOT – Robotic 3D vision-guided harvesting of chicken
yolov5 (verified 26 March 2023). fillets. Computers and Electronics in Agriculture 121, 84–100.
Joo, K.H., S. Duan, S.L. Weimer, M.N. Teli. 2022. Birds’ Eye View: Measuring Behavior Mocanu, B., Tapu, R., Zaharia, T., 2016. When Ultrasonic Sensors and Computer Vision
and Posture of Chickens as a Metric for Their Well-Being. Available at http://arxiv. Join Forces for Efficient Obstacle Detection and Recognition. Sensors 16, 1807.
org/abs/2205.00069 (verified 4 November 2023). Monmasson, E., Idkhajine, L., Cirstea, M.N., Bahri, I., Tisan, A., Naouar, M.W., 2011.
Jung, L., Nasirahmadi, A., Schulte-Landwehr, J., Knierim, U., 2021. Automatic FPGAs in Industrial Control Applications. IEEE Transactions on Industrial
Assessment of Keel Bone Damage in Laying Hens at the Slaughter Line. Animals Informatics 7, 224–243.
(basel) 11, 163. Mortensen, A.K., Lisouski, P., Ahrendt, P., 2016. Weight prediction of broiler chickens
Kanash, R.S., S.E. Alavi, A.A. Abed. 2021. Design and Implementation of Voice using 3D computer vision. Computers and Electronics in Agriculture 123, 319–326.
Controlled Robotic ARM.Pages 284–289 in 2021 International Conference on Mota-Grajales, R., Torres-Peña, J.C., Camas-Anzueto, J.L., Pérez-Patricio, M., Grajales
Communication & Information Technology (ICICT). Coutiño, R., López-Estrada, F.R., Escobar-Gómez, E.N., Guerra-Crespo, H., 2019.
Kashiha, M., Pluk, A., Bahr, C., Vranken, E., Berckmans, D., 2013. Development of an Defect detection in eggshell using a vision system to ensure the incubation in poultry
early warning system for a broiler house using computer vision. Biosystems production. Measurement 135, 39–46.
Engineering 116, 36–45. Nakarmi, A., Tang, L., Xin, H., 2014. Automated Tracking and Behavior Quantification of
Khairunissa, J., S. Wahjuni, I.R.H. Soesanto, W. Wulandari. 2021. Detecting Poultry Laying Hens Using 3D Computer Vision and Radio Frequency Identification
Movement for Poultry Behavioral Analysis using The Multi-Object Tracking (MOT) Technologies. Transactions of the ASABE 57, 1455–1472.
Algorithm.Pages 265–268 in 2021 8th International Conference on Computer and Nakrosis, A., Paulauskaite-Taraseviciene, A., Raudonis, V., Narusis, I., Gruzauskas, V.,
Communication Engineering (ICCCE). Gruzauskas, R., Lagzdinyte-Budnike, I., 2023. Towards Early Poultry Health
Prediction through Non-Invasive and Computer Vision-Based Dropping
Classification. Animals 13, 3041.
15
X. Yang et al. Computers and Electronics in Agriculture 225 (2024) 109339
Nasir, A.F.A., Sabarudin, S.S., Majeed, A.P.P.A., Ghani, A.S.A., 2018. Automated egg Somnath, S., Smith, C.R., Kalinin, S.V., Chi, M., Borisevich, A., Cross, N., Duscher, G.,
grading system using computer vision: Investigation on weight measure versus shape Jesse, S., 2018. Feature extraction via similarity search: application to atom finding
parameters. IOP Conf. Ser.: Mater. Sci. Eng. 342 (012003). and denoising in electron and scanning probe microscopy imaging. Adv Struct Chem
Neethirajan, S., 2022. ChickTrack – A quantitative tracking tool for measuring chicken Imag 4, 3.
activity. Measurement 191, 110819. Subedi, S., Bist, R., Yang, X., Chai, L., 2023a. Tracking pecking behaviors and damages of
Neves, D.P., Mehdizadeh, S.A., Tscharke, M., I. de A. Nääs, and T. M. Banhazi, 2015. cage-free laying hens with machine vision technologies. Computers and Electronics
Detection of flock movement and behaviour of broiler chickens at different feeders in Agriculture 204, 107545.
using image analysis. Information Processing in Agriculture 2, 177–182. Subedi, S., Bist, R., Yang, X., Chai, L., 2023b. Tracking Floor Eggs with Machine Vision in
Okinda, C., Nyalala, I., Korohou, T., Okinda, C., Wang, J., Achieng, T., Wamalwa, P., Cage-free Hen Houses. Poultry. Science:102637.
Mang, T., Shen, M., 2020a. A review on computer vision systems in monitoring of Tang, C., W. Yu, G. Sun, X. Chen, T. Tan, W. Li, L. Lu, Z. Ma, and C. Zhang. 2023.
poultry: A welfare perspective. Artificial Intelligence in Agriculture 4, 184–208. SALMONN: Towards Generic Hearing Abilities for Large Language Models. Available
Okinda, C., Sun, Y., Nyalala, I., Korohou, T., Opiyo, S., Wang, J., Shen, M., 2020b. Egg at http://arxiv.org/abs/2310.13289 (verified 23 November 2023).
volume estimation based on image processing and computer vision. Journal of Food Thompson, T.N., Vickrey, A., Shapiro, M.D., Hsu, E., 2023. A computer vision framework
Engineering 283, 110041. for quantification of feather growth patterns. Frontiers in Bioinformatics 3 Available
Olejnik, K., Popiela, E., Opaliński, S., 2022. Emerging Precision Management Methods in at verified 31 October 2023.
Poultry Sector. Agriculture 12, 718. Tippannavar, S.S., Y. S D, and P. K M, 2023. SDR – Self Driving Car Implemented using
Orlosky, J., Itoh, Y., Ranchet, M., Kiyokawa, K., Morgan, J., Devos, H., 2017. Emulation Reinforcement Learning & Behavioural Cloning. In: In 2023 International
of physician tasks in eye-tracked virtual reality for remote diagnosis of Conference on Recent Trends in Electronics and Communication (ICRTEC), pp. 1–7.
neurodegenerative disease. IEEE transactions on visualization and computer Truswell, A., Lee, Z.Z., Stegger, M., Blinco, J., Abraham, R., Jordan, D., Milotic, M.,
graphics 23 (4), 1302–1311. Hewson, K., Pang, S., Abraham, S., 2023. Augmented surveillance of antimicrobial
Pacure Angelia, H.L., Bolo, J.M.U., Eliot, C.J.I., Gelicania, G.Y., 2022. In: Grade resistance with high-throughput robotics detects transnational flow of
Classification of Chicken Eggs through Computer Vision. Association for Computing fluoroquinolone-resistant Escherichia coli strain into poultry. Journal of
Machinery, New York, NY, USA, pp. 149–156. Antimicrobial. Chemotherapy:dkad323.
Pan, J., J. Wu, Y. Gaur, S. Sivasankaran, Z. Chen, S. Liu, J. Li. 2023. COSMIC: Data van den Heuvel, H., Youssef, A., Grat, L.M., Neethirajan, S., 2022. Quantifying the Effect
Efficient Instruction-tuning For Speech In-Context Learning. Available at http:// of an Acute Stressor in Laying Hens using Thermographic Imaging and Vocalisations.
arxiv.org/abs/2311.02248 (verified 23 November 2023). Bioengineering.
Pan, L., Zhan, G., Tu, K., Tu, S., Liu, P., 2011. Eggshell crack detection based on computer van der Eijk, J.A.J., Guzhva, O., Voss, A., Möller, M., Giersberg, M.F., Jacobs, L., de
vision and acoustic response by means of back-propagation artificial neural network. Jong, I.C., 2022. Seeing is caring – automated assessment of resource use of broilers
Eur Food Res Technol 233, 457–463. with computer vision techniques. Frontiers in Animal Science 3 Available at verified
Park, M., Britton, D., Daley, W., McMurray, G., Navaei, M., Samoylov, A., Usher, C., 19 November 2023.
Xu, J., 2022. Artificial intelligence, sensors, robots, and transportation systems drive Vroegindeweij, B.A., Blaauw, S.K., J. M. M. IJsselmuiden, and E. J. van Henten, 2018.
an innovative future for poultry broiler and breeder management. Anim Front 12, Evaluation of the performance of PoultryBot, an autonomous mobile robotic
40–48. platform for poultry houses. Biosystems Engineering 174, 295–315.
Pereira, D.F., Miyamoto, B.C.B., Maia, G.D.N., Tatiana Sales, G., Magalhães, M.M., Wang, S., J. Cheng, and Y. Wen. 2010. Research on Non-destructive Comprehensive
Gates, R.S., 2013. Machine vision to identify broiler breeder behavior. Computers Detection and Grading of Poultry Eggs Based on Intelligent Robot.Pages 487–498 in
and Electronics in Agriculture 99, 194–199. Computer and Computing Technologies in Agriculture III. Li, D., Zhao, C., eds. IFIP
Pleuss, J.D., Talty, K., Morse, S., Kuiper, P., Scioletti, M., Heymsfield, S.B., Thomas, D.M., Advances in Information and Communication Technology. Springer, Berlin,
2019. A machine learning approach relating 3D body scans to body composition in Heidelberg.
humans. Eur J Clin Nutr 73, 200–208. Wang, J., Wang, N., Li, L., Ren, Z., 2020. Real-time behavior detection and judgment of
Potamitis, I. 2023. ChatGPT in the context of precision agriculture data analytics. egg breeders based on YOLO v3. Neural Comput. Appl. 32, 5471–5481.
Available at http://arxiv.org/abs/2311.06390 (verified 23 November 2023). Wang, C.-H., Xie, B.-X., Chang, C.-L., 2019. Design and Implementation of Livestock
Prakash, A., Rajendran, A., Pranav, V.K., Sreedharan, P., 2021. Design, Analysis, Robot for Egg Picking and Classification in the Farm. In: In 2019 International
Manufacturing and Testing of a SCARA Robot with Pneumatic Gripper for the Symposium on Electrical and Electronics Engineering (ISEE), pp. 161–165.
Poultry Industry. IOP Conf. Ser.: Mater. Sci. Eng. 1132 (012010). Xiao, L., Song, C., Rao, X., 2017. Head and body motion tracking of caged chicken in
Pu, H., Lian, J., Fan, M., 2018. Automatic Recognition of Flock Behavior of Chickens with video. Available at https://doi.org/10.13031/aim.201700464 (verified 4 November
Convolutional Neural Network and Kinect Sensor. Int. J. Patt. Recogn. Artif. Intell. 2023).
32, 1850023. Yahya, M.A., Abdul-Rahman, S., Mutalib, S., 2020. Object Detection for Autonomous
Qi, L., Zhao, M., Li, Z., Shen, D., Lu, J., 2020. Non-destructive testing technology for raw Vehicle with LiDAR Using Deep Learning. In: In 2020 IEEE 10th International
eggs freshness: a review. SN Appl. Sci. 2, 1113. Conference on System Engineering and Technology (ICSET), pp. 207–212.
Quach, L.-D., N. Pham-Quoc, D.C. Tran, Mohd. Fadzil Hassan. 2020. Identification of Yang, X., Bist, R.B., Subedi, S., Chai, L., 2023a. A Computer Vision-Based Automatic
Chicken Diseases Using VGGNet and ResNet Models.Pages 259–269 in Industrial System for Egg Grading and Defect Detection. Animals 13, 2354.
Networks and Intelligent Systems. Vo, N.-S., Hoang, V.-P., eds. Lecture Notes of the Yang, X., Bist, R., Subedi, S., Chai, L., 2023b. A deep learning method for monitoring
Institute for Computer Sciences, Social Informatics and Telecommunications spatial distribution of cage-free hens. Artificial Intelligence in Agriculture 8, 20–29.
Engineering. Springer International Publishing, Cham. Yang, X., Dai, H., Wu, Z., Bist, R.B., Subedi, S., Sun, J., Lu, G., Li, C., Liu, T., Chai, L.,
Quintana, M.M.D., Infante, R.R.D., Torrano, J.C.S., Pacis, M.C., 2022. A Hybrid Solar 2024. An innovative segment anything model for precision poultry monitoring.
Powered Chicken Disease Monitoring System using Decision Tree Models with Visual Computers and Electronics in Agriculture 222, 109045.
and Acoustic Imagery. In: In 2022 14th International Conference on Computer and Yang, X., H. Dai, Z. Wu, R. Bist, S. Subedi, J. Sun, G. Lu, C. Li, T. Liu, and L. Chai. 2023c.
Automation Engineering (ICCAE), pp. 65–69. SAM for Poultry Science. Available at http://arxiv.org/abs/2305.10254 (verified 30
Raj, A.A.G., Jayanthi, J.G., 2018. IoT-based real-time poultry monitoring and health June 2023).
status identification.Pages 1–7. 11th International Symposium on Mechatronics and Yang, J., M. Gao, Z. Li, S. Gao, F. Wang, and F. Zheng. 2023d. Track Anything: Segment
Its Applications (ISMA) in 2018. Anything Meets Videos. Available at http://arxiv.org/abs/2304.11968 (verified 22
Raja, R., A.K. Burusa, G. Kootstra, E. van Henten. 2023. Advanced Robotic System for September 2023).
Efficient Pick-and-Place of Deformable Poultry in Cluttered Bin: A Comprehensive Yang, J., Zhang, T., Fang, C., Zheng, H., 2023e. A defencing algorithm based on deep
Evaluation Approach. Available at https://www.techrxiv.org/articles/preprint/ learning improves the detection accuracy of caged chickens. Computers and
Advanced_Robotic_System_for_Efficient_Pick-and-Place_of_Deformable_Poultry_in_ Electronics in Agriculture 204, 107501.
Cluttered_Bin_A_Comprehensive_Evaluation_Approach/23823117/1 (verified 15 Yao, Y., Yu, H., Mu, J., Li, J., Pu, H., 2020. Estimation of the Gender Ratio of Chickens
November 2023). Based on Computer Vision: Dataset and Exploration. Entropy 22, 719.
Ren, G., Lin, T., Ying, Y., Chowdhary, G., Ting, K.C., 2020. Agricultural robotics research You, J., Lou, E., Afrouziyeh, M., Zukiwsky, N.M., Zuidhof, M.J., 2021. A supervised
applicable to poultry production: A review. Computers and Electronics in machine learning method to detect anomalous real-time broiler breeder body weight
Agriculture 169, 105216. data recorded by a precision feeding system. Computers and Electronics in
Rozemberczki, B., Scherer, P., He, Y., Panagopoulos, G., Riedel, A., Astefanoaei, M., Agriculture 185, 106171.
Kiss, O., Beres, F., López, G., Collignon, N., Sarkar, R., 2021. In: PyTorch Geometric Zang, Y., Z. Zhu, Z. Song, and E. Mao. 2011. Virtual Reality and the Application in Virtual
Temporal: Spatiotemporal Signal Processing with Neural Machine Learning Models. Experiment for Agricultural Equipment.Pages 257–268 in Computer and Computing
Association for Computing Machinery, New York, NY, USA, pp. 4564–4573. Technologies in Agriculture IV. Li, D., Liu, Y., Chen, Y., eds. IFIP Advances in
Sadeghi, M., Banakar, A., Minaei, S., Orooji, M., Shoushtari, A., Li, G., 2023. Early Information and Communication Technology. Springer, Berlin, Heidelberg.
Detection of Avian Diseases Based on Thermography and Artificial Intelligence. Zhang, H., Chen, C., 2020. Design of Sick Chicken Automatic Detection System Based on
Animals 13, 2348. Improved Residual Network. In: In 2020 IEEE 4th Information Technology,
Shajahan, J.M.A., Reyes, S.M., Xiao, J., 2021, December. Camera Lens Dust Detection Networking, Electronic and Automation Control Conference (ITNEC),
and Dust Removal for Mobile Robots in Dusty Fields. IEEE, pp. 687–691. pp. 2480–2485.
Sigut, J., Castro, M., Arnay, R., Sigut, M., 2020. OpenCV Basics: A Mobile Application to Zhang, C., D. Han, S. Zheng, J. Choi, T.-H. Kim, and C. S. Hong. 2023b. MobileSAMv2:
Support the Teaching of Computer Vision Concepts. IEEE Transactions on Education Faster Segment Anything to Everything. Available at http://arxiv.org/abs/
63, 328–335. 2312.09579 (verified 20 January 2024).
Siriani, A.L.R., I. B. de C. Miranda, S. A. Mehdizadeh, and D. F. Pereira, 2023. Chicken Zhang, Y., Ge, Y., Guo, Y., Miao, H., Zhang, S., 2023a. An approach for goose egg
Tracking and Individual Bird Activity Monitoring Using the BoT-SORT Algorithm. recognition for robot picking based on deep learning. British Poultry Science 64,
AgriEngineering 5, 1677–1693. 343–356.
16
X. Yang et al. Computers and Electronics in Agriculture 225 (2024) 109339
Zhang, J., Lu, W., Jian, X., Hu, Q., Dai, D., 2023c. Nondestructive Detection of Egg Zhou, C., Lin, S., 2007. Removal of Image Artifacts Due to Sensor Dust. In: In 2007 IEEE
Freshness Based on Infrared Thermal Imaging. Sensors 23, 5530. Conference on Computer Vision and Pattern Recognition, pp. 1–8.
Zhang, X., Zhang, Y., Geng, J., Pan, J., Huang, X., Rao, X., 2023d. Feather Damage Zhou, M., Zhu, J., Cui, Z., Wang, H., Sun, X., 2023. Detection of abnormal chicken
Monitoring System Using RGB-Depth-Thermal Model for Chickens. Animals 13, 126. droppings based on improved Faster R-CNN. International Journal of Agricultural
Zhang, D., Zhou, F., 2023. Self-Supervised Image Denoising for Real-World Images With and Biological Engineering 16, 243–249.
Context-Aware Transformer. IEEE Access 11, 14340–14349. Zhuang, X., Bi, M., Guo, J., Wu, S., Zhang, T., 2018. Development of an early warning
Zhang, D., Zhou, F., Yang, X., Gu, Y., 2023e. Unleashing the Power of Self-Supervised algorithm to detect sick broilers. Computers and Electronics in Agriculture 144,
Image Denoising. verified 7 October 2023 A Comprehensive Review. Available at. 102–113.
http://arxiv.org/abs/2308.00247. Zhu, J., Zhou, M., 2021. Online detection of abnormal chicken manure based on machine
Zheng, H., Zhang, T., Fang, C., Zeng, J., Yang, X., 2021. Design and Implementation of vision. In 2021 ASABE Annual International Virtual Meeting (p. 1). American Society
Poultry Farming Information Management System Based on Cloud Database. of Agricultural and Biological Engineers.
Animals 11, 900.
17