Using Machine Learning to Detect Dustbathing Behavior of Cage-free Laying Hens Automatically
Using Machine Learning to Detect Dustbathing Behavior of Cage-free Laying Hens Automatically
DOI: https://doi.org/10.13031/aim.202400195
Paper Number: 2400195
The authors are solely responsible for the content of this meeting presentation. The presentation does not necessarily reflect the official position of the
American Society of Agricultural and Biological Engineers (ASABE), and its printing and distribution does not constitute an endorsement of views
which may be expressed. Meeting presentations are not subject to the formal peer review process by ASABE editorial committees; therefore, they are
not to be presented as refereed publications. Publish your paper in our journal after successfully completing the peer review process. See
www.asabe.org/JournalSubmission for details. Citation of this work should state that it is from an ASABE meeting paper. EXAMPLE: Author’s Last
Name, Initials. 2024. Title of presentation. ASABE Paper No. ---. St. Joseph, MI.: ASABE. For information about securing permission to reprint or
reproduce a meeting presentation, please contact ASABE at www.asabe.org/copyright (2950 Niles Road, St. Joseph, MI 49085-9659 USA).
1
Introduction
The U.S. laying hen industry is in a period of transition from conventional caged (CC) systems to cage-free (CF) housing,
largely due to increasing concerns for animal welfare and public demand (Chai et al., 2017, 2018, 2019). Cage-free housing
offers laying hens a more favorable environment with increased space and opportunities for natural behaviors, such as
dustbathing (DB), which is crucial for maintaining plumage and regulating feather lipids (UEP, 2017; Bist et al., 2023,
2024a, 2024b). DB, behavior, consisting of 15 elements, serves as a vital maintenance behavior for laying hens (Kruijt,
1964; Vestergaard, 1994). While the motivation behind DB remains debated among scientist, it is widely accepted that laying
hens engage in DB to clean their plumage and keep feathers in good condition (Van Liere and Bokma, 1987). The absence
of suitable DB materials can lead to stress and health issues in laying hens (Vestergaard et al., 1997).
Early exposure to DB materials has been shown to positively impact hen health and behavior (Nicol et al., 2001).
However, manual detection of DB behavior from video recordings is labor-intensive, and prone to errors. Therefore, there
is a need for more robust and precise detection technologies. Precision poultry farming, utilizing image analysis and machine
learning (ML) algorithms, offers a promising solution for accurate and efficient detection of poultry behaviors (Li, 2018; Gu
et al., 2022).
The machine learning or deep learning method such as You Only Look Once (YOLO) model, particularly the YOLOv5
variant, has emerged as a leading approach for object detection in poultry behavior analysis (Guo et al., 2020, 2021;
Neethirajan, 2022; Bist et al., 2024b). Studies have demonstrated the effectiveness of the YOLO models in detecting various
behaviors and activities in CF housing, including pecking, floor eggs, piling, mislaying behavior, dead hens, egg grading
and defect detection, and tracking individual birds (Subedi et al., 2023a,2023b; Bist et al., 2023a, 2023b, 2023c; Yang et al.,
2023, 2024). The recent advancment has been made to track the locomotion of individual chickens such as Track Anything
Model (TAM) (Yang et al., 2024). Recent advancements in YOLO models, such as YOLOv6, YOLOv7, and YOLOv8, have
further improved their accuracy and applicability for poultry behavior monitoring (Jocher et al., 2023b).
Despite the widespread adoption of YOLO models in poultry research, there has been limited exploration into using these
models to detect DB behavior in laying hens within CF housing. This study aims to fill this gap by developing and optimizing
a deep learning-based detector for monitoring DB behavior. The objectives includes developing and testing deep learning
methods for DB behavior detection, identifying the optimal model, and assessing performance across different growing
phases of laying hens. Through this research, we seek to enhance our understanding of laying hen behavior in CF housing
and contribute to the development of effective monitoring systems for improving animal welfare in the poultry industry.
2
Figure 1. Cage-free facility for raising Hy-line W-36 laying hens/pullets.
Table 1. Data pre-processing for YOLOv7 and YOLOv8 models, where each image contain more
than one birds performing DB behavior
Classa Original data set Train (70%) Validation (20%) Test (10%)
Starter-DB 1000 700 200 100
Grower-DB 1000 700 200 100
Developer-DB 1000 700 200 100
Pre-lay-DB 1000 700 200 100
Pre-peak-DB 1000 700 200 100
Layers-DB 1000 700 200 100
a
Each class or experimental setting was run for 200 epochs with a batch size of 8.
3
Figure 2. The processes of dustbathing detection system (i.e., data collection, labeling, training,
validation, testing, and implementation).
The head in YOLOv7 is similar to YOLOv5, highlighting the distinctions such as the replacement of the CSP module with
E-ELAN module and the transformation of the Down sampling module into the MPConv layer. The entire head layer
encompasses SPPCPC layers, multiple Bconv layers, several MPConv layers, numerous Catconv layers, and RepVGG block
layers that generate three subsequent heads, as detailed by (Yang et al., 2022c). The SPPCSPC layer is formed through the
pyramid pooling operation and CSP structure, with concatenated output information. The Catconv layer serves a function
similar to the E-ELAN layer, facilitating more efficient learning and convergence in deeper networks. The operation of the
Catconv layer aligns with that of the E-ELAN layers, enabling deeper networks to learn and connect more effectively (Wang
et al., 2023).
Precision
𝑇𝑇𝑇𝑇 𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑ℎ𝑖𝑖𝑖𝑖𝑖𝑖 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑
𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃 = × 100% = (𝑖𝑖)
𝑇𝑇𝑇𝑇 + 𝐹𝐹𝐹𝐹 𝑎𝑎𝑎𝑎𝑎𝑎 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑 𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏 𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏
where, TP stands for true positive, FP stands for false positive, FN stands for false negative values, respectively.
4
Recall
𝑇𝑇𝑇𝑇 𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑ℎ𝑖𝑖𝑛𝑛𝑛𝑛 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑
𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅 = × 100% = (𝑖𝑖𝑖𝑖)
𝑇𝑇𝑇𝑇 + 𝐹𝐹𝐹𝐹 𝑎𝑎𝑎𝑎𝑎𝑎 𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔𝑔 𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡ℎ 𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏 𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏𝑏
F1 score
2 × 𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅 × 𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃
𝐹𝐹1 𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆𝑆 = × 100% (𝑖𝑖𝑖𝑖𝑖𝑖)
𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅 + 𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃
Mean average precision (mAP)
∑𝐶𝐶𝑖𝑖−1 𝐴𝐴𝐴𝐴𝐴𝐴
𝑚𝑚𝑚𝑚𝑚𝑚 = (𝑖𝑖𝑖𝑖)
𝐶𝐶
Within this equation, APi signifies the average precision of the i category, and C represents the total number of categories.
th
Table 3. Performance metrics results of different YOLOv7-DB and YOLOv8-DB models for detecting DB behavior
Precision Recall [email protected] F1-score
Models [email protected](%)
(%) (%) (%) (%)
The [email protected] of YOLOv8x-DB model in our study for detecting DB behavior was 93.70%, which exceeded the results
of the previous study by (Lou et al., 2023) that utilized the YOLOv8 model for small object detection and reported a highest
[email protected] of 83%, where the lowest reported [email protected] was 18.1%. In another study that used improved YOLOv8n
model (E-YOLO) for detecting estrus cow achieved an average precision of estrus (93.90%) and average precision of
mounting (95.70%) (Wang et al., 2024). Slightly higher precision in (Wang et al., 2024) study than in ours might be because
of the object, as cows were of bigger size than laying hens it could have played a role in easily detecting target behavior in
cow than in laying hens as it is easier for large object small object for detection. However, another study that utilized the
YOLOv8 model achieved the lowest [email protected] of 47%, which was 46.7% lower than our result (Wang et al., 2023a). The
lower [email protected] in (Wang et al., 2023a) study might be credited to the targeted object’s greater height relative to the
camera’s location. As previous studies have highlighted that factors such as camera height and image quality can
significantly impact detection accuracy (Corregidor-Castro et al., 2021; Gadhwal et al., 2023). In addition, our study
achieved high-performance levels for DB. These individual metrics collaborate to yield an impressive overall F1 score of at
least 88% in all models and 92% in optimal model (YOLOv8-DB).
Conclusions
The YOLOv8x-DB model resulted in higher precision, recall, mAP, and F1 scores for detecting DB behavior in CF
housing conditions. It shows the ability and reliability of YOLOv8x-DB model than other models utilized in this study.
5
However, all other models also resulted in precision of at least 90% in detecting DB behavior. From the optimal model
(YOLOv8x-DB), we were able to achieve a precision of at least 89.30%, recall of at least 71.50 upto 97.10, [email protected] at
least 83.50, and [email protected] -0.95 at least 66.90 upto 80.00% during all growth phases of laying hens. DB detection precision
was highest during grower phase followed by pre-lay, layers, developer, and pre-peak phases. This study provides a reference
for CF producers that DB behavior can be detected automatically with precision of at least 90% using any of the four YOLO
models utilized in this study. However, the accuracy can further be increased with frequent camera cleaning. The study
highlighted the benefits of utilizing the new addition of YOLO models i.e. YOLOv8x in accurately detecting DB behavior
with higher precision. This finding can provide a valuable tool for detecting DB behavior among CF layer producers to
improve laying hen welfare in CF housing.
Acknowledgements
The study was sponsored by USDA-NIFA AFRI (2023-68008-39853), Georgia Research Alliance, USDA-Hatch projects:
Future Challenges in Animal Production Systems: Seeking Solutions through Focused Facilitation (GEO00895; Accession
Number: 1021519) and Enhancing Poultry Production Systems through Emerging Technologies and Husbandry Practices
(GEO00894; Accession Number: 1021518).
References
Appleby, M. C., J. A. Mench, and B. O. Hughes. 2004. Poultry Behaviour and Welfare. CABI.
Bist, R. B., S. Subedi, L. Chai, P. Regmi, C. W. Ritz, W. K. Kim, and X. Yang. 2023. Effects of Perching on Poultry
Welfare and Production: A Review. Poultry 2:134–157.
Bist, R. B., Yang, X., Subedi, S., & Chai, L. 2024a. Automatic detection of bumblefoot in cage-free hens using computer
vision technologies. Poultry Science, 103780.
Bist, R. B., Yang, X., Subedi, S., Ritz, C. W., Kim, W. K., & Chai, L. 2024b. Electrostatic particle ionization for
suppressing air pollutants in cage-free layer facilities. Poultry Science, 103(4), 103494.
Bist, R. B., S. Subedi, X. Yang, and L. Chai. 2023a. A Novel YOLOv6 Object Detector for Monitoring Piling Behavior
of Cage-Free Laying Hens. AgriEngineering 5:905–923.
Bist, R. B., S. Subedi, X. Yang, and L. Chai. 2023b. Automatic Detection of Cage-Free Dead Hens with Deep Learning
Methods. AgriEngineering 5:1020–1038.
Bist, R. B., X. Yang, S. Subedi, and L. Chai. 2023c. Mislaying behavior detection in cage-free hens with deep learning
technologies. Poult. Sci.:102729.
Chai, L., Zhao, Y., Xin, H., Wang, T., Atilgan, A., Soupir, M., & Liu, K. 2017. Reduction of particulate matter and
ammonia by spraying acidic electrolyzed water onto litter of aviary hen houses: a lab-scale study. Transactions of the
ASABE, 60(2), 497-506.
Chai, L., Xin, H., Zhao, Y., Wang, T., Soupir, M., & Liu, K. 2018. Mitigating ammonia and PM generation of cage-free
henhouse litter with solid additive and liquid spray. Transactions of the ASABE, 61(1), 287-294.
Chai, L., Xin, H., Wang, Y., Oliveira, J., Wang, K., & Zhao, Y. (2019). Mitigating particulate matter generation in a
commercial cage-free hen house. Transactions of the ASABE, 62(4), 877-886.
Corregidor-Castro, A., T. E. Holm, and T. Bregnballe. 2021. Counting breeding gulls with unmanned aerial vehicles:
camera quality and flying height affects precision of a semi-automatic counting method. Ornis Fenn. 98:33–45.
Gadhwal, M., A. Sharda, H. S. Sangha, and D. Van der Merwe. 2023. Spatial corn canopy temperature extraction: How
focal length and sUAS flying altitude influence thermal infrared sensing accuracy. Comput. Electron. Agric. 209:107812.
Gu, Y., S. Wang, Y. Yan, S. Tang, and S. Zhao. 2022. Identification and Analysis of Emergency Behavior of Cage-Reared
Laying Ducks Based on YoloV5. Agriculture 12:485 Available at https://www.mdpi.com/2077-0472/12/4/485 (verified 23
January 2024).
Guo, Y., Chai, L., Aggrey, S. E., Oladeinde, A., Johnson, J., & Zock, G. 2020. A machine vision-based method for
monitoring broiler chicken floor distribution. Sensors, 20(11), 3179.
Guo, Y., Aggrey, S. E., Oladeinde, A., Johnson, J., Zock, G., & Chai, L. 2021. A machine vision-based method optimized
for restoring broiler chicken images occluded by feeding and drinking equipment. Animals, 11(1), 123.
Guo, Y., P. Regmi, Y. Ding, R. B. Bist, and L. Chai. 2023a. Automatic detection of brown hens in cage-free houses with
deep learning methods. Poult. Sci. 102:102784.
Jocher, G., A. Chaurasia, and J. Qiu. 2023. Ultralytics YOLO. Available at https://github.com/ultralytics/ultralytics
(verified 6 February 2024).
Li, Y. 2018. Performance Evaluation of Machine Learning Methods for Breast Cancer Prediction. Appl. Comput. Math.
7:212.
6
Lou, H., X. Duan, J. Guo, H. Liu, J. Gu, L. Bi, and H. Chen. 2023. DC-YOLOv8: Small-Size Object Detection Algorithm
Based on Camera Sensor. Electronics 12:2323.
Neethirajan, S. 2022. ChickTrack – A quantitative tracking tool for measuring chicken activity. Measurement
191:110819.
Nicol, C. J., A. C. Lindberg, A. J. Phillips, S. J. Pope, L. J. Wilkins, and L. E. Green. 2001. Influence of prior exposure
to wood shavings on feather pecking, dustbathing and foraging in adult laying hens. Appl. Anim. Behav. Sci. 73:141–155.
Subedi, S., R. Bist, X. Yang, and L. Chai. 2023a. Tracking pecking behaviors and damages of cage-free laying hens with
machine vision technologies. Comput. Electron. Agric. 204:107545.
Subedi, S., R. Bist, X. Yang, and L. Chai. 2023b. Tracking floor eggs with machine vision in cage-free hen houses. Poult.
Sci. 102:102637.
UEP (United egg producer). 2017. Animal Husbandry guidelines for U.S. Egg-Laying Flocks-Guidelines for Cage-free
housing. Accessed February 2024. https://uepcertified.com/wp-content/uploads/2019/09/CF-UEP-Guidelines_17-
3.pdf
Van Liere, D. W., and S. Bokma. 1987. Short-term feather maintenance as a function of dust-bathing in laying hens. Appl.
Anim. Behav. Sci. 18:197–204.
Vestergaard, K., E. Skadhauge, and L. Lawson. 1997. The Stress of Not Being Able to Perform Dustbathing in Laying
Hens. Physiol. Behav. 62:413–419.
Wang, C. Y., Bochkovskiy, A., & Liao, H. Y. M. (2023). YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for
real-time object detectors. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (pp.
7464-7475).
Wang, G., Y. Chen, P. An, H. Hong, J. Hu, and T. Huang. 2023a. UAV-YOLOv8: A Small-Object-Detection Model Based
on Improved YOLOv8 for UAV Aerial Photography Scenarios. Sensors 23:7190.
Wang, Z., Z. Hua, Y. Wen, S. Zhang, X. Xu, and H. Song. 2024. E-YOLO: Recognition of estrus cow based on improved
YOLOv8n model. Expert Syst. Appl. 238:122212.
Yang, X., R. Bist, S. Subedi, and L. Chai. 2023a. A deep learning method for monitoring spatial distribution of cage-free
hens. Artif. Intell. Agric. 8:20–29.
Yang, X., R. B. Bist, S. Subedi, and L. Chai. 2023. A Computer Vision-Based Automatic System for Egg Grading and
Defect Detection. Animals 13:2354.
Yang, X., R. B. Bist, B. Paneru, and L. Chai. 2024. Deep Learning Methods for Tracking the Locomotion of Individual
Chickens. Animals, 14(6), 911.
Yang, Z., C. Ni, L. Li, W. Luo, and Y. Qin. 2022c. Three-Stage Pavement Crack Localization and Segmentation Algorithm
Based on Digital Image Processing and Deep Learning Techniques. Sensors 22:8459.