Human Detection in Flood Using Drone
Human Detection in Flood Using Drone
Abstract:- Drowning people in India approximately offering a safe and efficient means of accessing remote or
around 38000 peoples per year leads to dead finally dangerous locations.
because of, we have insufficient water rescue or timely
emergency response to search and rescue team during This project aims to enhance disaster response efforts
emergency, also the lack of information to the rescue by developing a drone-based human detection system using a
team about the drowning people place. We should believe Keras model. The system utilizes a drone connected to a PC
that a few seconds' difference could have saved a person’s running BlueStacks, which hosts the Lfun Pro app for
life. The timely information conveyed to the rescue team displaying real-time images captured by the drone. The Keras
is also an important criterion for drowning to dead rate model is trained to detect humans in the drone's camera feed,
being very higher At first, we make a dataset, which enabling rescue teams to quickly identify and locate
contains many human targets at sea. Then, we improve survivors in disaster scenarios. The integration of the Keras
the algorithm In the feature extraction network, we use model with the drone and PC setup offers several advantages
the residual module with channel attention mechanism. over traditional methods. It provides a cost-effective and
Finally, on the settings of the raspberry pi Pico with GPS efficient solution for rapid deployment, allowing rescue
and GSM, we use a linear transformation method to deal teams to assess disaster-affected areas more quickly and
with the python generated by clustering algorithm. The accurately.
detection accuracy of the improved algorithm for human
targets at sea is improved, which has a good detection Additionally, the system reduces the risk to rescue
effect. The drone with detecting and alerting with voice personnel by minimizing their exposureto hazardous
message to the Rescue Team at remote end with required environments. Overall, this project has the potential to
all details about the drowning people make sense for significantly improve the effectiveness of disaster response
faster rescue and save as the highest accuracy. The efforts, ultimately saving lives and reducing the impact of
camera detection of the rescue Drone had a proper in that disasters on affected communities.
the range of the active camera and the speed of the video
with Wi-Fi to the control room also optimal for the II. LITERATURE REVIEW
detection to work properly.
Drone-Based Human Detection and Localization for
Keywords:- ESP8266, ANN, Arduino Uno, Python Software, Flood Disaster Management This paper proposes a drone-
GSM/GPS Module. based system for human detection and localization during
flood disasters. It utilizes deep learning techniques for object
I. INTRODUCTION detection and employs thermal imaging to improve detection
accuracy.
Disasters, both natural and man-made, pose significant
challenges to emergency response teams due to their A Review of UAV-Based Human Detection Techniques
unpredictable nature and the often hazardous conditions they for Search and Rescue Operations. This review paper
create. Traditional methods of assessing disaster-affected discusses various UAV-based human detection techniques
areas and locating survivors can be time-consuming and risky applicable to search and rescue operations, including those in
for rescue personnel. Unmanned aerial vehicles (UAVs), or flood.
drones, have emerged as valuable tools in disaster response,
Raspberry is a small, versatile, and widely-used single- During testing, the system was able to detect humans
board computer developed by the Raspberry Pi Foundation. with a high degree of accuracy, even in challenging
It's designed for educational purposes, DIY projects, and as a conditions such as low light or obscured visibility. The real-
platform for learning programming and electronics. With time nature of the system allowed for rapid assessment of
various models available, it offers different levels of disaster-affected areas, enabling rescue teams to prioritize
performance and features, making it suitable for a wide range their efforts and locate survivors more efficiently.
of applications, from simple tasks like web browsing and
word processing to complex projects like home automation The integration of the Keras model with the drone and
and robotics. PC setup proved to be a cost-effective and scalable solution
for disaster response. The system's ability to minimize the
Hardware Setup risk to rescue personnel by reducing their exposure to
Connect the drone to a PC using BlueStacks to run the hazardous environments was a significant advantage,
Lfun Pro app for displaying drone images. Ensure the drone highlighting its potential for widespread adoption in disaster
is equipped with a high-resolution camera capable of response efforts.
capturing clear images of the disaster-affected area.
Overall, the results demonstrate the effectiveness of the
Software Development proposed methodology in enhancing disaster response
Develop software to interface with the drone's camera capabilities. Future improvements could include optimizing
feed and capture images for processing. Implement the Keras the model for faster processing speeds and integrating
model for human detection, using a convolutional neural additional sensors or technologies for enhanced situational
network (CNN) architecture for optimal performance. awareness.
Integrate the Keras model with the software to enable real-
time human detection on the drone's camera. V. CONCLUSION
Evaluation And Iteration One of the key strengths of the system is its ability to
Evaluate the performance of the system in detecting minimize the risk to rescue personnel by reducing their
humans in various disaster scenarios. Gather feedback from exposure to hazardous environments. By providing a cost-
rescue teams and stakeholders to identify areas for effective and scalable solution for disaster response, the
improvement. Iterate on the system design and model system has the potential to significantly enhance the
architecture to enhance performance and reliability in future capabilities of emergency management agencies worldwide.
deployments.
Moving forward, further research and development [8]. Lu, Y.; Xue, Z.; Xia, G.-S.; Zhang, L. A survey on
could focus on optimizing the system for use in specific vision-based UAV navigation. Geospat. Inf. Sci. 2018,
disaster scenarios, such as earthquakes or floods. 21, 21–32. [Google Scholar] [CrossRef] [Green
Additionally, integrating the system with other technologies, Version]
such as thermal imaging or unmanned ground vehicles, could [9]. Liu, Y.; Noguchi, N.; Liang, L. Development of a
further enhance its capabilities and utility in disaster response positioning system using UAV-based computer vision
efforts. for an airboat navigation in paddy field. Comput.
Electron. Agric. 2019, 162, 126–133. [Google Scholar]
Overall, the drone-based human detection system [CrossRef]
represents a valuable tool for improving the effectiveness of [10]. Apolo-Apolo, O.; Martínez-Guanter, J.; Egea, G.; Raja,
disaster response, ultimately saving lives and reducing the P.; Pérez-Ruiz, M. Deep learning techniques for
impact of disasters on affected communities. estimation of the yield and size of citrus fruits using a
UAV. Eur. J. Agron. 2020, 115, 126030. [Google
ACKNOWLEDGMENT Scholar] [CrossRef]
[11]. Aguilar, W.G.; Luna, M.A.; Moya, J.F.; Abad, V.;
We would like to express our gratitude to all those who Parra, H.; Ruiz, H. Pedestrian detection for UAVs using
contributed to the success of this research on human cascade classifiers with meanshift. In Proceedings of the
detection in flood scenarios using drones. Special thanks to 2017 IEEE 11th International Conference on Semantic
[Insert Names of Contributors/Team Members/Researchers] Computing (ICSC), San Diego, CA, USA, 30 January–1
for their dedication and hard work. We are also thankful to February 2017; pp. 509–514. [Google Scholar]
[Insert Funding Agencies/Institutions/Supporting [CrossRef]
Organizations] for their financial support and resources. [12]. Hu, B.; Wang, J. Deep learning based hand gesture
Additionally, we extend our appreciation to the participants recognition and UAV flight controls. Int. J. Autom.
who helped in data collection and testing. Without their Comput. 2019, 17, 17–29. [Google Scholar] [CrossRef]
cooperation, this study would not have been possible. [13]. Alotaibi, E.T.; Alqefari, S.S.; Koubaa, A. LSAR: Multi-
UAV collaboration for search and rescue missions.
REFERENCES IEEE Access 2019, 7, 55817–55832. [Google Scholar]
[CrossRef]
[1]. Gonçalves, J.; Henriques, R. UAV photogrammetry for [14]. Lygouras, E.; Santavas, N.; Taitzoglou, A.; Tarchanidis,
topographic monitoring of coastal areas. ISPRS J. K.; Mitropoulos, A.; Gasteratos, A. Unsupervised
Photogramm. Remote. Sens. 2015, 104, 101–111. human detection with an embedded vision system on a
[Google Scholar] [CrossRef] fully autonomous UAV for search and rescue
[2]. Rokhmana, C.A. The potential of UAV-based remote operations. Sensors 2019, 19, 3542. [Google Scholar]
sensing for supporting precision agriculture in [CrossRef] [PubMed] [Green Version]
Indonesia. Procedia Environ. Sci. 2015, 24, 245–253. [15]. Sudhakar, S.; Vijayakumar, V.; Kumar, C.S.; Priya, V.;
[Google Scholar] [CrossRef] [Green Version] Ravi, L.; Subramaniyaswamy, V. Unmanned aerial
[3]. Torresan, C.; Berton, A.; Carotenuto, F.; Di Gennaro, vehicle (UAV) based forest fire detection and
S.F.; Gioli, B.; Matese, A.; Miglietta, F.; Vagnoli, C.; monitoring for reducing false alarms in forest-fires.
Zaldei, A.; Wallace, L. Forestry applications of UAVs Comput. Commun. 2020, 149, 1–16. [Google Scholar]
in Europe: A review. Int. J. Remote. Sens. 2016, 38, [CrossRef]
2427–2447. [Google Scholar] [CrossRef]
[4]. Min, B.C.; Cho, C.H.; Choi, K.M.; Kim, D.H.
Development of a micro quad-rotor UAV for
monitoring an indoor environment. Adv. Robot. 2009,
6, 262–271. [Google Scholar] [CrossRef]
[5]. Samiappan, S.; Turnage, G.; Hathcock, L.; Casagrande,
L.; Stinson, P.; Moorhead, R. Using unmanned aerial
vehicles for high-resolution remote sensing to map
invasive Phragmitesaustralis in coastal wetlands. Int. J.
Remote. Sens. 2017, 38, 2199–2217. [Google Scholar]
[CrossRef]
[6]. Erdelj, M.; Natalizio, E.; Chowdhury, K.R.; Akyildiz,
I.F. Help from the sky: Leveraging UAVs for disaster
management. IEEE Pervasive Comput. 2016, 16, 24–
32. [Google Scholar] [CrossRef]
[7]. Al-Kaff, A.; Gómez-Silva, M.J.; Moreno, F.M.; De La
Escalera, A.; Armingol, J.M. An appearance-based
tracking algorithm for aerial search and rescue
purposes. Sensors 2019, 19, 652. [Google Scholar]
[CrossRef] [Green Version]