Driver Drowsiness Synopsis
Driver Drowsiness Synopsis
1. Introduction 2
2. Problem Statement 3
2.1. Features 3
2.2. Need of the Project 4
2.3. Literature Survey 4
2.4. Work Done 4
2.5. Future Work 5
3. Design Phase 6
3.1. Use Case Diagram 6
3.2. Level-0 DFD 7
3.3. Level-1 DFD 7
3.4. Flowchart 8
4. Chapter 4: References 9
1
CHAPTER 1: INTRODUCTION
Long hours of driving causes driver fatigue and, consequently, reduces a person’s response time.
Road safety while driving has been a serious concern. Many drivers lose their lives or cause fatal
accidents on the road during rash driving that causes many lives to be lost. Driver drowsiness
detection is a car safety technology which prevents accidents when the driver is getting drowsy.
Various studies have suggested that around 20% of all road accidents are fatigue-related, up to
50% on certain roads. driver drowsiness involves no triggering event but, instead, is characterized
by a progressive withdrawal of attention from the road and traffic demands. Both driver drowsiness
and distraction, however, might have the same effects, i.e., decreased driving performance, longer
reaction time, and an increased risk of crash involvement.
Based on Acquisition of video from the camera that is in front of driver perform real-time
processing of an incoming video stream in order to infer the driver’s level of fatigue if the
drowsiness is detected then the output is sent to the alarm system and the alarm is activated.
A phone camera is mounted inside the dashboard behind the steering wheel of a car, with a slightly
up-tilted view-angle towards the driver. The acquired video frames are processed sequentially by
the software modules. The face acquisition module detects and tracks the driver’s face in real-time.
Classifiers, which are trained offline with the extracted features, are applied to determine the
presence of facial expressions. Several approaches are applied to extract discriminative features to
learn patterns of different facial expressions. We investigate approaches based on holistic affine
warping and local descriptors.
2
CHAPTER 2: PROBLEM STATEMENT
Features:
● Facial Detection: Our live project will find and detect the face of the person who will be
driving the car. As the cell phone is docked on the dashboard, the front camera of the cell
phone is placed near the seat facing the driver seat; the assumption is taken that the nearer
face is the driver’s face.
● Eye Detection: Once the driver’s face is detected, the application will track the driver’s
eye, and check whether they are open and attentive, or are feeling tired and closing to sleep.
● Real-time Yawn Detection: As with the case of eye detection, the mouth of the driver is
also kept track of. The amount of opening of the mouth and the time for which the mouth
is in “yawn” will be detected.
● Contrast Correction: The interior is not always well lit. Front cameras of smartphones may
or may not be able to effectively correct the contrast and lighting. So, an artificial contrast
improvement technique is employed in the project to rectify this problem so that the
driver’s face can be easily detected and the video frames captured can be processed.
● On-screen Alert: If the application detects that the driver to be in a drowsy or sleepy state,
then the smartphone’s screen will display an alert message to notify about drowsiness
detection.
● Sound Alert: A distinct sound is played on the smartphone, that is accompanied by the
above-mentioned display alert. A loud alerting alarm sound would try to notify or startle
the driver to prevent a mishap.
3
Need for the project:
Many of us think we can keep our minds alert, even when we're feeling the tug of sleepiness on
our brains and bodies. But the truth is that sleep is a powerful biological drive—one that can
overtake even the best driver. It's important to know when to handle drowsiness to protect the
safety of everyone on the road.
Well before a person actually falls asleep while driving, lapses in attention and slowed reaction
times make drowsy driving very dangerous. Driving is a complex activity that involves many small
but important split decisions with every passing second. Even if you're awake, your brain is not
functioning optimally to handle these decisions. Studies show that excessive sleepiness decreases
our judgment and increases risk-taking.
In one study, 82% of drowsy-driving crashes involved someone driving alone. Driving by yourself
also means that you must do all the driving. A single driver has no one to talk to who can help
keep him alert. Other people in a car will often notice when the driver is getting sleepy.
Drivers who do not take regular breaks when driving long distances run a high risk of becoming
drowsy a state which they often fail to recognize early enough according to the experts. Studies
show that around one-quarter of all serious motorway accidents are attributable to sleepy drivers
in need of a rest, meaning that drowsiness causes more road accidents than drink-driving. Attention
assist can warn of inattentiveness and drowsiness in an extended speed range and notify drivers of
their current state of fatigue and the driving time since the last break offers adjustable sensitivity
and, if detected, an alarm is sounded.
Work Done:
● Image Contrast and Lighting improvement: Used in-built OpenCV to improve the lighting,
contrast and reduce noise in each video frame that was captured. This rectified video frame
will be then used by eye and mouth tracking methods for further processing.
4
● Working eye drowsiness detection: The driver’s Eye Aspect Ratio or EAR is calculated in
each frame of the video stream. If the EAR is found to be of the range of values that are in
the eye closed range, then the driver is alerted.
● Working mouth yawning detection: The driver’s mouth aspect ratio is calculated in each
video frame. If this aspect ratio is found to be high value, that is in the range of “yawning”,
then the driver is detected to be taking a yawn, and would be alerted if the condition of the
driver is drowsy.
● On-screen alert: When the driver is detected to be sleepy from the above detection methods,
then the screen will display an alert to notify that drowsiness conditions were observed on
the driver’s face.
Future Work:
5
CHAPTER 3: DESIGN PHASE
6
Level-0 DFD:
Level-1 DFD:
7
Flowchart Diagram:
8
CHAPTER 4: REFERENCES
1. Vandna Saini et al, / (IJCSIT) International Journal of Computer Science and Information
Technologies, Vol. 5 (3), 2014, 4245-4249; Driver Drowsiness Detection System and
Techniques: A Review;
https://pdfs.semanticscholar.org/71bc/acba6bbd44ef330432ce1603c8874ca35d03.pdf
3. Reza Ghoddoosian, Marnim Galib and Vassilis Athitsos; “A Realistic Dataset and
Baseline Temporal Model for Early Drowsiness Detection”, arXiv from Cornell
University; https://arxiv.org/pdf/1904.07312.pdf
4. Hua Goa et al, “Detecting Emotional Stress From Facial Expressions For Driving
Safety”; https://infoscience.epfl.ch/record/200407/files/icip1024-cam-ready.pdf
5. https://www.geeksforgeeks.org/project-idea-driver-distraction-and-drowsiness-detection-
system-dcube/
6. “Driver Fatigue and Road Accidents A Literature Review and Position Paper”, Royal
Society for the Prevention of Accidents, February 2001.
7. E. Rogado, J.L. García, R. Barea, L.M. Bergasa, Member IEEE and E. López, February,
2013, “Driver Fatigue Detection System”, Proceedings of the IEEE International
Conference on Robotics and Biomimetics, Bangkok, Thailand.
9
8. Abhinash Dash, Birendra Nath Tripathy, “Prototype Drowsiness Detection System”, A
Thesis Submitted In Parallel Fulfilment Of The Requirements For The Degree Of
Bachelor in Technology In Electronics and Instrumentation Engineering;
http://ethesis.nitrkl.ac.in/3373/1/thesis-108EI038-026.pdf
10. https://docs.opencv.org/3.4/d4/d13/tutorial_py_filtering.html
12. Danisman T, Bilasco IM, Djeraba C, Ihaddadene N. Drowsy driver detection system
using eye blink patterns. Machine and Web Intelligence (ICMWI) IEEE 2010;230-233.
10