0% found this document useful (0 votes)
58 views70 pages

Major Project

This document summarizes a project report on developing an Internet of Things (IoT) based smart vehicle system with enhanced safety, security, and tracking features using wireless sensors. The system was developed by three students for their Bachelor of Technology degree in Computer Science and Engineering under the guidance of a faculty member. The project aims to address problems that can cause accidents and ensure safety using sensors like vibration, ultrasonic, alcohol, and GPS to detect accidents, alert authorities, enforce seatbelt use, prevent drunken driving, and track vehicle location respectively.

Uploaded by

Mujtaba Ali
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
58 views70 pages

Major Project

This document summarizes a project report on developing an Internet of Things (IoT) based smart vehicle system with enhanced safety, security, and tracking features using wireless sensors. The system was developed by three students for their Bachelor of Technology degree in Computer Science and Engineering under the guidance of a faculty member. The project aims to address problems that can cause accidents and ensure safety using sensors like vibration, ultrasonic, alcohol, and GPS to detect accidents, alert authorities, enforce seatbelt use, prevent drunken driving, and track vehicle location respectively.

Uploaded by

Mujtaba Ali
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 70

A Project Report

On
IOT based Smart Vehicle Automation and Control with Enhanced
Safety,Security and Tracking System using Wireless Sensors
Submitted to

JAWAHARLAL NEHRU TECHNOLOGICAL UNIVERSITY- HYDERABAD

In partial fulfilment of the requirements for the award of degree

BACHELOR OF TECHNOLOGY
In

COMPUTER SCIENCE AND ENGINEERING


Submitted by
MOHD AMAN UDDIN (15RT1A0520)
SYED MUJTABA ALI (15RT1A0547)
SHAIK SHAZEB AHMED (15RT1A0539)

Under the guidance of

Mr. Mohammad Khaleel Ahmed


Associate Professor

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING

NAWAB SHAH ALAM KHAN COLLEGE OF ENGINEERING & TECHNOLOGY

(Approved By AICTE, Permitted By Government of Telangana, Affiliated to JNTUH)

New Malakpet, Hyderabad - 500024.

Year - 2019
DECLARATION

I here by declare that the project work entitled “IOT based

Smart Vehicle Automation and Control with Enhanced

Safety,Security and Tracking System using Wireless Sensors

Networks”submitted to the Department of Computer Science and

Engineering of Nawab Shah Alam Khan College Of Engineering And

Technology, affiliated to JNTU, Hyderabad in the partial fulfillment of

the requirement for award of the degree in BACHELOR OF

TECHNOLOGY is a bonafide work done by him.

MOHD AMAN UDDIN (15RT1A0520)


SYED MUJTABA ALI (15RT1A0547)
SHAIK SHAZEB AHMED (15RT1A0539)
ACKNOWLEDGEMENT

The satisfaction that accomplished completion of any task would be


incomplete without the mention of people who made it possible and whose
encouragementandguidancehasbeenasourceofinspirationthroughthesource of the
project.

I express my profound sense of guidance to Dr. Syed Abdul Sattar,


Principal of Nawab Shah Alam Khan College of Engineering and
Technology, Dr. Zaher hassan, Director R & D cell andMr. Mohammed
Khaleel Ahmed, HOD CSE for inspiring me.

I would like to express my sincere thanks and gratitude to ourSupervisor


Mr Mohammad Khaleel Ahmed, Associate Professor in CSE Department for
her earnest effort and her timely suggestion that motivated me to come out with
an excellentoutput.

I would also like to express my sincere thanks to our ProjectCoordinator


Mr. M. A. Rawoof Assistant Professor in CSE Department for giving me this
opportunity to work within thisenvironment.

I thank to my parents who gave me immense support in building my


career and also to all the staff members of CSE department, my colleagues who
played a vital role in finishing my project work with less difficulty.

MOHD AMAN UDDIN (15RT1A0520)


SYED MUJTABA ALI (15RT1A0547)
SHAIK SHAZEB AHMED (15RT1A0539)
CONTENTS

Chapter name: Page no:

Abstract I

List of figures II

List of Tables III

CHAPTER 1. INTRODUCTION

1.1 Introduction
1.2 General
1.3 Objectives
1.4 Overview
1.4.1 Key features
1.4.2 Advantages
1.4.3 disadvantages
2

CHAPTER 2. LITERATURE SURVEY

2.1 Embedded System

2.2 Sharing the Data of E.H.R System

2.3 High Privacy-Maintaining Mechanism

2.4 Privacy-Maintaning Queries


2.5 Securing Singular Individual E.H.R
2.6 Tools
2.7 Resources
2.8 Real Time Issues
CHAPTER 3. SYSTEM 15

3.1 System information

3.2 system Architecture

3.3 Applications

3.3.1 IOsL (Internet of smart living)

3.3.2 IOsC ( Internet of smart cities)


3.3.3 IOsE (Internet of smart environment)
3.3.4 IOsI (Internet of smart industry)
3.3.5 IOsH (Internet of smart health)

3.3.6 IOsE (internet of smart energy)

3.3.7 IOsA (internet of smart agriculture)

CHAPTER 4. SYSTEM INTEGRATION


4.1 Hardware Components
4.1.1 Power Supply
4.1.2 Node MCU
4.1.3 Gas Sensor
4.1.4 Ultrasonic Sensor
4.1.5 Vibration Sensor
4.1.6 Global Positioning System Sensor
4.1.7 Stepper Motor

4.1.8 Buzzer

4.2 Software Components


4.2.1 Arduino IDE
4.2.1 Embedded C
4.2.3 Android App(Blynk)

CHAPTER 5. SYSTEM DESIGN 57

5.1 SYSTEM Architecture 57

5.2 Uml Diagrams 58

5.2.1 Class Diagram 58

5.2.2 Use Case Diagram 59

5.2.3 Sequence Diagram 60

5.2.4 Activity Diagram 61

5.2.5 State chart Diagram 62

CHAPTER 6. IMPLEMENTATION 63

6.1 PIC 16F73 Schematic Diagram 63

6.2 PIC Micro Controller Programming Procedure 64

6.2.1 Circuit Design 64


6.2.2 Circuit Diagram 65

6.2.3 Program the PIC Microcontroller 65

6.2.4 Simulating the Circuit 65

6.3 Program Code 66

CHAPTER 7. SYSTEM TESTING 70

7.1 Types Of Test 70

7.1.1 Unit Testing 70

7.1.2 Integration Testing 70

7.1.3 Functional Testing 71

7.1.4 System Testing 71

7.1.5 White Box Testing 72

7.1.6 Black Box Testing 72

7.1.7 Acceptance Testing 72

CHAPTER 8. SCREENSHOTS 73

CHAPTER 9. FUTURE SCOPE 83

CHAPTER 10. CONCLUSION 84

CHAPTER 11.BIBLOGRAPHY 85

REFERENCES 86
LIST OF FIGURES

S.No FIGURE NAME PAGE


NO
2.1 Modern example of embedded 4
system
2.2 Network communication embedded 11
system
2.3 Automatic coffee making machine 12

2.4 Fax machine 13

2.5 Printing machine 13

2.6 Robot 13

2.7 Computer networking 14

2.8 Cell phone 14


2.9 Web camera 14
3.1 Block diagram of IOT based smart 15
home appliances by using Tetris
switch
3.2 Harvard and Neuman architecture 19

3.3 Clock cycle 21

3.4 Instruction pipeline flow 22

3.5 Pin diagram of PIC16f76/73 23


3.6 PORT B and TRIS B 25

3.7 PORT A and TRIS A 27

3.8 Regulated power supply 28

3.9 Circuit diagram for regulated power 29


supply with LED connection
3.10 Step down transformer 30

3.11 High-watt 9v battery 31

3.12 Bridge rectifier 33


3.13 Construction of a capacitor 35

3.14 Electrolytic capacitor 35

3.15 Voltage regulator 36

3.16 Resistor 37

3.17 Inside a LED 38

3.18 Parts of LED 39


3.19 Electrical symbol and polarities of 39
LED
3.20 ESP8266 WIFI Module 41

3.21 WIFI ESP8266MOD 41

3.22 Relay circuit diagram 44

3.23 Circuit symbol of relays 48

3.24 DPDT AC coil relay 50

3.25 Relay driver 52


3.26 Toolbar necessary for the interface 53
4.1 System Architecture 57

4.2.1 Class Diagram 58


4.2.2 Use Case Diagram 59

4.2.3 Sequence Diagram 60

4.2.4 Activity Diagram 61

4.2.5 State Chart Diagram 62

5.1 PIC 16F73 Schematic Diagram 63

5.2.3 Circuit Diagram 65

7.1-7.11 Screenshots 73-82

LIST OF TABLE
S NO. TABLE NAME PAGE NO.
1. ESP 8266 Description 42
Abstract

In these modern era transportation is becoming as one of the important need of


human. Though it has numerous need, we face lot of problem in it which cost human
life. This paper deals with problem which cause accident and also to ensure safety.
It deals with vibration sensor to detect the accident through this a alert message to
the official person which give GPS location. A mechanism involves to confirm the
assurance of locked seat belt. And it also ensure the driver is not get drunken through
the alcohol sensor and a proximity sensor is deployed to avoid the collision. Through
this driver safety is ensured through the automotive mechanism.
CHAPTER 1

INTRODUCTION

1.1 Introduction
Vehicle tracking systems are popular among people as a retrieval device and theft
prevention. The main benefit of vehicle tracking systems is the security purposes by
monitoring the vehicle's location which can be used as a protection approach for
vehicles that are stolen by sending its position coordinates to the police center as an
alert for the stolen. When a police center receives an alert for stolen vehicles, they
can make an action to prevent this theft. Nowadays, it is used either as a replacement
or addition for car alarms to protect it from theft or it can be used as a monitoring
system to keep track the vehicle at the real time. So, many applications can be used
for this purpose to block car's engine or doors as an action to protect the vehicle. Due
to the advancement in technology vehicle tracking systems that can even identify
and detect vehicle's illegal movements and then attentive the owner about these
movements. This gives an advantage over the rest applications and other pieces of
technology that can serve for the same purpose. Nowadays, vehicle tracking is one
of the most important applications. For example, the maps given to vehicle drivers
may play a large role in vehicle tracking and monitoring. The major difficulty is that
vehicle owners may not be able to distinguish the vehicle in a place as a result of
overlapping maps, which adversely affects the process of tracking and
monitoring[1]. It requires some types of systems to identify and detect where objects
were at some time or what distance traveled during a trip to a vehicle. This may be
an additional point and help the police in preventing thefts and locating the vehicle
by relying on reports from these approved systems and studying and analyzing them
to detect stolen vehicles' locations. This system is a necessary device for tracking of
vehicles any time the owner wants to observe or monitor it and today it is really
trendy among people having costly cars, used as theft avoidance and recovery of the
stolen car. The collected data can be observed on a digital maps by using internet
and software.

There is tremendous demand for object tracking application for the business process.
The real-time tracking information on valuable things and assets could solve many
problems in the world. GPS is the Global Positioning System which provides the
location, using off-line and on-line both in any atmospheric conditions. There are
several types of GPS tracking system available in the market.

1.2 General

An automated guided vehicle or automatic guided vehicle (AGV) is a


portable robot that follows along marked lines or wires on the floor, or uses radio
waves, vision cameras, magnets, or lasers for navigation. They are most often used
in industrial applications to transport heavy materials around a large industrial
building, such as a factory or warehouse. Application of the automatic guided
vehicle broadened during the late 20th century. An automated driving system is a
complex combinations of various components that can be defined as systems where
perception, decision making, and operation of the automobile are performed by
electronics and machinery instead of a human driver, and as introduction of
automation into road traffic. This includes handling of the vehicle, destination, as
well as awareness of surroundings. While the automated system has control over the
vehicle, it allows the human operator to leave all responsibilities to the system.
The automated driving system is generally an integrated package of
individual automated systems operating in concert. Automated driving implies that
you as the driver have given up the ability to drive (i.e., all appropriate monitoring,
agency, and action functions) to the vehicle automation system. Even though you as
the driver may be alert and ready to take action at any moment, you are still giving
up the ability to the automation system.
Automated driving systems are often conditional, which implies that the automation
system is capable of automated driving, but not for all conditions encountered in the
course of normal operation. Therefore, a human driver is functionally required to
initiate the automated driving system, and may or may not do so when driving
conditions are within the capability of the system. When the vehicle automation
system has assumed all driving functions, the human is no longer driving the vehicle
but continues to assume responsibility for the vehicle's performance as the vehicle
operator. The automated vehicle operator is not functionally required to actively
monitor the vehicle's performance while the automation system is engaged, but the
operator must be available to resume driving within several seconds of being
prompted to do so, as the system has limited conditions of automation. While the
automated driving system is engaged, certain conditions may prevent real-time
human input, but for no more than a few seconds. The operator is able to resume
driving at any time subject to this short delay. When the operator has resumed all
driving functions, he or she reassumes the status of the vehicle's driver.

1.3 Objectives

According to data from the Federal Reserve Bank of St. Louis, Americans drove
more than 3 trillion miles last year. At an average speed of 40mph, that’s roughly
23,000 years of human life spent each day doing little more than sitting and focusing
on pavement. What’s worse, car accidents also kill more than 30,000 people in the
United States every year (Insurance Institute for Highway Safety). Nothing has
shaped American construction and consumerism quite like the car. It’s encouraged
spacious design in everything from suburbs to shopping malls and has created
hundreds of secondary industries. America has been so accommodative to
automobiles that we are now dependent on them. In 2013, 85.8 percent of Americans
used automobiles to get to work (U.S. Census Bureau). Even if someone can avoid
owning a car, they’ll still need to rent one or use a taxi from time to time.

However, just because America needs cars, it doesn’t necessarily need drivers.
Various companies are designing self-driving vehicles, believing that a fully
automated car could improve the quality of the American commute and drastically
reduce vehicle accidents
1.4 Overviews

Internet of things systems allow users to achieve deeper automation, analysis, and
integration within a system. They improve the reach of these areas and their
accuracy. Internet of things utilizes existing and emerging technology for sensing,
networking, and robotics. Internet of things exploits recent advances in software,
falling hardware prices, and modern attitudes towards technology. Its new and
advanced elements bring major changes in the delivery of products, goods, and
services; and the social, economic, and political impact of those changes.

1.4.1 Key Features


The most important features of IoT include artificial intelligence, connectivity,
sensors, active engagement, and small device use. A brief review of these features
is given below:

 AI – IoT essentially makes virtually anything “smart”, meaning it enhances every


aspect of life with the power of data collection, artificial intelligence algorithms, and
networks. This can mean something as simple as enhancing your refrigerator and
cabinets to detect when milk and your favorite cereal run low, and to then place an
order with your preferred grocer.

 Connectivity – New enabling technologies for networking, and specifically IoT


networking, mean networks are no longer exclusively tied to major providers.
Networks can exist on a much smaller and cheaper scale while still being practical.
IoT creates these small networks between its system devices.

 Sensors – IoT loses its distinction without sensors. They act as defining instruments
which transform IoT from a standard passive network of devices into an active
system capable of real-world integration.
 Active Engagement – Much of today's interaction with connected technology
happens through passive engagement. IoT introduces a new paradigm for active
content, product, or service engagement.  Small Devices – Devices, as predicted,
have become smaller, cheaper, and more powerful over time. IoT exploits purpose-
built small devices to deliver its precision, scalability, and versatility.

1.4.2 Advantages

The advantages of IoT span across every area of lifestyle and business. Here is a list
of some of the advantages that IoT has to offer:

 Improved Customer Engagement – Current analytics suffer from blind-spots and


significant flaws in accuracy; and as noted, engagement remains passive. IoT
completely transforms this to achieve richer and more effective engagement with
audiences.

 Technology Optimization – The same technologies and data which improve the
customer experience also improve device use, and aid in more potent improvements
to technology. IoT unlocks a world of critical functional and field data.

 Reduced Waste – IoT makes areas of improvement clear. Current analytics give
us superficial insight, but IoT provides real-world information leading to more
effective management of resources.

 Enhanced Data Collection – Modern data collection suffers from its limitations
and its design for passive use. IoT breaks it out of those spaces, and places it exactly
where humans really want to go to analyze our world. It allows an accurate picture
of everything.
1.4.3 Disadvantages

Though IoT delivers an impressive set of benefits, it also presents a significant set
of challenges. Here is a list of some its major issues:

 Security – IoT creates an ecosystem of constantly connected devices


communicating over networks. The system offers little control despite any security
measures. This leaves users exposed to various kinds of attackers.

 Privacy – The sophistication of IoT provides substantial personal data in extreme


detail without the user's active participation.

 Complexity – Some find IoT systems complicated in terms of design, deployment,


and maintenance given their use of multiple technologies and a large set of new
enabling technologies.

 Flexibility – Many are concerned about the flexibility of an IoT system to integrate
easily with another. They worry about finding themselves with several conflicting
or locked systems.

 Compliance – IoT, like any other technology in the realm of business, must comply
with regulations. Its complexity makes the issue of compliance seem incredibly
challenging when many consider standard software compliance a battle.
CHAPTER 2

LITERATURE SURVEY

2.1 Embedded Systems

An embedded system is a computer system designed to perform one or a few


dedicated functions often with real-time computing constraints. It is embedded as
part of a complete device often including hardware and mechanical parts. By
contrast, a general-purpose computer, such as a personal computer (PC), is
designed to be flexible and to meet a wide range of end-user needs. Embedded
systems control many devices in common use today.

Embedded systems are controlled by one or more main processing cores that
are typically either microcontrollers or digital signal processors (DSP). The key
characteristic, however, is being dedicated to handle a particular task, which may
require very powerful processors. For example, air traffic control systems may
usefully be viewed as embedded, even though they involve mainframe computers
and dedicated regional and national networks between airports and radar sites.
(Each radar probably includes one or more embedded systems of its own.)
2.2 Sharing the Data of E.H.R System

Owing to its cost-efficiency and immense popularity, GPU has emerged


as the most dominant chip architecture for self-driving technology in the
recent past. The increasing complexities of computing hardware and the
requirements for testing autonomous cars on real roads warrant
superior AI-based operating platforms that would anticipate potential
hazards while driving. In this regard, world's foremost GPU maker
Nvidia has been scoring big wins in terms of developing GPU-powered
AI platforms and teaming up with well-known automotive giants.

Equipped with these Level 5-empowering GPUs, the driverless cars would most
likely be deployed in a ride-hailing capacity in restricted settings like airports or
college campuses. Moreover, it has also been reported that German engineering and
electronics company Robert Bosch GmbH and leading automaker Daimler AG have
partnered up with Nvidia to utilize its Pegasus system as the platform for their self-
driving vehicle designs beginning in 2020. A few other automotive firms such as
Zenrin, ZF and Audi have committed to use the AI-based computers of Nvidia.
Considering the instances of GPU makers building new products, particularly
Nvidia, it can certainly be claimed that the criticality of GPU-powered AI platforms
in the effective implementation of autonomous vehicles programs is of much
significance. Many more such developments are in the works and would speed up
the creation of AI-driven big data systems, in which GPUs would play a pivotal role
in the upcoming years.
2.3 High Privacy-Maintaining Mechanism

As the development and testing of self-driving car technology has progressed, the
prospect of privately-owned autonomous vehicles operating on public roads is
nearing. Several states have passed laws related to autonomous vehicles, including
Nevada, California, Florida, Michigan, and Tennessee. Other states
have ordered that government agencies support testing and operations of these
vehicles. Industry experts predict that autonomous vehicles will be commercially
available within the next five to ten years. A 2016 federal budget proposal, slated to
provide nearly $4 billion in funding for testing connected vehicle systems, could
accelerate this time frame. In addition, the National Highway Traffic Safety
Administration (NHTSA) set a goal to work with stakeholders to “accelerate the
deployment” of autonomous technologies.

Autonomous vehicles may collect and maintain identifying information about the
owner or passenger of the vehicle for a variety of purposes, such as to authenticate
authorized use, or to customize comfort, safety, and entertainment settings. This
information likely will be able to identify owners and passengers and their activities
with a high degree of certainty.

Existing U.S. federal privacy legislation is largely inapplicable to autonomous


vehicles:

 The federal Drivers’ Privacy Protection Act protects motor vehicle records from
disclosure by state departments of motor vehicles.
 Although the Electronic Communications Privacy Act (“ECPA”) may protect
against the interception of the vehicle’s electronic communications or access to
stored communications by unauthorized third parties, the service provider (or its
vendor) providing the communications or storage functionality may be capture
and use these communications without violating the law.
 Although the Federal Communications Act (“FCA”) requires
“telecommunications carriers” to protect the confidentiality of “proprietary
information” of customers, it is possible that autonomous vehicle manufacturers
or their service providers would not be a “telecommunications carrier” – a
classification more typically applied to operators of landline telephone or
cellular phone networks.
 State law also may not provide much protection. For example, state data
breach notification laws typically require notification of a data breach, but do
not impose substantive privacy or security protections. Data security laws,
such as those in effect in Massachusetts and California, may not currently
apply to the types of data collected or used by autonomous vehicles.

2.4 Privacy-Maintaning Queries

Our methodology involved two steps. Firstly, we identified AV-related implications


by preliminary review and exploration of the key factors that were highlighted as the
most prominent in the current literature. We searched for possible risks associated
with AVs using the keywords “autonomous vehicle(s)”, “driverless” or “driverless
vehicle(s)” in combination with one of the following keywords representing an AV-
related implication (Table 1). Boolean operators such as “AND”, “OR” and “NOT”
were also used. To identify the lesser-known risks of AVs, we searched AVs in
conjunction with “risk(s)” and its synonyms, such as “effect(s)”, “impact(s)” and
“consequence(s)”. Secondly, existing government efforts to manage AV-related
risks were identified. We searched for words relating to government regulation, such
as “regulation(s)”, “legislation(s)”, “rule(s)”, “bill(s)” and “law(s)”, together with
AVs and the names of the countries and regions of study. These include Australia,
China, the EU, Germany, Japan, South Korea, Singapore, the US, and the UK, as
most of AV-related developments have occurred in these regions and countries.
2.5 Securing Singular Individual E.H.R

Specifically, the evolutionary path to the much-hyped “fully autonomous” car with each
stage providing exponential value.
Increasing levels of intelligent automation will also provide exponential benefits. If we
compare the levels in the auto industry and apply them to the world of cybersecurity,
level zero has very little automation while level five is most autonomous.

Level 0:

Cars: Complete driver control of the vehicle, i.e., very little automation.

Cybersecurity: This is equivalent to using manual cybersecurity


techniques for all threat detection, security data analysis, and incident
response.

Level 1:

Cars: Some driver assistance with specific functions carried out


automatically, such as steering or accelerating, but not both
simultaneously. Adaptive cruise control or automatic emergency
braking, for example.
Cybersecurity: This is equivalent to automatic log aggregation with
SIEMs and creating rules for alerts. It is not particularly “intelligent,”
but serves an important foundational role for the future of intelligent
security automation.
Level 2:

Cars: At least one driver assistance system for both steering and
acceleration/ deceleration, which responds to the environment and
allows the driver to physically disengage from the steering wheel.
Examples include Tesla AutoPilot and self-parking.
Cybersecurity: This is where we see a lot of hype in the security industry.

On one hand, you have solutions such as User Behavior Analytics and
Network Traffic Analysis that profess to automatically analyze
”normal” behavior and alert anything abnormal. The drawback is the
inability to understand the full context of an environment or situation,
which results in a tendency to generate too many false positives and
requires significant analyst involvement to triage.

On the other hand, you have early orchestration solutions that can
partially automate some of the easier and repeatable actions during an
incident response process. While this solution is adequate to collect
relevant information for an investigation process, the actual decision
making is delegated to the analyst.

In essence, Level 2 automates actions and repeatable tasks, but not the
decision making and judgments that require “intelligence.”

Level 3:
Cars: Drivers can be fully disengaged, but are still required to pay close
attention and be “on standby” to take over should the system fail.
Cybersecurity: There are key areas where this is becoming a reality in
security automation today.

The first is full, end-to-end alert triage automation. This is where the
system has the intelligence, based on context and awareness of an alert’s
severity, to make decisions and accept feedback from human analysts.
Though more advanced systems are able to provide a full explanation of
their scoring, analysts still need to review the system’s results. However,
95 percent of the overhead work they used to have to do is effectively
eliminated.
Level 4:

Cars: This is positioned as "fully autonomous,” yet still doesn’t cover


every situation. No driver interaction is needed and the car will deal with
system failures by stopping itself.
Cybersecurity: A “fully autonomous” security solution is where threat
hunting is automated with the system itself to create logic for 99 percent
of known and unknown threats, while continuously adapting to
changing threat landscapes. It can not only identify the threats, but can
also automatically remediate and respond. Generally no human
interaction is necessary, except for in extreme situations like the less
than 1 percent of threats the system cannot detect.

Such a solution does not exist today, but is often what CISOs hope for
when they hear “security automation.” Achieving this nirvana will
require significant advancements in machine learning and computing
power.

2.6 Tools

Embedded development makes up a small fraction of total programming.


There's also a large number of embedded architectures, unlike the PC world where
1 instruction set rules, and the UNIX world where there's only 3 or 4 major ones.
This means that the tools are more expensive. It also means that they're lowering
featured, and less developed. On a major embedded project, at some point you
will almost always find a compiler bug of some sort.

Debugging tools are another issue. Since you can't always run general
programs on your embedded processor, you can't always run a debugger on it.
This makes fixing your program difficult. Special hardware such as JTAG ports
can overcome this issue in part. However, if you stop on a breakpoint when your
system is controlling real world hardware (such as a motor), permanent equipment
damage can occur. As a result, people doing embedded programming quickly
become masters at using serial IO channels and error message style debugging.

2.7 Resources

To save costs, embedded systems frequently have the cheapest processors that
can do the job. This means your programs need to be written as efficiently as
possible. When dealing with large data sets, issues like memory cache misses that
never matter in PC programming can hurt you. Luckily, this won't happen too
often- use reasonably efficient algorithms to start, and optimize only when
necessary.

Memory is also an issue. For the same cost savings reasons, embedded
systems usually have the least memory they can get away with. That means their
algorithms must be memory efficient (unlike in PC programs, you will frequently
sacrifice processor time for memory, rather than the reverse). It also means you
can't afford to leak memory. Embedded applications generally use deterministic
memory techniques and avoid the default "new" and "malloc" functions, so that
leaks can be found and eliminated more easily. Other resources programmers
expect may not even exist. For example, most embedded processors do not have
hardware FPUs (Floating-Point Processing Unit). These resources either need to
be emulated in software, or avoided altogether.
2.8 Real Time Issues

Embedded systems frequently control hardware, and must be able to


respond to them in real time. Failure to do so could cause inaccuracy in
measurements, or even damage hardware such as motors. This is made even more
difficult by the lack of resources available. Almost all embedded systems need to
be able to prioritize some tasks over others, and to be able to put off/skip low
priority tasks such as UI in favor of high priority tasks like hardware control.
CHAPTER 3

PROPOSED SYSTEMS
The proposed system in IoT connected vehicle with a voice-based virtual personal
assistant and it is made up of a vehicle agent and a home agent, as shown in Figure
1. The virtual personal assistant always stays with a driver as a personal IoT partner
and performs several kinds of activities to be able to do things like turn on the A/C,
lock/unlock the doors, and turn on lights, as well as to support functions like playing
music, making a phone call, and navigating while driving, at home and at the office.
A user can communicate with it through the homogenous voice-based natural
language interface, both in the vehicle and while at home. One interesting feature is
to be able to manage their smart home directly and access all kinds of content
provided by the smartphone, anywhere and at anytime. Another is to provide cloud-
based personalized services within the home-to-vehicle connected environment
using a unified speech interface based on natural language that is supported in
different environments such as the home, vehicle, and office. This makes it possible
for users to have the same connectivity in their vehicles as they have at home and at
work.
3.1 System Information

This IVI system, in which a user (vehicle owner) controls and uses numerous IVI
resources through one-to-one communication. The user will exchange the request
and response directly with the devices or contents. Thus, it is not easy for a user to
control and manage many different types of IVI resources, since the distinctive
features of each resource must be considered by the user. In addition, only the owner
is considered for IVI services, and the types of users (e. g., family, friends, and public
users) are not supported. Depending on the user type, different services and
permission levels for IVI resources may be provided. Furthermore, each IVI device
may use a different communication technology, such as Bluetooth, ZigBee, WLAN,
etc.

Figure 1: Architecture of the existing in-vehicle


In the meantime, Figure 2 shows the proposed IoT-based IVI system architecture.
The IVI-Master is newly introduced for overall control and management of various
IVI resources, such as sensors, devices, and contents. The IVI users are classified as
vehicle owner or other users. The vehicle owner will manage the IVI-Master and all
IVI resources with the associated database (DB). The other users will use and control
the IVI resources with their authority and permission level with the help of the IVI-
Master. The communications between the IVI-Master and users will be done by
using the HTTP, whereas LWM2M is used for communications between the IVI-
Master and IVI resources, in which CoAP (Constrained Application Protocol) [28]
and/or Message Queuing Telemetry Transport (MQTT) [29] may be used.

Figure 2. Architecture of proposed IVI system.

The possible configuration of IVI system components, based on Figure 2, which


includes the IVI-Master, user, and many IVI resources.
3.3 APPLICATION

Potential applications of the IoT are numerous and diverse, permeating into
practically all areas of every-day life of individuals, enterprises, and society as a
whole. The IoT application covers “smart” environments/spaces in domains such as:
Transportation, Building, City, Lifestyle, Retail, Agriculture, Factory, Supply chain,
Emergency, Healthcare, User interaction, Culture and tourism, Environment and
Energy. Below are some of the IOT applications

A. IOsL (Internet of smart living):


Remote Control Appliances: Switching on and off remotely appliances to avoid
accidents and save energy, Weather: Displays outdoor weather conditions such as
humidity, temperature, pressure, wind speed and rain levels with ability to transmit
data over long distances, Smart Home Appliances: Refrigerators with LCD screen
telling what’s inside, food that’s about to expire, ingredients you need to buy and
with all the information available on a Smartphone app. Washing machines allowing
you to monitor the laundry remotely, and. Kitchen ranges with interface to a
Smartphone app allowing remotely adjustable temperature control and monitoring
the oven’s self-cleaning feature, Safety Monitoring: cameras, and home alarm
systems making people feel safe in their daily life at home, Intrusion Detection
Systems: Detection of window and door openings and violations to prevent
intruders, Energy and Water Use: Energy and water supply consumption
monitoring to obtain advice on how to save cost and resources, & many more…
B. IOsC ( Internet of smart cities):

Structural Health: Monitoring of vibrations and material conditions in buildings,


bridges and historical monuments, Lightning: intelligent and weather adaptive
lighting in street lights, Safety: Digital video monitoring, fire control management,
public announcement systems, Transportation: Smart Roads and Intelligent High-
ways with warning messages and diversions according to climate conditions and
unexpected events like accidents or traffic jams, Smart Parking: Real-time
monitoring of parking spaces availability in the city making residents able to identify
and reserve the closest available spaces, Waste Management: Detection of rubbish
levels in containers to optimize the trash collection routes. Garbage cans and recycle
bins with RFID tags allow the sanitation staff to see when garbage has been put out.

C. IOsE (Internet of smart environment):

Air Pollution monitoring: Control of CO2 emissions of factories, pollution emitted


by cars and toxic gases generated in farms, Forest Fire Detection: Monitoring of
combustion gases and preemptive fire conditions to define alert zones, Weather
monitoring: weather conditions monitoring such as humidity, temperature,
pressure, wind speed and rain, Earthquake Early Detection, Water Quality: Study
of water suitability in rivers and the sea for eligibility in drinkable use River Floods:
Monitoring of water level variations in rivers, dams and reservoirs during rainy days,
Protecting wildlife: Tracking collars utilizing GPS/GSM modules to locate and
track wild animals and communicate their coordinates via SMS.
D. IOsI (Internet of smart industry):

Explosive and Hazardous Gases: Detection of gas levels and leakages in industrial
environments, surroundings of chemical factories and inside mines, Monitoring of
toxic gas and oxygen levels inside chemical plants to ensure workers and goods
safety, Monitoring of water, oil and gas levels in storage tanks and Cisterns,
Maintenance and repair: Early predictions on equipment malfunctions and service
maintenance can be automatically scheduled ahead of an actual part failure by
installing sensors inside equipment to monitor and send reports.

E. IOsH (Internet of smart health):

Patients Surveillance: Monitoring of conditions of patients inside hospitals and in


old people’s home, Medical Fridges:
Control of conditions inside freezers storing vaccines, medicines and organic
elements, Fall Detection: Assistance for elderly or disabled people living
independent, Dental: Bluetooth connected toothbrush with Smartphone app
analyzes the brushing uses and gives information on the brushing habits on the
Smartphone for private information or for showing statistics to the dentist, Physical
Activity Monitoring: Wireless sensors placed across the mattress sensing small
motions, like breathing and heart rate and large motions caused by tossing and
turning during sleep, providing data available through an app on the Smartphone.
F. IOsE (internet of smart energy):

Smart Grid: Energy consumption monitoring and management, Wind Turbines/


Power house: Monitoring and analyzing the flow of energy from wind turbines &
power house, and two-way communication with consumers’ smart meters to analyze
consumption patterns, Power Supply Controllers: Controller for AC-DC power
supplies that determines required energy, and improve energy efficiency with less
energy waste for power supplies related to computers, telecommunications, and
consumer electronics applications, Photovoltaic Installations: Monitoring and
optimization of performance in solar energy plants.

G. IOsA (internet of smart agriculture):

Green Houses: Control micro-climate conditions to maximize the production of


fruits and vegetables and its quality, Compost: Control of humidity and temperature
levels in alfalfa, hay, straw, etc. to prevent fungus and other microbial contaminants,
Animal Farming/Tracking: Location and identification of animals grazing in open
pastures or location in big stables, Study of ventilation and air quality in farms and
detection of harmful gases from excrements, Offspring Care: Control of growing
conditions of the offspring in animal farms to ensure its survival and health, field
Monitoring: Reducing spoilage and crop waste with better monitoring, accurate
ongoing data obtaining, and management of the agriculture fields, including better
control of fertilizing, electricity and watering.
Figure 4: IoT applications
The IoT application area is very diverse and IoT applications serve different users.
Different user categories have different driving needs. From the IoT perspective
there are three important user categories:The individual citizens,community of
citizens (citizens of a city, a region, country or society as a whole), The enterprises.
CHAPTER 4

COMPONENTS

4.1 HARDWARE COMPONENTS.

4.1.1 Power supply


Power supply is a supply of electrical power. A device or system that
supplies electrical or other types of energy to an output load or group of loads is
called a power supply unit or PSU. The term is most commonly applied to
electrical energy supplies, less often to mechanical ones, and rarely to others.

A power supply may include a power distribution system as well as primary


or secondary sources of energy such as

• Conversion of one form of electrical power to another desired form and voltage,
typically involving converting AC line voltage to a well-regulated lower-
voltage DC for electronic devices. Low voltage, low power DC power supply
units are commonly integrated with the devices they supply, such
as computers and household electronics.

• Batteries.

• Chemical fuel cells and other forms of energy storage systems.

• Solar power.

• Generators or alternators.
Fig 3.8: Regulated Power Supply

The basic circuit diagram of a regulated power supply (DC O/P) with led connected
as load is shown in fig: 3.3.3.

Fig 3.9: Circuit diagram of Regulated Power Supply with Led connection

The components mainly used in above figure are

• 230V AC MAINS
• TRANSFORMER
• BRIDGE RECTIFIER(DIODES)
• CAPACITOR
• VOLTAGE REGULATOR(IC 7805)
• RESISTOR
• LED(LIGHT EMITTING DIODE)
4.1.2 NodeMCU

NodeMCU is an open source IoT platform. It includes firmware which runs on


the ESP8266 Wi-Fi SoC from Espressif Systems, and hardware which is based on
the ESP-12 module. The term "NodeMCU" by default refers to the firmware rather
than the development kits. The firmware uses the Lua scripting language. It is based
on the eLua project, and built on the Espressif Non-OS SDK for ESP8266. It uses
many open source projects, such as lua-cjson and SPIFFS.
4.1.3 Gas Sensor

It is used to monitor changes in air quality and to detect the presence of various
gases. In this sensor mostly used in manufacturing industries, space stations, and
chemical industries, the alternative gas sensor is available, but mostly, MQ2 is
used in IoT industries.
Different types of Gas Sensors:

 Catalytic bead sensor


 Hydrogen sensor
 Air pollution sensor
 Nitrogen oxide sensor
 Oxygen sensor
 Ozone monitor
 Electrochemical gas sensor
 Gas detector
 Hygrometer

MQ2 Sensor
4.1.4 Ultrasonic Sensor

The ultrasonic sensor or ultrasonic transducer is one of the most popular sensors
used in applications of IoT.
Working Principle: The transmitter transmits the ultrasonic waves in the air of
forwarding direction, and when an object is there, it will reflect towards the
receiver. The receiver (Photodiode) receives the ultrasonic waves.

Distance = Speed x Time


Now, we should know the speed and distance, then calculate the distance of the
object.

Pin Definition

VCC 5V Power Supply

Trig The Trigger is an input pin. It will have kept high for 10uS

Echo Output pin.

GND Ground pin.


4.1.5 Vibration Sensor

Vibration sensors provide an easy, cost effective means of monitoring and


protecting critical machinery, 24/7. Protect critical equipment and avoid costly
downtime with cost-effective transmitters from IOT That.

Applications
 Critical pumps and motors
 Cooling towers and fans
 Slow speed rolls
 Rotary and screw compressors

Features
 Monitors and protects 24/7
 Installs quickly and easily
 Provides critical machine information
 Avoids costly catastrophic failures
4.1.6 GPS Sensor

GPS receivers are generally used in smartphones, fleet management system, military
etc. for tracking or finding location.
Global Positioning System (GPS) is a satellite-based system that uses satellites and
ground stations to measure and compute its position on Earth.
GPS is also known as Navigation System with Time and Ranging (NAVSTAR)
GPS.

GPS receiver needs to receive data from at least 4 satellites for accuracy purpose.
GPS receiver does not transmit any information to the satellites.
This GPS receiver is used in many applications like smartphones, Cabs, Fleet
management etc.
4.1.7 Stepper Motor Control

A stepper motor is a brushless DC-electric motor that divides a full rotation into a
number of equal steps. The position of the motor can be commanded to move and
hold at one of these steps without feedback. The stepper motor is used in a wide
range of applications involving precision motion control.

Figure: the Stepper Motor with Nodemcu


4.1.8 Buzzer

A buzzer or beeper is an audio signaling device, which may be mechanical,


electromechanical, or piezoelectric. Typical uses of buzzers and beepers include
alarm devices, timers and confirmation of user input such as a mouse click or
keystroke. Buzzer is an integrated structure of electronic transducers, DC power
supply, widely used in computers, printers, copiers, alarms, electronic toys,
automotive electronic equipment, telephones, timers and other electronic products
for sound devices.

Active buzzer 5V Rated power can be directly connected to a continuous sound, this
section dedicated sensor expansion module and the board in combination, can
complete a simple circuit design, to "plug and play.
4.2 SOFTWARE COMPONENTS

The set of programs which enable the data collection, storage, processing,
manipulating and instructing to and from IoT hardware components are called IoT
Software. The operating systems, middleware or firmware, apps, etc., are few
examples.
Software components

4.2.1 Arduino (IDE)

The Arduino integrated development environment (IDE) is a cross


platform application that is written in the Java programming language. It is used to
write and upload programs to Arduino compatible boards, but also, with the help of
3rd party cores, other vendor development boards.
The source code for the IDE is released under the GNU public License, version
2. The Arduino IDE supports the languages C++ using special rules of code
structuring. The Arduino IDE supplies in lib from the Wiring project, which
provides many common input and output procedures. User-written code only
requires two basic functions, for starting the sketch and the main program loop, that
are compiled and linked with a program stub main() into an executable cyclic
executive program with the GNU toolchain, also included with the IDE distribution.
The Arduino IDE employs the program avrdude to convert the executable code into
a text file in hexadecimal encoding that is loaded into the Arduino board by a loader
program in the board's firmware.
4.2.2 Embedded C

Embedded C is a set of language extensions for the C programming language by


the C Standards Committee to address commonality issues that exist between C
extensions for different embedded systems.
Historically, embedded C programming requires nonstandard extensions to the C
language in order to support exotic features such as fixed-point arithmetic, multiple
distinct memory banks, and basic I/O operations. In 2008, the C Standards
Committee extended the C language to address these issues by providing a common
standard for all implementations to adhere to. It includes a number of features not
available in normal C, such as fixed-point arithmetic, named address spaces and
basic I/O hardware addressing. Embedded C uses most of the syntax and semantics
of standard C, e.g., main() function, variable definition, datatype declaration,
conditional statements (if, switch case), loops (while, for), functions, arrays and
strings, structures and union, bit operations, macros, etc.
4.2.3 Android application Blynk

Blynk is a new platform that allows you to quickly build interfaces for controlling
and monitoring your hardware projects from your iOS and Android device. After
downloading the Blynk app, you can create a project dashboard and arrange buttons,
sliders, graphs, and other widgets onto the screen. Using the widgets, you can turn
pins on and off or display data from sensors.
Whatever your project is, there are likely hundreds of tutorials that make the
hardware part pretty easy, but building the software interface is still difficult. With
Blynk, though, the software side is even easier than the hardware. Blynk is perfect
for interfacing with simple projects like monitoring the temperature of your fish tank
or turning lights on and off remotely. Personally, I’m using it to control RGB LED
strips in my living room.
CHAPTER-5

UML DIAGRAMS

5.1 Sequence Diagram:


5.2 Component Diagram
5.3 Use Case Diagram
5.4 System Architecture

IOT architecture consists of different layers of technologies supporting IOT. It


serves to illustrate how various technologies relate to each other and to communicate
the scalability, modularity and configuration of IOT deployments in different
scenarios. The Figure shows detailed architecture of IOT. The functionality of each
layer is described in sequential manner.
FIGURE 3: Detailed ARCHITECTURE
CHAPTER 6

SYSTEM IMPLEMENTATION:

~~Arduino code ~~
int s2,s7;
int s1,s8;
int s4;
void setup() {
// put your setup code here, to run once:
pinMode(9,OUTPUT);//left motor
pinMode(10,OUTPUT);
pinMode(11,OUTPUT);//right motor
pinMode(12,OUTPUT);
pinMode(A0,INPUT);
pinMode(A3,INPUT);
pinMode(2,INPUT);
pinMode(A5,INPUT);
pinMode(s4,INPUT);
}
//1=white
//0=black
void loop() {
// put your main code here, to run repeatedly:
s2=digitalRead(3);
s7=digitalRead(A4);
s1=digitalRead(2);
s8=digitalRead(A5);
s4=digitalRead(A1);

if(s2==1 && s1==1)//right


{
digitalWrite(9,LOW);
digitalWrite(10,LOW);
digitalWrite(11,HIGH);
digitalWrite(12,LOW);
}
else if(s7==1 && s8==1)//left
{
digitalWrite(9,HIGH);
digitalWrite(10,LOW);
digitalWrite(11,LOW);
digitalWrite(12,LOW);
}
else if((s2==0) && (s4==1) && (s7==0))//forward
{
digitalWrite(9,HIGH);
digitalWrite(10,LOW);
digitalWrite(11,HIGH);
digitalWrite(12,LOW);
}

Traffic signal

const int r = 9; //connect red led at pin 9


const int y = 10; //connect yellow led at pin 10
const int g = 11; //connect green led at pin 11
const int sec = 1000; //seconds defined
void setup()
{
pinMode(r,OUTPUT);
pinMode(y,OUTPUT);
pinMode(g,OUTPUT);
delay(sec);
}

void loop()
{
digitalWrite(r,HIGH) ;
delay(sec*5);
digitalWrite(r,LOW) ;
digitalWrite(y,HIGH) ;
delay(sec*5);
digitalWrite(y,LOW) ;
digitalWrite(g,HIGH) ;
delay(sec*5);
digitalWrite(g,LOW) ;

rc_keyboard_control.ino

// assign pin num


int right_pin = 6;
int left_pin = 7;
int forward_pin = 10;
int reverse_pin = 9;

// duration for output


int time = 50;
// initial command
int command = 0;

void setup() {
pinMode(right_pin, OUTPUT);
pinMode(left_pin, OUTPUT);
pinMode(forward_pin, OUTPUT);
pinMode(reverse_pin, OUTPUT);
Serial.begin(115200);
}

void loop() {
//receive command
if (Serial.available() > 0){
command = Serial.read();
}
else{
reset();
}
send_command(command,time);
}

void right(int time){


digitalWrite(right_pin, LOW);
delay(time);
}

void left(int time){


digitalWrite(left_pin, LOW);
delay(time);
}

void forward(int time){


digitalWrite(forward_pin, LOW);
delay(time);
}

void reverse(int time){


digitalWrite(reverse_pin, LOW);
delay(time);
}

void forward_right(int time){


digitalWrite(forward_pin, LOW);
digitalWrite(right_pin, LOW);
delay(time);
}

void reverse_right(int time){


digitalWrite(reverse_pin, LOW);
digitalWrite(right_pin, LOW);
delay(time);
}

void forward_left(int time){


digitalWrite(forward_pin, LOW);
digitalWrite(left_pin, LOW);
delay(time);
}

void reverse_left(int time){


digitalWrite(reverse_pin, LOW);
digitalWrite(left_pin, LOW);
delay(time);
}

void reset(){
digitalWrite(right_pin, HIGH);
digitalWrite(left_pin, HIGH);
digitalWrite(forward_pin, HIGH);
digitalWrite(reverse_pin, HIGH);
}

void send_command(int command, int time){


switch (command){

//reset command
case 0: reset(); break;

// single command
case 1: forward(time); break;
case 2: reverse(time); break;
case 3: right(time); break;
case 4: left(time); break;

//combination command
case 6: forward_right(time); break;
case 7: forward_left(time); break;
case 8: reverse_right(time); break;
case 9: reverse_left(time); break;

default: Serial.print("Inalid Command\n");


}
}

Test

author__ = 'zhengwang'

import socket
import time
class SensorStreamingTest(object):
def __init__(self, host, port):

self.server_socket = socket.socket()
self.server_socket.setsockopt(socket.SOL_SOCKET,
socket.SO_REUSEADDR, 1)
self.server_socket.bind((host, port))
self.server_socket.listen(0)
self.connection, self.client_address = self.server_socket.accept()
self.host_name = socket.gethostname()
self.host_ip = socket.gethostbyname(self.host_name)
self.streaming()

def streaming(self):

try:
print("Host: ", self.host_name + ' ' + self.host_ip)
print("Connection from: ", self.client_address)
start = time.time()

while True:
sensor data = float(self connection.recv(1024))
print("Distance: %0.1f cm" % sensor data)

# test for 10 seconds


if time time() - start > 10:
break
finally:
self.connection.close()
self.server_socket.close()

if __name__ == '__main__':
h, p = "192.168.1.100", 8002
SensorStreamingTest(h, p)
Setting up environment with Anaconda
Install miniconda(Python3) on your computer
Create auto-rccar environment with all necessary libraries for this project
Conda env create -f environment.yml

Activate auto-rccar environment


source activate auto-rccar

To exit, simply close the terminal window. More info about managing Anaconda
environment, please see here.

About the files


test/
rc_control_test.py: RC car control with keyboard
stream_server_test.py: video streaming from Pi to computer
ultrasonic_server_test.py: sensor data streaming from Pi to computer
model train test/
data test.npz: sample data
train predict test.ipynb: a Jupiter notebook that goes through neural network
model in OpenCV3

raspberryPi /
stream_client.py: stream video frames in jpeg format to the host computer
ultrasonic_client.py: send distance data measured by sensor to the host computer

Arduino /
rc keyboard control.ino: control RC car controller

computer/
cascade xml/
trained cascade classifiers
chess board/
images for calibration, captured by pi camera

picam_calibration.py: pi camera calibration


collect_training_data.py: collect images in grayscale, data saved as *.npz
model.py: neural network model
model_training.py: model training and validation
rc_driver_helper.py: helper classes/functions for rc_driver.py
rc_driver.py: receive data from raspberry pi and drive the RC car based on
model prediction
rc_driver_nn_only.py: simplified rc_driver.py without object detection
Traffic signal
Traffic signal sketch contributed by @geek111

How to drive
Testing: Flash rc keyboard control.ino to Arduino and run rc_control_test.py to
drive the RC car with keyboard. Run stream_server_test.py on computer and then
run stream_client.py on raspberry pi to test video streaming. Similarly,
ultrasonic_server_test.py and ultrasonic_client.py can be used for sensor data
streaming testing.

Pi Camera calibration (optional): Take multiple chess board images using pi


camera module at various angles and put them into chess board folder, run
picam_calibration.py and returned parameters from the camera matrix will be used
in rc_driver.py.

Collect training/validation data: First run collect_training_data.py and then run


stream_client.py on raspberry pi. Press arrow keys to drive the RC car, press q to
exit. Frames are saved only when there is a key press action. Once exit, data will
be saved into newly created training data folder.

Neural network training: Run model_training.py to train a neural network model.


Please feel free to tune the model architecture/parameters to achieve a better result.
After training, model will be saved into newly created saved model folder.

Cascade classifiers training (optional): Trained stop sign and traffic light classifiers
are included in the cascade xml folder, if you are interested in training your own
classifiers, please refer to Open CV doc and this great tutorial.

Self-driving in action: First run rc_driver.py to start the server on the computer (for
simplified no object detection version, run rc_driver_nn_only.py instead), and then
run stream_client.py and ultrasonic_client.py on raspberry pi.
CHAPTER 7

SYSTEM TESTING
The purpose of testing is to discover errors. Testing is the process of trying to discover every
conceivable fault or weakness in a work product. It provides a way to check the functionality
of components, sub assemblies, assemblies and/or a finished product It is the process of
exercising software with the intent of ensuring that theSoftware system meets its requirements
and user expectations and does not fail in an unacceptable manner. There are various types of
test. Each test type addresses a specific testing requirement.

6.1 TYPES OF TESTS


6.1.1 Unit testing
Unit testing involves the design of test cases that validate that the internal program logic is
functioning properly, and that program inputs produce valid outputs. All decision branches and
internal code flow should be validated. It is the testing of individual software units of the
application .it is done after the completion of an individual unit before integration. This is a
structural testing, that relies on knowledge of its construction and is invasive. Unit tests perform
basic tests at component level and test a specific business process, application, and/or system
configuration. Unit tests ensure that each unique path of a business process performs accurately
to the documented specifications and contains clearly defined inputs and expected results.

Unit testing is usually conducted as part of a combined code and unit test phase of the
software lifecycle, although it is not uncommon for coding and unit testing to be conducted as
two distinct phases.

6.1.2 Integration testing


Integration tests are designed to test integrated software components to determine if
they actually run as one program. Testing is event driven and is more concerned with the basic
outcome of screens or fields. Integration tests demonstrate that although the components were
individually satisfaction, as shown by successfully unit testing, the combination of components
is correct and consistent. Integration testing is specifically aimed at exposing the problems
that arise from the combination of components.

Software integration testing is the incremental integration testing of two or more integrated
software components on a single platform to produce failures caused by interface defects.

The task of the integration test is to check that components or software applications, e.g.
components in a software system or – one step up – software applications at the company level
– interact without error.

6.1.3 Functional test


Functional tests provide systematic demonstrations that functions tested are available as
specified by the business and technical requirements, system documentation, and user manuals.

Functional testing is centered on the following items:

Valid Input : identified classes of valid input must be accepted.

Invalid Input : identified classes of invalid input must be rejected.

Functions : identified functions must be exercised.

Output : identified classes of application outputs must be exercised.

Systems/Procedures: interfacing systems or procedures must be invoked.

6.1.4 System Test

System testing ensures that the entire integrated software system meets requirements. It
tests a configuration to ensure known and predictable results. An example of system testing is
the configuration oriented system integration test. System testing is based on process
descriptions and flows, emphasizing pre-driven process links and integration points.
6.1.5 White Box Testing
White Box Testing is a testing in which in which the software tester has knowledge of the
inner workings, structure and language of the software, or at least its purpose. It is purpose. It
is used to test areas that cannot be reached from a black box level.

6.1.6 Black Box Testing


Black Box Testing is testing the software without any knowledge of the inner workings,
structure or language of the module being tested. Black box tests, as most other kinds of tests,
must be written from a definitive source document, such as specification or requirements
document, such as specification or requirements document. It is a testing in which the software
under test is treated, as a black box .you cannot “see” into it. The test provides inputs and
responds to outputs without considering how the software works.

6.1.7 Acceptance Testing


User Acceptance Testing is a critical phase of any project and requires significant
participation by the end user. It also ensures that the system meets the functional requirements.
CHAPTER 8

SCREENSHOTS
CHAPTER 9
FUTURE SCOPE
Our project “IOT based Smart Vehicle Automation and Control with
Enhanced Safety, Security and Tracking System using Wireless Sensors” is
mainly intended to operate a devices using Android mobile phone through WI-FI
CHAPTER 10

CONCLUSION

Projections for the impact of IoT on the Internet and economy are
impressive, with some anticipating as many as 100 billion connected IoT
devices and a global economic impact of more than $11 trillion by 2025.
The potential economic impact of IoT is huge, but the journey to IoT
adoption is not a seamless one. There are many challenges that face
companies looking to implement IoT solutions. However, the risks and
disadvantages associated with IoT can be overcome.
CHAPTER 11

BIBLOGRAPHY

The sites which were used while doing this project:

1. www.wikipedia.com

2. www.allaboutcircuits.com

3. www.microchip.com

4. www.howstuffworks.com

Books referred:

1. Raj kamal –Microcontrollers Architecture, Programming, Interfacing and System Design.

2. Mazidi and Mazidi –Embedded Systems.

3. PCB Design Tutorial –David.L.Jones.

4. PIC Microcontroller Manual – Microchip.

5. Pyroelectric Sensor Module- Murata.

6. Embedded C –Michael.J.Pont.
REFRENCES

[1] J. Chongwatpol and R. Sharda, “RFID-enabled track and traceability


in job-shop scheduling environment,” Eur. J. Oper. Res., vol. 227, no. 3,
pp. 453–463, Jun. 2013.
[2] D. Sundaram, W. Zhou, S. Piramuthu, and S. Pienaar, “Knowledge-
based RFID enabled Web service architecture for supply chain
management,” Expert Syst. Appl., vol. 37, no. 12, pp. 7937–7946, Dec.
2010.
[3] H. K. H. Chow, K. L. Choy, W. B. Lee, and F. T. S. Chan, “Integration
of Web-based and RFID technology in visualizing logistics operations—
A case study,” Supply Chain Manage. Int. J., vol. 12, no. 3, pp. 221–234,
2007.
[4] C. Metzger, F. Thiesse, S. Gershwin, and E. Fleisch, “The impact of
false-negative reads on the performance of RFID-based shelf inventory
control policies,” Comput. Oper. Res., vol. 40, no. 7, pp. 1864–1873, Jul.
2013.
[5] R. Want, “An introduction to RFID technology,” IEEE Pervasive
Comput., vol. 5, no. 1, pp. 25–33, Jan./Mar. 2006.
[6] P. Jiang and W. Cao, “An RFID-driven graphical formalized
deduction for describing the time-sensitive state and position changes of
work-inprogress material flows in a job-shop floor,” J. Manuf. Sci. Eng.,
vol. 135, no. 3, May 2013, Art. no. 031009.
[7] T. Qu et al., “A case of implementing RFID-based real-time shop-floor
material management for household electrical appliance manufacturers,”
J. Intell. Manuf., vol. 23, no. 6, pp. 2343–2356, Dec. 2012.
[8] Y. Fu and P. Jiang, “RFID based e-quality tracking in service-oriented
manufacturing execution system,” Chin. J. Mech. Eng., vol. 25, no. 5, pp.
974–981, Sep. 2012.
[9] J. Lyu, Jr., S.-Y. Chang, and T.-L. Chen, “Integrating RFID with
quality assurance system—Framework and applications,” Expert Syst.
Appl., vol. 36, no. 8, pp. 10877–10882, Oct. 2009.
[10] Y. Zhang, G. Q. Huang, S. Sun, and T. Yang, “Multi-agent based
realtime production scheduling method for radio frequency identification
enabled ubiquitous shopfloor environment,” Comput. Ind. Eng., vol. 76,
no. 1, pp. 89–97, Oct. 2014.
[11] Z. X. Guo, E. W. T. Ngai, C. Yang, and X. Liang, “An RFID-based
intelligent decision support system architecture for production monitoring
and scheduling in a distributed manufacturing environment,” Int. J. Prod.
Econ., vol. 159, pp. 16–28, Jan. 2015.
[12] R. Y. Zhong, Q. Y. Dai, T. Qu, G. J. Hu, and G. Q. Huang, “RFID-
enabled real-time manufacturing execution system for masscustomization
production,” Robot. Comput. Integr. Manuf., vol. 29, no. 2, pp. 283–292,
Apr. 2013.
[13] M. L. Wang et al., “A radio frequency identification-enabled real-
time manufacturing execution system for one-of-a-kind production
manufacturing: A case study in mould industry,” Int. J. Comput. Integr.
Manuf., vol. 25, no. 1, pp. 20–34, Jul. 2012.
[14] E. Abad et al., “RFID smart tag for traceability and cold chain
monitoring of foods: Demonstration in an intercontinental fresh fish
logistic chain,” J. Food Eng., vol. 93, no. 4, pp. 394–399, Aug. 2009.
[15] Y.-Y. Chen, Y.-J. Wang, and J.-K. Jan, “A novel deployment of
smart cold chain system using 2G-RFID-Sys,” J. Food Eng., vol. 141, pp.
113–121, Nov. 2014.
[16] H. Cai, A. R. Andoh, X. Su, and S. Li, “A boundary condition based
algorithm for locating construction site objects using RFID and GPS,”
Adv. Eng. Inform., vol. 28, no. 4, pp. 455–468, Oct. 2014.

You might also like