Major Project Report: Submitted By: Submitted To
Major Project Report: Submitted By: Submitted To
CHAPTER 1: INTRODUCTION
1.1 Project objective
1.2 Scope of work
CHAPTER 3: WORKING
3.1 Circuit Explanation
3.2 Block Diagram
CHAPTER 4: CONCLUSION
4.1 Picture of the completed project
REFERENCES ......................................................................... xxx
CERTIFICATE
This is to certify that the work titled “APPLICATIONS BASED ON AUGMENTED REALITY
AND INTERNET OF THINGS” submitted by “RISHABH GOSWAMI and “RAGHAV
RANJAN” in partial fulfillment for the award of degree of ELECTRONICS AND
COMMUNICATION of Jaypee Institute of Information Technology, Noida has been carried out
under my supervision. This work has not been submitted partially or wholly to any other University
or Institute for the award of this or any other degree or diploma.
We want to express our sincerest gratitude to Dr. GAURAV VERMA (Project Supervisor),
whose exceptional knowledge and unparalleled behavior is full of ardent inspiration in it.
However, we could never adequately thank all those whose assistance, guidance, cooperation,
criticism contributed to the improvement of this report. We are ebullient in expressing our
intense in debtless heartiest gratitude to all of them. We would also like to thank Mr. MANDEEP
SINGH NARULA and Dr. RICHA GUPTA who provided us the opportunity to present our
project for our evaluation. We would also like to thank the teaching and non-teaching staff of the
Dept. of electronics and communication engineering for their invaluable help and support. Since
performance feedback is essential for effective communication, mistakes and creative feedback
of the report may be unhesitatingly communicated to us, who will be as far as possible duly
acknowledged and most welcome. In this report, whatever is beneficial comes from almighty,
and whatever is faulty is ours. We would always be thankful to these people.
.
Signature of the student ………………………………
Signature of the student ………………………………
Date ……………………………….
PREFACE
Project work is a part of our curriculum that gives us the knowledge about a particular work in
any organization and makes us understand an organization. This also helps to understand and
correlate the theoretical concepts better which remains uncovered in the classroom. We have
prepared this project as a part of our “MAJOR PROJECT FOR SEESTER VII”. The topic we
have chosen for the project is “APPLICATIONS BASED ON AUGMENTED REALITY AND
INTERNET OF THINGS”.
CHAPTER 1
INTRODUCTION
For the IOT-AR based thirsty plant, a soil moisture level sensor is attached to the soil of the plant and
when the reading of the sensor is below the pre-set value, the animation of water level will go down.
On the other hand, when the reading of the sensor is above the pre-set value, the water level goes up.
To update the user regarding the condition of the system, an ESP-8266 (Node--mcu) is attached. This
device helps us read the water level in the soil. After reading the water level, it sends the data to cloud.
Then cloud sends the data to unity and according to that data the water level in the animation is
adjusted. The user will be updated about the status of the system via emails and graphs of the moisture
level.The project is the integration of offline plant system and the online IOT and AR system.
For the Weather-AR project, a dashboard was made which consists of boxes which has many buttons
like time, temperature, uv index, humidity, etc. It would take information from internet and would
display on the dashboard.
The objective is to design a thirsty plant using Internet of Things and Augmented Reality. IoT will help
us retrieve data that is the water level of the plant. AR will show us 3D water level implementation.
For IoT implementation we used NodeMCU which is also known as ESP8266. For AR implementation
we used Unity 3D software which is integrated with Vuforia platform.
The scope of these IOT-AR based projects is huge. In Thirsty plant implementation, we can actually
save huge amount of water by pouring only appropriate content of water in the plant after seeing the3D
level implementation of water. Another scope is we can limit the use of fertilizers by seeing the water
level because excess use of water with fertilizers can destroy the plant.
The scope of Weather-AR is also huge. It can help us whenever we want to go out in case of
emergency. Later traffic implementation can also be done which will tell us before whether traffic
outside is large outside or not.
CHAPTER 2
TECHNICAL DISCRIPTION
Unity is a cross-platform game engine developed by Unity Technologies, first announced and released
in June 2005 at Apple Inc.'s Worldwide Developers Conference as an OS X-exclusive game engine. As
of 2018, the engine has been extended to support 27 platforms. The engine can be used to create
both three-dimensional and two-dimensional games as well as simulations for its many platforms.
Several major versions of Unity have been released since its launch, with the latest stable version being
Unity 2018.2.17, released on November 22, 2018. Unity gives users the ability to create games in
both 2D and 3D, and the engine offers a primary scripting API in C#, for both the Unity editor in the
form of plugins, and games themselves, as well as drag and drop functionality. Prior to C# being the
primary programming language used for the engine.
The engine has support for the following graphics APIs: Direct3D on Windows and Xbox
One; OpenGL on Linux, macOS, and Windows; OpenGL ES on Android and iOS; WebGL on the
web; and proprietary APIs on the video game consoles. Additionally, Unity supports the low-level
APIs Metal on iOS and macOS and Vulkan on Android, Linux, and Windows, as well as Direct3D
12 on Windows and Xbox One.
Since NodeMCU is open source platform, their hardware design is open for edit/modify/build.
NodeMCU Dev Kit/board consists of ESP8266 Wi-Fi enabled chip. The ESP8266 is a low-
cost Wi-Fi chip developed by Espressif Systems with TCP/IP protocol. For more information
about ESP8266, you can refer ESP8266 Wi-Fi Module. Since NodeMCU is open source
platform, their hardware design is open for edit/modify/build. It consists of ESP8266 Wi-Fi
enabled chip. The ESP8266 is a low-cost Wi-Fi chip developed by Espressif Systems with
TCP/IP protocol. NodeMCU Dev Kit has Arduino like Analog (i.e. A0) and Digital (D0-D8)
pins on its board. It supports serial communication protocols i.e. UART, SPI, I2C
etc. Using such serial protocols we can connect it with serial devices like I2C enabled LCD
display, Magnetometer HMC5883, MPU-6050 Gyro meter + Accelerometer, RTC chips, GPS
modules, touch screen displays, SD cards etc.
NodeMCU Development board is featured with Wi-Fi capability, analog pin, digital pins and
serial communication protocols. To get start with using NodeMCU for IoT applications first we
need to know about how to write/download NodeMCU firmware in NodeMCU Development
Boards. And before that where this NodeMCU firmware will get as per our requirement. There is
online NodeMCU custom builds available using which we can easily get our custom NodeMCU
firmware as per our requirement.
WORKING
3.1 WeatherAR
We create a dashboard that displays data from various sensors like temperature, Humidity, Motion
Detection, Time, etc. This includes linking our IoT board (Node MCU) to Vuforia. Sensors
transmits data to accuweather which are then displayed on dashboard. The dashboard is motion
enabled i.e. it only shows all data when gently shaken or moved. Node MCU fetches data from
accuweather website. Accuweather website provides api of time and weather for every website. We
use this api in our C# script and display it on our dashboard. The data so provided is real time and
updated every interval.The time so received is then interfaced with a digital clock which gives it a
better look.
BLOCK DIAGRAM
POWER SUPPLY
CONCLUSION
We have created 2 different applications in Augmented Reality including IoT. The RulAR has
wide range of applicatins in small and big projects in industrial area. The WeatherAR can be
used anywhere from home to office or in any other project.
1. J. G. Wang, X. Xiao, and H. Hua., “Augmented reality 3-D displays with micro integral
imaging,” J. Display Technol., vol. 11, no. 11, pp. 889–893, Nov. 2015.
2. H. Ling , “Augmented Reality in Reality”, IEEE Multimedia, vol. 24, no. 3, pp. 10-15, Jan.
2017
3. T. Lang, B. MacIntyre, and I. J. Zugaza. Massively multiplayer online worlds as a platform
for augmented reality experiences. In IEEE VR’08, pp. 5-10, May. 2013