Edge Computing
Edge Computing
net
Seminar
On
Edge Computing
2
Definition
Edge computing is a distributed computing
paradigm that brings computation and data
storage closer to the sources of data.
3
Introduction
●●●
6
Facts About Edge Computing
• For a time, personal computing was the
dominant computing model.
• Applications ran and data was stored
locally on a user's device, or sometimes
within an on-premise data center.
• Cloud computing, a more recent
development, offered a number of
advantages over this locally based, on-
premise computing.
●●●
7
Facts About Edge Computing
• Cloud services are centralized in a vendor-
managed "cloud" (or collection of data
centers) and can be accessed from any
device over the Internet.
• However, cloud computing can introduce
latency because of the distance between
users and the data centers where cloud
services are hosted.
●●●
8
Facts About Edge Computing
• Early computing: Centralized applications
only running on one isolated computer
• Personal computing: Decentralized
applications running locally
• Cloud computing: Centralized applications
running in data centers
• Edge computing: Centralized applications
running close to users, either on the
device itself or on the network edge.
9
Example of Edge Computing
• Consider a building secured with dozens of
high-definition IoT video cameras. These are
"dumb" cameras that simply output a raw
video signal and continuously stream that
signal to a cloud server.
• On the cloud server, the video output from all
the cameras is put through a motion-detection
application to ensure that only clips featuring
activity are saved to the server’s database.
●●●
10
Example of Edge Computing
• This means there is a constant and significant
strain on the building’s Internet infrastructure,
as significant bandwidth gets consumed by the
high volume of video footage being
transferred.
• Additionally, there is very heavy load on the
cloud server that has to process the video
footage from all the cameras simultaneously.
●●●
11
Example of Edge Computing
• Now imagine that the motion sensor
computation is moved to the network edge.
What if each camera used its own internal
computer to run the motion-detecting
application and then sent footage to the cloud
server as needed?
• This would result in a significant reduction in
bandwidth use, because much of the camera
footage will never have to travel to the cloud
server.
●●●
12
Example of Edge Computing
• Additionally, the cloud server would now
only be responsible for storing the
important footage, meaning that the server
could communicate with a higher number of
cameras without getting overloaded.
• This is what edge computing looks like.
13
Cases of Edge Computing
• Security system monitoring: As described
above.
• IoT devices: Smart devices that connect to the
Internet can benefit from running code on the
device itself, rather than in the cloud, for more
efficient user interactions.
• Self-driving cars: Autonomous vehicles need to
react in real time, without waiting for
instructions from a server.
●●●
14
Cases of Edge Computing
• More efficient caching: By running code on a
CDN edge network, an application can
customize how content is cached to more
efficiently serve content to users.
• Medical monitoring devices: It is crucial for
medical devices to respond in real time
without waiting to hear from a cloud server.
15
Benefits of Edge Computing
Cost savings
• As seen in the example above, edge computing
helps minimize bandwidth use and server
resources. Bandwidth and cloud resources are
finite and cost money.
Performance
• Another significant benefit of moving processes
to the edge is to reduce latency. Every time a
device needs to communicate with a distant
server somewhere, that creates a delay.
●●●
16
Benefits of Edge Computing
17
Drawback of Edge Computing
19
Conclusion
20
References
Studymafia.net
Seminarppt.com
Google.com
Studymafia.org
21
Thanks
22