0% found this document useful (0 votes)
92 views

Understanding Industry 4 V007a

The document discusses digital environments and digital transformation. It covers topics like megatrends, digital transformation frameworks, the hype cycle for digital transformation, and how to start a digital transformation journey. It also discusses concepts like industry 4.0, the technologies that enable it, and how digital transformation relates to industry 4.0.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
92 views

Understanding Industry 4 V007a

The document discusses digital environments and digital transformation. It covers topics like megatrends, digital transformation frameworks, the hype cycle for digital transformation, and how to start a digital transformation journey. It also discusses concepts like industry 4.0, the technologies that enable it, and how digital transformation relates to industry 4.0.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 239

Entornos Digitales

y Dr. Gustavo A.
la Transformación Santana Torrellas

Digital
Gustavo Santana
Matemático con amplia experiencia
en Cyberseguridad y desarrollo de Gustavo Santana ha participado en
Modelos Matemáticos en Sistemas Los antecedentes de Gustavo son proyectos de Estrategia de TI, Desarrollo
Administrativos y Tecnológicos. Su de modelos y soluciones de
investigación abarca muchas tan multidimensionales como las Administración de Riesgos de
disciplinas relacionadas y ha visiones que él ayuda a crear. Información y Riesgo Tecnológico con
publicado mas de 120 artículos de énfasis en cumplimiento de estándares.
investigación.

En los últimos 15 años ha


participado en proyectos de Gustavo Santana, fue Director Ejecutivo Es profesor invitado en instituciones
Análisis, Diseño e de la práctica de Consultoría Sector como: Tecnológico de Monterrey, UDLAP
Financiero de EY (2017-2019); fue Jenkins Graduate Schools, el Instituto
Implementación de Modelos de Director de PwC (2012-2017); también Tecnológico Autónomo de México (ITAM),
Innovacion en la Banca desarrolló actividades de consultoría Instituto Politécnico Nacional (IPN) ,
Comercial y de Desarrollo en tecnológica en Accenture, (2008 – 2012). CINVESTAV, entre otras.
2
México.
Digital
Environments
and Digital
Transformation
Megatrends 2030+

The concept of megatrend was first introduced by John Naisbit, in


the book Megatrends (Naisbitt, 1982), describing a long-term,
transformational process with global reach, broad scope, and a
fundamental and dramatic impact
Three dimensions of
Megatrends
The 5 Megatrends
Design and
futures
thinking
scenario
transitions
along the
"future…”
What could disrupt our future
understanding of the world?
5 Megatrends shaping the future
Technology
imperative shift
translated into
Digital
Transformation
Digital transformation is the integration of digital technology into all areas of
a business, fundamentally changing how you operate and deliver value to
customers

What is Digital
Transformation?
Digital
transformation
is the
integration of
digital
technology into
all areas of a
business,…
Digital transformation is…, fundamentally changing how
you operate and deliver value to customers
What is Digital
transformation?
Digital Transformation = Business Transformation

It's also a cultural change that


requires organizations to
continually challenge the
status quo, experiment, and
get comfortable with failure

17
Digital
Transformation
Taxonomy
Procedural
pattern map for
Digital
Transformation
Value Generation
The two dimensión of Digital
Transformation
Digital
Transformation
as a combination
of Digital
Governance +
Digital
Infrastructure +
Digital Economy
The four Tiers of
Digital
Transformation
Digital Transformation Pillars
Market forces
that drive
digitalization

Digital can reshape every


aspect of the modern
enterprise
Market forces that drive digitalization
Emergence of Ecosystems: New ecosystems accessible through digital channels reduce switching costs
Reduced Ownership of Assets & Infrastructure: Growth of data is accelerating, and is forcing issues
around ownership, privacy, security, transparency, and trust
Reduced Barriers to Digital Entry: Low barriers to digital entry blur industry lines
Decoupled Value Chains: Increased speed, velocity, transparency and access disaggregate value
chains
New Entrants: Businesses are reaching farther and disaggregating business offerings invading new
spaces
The Internet of me: Through the personalization of apps and services, the third wave of the Internet is
placing users at the center of every digital experience
Emergence of
Ecosystems |
Decoupled
Value Chains |
The Internet of
me
Emergence of
Ecosystems |
Decoupled
Value Chains |
The Internet of
me
Emergence of Ecosystems |
Decoupled Value Chains |
The Internet of me | IoB
Market forces
that drive
digitalization
Outcome economy: As sensors and connectivity
become ubiquitous in a growing number of
environments, enterprises have an increasing ability
to measure the outcomes of the services they
deliver
Business models that sell results appeal to customers
far more than those that just sell products
Market forces that drive digitalization
The intelligent enterprise: Advances in data science, cognitive technology and processing
power have combined to open up the possibility of ‘intelligent enterprises’, built around smart
machines and software intelligence
By turning big data into smart data, firms can achieve higher levels of operational efficiency
and innovation
Workforce reimagined: The digital economy is creating ever-greater demand for machines and
humans to work together effectively
Advances in wearable devices, natural interfaces and smart machines are opening up new
opportunities to empower human talent through technology
The intelligent enterprise
data science, cognitive technology and
processing power have combined to open up
the possibility of ‘intelligent enterprises’
data science, cognitive
technology and
processing power
Market forces
that drive
digitalization

The Platform (r)evolution: Rapid advances in


cloud and mobile connectivity are dismantling
the technological barriers and reducing the costs
associated with establishing global platforms
These platforms offer huge potential for
innovation and the delivery of next-generation
services
The Platform
(r)evolution:
Sense and interpret disruption: Look beyond your own industry
Be prepared to blur the lines between the physical and digital worlds
Experiment to develop and launch ideas faster: Stop innovating
Digital and look to solve customer problems instead
Capabilities Develop platforms for fast and cheap experiments: Find or fund
one venture that could most disrupt you
Understand and leverage data: Organize data hackathons
Think beyond big data to consider different types of data
Find new ways to monetize data
Create an analytics team.
Build and maintain a high-quotient digital team: Be honest about
how digitally savvy you and your workforce are
Create digital boot camps to reskill employees
Digital Partner and invest for all noncore activities: One of the
Capabilities characteristics of effective digital leaders is their intuitive
understanding that the journey is not one to be undertaken alone
Organize for speed: Ensure CEO support and the presence of a
dedicated central team to drive the new digital growth supported by a
team of digitally savvy executers
Design a delightful user experience: User experience drives IT
architectures, and not vice versa
9-steps to developing a capability-centric Digital Transformation
Roadmap
Digital Transformation = Data-Driven
Organization

In a data-driven approach, decisions


are made based on data instead of
intuition
Following a data-driven approach offers
measurable advantages
That's because a data-driven strategy
uses facts and hard information rather
than gut instinct
Mapping the
journey to
becoming a
Data-Driven
Organization

Using a data-driven approach makes it easier to be objective


about decisions
Digital
Transformation
Framework

is the blueprint for


how an Organization
moves through a
period of significant
change because of
the current evolving
business conditions ‍

The framework is a tool, used across an organization, that guides all levels
of the organization through the journey
What’s all the
hype about?
the Gartner Hype Cycle?
Have you wondered what it’s all about and whether
it can help you?
It is published annually, so it is not something you need
to look at very often, but each year there are new and
interesting entrants on the Cycle while others make
progress along it.
It is worth mentioning that there are different versions
for different audiences,…
Hype Cycle Digital
Transformation
How to start your
Digital
Transformation
Journey
Digital
Disruption

Many different paths lead to digital transformation and each


organization’s journey will be unique
In every case, though, starting a digital transformation journey
requires a new mindset
It is a chance to reimagine how companies do things, often from the
ground up
Understanding
Industry 4.0

A Conceptual Framework for Industry 4.0


What is
Industry 4.0?
Core concept
map of Industry
4.0 and
technologies that
enable its
implementation
Concept
Relationship
Map
Representing
Pillar
Connections
in Industry 4.0
Digital Transformation and Industry 4.0
Industry 4.0
Tecnologias

”Industry 4.0”
Different
technologies
for different
types of
innovations
Data is the key
concept
... impact of
digitalization in all
areas and sectors
Pyramid of
Digital
Transformation
and
Industry 4.0
Industry 4.0:
Cybersecurity
and Smart Data
Curren state of
Industry 4.0
Industry 4.0
Industry 5.0
industry 6.0
Industry 4.0
Industry 5.0
industry 6.0

…and beyond!!!
From Industry 4.0 to Industry 6.0
…and beyond
How we get to
Digital
Transformation?
Future
computing
scenarios
Vision and
key features of
Computing
2030
Internet
evolution:
Histografía

Services and
principal
milestones
Internet
evolution:
Histografia

Services and principal


milestones
Internet
evolution:
Services
Internet
Protocols
Suite
Internet
evolution: IPv6
What is IPv6?
El IPv6 es una actualización al protocolo IPv4, diseñado para resolver el problema de
agotamiento de direcciones
Su desarrollo comenzó en diciembre de 1998 cuando Steve Deering y Robert Hinden, empleados de
Cisco y Nokia publicaron una especificación formal del protocolo
Su objetivo inicial fue sustituir eventualmente a IPv4 cuyo límite en el número de direcciones de
red admisibles restringían el crecimiento de Internet y su uso, para las situaciones de IoT entre
otras.
IPv4 posibilita 4 294 967 296 () direcciones de dispositivos diferentes, un número menor a la población
mundial, y menor a la cantidad de dispositivos totales.
A principios de 2010, quedaban menos del 10% de IPv4 sin asignar.
What is IPv6?

El desarrollo e introducción de IPv6 no ha estado exento de controversia


En muchos aspectos, IPv6 es una extensión conservadora de IPv4, manteniendo las
funciones más utilizadas, otras no tan importantes o poco utilizadas han sido eliminadas
o se han hecho opcionales, además se han añadido nuevas características.
La mayoría de los protocolos de transporte -y aplicación- necesitan pocos o ningún
cambio para operar sobre IPv6; las excepciones son los protocolos de aplicación
que integran direcciones de capa de red, como FTP o NTP.
What is IPv6?
IPv6 admite 340 282 366 920 938 463
463 374 607 431 768 211 456 ( o 340
sextillones de direcciones), cerca de
6,7 × 1017 (670 mil billones) de
direcciones por cada milímetro
cuadrado de la superficie de la Tierra.
IoT: IPv4 &
IPv6
Opportunities of
Challenges
towards
Internetworking
2030
What Is Cloud
Computing/Security?

Cloud Computing
Evolution of
IT towards
Cloud
Computing
What Is Cloud Computing?

Cloud computing is the on-demand availability


of computer system resources, especially data
storage and computing power, without direct active
management by the user
The term is generally used to describe data
centers available to many users over the Internet
Clouds may be limited to a single organization (enterprise
clouds), or be available to many organizations (public
cloud)
Types of
Cloud
deployment
What Is
Cloud
Computing?

Cloud computing
relies on sharing of
resources to
achieve coherence
and economies of
scale
(Business Model)
Proponents also claim that cloud computing allows
enterprises to get their applications
up and running faster,
with improved manageability and
less maintenance,
and that it enables IT teams to more rapidly adjust
resources to meet fluctuating and unpredictable
demand, providing the burst computing capability: high
computing power at certain periods of peak demand

What Is Cloud Computing?


What Is Cloud
Computing?

Large clouds, predominant today, often


have functions distributed over multiple
locations from central servers
If the connection to the user is relatively close,
it may be designated an edge server
Different
approaches to
the cloud have
different
scopes of
adoption
Service models
Though service-oriented
architecture advocates "Everything as a
service" (with the acronyms EaaS or xaaS),
cloud-computing providers offer their
"services" according to different models,
of which the three standard models
per NIST are:
Infrastructure as a Service (IaaS),
Platform as a Service (PaaS),
Software as a Service (SaaS).
Infrastructure as
a service (IaaS)

The NIST's definition of cloud computing describes IaaS as


"where the consumer is able to deploy and run arbitrary software,
which can include operating systems and applications. The consumer
does not manage or control the underlying cloud infrastructure but has
control over operating systems, storage, and deployed applications;
and possibly limited control of select networking components (e.g.,
host firewalls)."
Platform as a
service (PaaS)
The NIST's definition of cloud computing defines Platform as a Service as:
The capability provided to the consumer is to deploy onto the cloud
infrastructure consumer-created or acquired applications created using
programming languages, libraries, services, and tools supported by the
provider. The consumer does not manage or control the underlying
cloud infrastructure including network, servers, operating systems, or
storage, but has control over the deployed applications and possibly
configuration settings for the application-hosting environment
Software as a
service (SaaS)

The NIST's definition of cloud computing defines Software as a Service as:


The capability provided to the consumer is to use the provider's applications
running on a cloud infrastructure. The applications are accessible from
various client devices through either a thin client interface, such as a web
browser (e.g., web-based email), or a program interface. The consumer does
not manage or control the underlying cloud infrastructure including network,
servers, operating systems, storage, or even individual application
capabilities, with the possible exception of limited user-specific application
configuration settings.
What Is Cloud Computing?

Advocates of public and hybrid


clouds note that cloud computing
allows companies to avoid or
minimize up-front IT
infrastructure costs
Cloud value chain
Why Cloud Computing?
Essential
aspects of
Cloud
Computing
Agility for organizations may be improved, as
Cloud computing
cloud computing may increase users' flexibility
with re-provisioning, adding, or expanding
exhibits the
technological infrastructure resources
Cost reductions are claimed by cloud providers
following key
characteristics:
Agility for
organizations
may be
improved
A public-cloud delivery model
converts capital expenditures (e.g., buying
servers) to operational expenditure
This purportedly lowers barriers to entry, as
Cloud computing
infrastructure is typically provided by a third party
and need not be purchased for one-time or exhibits the
following key
infrequent intensive computing tasks
Pricing on a utility computing basis is "fine-
grained", with usage-based billing options
As well, less in-house IT skills are required for characteristics:
implementation of projects that use cloud
computing
x.cloud delivery model
converts capital expenditures
Cloud computing
exhibits the following
key characteristics:

Device and location independence enable


users to access systems using a web browser
regardless of their location or what device they
use (e.g., PC, mobile phone)
As infrastructure is off-site (typically provided
by a third-party) and accessed via the Internet,
users can connect to it from anywhere
Cloud computing
exhibits the following
key characteristics:

Maintenance of cloud computing applications


is easier, because they do not need to be
installed on each user's computer and can be
accessed from different places (e.g., different
work locations, while travelling, etc.)
Cloud computing
exhibits the following
key characteristics:

Multitenancy enables sharing of resources and costs across a large pool of users thus allowing for:
centralization of infrastructure in locations with lower costs (such as real estate, electricity, etc.)
peak-load capacity increases (users need not engineer and pay for the resources and equipment to
meet their highest possible load-levels)
utilisation and efficiency improvements for systems that are often only 10–20% utilised.
key design
considerations
for a multi-tenant
cloud
Cloud computing
exhibits the following
key characteristics:

Performance is monitored by IT experts from


the service provider, and consistent and loosely
coupled architectures are constructed
using web services as the system interface.
Cloud computing
exhibits the following
key characteristics:

Productivity may be increased when multiple users can work on the same data
simultaneously, rather than waiting for it to be saved and emailed. Time may be saved as
information does not need to be re-entered when fields are matched, nor do users need to
install application software upgrades to their computer.
Cloud computing
Availability improves with the use of multiple redundant sites, which makes well-
designed cloud computing suitable for business continuity and disaster recovery.
exhibits the
following key
characteristics:
Cloud computing
exhibits the following
key characteristics:
Scalability and elasticity via dynamic ("on-
demand") provisioning of resources on a fine-grained, self-
service basis in near real-time (Note, the VM startup time varies by
VM type, location, OS and cloud providers), without users having
to engineer for peak loads.
This gives the ability to scale up when the usage need increases
or down if resources are not being used.
Emerging approaches for managing elasticity include the use of
machine learning techniques to propose efficient elasticity
models.
Cloud computing Security can improve due to
exhibits the centralization of data, increased security-
focused resources, etc., but concerns can
following key persist about loss of control over certain

characteristics: sensitive data, and the lack of security


for stored kernels
Security is often as good as or better
than other traditional systems, in part
because service providers are able to
devote resources to solving security
issues that many customers cannot afford
to tackle or which they lack the technical
skills to address
Cloud computing
exhibits the the complexity of security is greatly
increased when data is distributed over a
following key wider area or over a greater number of
devices, as well as in multi-tenant
characteristics: systems shared by unrelated users
In addition, user access to security audit
logs may be difficult or impossible.
Private cloud installations are in part
motivated by users' desire to retain
control over the infrastructure and avoid
losing control of information security.
Recap of previous ideas
Cloud: Types,
Service and main
Characteristics
Cloud Computing Management
Approach
Evolution of
Cloud
Computing
Applications
Cloud vs
Distributed
Computing
Enhanced Cloud Computing
Definition

The availability of high-capacity networks, low-cost


computers and storage devices as well as the
widespread adoption of hardware
virtualization, service-oriented architecture and
autonomic and utility computing has led to growth
in cloud computing.
Reason for
Virtualization
Reasons for virtualization
A virtual machine (VM) can be more easily controlled and inspected from a remote site than a physical
machine, and the configuration of a VM is more flexible.
This is very useful in kernel development and for teaching operating system courses, including running
legacy operating systems that do not support modern hardware.
A new virtual machine can be provisioned as required without the need for an up-front hardware
purchase.
Reasons for virtualization
A virtual machine can easily be relocated from one physical machine to another as needed.
For example, a salesperson going to a customer can copy a virtual machine with the demonstration software to
their laptop, without the need to transport the physical computer.

An error inside a virtual machine does not harm the host system, so there is no risk of the OS crashing
on the laptop.
Because of this ease of relocation, virtual machines can be readily used in disaster recovery scenarios
without concerns with impact of refurbished and faulty energy sources.
Service-oriented
architecture -
SOA

…is a style of software design where services


are provided to the other components
by application components, through
a communication protocol over a network
Service-oriented
architecture -
SOA

A SOA element is a discrete unit of


functionality that can be accessed remotely
and acted upon and updated independently,
• such as retrieving a credit card
statement online
SOA is also intended to be independent of vendors,
products and technologies.
Service-
oriented A service has four properties according to one of many
architecture - definitions of SOA:
It logically represents a business activity with a specified
SOA outcome
It is self-contained
It is a black box for its consumers, meaning the consumer
does not have to be aware of the service's inner workings
It may consist of other underlying services
A manifesto was published for service-oriented architecture in October,
SOA 2009.
This came up with six core values which are listed as follows:
Defining Business value is given more importance than technical strategy.
concepts Strategic goals are given more importance than project-specific benefits.
Intrinsic inter-operability is given more importance than custom integration.
Shared services are given more importance than specific-purpose
implementations.
Flexibility is given more importance than optimization.
Evolutionary refinement is given more importance than pursuit of initial
perfection.
Service-oriented
architecture -
SOA
Different services can be used in
conjunction to provide the functionality
of a large software application, a
principle SOA shares with modular
programming.
Service-oriented architecture
integrates distributed, separately
maintained and deployed software
components.
It is enabled by technologies and
standards that facilitate components'
communication and cooperation over a
network, especially over an IP network.
Service-oriented
architecture -
SOA

SOA is related to the idea of an application


programming interface (API), an interface or
communication protocol between different parts of a
computer program intended to simplify the
implementation and maintenance of software.
An API can be thought of as the service, and the SOA
the architecture that allows the service to operate.
SOA Defining concepts
The related buzzword service-orientation promotes loose coupling between services.
SOA separates functions into distinct units, or services, which developers make accessible over a
network in order to allow users to combine and reuse them in the production of applications.
These services and their corresponding consumers communicate with each other by passing data in a
well-defined, shared format, or by coordinating an activity between two or more services.
Autonomic
computing

Autonomic computing (AC) refers to the self-


managing characteristics of distributed
computing resources, adapting to unpredictable
changes while hiding intrinsic complexity to
operators and users.
Autonomic computing
Initiated by IBM in 2001, this
initiative ultimately aimed to
develop computer systems capable
of self-management, to overcome
the rapidly growing complexity of
computing systems management,
and to reduce the barrier that
complexity poses to further growth
Autonomic computing
computer systems capable of self-management,
to overcome the rapidly growing complexity of
computing systems management
Autonomic computing

Autonomy-oriented computation is a
paradigm proposed by Jiming Liu in 2001 that
uses artificial systems imitating social animals'
collective behaviours to solve difficult
computational problems.
For example, ant colony optimization could be
studied in this paradigm.
Autonomic
computing The AC system concept is
designed to make adaptive
decisions, using high-level policies
It will constantly check and optimize
its status and automatically adapt
itself to changing conditions
Autonomic
computing
An AC can be modeled in terms of:
• two main control schemes (local and global)
with sensors (for self-monitoring),
• effectors (for self-adjustment),
• knowledge and planner/adapter for exploiting
policies based on self- and environment awareness
This architecture is sometimes referred to as Monitor-
Analyze-Plan-Execute (MAPE)
Autonomic Computing-Layered
Approach
An autonomic
computing
framework is
composed of
autonomic comp
onents (AC)
interacting with
each other
Driven by such vision, a variety of architectural
frameworks based on "self-regulating" autonomic

Autonomic components has been recently proposed


A very similar trend has been characterized

computing significant research in the area of multi-agent


systems
However, most of these approaches are typically
conceived with centralized or cluster-based server
architectures in mind and mostly address the need of
reducing management costs rather than the need of
enabling complex software systems or providing
innovative services
Some autonomic systems involve mobile agents
interacting via loosely coupled communication
mechanisms
mobile agents interacting via loosely
coupled communication mechanisms
Utility computing

…is a service provisioning model in which a service


provider makes computing resources and
infrastructure management available to the
customer as needed, and charges them for specific
usage rather than a flat rate
Like other types of on-demand computing (such as
grid computing), the utility model seeks to
maximize the efficient use of resources and/or
minimize associated costs.

computation,
Utility is the packaging
storage and
of system resources, such
Utility as: services, as a metered
service.
computing

This model has the advantage of a low or no initial


cost to acquire computer resources; instead,
resources are essentially rented.
Cloud
Computing
Challenges:
Technical,
Manegerial,
Relational
Cloud
Computing
Security
Dimensions
and Categories
Security and
privacy issues
in Cloud
Computing

Cloud computing poses privacy concerns because the service


provider can access the data that is in the cloud at any time.
Security and privacy issues
in Cloud comnputing

According to the Cloud Security


which accounted for all cloud
Alliance, the top three threats in
security outages respectively:
the cloud are:
• Insecure Interfaces and API’s, • 29%,
• Data Loss & Leakage,
• Hardware Failure— • 25%
• 10%
cloud security outages
There is the problem of legal ownership of
the data (If a user stores some data in the
cloud, can the cloud provider profit from it?).
Many Terms of Service agreements are
silent on the question of ownership.
Physical control of the computer equipment
(private cloud) is more secure than having
the equipment off site and under someone
else's control (public cloud).
Cloud
infrastructure
and Platform
Protection
Hierarchy
Hype Cycle
for Cloud
Security
A Winning Cloud Migration
How to Migrate to Cloud:
Strategy
key issue of
cloud migration

latency, influenced by the number of router


hops
packet delays, introduced by virtualization
server placement within a data center
Cloud Migration Philosophy To devise an effective strategy for moving
Underlying Cloud Migration your on-premises applications and databases
Strategies to the cloud, the focus must be on getting the
underlying principles of cloud migration right
The following are the key
components of a cloud
migration philosophy that
must be taken into
consideration:
• End-to-end Automation
• Innovation
• Disaster Recovery
• Cloud Governance
• Remote Access and Control
• Shared Resources and Cost-effectiveness
• System Modernization
key components of a cloud migration
philosophy
Key Outcomes and Metrics
to Track

Analyzing and presenting the benefits of cloud migration in business


terms can be complex and confusing, especially for those with a
technical bent of mind
This process can be simplified by viewing
these benefits from the prism of tangible
outcomes and metrics such as:
What Is Edge
Computing?

Edge computing is essentially the process


of decentralizing computer services and
moving them closer to the source of data
This can have a significant impact on latency, as it can drastically reduce the volume of data moved
and the distance it travels
Proximity, or low latency, is extremely important in business because data loses value as it ages

The Relationship
Between Edge
and Cloud
The distributed nature
of edge computing Processing of data starts at its source
means that along with
reducing latency, it also Once initial processing is completed, only the data that needs further
improves resiliency, analysis or requires other services needs to be sent
reduces networking
load, and is easier to This reduces networking requirements and the potential for bottlenecks at
scale any centralized services
with other nearby edge locations, or the potential of caching data on the
device, you can mask outages and improve your system’s resiliency
This reduces the need to scale your centralized services since they are
handling less traffic
The results can also reduce costs, architecture complexity, and management
Evolving
Computing
Paradigms:
Cloud, Edge,
and Fog
Technologies
Big data & big data security
Simple to start

What is the maximum file size you have dealt so


far?
Movies/Files/Streaming video that you have used?
What have you observed?
What is the maximum download speed you get?
Simple computation
How much time to just transfer.
What are we What is Big Data?

going to CyberSecurity - Why we


landed up there?
understand? Cybersecurity in BigData - To
whom does it matter

Biga Data Security - Are we


ready to handle it?

What are the concerns?


Introduction to Big Data
Data sets representing information will only keep increasing in the future

What is Big Data What Problems will it solve

Big Data refers to data sets with size beyond the ability of the typical Today organizations capture and store an ever increasing amount of
BI tools and techniques to capture, store, analyze and manage. data.

The definition is still evolving and can vary by industry sector. Internet availability, interconnectedness, connection speed and
Currently it is intentionally subjective without ties to a specific mobility contribute to a torrent of data points being generated
storage size. daily.

Today big data in many sectors can range from several terabytes to Organizations realize the potential value of these extreme-sized data
petabytes of storage. sets and discard less and less information (customer data or
internal)
The underlying enabling technology (Hadoop & MapReduce)
relies heavily on distributed processing. However, the existing means to process, and analyze data cannot
scale to extreme sizes economically.

What is Big Data?


Big data security

…is the collective term for all the measures and


tools used to guard both the data and analytics
processes from attacks, theft, or other malicious
activities that could harm or negatively affect them
Much like other forms of cyber-security, the big data
variant is concerned with attacks that originate either
Big Data
from the online or offline spheres security
Big Data
security
Big data security is the collective
term for all the measures and
tools used to guard both the
data and analytics processes
from attacks, theft, or other
malicious activities that could
harm or negatively affect them.
Big security
Data
Challenge
Big Data
security
CYBER
SECURITY
TRENDS

Big Data
security
Big Data
security
analytics
Big Data
security
analytics
Big Data
security
analytics
Challenges
To capture Full Potential of Big Data several moving
Key Challenges have To Be overcome forward

Governance Regulation Organizational Change IT Delivery and Industry


Supporting Technology
and Security and Talent Structure

• Ownership of, and access • Big Data still in early • Traditional SDLC models • Technology still evolving.
to data. stages – There might be might not work. • Analytical theory to
• Traditional compliance organizational chart • Industry sectors not yet support big data not
and security tools might changes required. ready infrastructure-wise. mature.
not fit. • Shortage of specialized • Politics and industry
• Extreme-size data sets analytical skills . leaders buy in.
will have to reside in • New business model
cloud storage to leverage might be required.
its flexibility and
distributedness.
Security Intelligence

Big Data Analytics Dashboard

• Cyber-Data • Models, Baselining • Security analyst


• Logs, events, • Feature extraction (human) looks at
network flows, • Anomaly detection indicators
user id. & activity, • Context (external • Correlates with
etc sources of external sources of
information) info to detect
attacks
PET
Privacy Enhancing Technologies:
PET is a term for a set of computer
tools and applications which when
integrated with online services allow
online users to protect the privacy of
their personally identifiable
information
Taxonomy of Privacy Enhancing Technologies
Ubiquitous
computing

"ubicomp" is a concept in software


engineering and computer science where
computing is made to appear anytime and
everywhere.
The underlying Advanced Operating
technologies to Internet
middleware System
support
ubiquitous
computing Mobile code Sensors
microprocessor
include: s

computer
new I/O user interfaces
networks

mobile location and


new materials
protocols positioning
This paradigm is also described as:
• pervasive computing
• ambient intelligence
• "everyware"

Ubiquitous Each term emphasizes slightly different


aspects.

computing When primarily concerning the objects


involved, it is also known as:
• physical computing,
• the Internet of Things,
• haptic computing,
• "things that think".
…is a system of interrelated computing devices, mechanical and digital machines
provided with unique identifiers (UIDs) and the ability to transfer data over a
network without requiring human-to-human or human-to-computer interaction.

IOT - INTERNET OF THINGS


IoT Security Issues

Security is the biggest concern in adopting Internet of things technology,with


concerns that rapid development is happening without appropriate consideration
of the profound security challenges involved and the regulatory changes that
might be necessary.
Protecting the security and privacy
have become the primary concerns in
this new telecommunication
networks as risks can have high
consequences.
Four phases of Network
Transformation towards
Network Softwarization
in 5G.

It illustrates how above technologies has


enabled the deployment of softwarized 5G
network, spanning from inflexible fixed-
mobile architecture to a dynamic and agile
software based network architecture.
5G Evolved Security Model

Wireless communication systems are not only limited to typical


phone audio and video calls.
5G Evolved
Security
Model for
upcoming
technology’s
threat
awareness.
In the 5G security model, data confidentiality is one of the main security requirements; the
property that can protect data transmission from disclosure to unauthorized entities and from
passive attacks (i.e., eavesdropping).

Confidentiality:
Considering the 4G-LTE and 5G architectures, any user plane data must be
confidential and protected from unauthorized users [73].
Standard data encryption algorithms have been widely adopted to realize the
Confidentiality: data confidentiality in 5G network applications (e.g., vehicle network [74],
health monitoring [75] etc).
The symmetric key encryption algorithm can be utilized to encrypt and
decrypt 5G data with one private key.
This is shared between the communicating entities (e.g., a sender and a
receiver).
This is to prevent tempering and loss of information during transformation from one
point to another.

Integrity:
Integrity of 5G New Radio (NR) traffic is protected similar to 4G.
In 5G NR, the integrity protected of wireless data traffic at the Packet Data Convergence
Protocol (PDCP) layer.
In 4G LTE integrity protection is provided only for Non-Access Stratum (NAS) and
Access Stratum (AS) [82].

Integrity: However, One main of key advancement in 5G integrity protection entails that 5G NR
offers the integrity protection of the user plane as well.
This is significant because 4G did not support the integrity protection of the user plane.
This new feature is useful for small data transmissions, particularly for constrained IoT
devices.
Moreover, 5G authentication mechanism 5G-AKA is using integrity-protected signaling.
This ensures that no unauthorized party can modify or access the information that is
communicated over the air [83].
In 5G domain, networks availability is to ensure that the network resources can be
accessible whenever they are needed by legitimate users, since the availability effects
on the reputation of service provider.

Availability:
In another words, the availability ensures the high probability effectiveness
of network infrastructure. It also measures the sustainability of a network
against active attacks, e.g., DoS attack.

Availability: A DoS attack can degrade the network performance.


However, in [84], the authors suggested that via the extreme Mobile
Broadband (eMBB) and ultra-reliable Machine-Type-Communication
(uMTC) the network availability can be achieved by at least 95% and
99.99%, respectively, for the 5G applications.
In 5G network, the current 3GPP 4G security architectures cannot directly applied to
the new 5G use-cases as they are dedicated to the traditional operators-subscriber
trust model.

Centralized security policy:


Therefore, to support new innovations (such as NFV and SDN), there is the
Centralized need for a centralized security policies management system that provides
security convenience for users to access the applications and resources.

policy: In [85], Thanh et al. proposed a policy-based security management


framework (VISECO) to support centralized security management for 5G.
The authors claimed that with the help of VISECO, mobile operators can
secure their network infrastructure.
In addition, the operators can enable Security-as-a-Service (SecaaS) as a
potential solution to several customers such as IoT vendors.
Visibility enables E2E-awareness of mobile networks to the control plane

Visibility:
This can efficiently tackle the basic network issues to ensure a secure environment.
The 5G networks need to utilize comprehensive end-to-end security strategies, which
should cover all layers of the network including application, signaling and data planes.
To implement such comprehensive security mechanism, 5G operators should have a
complete visibility, inspection and controls over all layers in the network.
Visibility: Here, the 5G technologies should be integrated with open APIs to manage with the
security policies.
In such a way, 5G network can have consistent security polices of both software and
hardware in the network.
The enhanced visibility across the network and security policies will help to implement
contextual security mechanisms which is suitable for new 5G services.
Moreover, enhance visibility enables data-driven threat prevention to find and isolate the
infected devices before attacks can potentially take place.
Artificial
Intelligence &
Industry 4.0
A medida que más empresas se centran en las Los ciber riesgos y
principales transformaciones digitales amenazas se
impulsadas por IA,... incrementan
Why
Machine
Learning
AI for Cyber Security:
¿Cómo la IA previene ciberataques?
En esta sección se echa un vistazo a cómo medidas en cyber analytics y artificial
intelligence permite que los equipos de ciberseguridad evalúen y analicen de
antemano el próximo movimiento de un ataque/enemigo evasivo.
CYBERTHREAT
S
First, let’s talk about
some common cyber
safety threats and the
problems they can
cause.
5 Digital Security threats that machine
learning can protect against

Unfortunately, machine learning will


never be a silver bullet for Digital
Security compared to image
recognition or natural language
processing, two areas where machine
learning is thriving
5 Digital
Security threats
Machine learning for Network Protection
that machine
learning can Machine learning for Endpoint Protection
protect against Machine learning for Application Security
Machine learning for User Behavior
Machine learning for Process Behavior
Artificial-Intelligence
capabilities mapped
to their potential uses
in domains Where
they may be of
societal benefit
Blockchain
A technology?
A currency?
The new internet?
Blockchain has the potential to transform
the functioning of a wide range of
industries
Its features can increase the
transparency and traceability of goods,
data and financial assets, facilitate market
access and improve the efficiency of
transactions
Governments will play a significant role in shaping policy and regulatory frameworks
that help address challenges presented by the technology, and foster transparent, fair
and stable markets as blockchain develops

Fulfilling blockchain’s potential,


however, depends on a policy
environment that allows innovation and
experimentation, while balancing the
risks of misuse
Fundamentally, blockchain is a
combination of already existing
technologies that together can create
A technology? networks that secure trust between people
or parties who otherwise have no reason to
The marriage of these technologies gives trust one another
blockchain networks key characteristics
that can remove the need for trust, and Specifically, it utilises distributed ledger
therefore enable a secure transfer of technology (DLT) to store information
value and data directly between parties. verified by cryptography among a group
of users, which is agreed through a pre-
defined network protocol, often without the
control of a central authority
Due to this unique ability,
blockchain technology can
A currency? diminish the role of
intermediaries, who can
Although mostly known for its digital
command market power, collect
financial asset applications (like Bitcoin),
blockchain technology is poised to have significant fees, slow economic
an impact on a wide range of sectors activity, and are not
necessarily trustworthy or
altruistic keepers of personal
information
A blockchain is a shared ledger of transactions between parties in a network, not
controlled by a single central authority

What is
blockchain?
You can think of a ledger like a record book: it
records and stores all transactions between users
in chronological order.

Instead of one authority controlling


this ledger (like a bank), an identical
copy of the ledger is held by all
users on the network, called nodes
Blockchain
Along with its own hash,
each block stores the hash
of the block before it
A hash is a unique string of
letters and numbers created
from text using a mathematical
formula
Blockchain
Blocks are therefore “chained”
together making the ledger (almost)
immutable or unable to be changed.
To add a block, it may first need to be
mined and then approved by a number
of notes through a consensus
mechanism
Before going further, it is important to note that not every blockchain is made the same
While there are a number of variable features, two of the most important are:
the “openness” of the platform (public or private) and
the level of permissions required to add information to the blockchain (permissioned or permissionless)

Different types of
blockchain
Public blockchains (like Bitcoin) are open for
anyone to read and view,

Private blockchains can only be viewed by a


chosen group of people.

Different types of
Permissioned blockchains permit just a select
blockchain group of users to write (i.e. generate transactions
for the ledger to record) and commit (i.e. verify
new blocks for addition to the chain).

Permissionless blockchains allow anyone to


contribute and add data to the ledger.
The main types of blockchain segmented by permission
model
Blockchain layers
One of the core aspects of a blockchain is that it
Blockchain’s key is a distributed ledger, meaning that the database
is maintained and held by all nodes in the
characteristics network
No central authority holds or updates the ledger,
Distributed rather each node independently constructs its
own record by processing every block (group of
Immutable
transactions), deciding if it is valid, then voting
Agreed by consensus via the consensus mechanism on their
conclusions
Once a change in the record is agreed, each node
updates its own ledger. In contrast, traditional
databases are stored and maintained centrally,
which can make them high-value targets for
hackers and criminals
In general, once a transaction is added to a blockchain
ledger, it cannot be undone
Blockchain’s key This immutability is one of the principal aspects that
characteristics contribute to the trustworthiness of blockchain
transactions
A blockchain’s immutability is secured through its use of
Distributed cryptography (see below for an explanation of hashing)
In a traditional, centralised database, an authorised user
Immutable
can connect to the server to add or modify the data
Agreed by consensus without the approval or detection of other users
Because all the data is held in one place, if the security
of the server or the authority that runs the server is
compromised, data can be modified or permanently
deleted
This may sometimes be irreversible and occur without
anyone else realising it.
No block can be added to the ledger
Blockchain’s key without approval from specified nodes in
the network
characteristics Rules regarding how this consent is
collected are called consensus mechanisms
Distributed Consensus protocols are crucial in ensuring
Immutable that every block is valid and that all
participants agree and maintain the same
Agreed by consensus version of the ledger
They heavily affect the incentives for nodes
to act honestly and are therefore the most
important variables when designing a
blockchain.
Ultima
consideración
CYBER 2.0 & 3.0
Extra material
understanding
future risk and
complex
uncertainty
complex
uncertainty
theme of
understanding
future risk and
complex uncertainty
…the idea of bearing risk and managing
uncertainty has been considered in
framing questions that “set the scene”
WHAT IS THE
OPTIMAL WAY
TO BEAR RISK
With this in mind,…
What are the
potential changes
in risks and the
emerging risks?
What changes are there in the desire to bear risks?
desire to bear
risks?

Risk appetite vs
Risk Tolerance
WHAT IS THE How can we expect risks to be identified and
OPTIMAL measured in the future?

WAY TO How could we deal with extreme events?

BEAR RISK How can risks be priced?


What is the optimal differentiation in pricing risk
(depooling)?
How are the above affected by the availability of
Consequential data?
questions: What changes will there be in forms of risk-bearing?
for example, insurance markets, hedging, investment
markets or government as the risk-bearer
Global Risk Management Survey Risk
Ranking

You might also like