0% found this document useful (0 votes)
16 views

Lecture 4-1

Uploaded by

malek771430311
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views

Lecture 4-1

Uploaded by

malek771430311
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 25

Connecting Your

Organization to the Remote


Cloud Data Center
What Are the Tools Used for Remote Management?

• VPN Access
• A virtual private network (VPN) allows for secure and usually encrypted connections
over a public network, as shown in Figure

Usually a VPN connection is set up between the network management location or


customer managing their cloud deployment and the cloud
services being monitored and managed.
However, a cloud provider will offer network connections for management or monitoring
using a direct dedicated connection or a VPN connection.
Is My Data Safe? (Replication and
Synchronization)
• Synchronous replication is the process of replicating data in real time
from the primary storage system to a remote facility.
• Replication is the transfer and synchronization of data between
multiple data centers, For disaster recovery purposes and data
security, your data must be transferred, or replicated, between data
centers. Remote copies of data have traditionally been implemented
with storage backup applications.
• However, with the virtualization of servers in the cloud, you can now
replicate complete VM instances, which allows you to replicate
complete server instances, with all of the applications, service packs,
and content, to a remote facility.
• Asynchronous replication works off a store-and-forward model and is
a cost-effective protection and backup solution. With asynchronous
replication, the data is first written to the primary storage system in
the primary storage facility or cloud location. After the data is stored,
it is then copied to remote replicas on a scheduled basis or in near
real time.
• Asynchronous replication is much more cost effective than
implementing a synchronous replication offering. Since asynchronous
replication is not in real time, it works well over slower wider area
network links, where a certain amount of network delay is to be
expected.
Understanding Load Balancers
• Load balancing addresses the issues found when cloud workloads
and connections increase to the point where a single server can no
longer handle the workload or performance requirements of web,
DNS, or FTP servers; firewalls; and other network services.
• With load balancing, you can configure the cloud for many servers
working together and sharing the load. Therefore, redundancy and
scalability can be achieved.
• A load balancer is commonly found in front of web servers. The
website’s IP address is advertised on the network via DNS. This IP
address is not of the real web server but instead is an interface on the
load balancer. The load balancer allocates the traffic by distributing
the connections to one of many servers connected to it.
• A load balancer can check the health of each server and remove a
server from the network if there is a hardware, network, or
application issue.
Vulnerability Scanning:
• Vulnerability scanning is used to find objects in your cloud
deployment that can be exploited or are potential security threats.
The vulnerability scanner is an application that has a database of
known exploits and runs them against your deployment to see
whether your cloud deployment may be susceptible or have security
holes that need to be remediated. The scanner will detect and report
on weaknesses in your cloud deployment.
Penetration Testing
• Penetration testing is the process of testing your cloud access to
determine whether there is any vulnerability that an attacker could
exploit. Pen testing is usually performed from outside your cloud
deployment to assess the ability to access systems into your cloud
from, for example, the Internet.
Loading Testing
• Load testing puts load on your application or compute system and
measures the response. By performing load testing, you can
determine how your applications and cloud deployment can be
expected to perform in times of heavy production usage. Load
testing is performed to determine a system’s behavior under both
normal and expected peak load conditions.
Verifying System Requirements
• After defined your requirements and which cloud service and
deployment models best meet them. The next step is to select trial
application to migrate to the cloud from your existing
data center.
• Prior to performing the migration, the engineering team should sit
down and review the complete design, from the application,
configuration, hardware, and networking to the storage and security.
Lecture 5
Correct Scaling for Your Requirements
• Elasticity
• On-demand
• Pay as you grow
Making Sure the Cloud Is Always Available
• Regions: it is a geographical area of presence.
• Large cloud operations will actually partition operations into regions for
fault tolerance and to offer localized performance advantages.
• All of the regions are interconnected to each other and the Internet with
high-speed optical networks but are isolated from each other, so if there
is an outage in one region, it should not affect the operations of other
regions.
• When you deploy your cloud operations, you will be given a choice of
what region you want to use. Also, for a global presence and to reduce
network delays, the cloud customer can choose to replicate operations
in multiple regions around the world.
• Availability Zones (AZs):
• Each region will usually have two or more availability zones for fault
tolerance. The Azs are isolated locations within the cloud data center
regions that the public cloud services providers originate and operate.
• Each availability zone is a physically separate data center
with its own redundant power and telecommunication connections.
Understanding Direct and Virtual Cloud
Connections
• Remote Hypervisor Access:
• almost every hypervisor product on the market has a management
application that can be installed on a workstation and can be used to fully configure and
manage the hypervisor.
• it is important to ensure that all firewalls between your location and the cloud data
center allow the application to communicate.
• Remote Desktop Protocol (RDP):
• is a proprietary protocol developed by Microsoft to allow remote access to Windows
devices and managing remote Windows virtual machines as if you were locally
connected to the server
• The remote desktop application comes preinstalled on most versions of Windows. You
need to open the TCP and UDP port 3389 in your firewalls, and you will be ready to
connect.
SSH
• The Secure Shell (SSH) protocol has largely replaced Telnet as a
remote access method. SSH supports encryption.
• To use SSH, the SSH service must be supported on the server in the
cloud data center and enabled.
• The SSH client connects over the network using TCP port 22
via an encrypted connection
• Console Port:
• Console ports are common in the networking environment to configure
switches and routers from a command-line interface.
• In a data center, devices called terminal servers are deployed that
have many serial ports, each cabled to a console port on a device that is
being managed.
• This allows you to make an SSH or Telnet connection to the terminal
server and then use the serial interfaces on it to access the console ports
on the devices you want to connect to. VM can use port redirections to
connect to a physical serial port on the server for console connections.
• HTTP :
• HTTP Probably the most common and easiest way of managing
remote devices is to use a standard browser and access the remote
device’s web interface. Most devices are now supporting web access
with the secure version of HTTPS, which is the most common for
security reasons. When connected and authenticated, the web-based
applications provide a graphical interface that can be used to monitor
and configure the device.
• HTTPS uses TCP port 443 and is the suggested remote access protocol
for web-based access because it is secure.
Data Integrity
• Data can be encrypted at rest and in transit using strong encryption
algorithms.
• The encryption keys can be managed by the cloud providers, or you
can elect to manage your own keys if that is your company’s security
policy.
Responsibility of the Cloud
Service Provider
• Cloud operations are the responsibility of both your organization and the
cloud service provider.
• if you are using Infrastructure as a Service, then the cloud company would
usually take responsibility for all cloud connections and networking up to
the hypervisor level, and the cloud consumer would have responsibility for
the operating system and application.
• Platform as a Service would be defined as the cloud provider taking
responsibility up to the operating system level and you, the customer, being
responsible for the applications you choose to run on the operating system.

• Software as a Service model, then the cloud provider would assume full
responsibility up to the operating system and the application running on it.

You might also like