0% found this document useful (0 votes)
23 views9 pages

Unit 6 Cloud Platforms in Industry

The document discusses several cloud platforms including AWS, Google App Engine, Microsoft Azure, and Aneka. AWS provides IaaS, PaaS and SaaS services and has a global infrastructure across regions and availability zones. Google App Engine is a PaaS that allows developers to build apps using Google infrastructure and services. Microsoft Azure provides a platform built on Windows and virtualization technology. Aneka is an open source PaaS for building applications that can run on private, public or hybrid clouds.

Uploaded by

shuklchitrank
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views9 pages

Unit 6 Cloud Platforms in Industry

The document discusses several cloud platforms including AWS, Google App Engine, Microsoft Azure, and Aneka. AWS provides IaaS, PaaS and SaaS services and has a global infrastructure across regions and availability zones. Google App Engine is a PaaS that allows developers to build apps using Google infrastructure and services. Microsoft Azure provides a platform built on Windows and virtualization technology. Aneka is an open source PaaS for building applications that can run on private, public or hybrid clouds.

Uploaded by

shuklchitrank
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

Cloud Platforms in Industry :-

AWS :-
Amazon Web Services (AWS), a subsidiary of Amazon.com, has invested billions of dollars in IT
resources distributed across the globe. These resources are shared among all the AWS account holders
across the globe. These account themselves are entirely isolated from each other. AWS provides on-
demand IT resources to its account holders on a pay-as-you-go pricing model with no upfront cost.
Enterprises use AWS to reduce capital expenditure of building their own private IT infrastructure
(which can be expensive depending upon the enterprise’s size and nature). All the maintenance cost
is also bared by the AWS that saves a fortune for the enterprises.

AWS Global Infrastructure

The AWS global infrastructure is massive and is divided into geographical regions. The geographical
regions are then divided into separate availability zones. While selecting the geographical regions for
AWS, three factors come into play
 Optimizing Latency
 Reducing cost
 Government regulations (Some services are not available for some regions)
Each region is divided into at least two availability zones that are physically isolated from each other,
which provides business continuity for the infrastructure as in a distributed system. If one zone fails
to function, the infrastructure in other availability zones remains operational. The largest region North
Virginia (US-East), has six availability zones. These availability zones are connected by high-speed
fiber-optic networking.
There are over 100 edge locations distributed all over the globe that are used for the CloudFront content
delivery network. Cloudfront can cache frequently used content such as images and videos at edge
locations and distribute it to edge locations across the globe for high-speed delivery for end-users. It
also protects from DDOS attacks.

AWS Management Console

The AWS management console is a web-based interface to access AWS. It requires an AWS account
and also has a smartphone application for the same purpose. Cost monitoring is also done through the
console.
AWS resources can also be accessed through various Software Development Kits (SDKs), which allows
the developers to create applications as AWS as its backend. There are SDKs for all the major
languages(e.g., JavaScript, Python, Node.js, .Net, PHP, Ruby, Go, C++). There are mobile SDKs for
Android, iOS, React Native, Unity, and Xamarin. AWS can also be accessed by making HTTP calls
using the AWS-API. AWS also provides a Command Line Interface (CLI) for remotely accessing the
AWS and can implement scripts to automate many processes.

AWS Cloud Computing Models

There are three cloud computing models available on AWS.


1. Infrastructure as a Service (IaaS): It is the basic building block of cloud IT. It generally provides
access to data storage space, networking features, and computer hardware(virtual or dedicated
hardware). It is highly flexible and gives management controls over the IT resources to the developer.
For example, VPC, EC2, EBS.
2. Platform as a Service (PaaS): This is a type of service where AWS manages the underlying
infrastructure (usually operating system and hardware). This helps the developer to be more efficient
as they do not have to worry about undifferentiated heavy lifting required for running the applications
such as capacity planning, software maintenance, resource procurement, patching, etc., and focus
more on deployment and management of the applications. For example, RDS, EMR, ElasticSearch
3. Software as a Service(SaaS): It is a complete product that usually runs on a browser. It primarily refers
to end-user applications. It is run and managed by the service provider. The end-user only has to worry
about the application of the software suitable to its needs. For example, Saleforce.com, Web-based
email, Office 36

Amazon has been a leader in providing public cloud services Amazon applies the
IaaS model in providing its services. EC2 provides the virtualized platforms to the host
VMs where the cloud application can run. S3 (Simple Storage Service) provides the
object-oriented storage service for users. EBS (Elastic Block Service) provides the block
storage interface which can be used to support traditional applications. SQS stands for
Simple Queue Service, and its job is to ensure a reliable message service between two
processes. Amazon (like Azure) offers a Relational Database Service (RDS) with a
messaging interface. AWS Import/Export allows one to ship large volumes of data to and
from EC2 by shipping physical disks.
Google App Engine :-
Google App Engine is Google's platform as a service offering that allows developers and businesses to build
and run applications using Google's advanced infrastructure. These applications are required to be written
in one of a few supported languages, namely: Java, Python, PHP and Go. It also requires the use of Google
query language and that the database used is Google Big Table. Applications must abide by these
standards, so applications either must be developed with GAE in mind or else modified to meet the
requirements.

GAE is a platform, so it provides all of the required elements to run and host Web applications, be it on
mobile or Web. Without this all-in feature, developers would have to source their own servers, database
software and the APIs that would make all of them work properly together, not to mention the entire
configuration that must be done. GAE takes this burden off the developers so they can concentrate on the
app front end and functionality, driving better user experience.

Advantages of GAE include:

 Readily available servers with no configuration requirement


 Power scaling function all the way down to "free" when resource usage is minimal
 Automated cloud computing tools
GAE Architecture :-

Google cloud platform which has been used to deliver the cloud services highlighted earlier.
GFS is used for storing large amounts of data. MapReduce is for use in application program
development. Chubby is used for distributed application lock services. BigTable offers a storage
service for accessing structured data.
Users can interact with Google applications via the web interface provided by each
application. Third-party application providers can use GAE to build cloud applications for
providing services. The applications all run in data centers under tight management by Google
engineers. Inside each data center, there are thousands of servers forming different clusters.
Functional Modules of GAE

The GAE platform comprises the following five major components.


 The datastore offers object-oriented, distributed, structured data storage services based on
BigTable techniques. The datastore secures data management operations.
 The application runtime environment offers a platform for scalable web programming and
execution. It supports two development languages: Python and Java.
 The software development kit (SDK) is used for local application development. The SDK
allows users to execute test runs of local applications and upload application code.
 The administration console is used for easy management of user application development
cycles, instead of for physical resource management.
 The GAE web service infrastructure provides special interfaces to guarantee flexible use
and management of storage and network resources by GAE.
Microsoft Windows Azure
Microsoft launched a Windows Azure platform to meet the challenges in cloud computing.
This platform is built over Microsoft data centers. Windows Azure offers a cloud platform built on
Windows OS and based on Microsoft virtualization technology. Applications are installed on VMs
deployed on the data-center servers. Azure manages all servers, storage, and network resources of the
data center. On top of the infrastructure are the various services for building different cloud
applications. Cloud-level services provided by the Azure platform are introduced below.

• Live service Users can visit Microsoft Live applications and apply the data involved across multiple
machines concurrently.
• .NET service This package supports application development on local hosts and execution on cloud
machines.
• SQL Azure This function makes it easier for users to visit and use the relational database associated with
the SQL server in the cloud.
• SharePoint service This provides a scalable and manageable platform for users to develop their special
business applications in upgraded web services.
• Dynamic CRM service This provides software developers a business platform in managing CRM
applications in financing, marketing, and sales and promotions.

Aneka Cloud Platform


Aneka is used for developing, deploying and managing cloud capplications.
Aneka can be integrated with existing cloud technologies.
Aneka includes extensible set of APIs associated with programming models like MapReduce.
These APIs supports different types of cloud models like private, public, hybrid cloud.

Aneka framework:
Aneks is a software platform for developing cloud computing applications.
In Aneka cloud applications are executed.
Aneka is a pure PaaS solution for cloud computing.
Aneka is a cloud middleware product.
Aneks can be deployed on a network of computers, a multicore server, datacenters, virtual cloud
infrastructures, or a mixture of these.

Aneka container can be classified into three major categories:


1. Fabric Services
2. Foundation Services
3. Application Services
1. Fabric services:
Fabric Services define the lowest level of the software stack representing the Aneka Container.
They provide access to the resource-provisioning subsystem and to the monitoring facilities
implemented in Aneka.
2. Foundation services:
Fabric Services are fundamental services of the Aneka Cloud and define the basic infrastructure
management features of the system. Foundation Services are related to the logical management of
the distributed system built on top of the infrastructure and provide supporting services for the
execution of distributed applications.
3. Application services:
Application Services manage the execution of applications and constitute a layer that
differentiates according to the specific programming model used for developing distributed
applications on top of Aneka.
Protein structure prediction in Cloud Computing :-

Cloud computing is an emerging technology that provides various computing services on demand. It
provides convenient access to a shared pool of higher-level services and other system resources.
Nowadays, cloud computing has a great significance in the fields of geology, biology, and other
scientific research areas.

Protein structure prediction is the best example in research area that makes use of cloud applications for
its computation and storage.

A protein is composed of long chains of amino acids joined together by peptide bonds. The various
structures of protein help in the designing of new drugs and the various sequences of proteins from its
three-dimensional structure in predictive form is known as a Protein structure prediction.

Firstly primary structures of proteins are formed and then prediction of the secondary, tertiary and
quaternary structures are done from the primary one. In this way predictions of protein structures are
done. Protein structure prediction also makes use of various other technologies like artificial neural
networks, artificial intelligence, machine learning and probabilistic techniques, also holds great
importance in fields like theoretical chemistry and bioinformatics.

There are various algorithms and tools that exists for protein structure prediction. CASP (Critical
Assessment of Protein Structure Prediction) is a well-known tool that provides methods for automated
web servers and the results of research work are placed on clouds like CAMEO (Continuous Automated
Model Evaluation) server. These servers can be accessed by anyone as per their requirements from any
place. Some of the tools or servers used in protein structure prediction are Phobius, FoldX, LOMETS,
Prime, Predict protein, SignalP, BBSP, EVfold, Biskit, HHpred, Phre, ESyired3D. Using these tools
new structures are predicted and the results are placed on the cloud-based servers.

Data Analysis :-
Although many groups, organizations, and experts have different ways to approach data analysis, most of
them can be distilled into a one-size-fits-all definition. Data analysis is the process of cleaning, changing,
and processing raw data, and extracting actionable, relevant information that helps businesses make
informed decisions. The procedure helps reduce the risks inherent in decision-making by providing useful
insights and statistics, often presented in charts, images, tables, and graphs.
It’s not uncommon to hear the term “big data” brought up in discussions about data analysis. Data analysis
plays a crucial role in processing big data into useful information. Neophyte data analysts who want to dig
deeper by revisiting big data fundamentals should go back to the basic question, “What is data?”

Data Analysis Importance :-


Here is a list of reasons why data analysis is such a crucial part of doing business today.
 Better Customer Targeting: You don’t want to waste your business’s precious time, resources, and
money putting together advertising campaigns targeted at demographic groups that have little to no
interest in the goods and services you offer. Data analysis helps you see where you should be focusing
your advertising efforts.
 You Will Know Your Target Customers Better: Data analysis tracks how well your products and
campaigns are performing within your target demographic. Through data analysis, your business can get
a better idea of your target audience’s spending habits, disposable income, and most likely areas of
interest. This data helps businesses set prices, determine the length of ad campaigns, and even help
project the quantity of goods needed.
 Reduce Operational Costs: Data analysis shows you which areas in your business need more resources
and money, and which areas are not producing and thus should be scaled back or eliminated outright.
 Better Problem-Solving Methods: Informed decisions are more likely to be successful decisions. Data
provides businesses with information. You can see where this progression is leading. Data analysis helps
businesses make the right choices and avoid costly pitfalls.
 You Get More Accurate Data: If you want to make informed decisions, you need data, but there’s more
to it. The data in question must be accurate. Data analysis helps businesses acquire relevant, accurate
information, suitable for developing future marketing strategies, business plans, and realigning the
company’s vision or mission.

Data Analysis Process :-


The data analysis process, or alternately, data analysis steps, involves gathering all the information,
processing it, exploring the data, and using it to find patterns and other insights.
The process consists of:

 Data Requirement Gathering: Ask yourself why you’re doing this analysis, what type of data analysis
you want to use, and what data you are planning on analyzing.
 Data Collection: Guided by the requirements you’ve identified, it’s time to collect the data from your
sources. Sources include case studies, surveys, interviews, questionnaires, direct observation, and focus
groups. Make sure to organize the collected data for analysis.
 Data Cleaning: Not all of the data you collect will be useful, so it’s time to clean it up. This process is
where you remove white spaces, duplicate records, and basic errors. Data cleaning is mandatory before
sending the information on for analysis.
 Data Analysis: Here is where you use data analysis software and other tools to help you interpret and
understand the data and arrive at conclusions. Data analysis tools include Excel, Python, R, Looker,
Rapid Miner, Chartio, Metabase, Redash, and Microsoft Power BI.
 Data Interpretation: Now that you have your results, you need to interpret them and come up with the
best courses of action, based on your findings.
 Data Visualization: Data visualization is a fancy way of saying, “graphically show your information in a
way that people can read and understand it.” You can use charts, graphs, maps, bullet points, or a host of
other methods. Visualization helps you derive valuable insights by helping you compare datasets and
observe relationships.

CRM in Cloud Computing

Cloud Computing: is a pool network of computer systems sharing various resources and higher
level services hosted on the internet servers. Various cloud applications like Google Drive,
Google Mail, Drop Box, Microsoft Skydrive, Prime Desk, SOS Online Backup and many more
apps like these are available free of cost to the user and can be accessed anytime from any corner
of the world. Hence, cloud computing has a great significance in sharing and storing numerous
amount of data and it also saves the time and energy of the users.

What is CRM?

CRM stands for Customer Relationship Management and is a software that is hosted in
cloud so that the users can access the information using internet. CRM software provides
high level of security and scalability to its users and can be easily used on mobile phones
to access the data.
Now a days, many business vendors and service providers are using these CRM software
to manage the resources so that the user can access them via internet. Moving the business
computation from desktop to the cloud is proving a beneficial step in both the IT and Non-
IT fields. Some of the major CRM vendors include Oracle Siebel, Mothernode CRM,
Microsoft Dynamics CRM, Infor CRM, SAGE CRM, NetSuite CRM.

Advantages:
Few advantages of using CRM are as follows:

 High reliability and scalability


 Easy to use
 Highly secured
 Provides flexibility to users and service providers
 Easily accessible

You might also like