Cloud Computing: A Technical Seminar Report On
Cloud Computing: A Technical Seminar Report On
CLOUD COMPUTING
Submitted in the partial fulfillment of the requirement for the award of the Degree
BACHELOR OF ENGINEERING
in
INFORMATION SCIENCE AND ENGINEERING
of
VISVESVARAYA TECHNOLOGICAL UNIVERSITY
BELGAUM
SUBMITTED BY
[1DS05IS058]
2009
Dept of ISE
D.S.C.E, Bangalore
1. Introduction
Imagine yourself in the world where the users of the computer of todays
internet world dont have to run, install or store their application or data on
their own computers, imagine the world where every piece of your
information or data would reside on the Cloud (Internet).
As a metaphor for the Internet, "the cloud" is a familiar clich, but
when combined with "computing", the meaning gets bigger and fuzzier.
Some analysts and vendors define cloud computing narrowly as an updated
version of utility computing: basically virtual servers available over the
Internet. Others go very broad, arguing anything you consume outside the
firewall is "in the cloud", including conventional outsourcing.
Cloud computing comes into focus only when you think about what we
always need: a way to increase capacity or add capabilities on the fly
without investing in new infrastructure, training new personnel, or licensing
new software. Cloud computing encompasses any subscription-based or
pay-per-use service that, in real time over the Internet, extends ICT's existing
capabilities.
Cloud computing is at an early stage, with a motley crew of providers
large and small delivering a slew of cloud-based services, from full-blown
applications to storage services to spam filtering. Yes, utility-style
infrastructure providers are part of the mix, but so are SaaS (software as a
service) providers such as Salesforce.com. Today, for the most part, IT must
plug into cloud-based services individually, but cloud computing aggregators
and integrators are already emerging.
Dept of ISE
D.S.C.E, Bangalore
Dept of ISE
D.S.C.E, Bangalore
2.1 Comparison:
Cloud computing is often confused with grid computing ("a form of distributed
computing whereby a 'super and virtual computer' is composed of a cluster of networked,
loosely-coupled computers, acting in concert to perform very large tasks"), utility
computing (the "packaging of computing resources, such as computation and storage, as
a metered service similar to a traditional public utility such as electricity") and autonomic
computing ("computer systems capable of self-management").
Indeed many cloud computing deployments as of 2009 depend on grids, have autonomic
characteristics and bill like utilities but cloud computing can be seen as a natural next
step from the grid-utility model. Some successful cloud architectures have little or no
centralized infrastructure or billing systems whatsoever, including peer-to-peer networks
like Bit Torrent and Skype and volunteer computing like
2.2 Implementation:
The majority of cloud computing infrastructure as of 2009 consists of reliable services
delivered through data centers and built on servers with different levels of virtualization
technologies. The services are accessible anywhere that has access to networking
infrastructure. The Cloud appears as a single point of access for all the computing needs
of consumers. Commercial offerings need to meet the quality of service requirements of
customers and typically offer service level agreements. Open standards are critical to the
growth of cloud computing and open source software has provided the foundation for
many cloud computing implementations.
2.3 Characteristics:
As customers generally do not own the infrastructure, they merely access or rent, they
can avoid capital expenditure and consume resources as a service, paying instead for
what they use. Many cloud-computing offerings have adopted the utility computing
model, which is analogous to how traditional utilities like electricity are consumed, while
others are billed on a subscription basis. Sharing "perishable and intangible" computing
power among multiple tenants can improve utilization rates, as servers are not left idle,
which can reduce costs significantly while increasing the speed of application
development. A side effect of this approach is that "computer capacity rises dramatically"
as customers do not have to engineer for peak loads. Adoption has been enabled by
"increased high-speed bandwidth" which makes it possible to receive the same response
times from centralized infrastructure at other sites.
Dept of ISE
D.S.C.E, Bangalore
2.4 Economics:
Cloud computing users can avoid capital expenditure (CapEx) on hardware, software and
services, rather paying a provider only for what they use. Consumption is billed on a
utility (e.g. resources consumed, like electricity) or subscription (e.g. time based, like a
newspaper) basis with little or no upfront cost. Other benefits of this time sharing style
approach are low barriers to entry, shared infrastructure and costs, low management
overhead and immediate access to a broad range of applications. Users can generally
terminate the contract at any time (thereby avoiding return on investment risk and
uncertainty) and the services are often covered by service level agreements with financial
penalties.
According to Nicholas Carr the strategic importance of information technology is
diminishing as it becomes standardized and cheaper. He argues that the cloud computing
paradigm shift is similar to the displacement of electricity generators by electricity grids
early in the 20th century.
2.5 Companies:
Providers including Amazon, Microsoft, Google, Sun and Yahoo exemplify the use of
cloud computing. It is being adopted by individual users through large enterprises
including General Electric, L'Oral, and Procter & Gamble.
Dept of ISE
D.S.C.E, Bangalore
3. History
The Cloud is a term with a long history in telephony, which has in the past decade, been
adopted as a metaphor for internet based services, with a common depiction in network
diagrams as a cloud outline.
The underlying concept dates back to 1960 when John McCarthy opined that
"computation may someday be organized as a public utility"; indeed it shares
characteristics with service bureaus which date back to the 1960s. The term cloud had
already come into commercial use in the early 1990s to refer to large ATM networks. By
the turn of the 21st century, the term "cloud computing" had started to appear, although
most of the focus at this time was on Software as a service (SaaS).
In 1999, Salesforce.com was established by Marc Benioff, Parker Harris, and his fellows.
They applied many technologies of consumer web sites like Google and Yahoo! to
business applications. They also provided the concept of "On demand" and "SaaS" with
their real business and successful customers. The key for SaaS is being customizable by
customer alone or with a small amount of help. Flexibility and speed for application
development have been drastically welcomed and accepted by business users.
IBM extended these concepts in 2001, as detailed in the Autonomic Computing
Manifesto -- which described advanced automation techniques such as self-monitoring,
self-healing, self-configuring, and self-optimizing in the management of complex IT
systems with heterogeneous storage, servers, applications, networks, security
mechanisms, and other system elements that can be virtualized across an enterprise.
Amazon.com played a key role in the development of cloud computing by modernizing
their data centers after the dot-com bubble and, having found that the new cloud
architecture resulted in significant internal efficiency improvements, providing access to
their systems by way of Amazon Web Services in 2005 on a utility computing basis.
2007 saw increased activity, with Google, IBM, and a number of universities embarking
on a large scale cloud computing research project, around the time the term started
gaining popularity in the mainstream press. It was a hot topic by mid-2008 and numerous
cloud computing events had been scheduled.
In August 2008, Gartner Research observed that "organizations are switching from
company-owned hardware and software assets to per-use service-based models" and that
the "projected shift to cloud computing will result in dramatic growth in IT products in
some areas and in significant reductions in other areas."
Dept of ISE
D.S.C.E, Bangalore
4. Political Issues
The Cloud spans many borders and "may be the ultimate form of globalization." As such
it becomes subject to complex geopolitical issues: providers must satisfy myriad
regulatory environments in order to deliver service to a global market. This dates back to
the early days of the Internet, where libertarian thinkers felt that "cyberspace was a
distinct place calling for laws and legal institutions of its own"; author Neal Stephenson
envisaged this as a tiny island data haven called Kinakuta in his classic science-fiction
novel Cryptonomicon.
Despite efforts (such as US-EU Safe Harbor) to harmonize the legal environment, as of
2009 providers such as Amazon Web Services cater to the major markets (typically the
United States and the European Union) by deploying local infrastructure and allowing
customers to select "availability zones." Nonetheless, there are still concerns about
security and privacy from individual through governmental level, e.g., the USA
PATRIOT Act and use of national security letters and the Electronic Communication
Privacy Act's Stored Communications Act.
5. Legal Issues
In March 2007, Dell applied to trademark the term "cloud computing" (U.S. Trademark
77,139,082) in the United States. The "Notice of Allowance" it received in July 2008 was
canceled on August 6, resulting in a formal rejection of the trademark application less
than a week later.
On 30 September 2008, USPTO issued a "Notice of Allowance" to CGactive LLC (U.S.
Trademark 77,355,287) for "CloudOS". A cloud operating system is a generic operating
system that "manage[s] the relationship between software inside the computer and on the
Web", such as Microsoft Azure. Good OS LLC also announced their "Cloud" operating
system on 1 December 2008.
Richard Stallman, founder of the Free Software Foundation, believes that cloud
computing endangers liberties because users sacrifice their privacy and personal data to a
third party. In November 2007, the Free Software Foundation released the Affero General
Public License, a version of GPLv3 designed to close a perceived legal loophole
associated with free software designed to be run over a network, particularly software as
a service. An application service provider is required to release any changes they make to
Affero GPL open source code
Dept of ISE
D.S.C.E, Bangalore
6. Risk Mitigation
Corporations or end-users wishing to avoid not being able to access their data or even
losing it should research vendors' policies on data security before using vendor
services. One technology analyst and consulting firm, Gartner, lists seven security issues
which one should discuss with a cloud-computing vendor:
1. Privileged user accessinquire about who has specialized access to data and
about the hiring and management of such administrators.
2. Regulatory compliancemakes sure a vendor is willing to undergo external
audits and/or security certifications.
3. Data locationsask if a provider allows for any control over the location of data.
4. Data segregationmake sure that encryption is available at all stages and that
these "encryption schemes were designed and tested by experienced
professionals".
5. Recoveryfind out what will happen to data in the case of a disaster; do they
offer complete restoration and, if so, how long that would take.
6. Investigative Supportinquire whether a vendor has the ability to investigate any
inappropriate or illegal activity.
7. Long-term viabilityask what will happen to data if the company goes out of
business; how will data be returned and in what format.
Dept of ISE
D.S.C.E, Bangalore
7. Key characteristics
Dept of ISE
D.S.C.E, Bangalore
8. Components
Infrastructur
BitTorrent EC2 GoGrid Sun Grid 3tera
e
Platforms
Services
Storage
Standards
Dept of ISE
10
D.S.C.E, Bangalore
8.1 Application
A cloud application leverages the Cloud in software architecture, often eliminating the
need to install and run the application on the customer's own computer, thus alleviating
the burden of software maintenance, ongoing operation, and support. For example:
8.2 Client
A cloud client consists of computer hardware and/or computer software which relies on
cloud computing for application delivery, or which is specifically designed for delivery
of cloud services and which, in either case, is essentially useless without it. For example:
8.3 Infrastructure
Cloud infrastructure, such as Infrastructure as a service, is the delivery of computer
infrastructure, typically a platform virtualization environment, as a service. For example:
8.4 Platform
A cloud platform, such as Platform as a service, the delivery of a computing platform,
and/or solution stack as a service, facilitates deployment of applications without the cost
and complexity of buying and managing the underlying hardware and software layers.
For example:
Dept of ISE
11
D.S.C.E, Bangalore
8.5 Service
A cloud service includes "products, services and solutions that are delivered and
consumed in real-time over the Internet". For example, Web Services ("software
system[s] designed to support interoperable machine-to-machine interaction over a
network") which may be accessed by other cloud computing components, software, e.g.,
Software plus service, or end users directly. Specific examples include:
8.6 Storage
Cloud storage involves the delivery of data storage as a service, including database-like
services, often billed on a utility computing basis, e.g., per gigabyte per month. For
example:
Dept of ISE
12
D.S.C.E, Bangalore
9. Architecture
Cloud architecture, the systems architecture of the software systems involved in the
delivery of cloud computing, comprises hardware and software designed by a cloud
architect who typically works for a cloud integrator. It typically involves multiple cloud
components communicating with each other over application programming interfaces,
usually web services.
This closely resembles the UNIX philosophy of having multiple programs doing one
thing well and working together over universal interfaces. Complexity is controlled and
the resulting systems are more manageable than their monolithic counterparts.
Cloud architecture extends to the client, where web browsers and/or software
applications access cloud applications.
Cloud storage architecture is loosely coupled, where metadata operations are centralized
enabling the data nodes to scale into the hundreds, each independently delivering data to
applications or user.
Dept of ISE
13
D.S.C.E, Bangalore
10. Types
14
D.S.C.E, Bangalore
11. Roles
11.1 Provider
A cloud computing provider or cloud computing service provider owns and operates live
cloud computing systems to deliver service to third parties. The barrier to entry is also
significantly higher with capital expenditure required and billing and management creates
some overhead. Nonetheless, significant operational efficiency and agility advantages can
be realized, even by small organizations, and server consolidation and virtualization
rollouts are already well underway. Amazon.com was the first such provider, modernizing
its data centers which, like most computer networks, were using as little as 10% of its
capacity at any one time just to leave room for occasional spikes. This allowed small,
fast-moving groups to add new features faster and easier, and they went on to open it up
to outsiders as Amazon Web Services in 2002 on a utility computing basis.
11.2 User
A user is a consumer of cloud computing. The privacy of users in cloud computing has
become of increasing concern. The rights of users are also an issue, which is being
addressed via a community effort to create a bill of rights.
11.3 Vendor
A vendor sells products and services that facilitate the delivery, adoption and use of cloud
computing. For example:
Dept of ISE
15
D.S.C.E, Bangalore
12. Standards
Cloud standards, a number of existing, typically lightweight, open standards, have
facilitated the growth of cloud computing, including:
Application
o Communications (HTTP, XMPP)
o Security (OAuth, OpenID, SSL/TLS)
o Syndication (Atom)
Client
o Browsers (AJAX)
o Offline (HTML 5)
Implementations
o Virtualization (OVF)
Platform
o Solution stacks (LAMP)
Service
o Data (XML, JSON)
o Web Services (REST)
Storage
o Database(Amazon Simple DB, Google App Engine BigTable Datastore)
o Network attached storage (MobileMe iDisk, Nirvanix CloudNAS)
o Synchronization (Live Mesh Live Desktop component, MobileMe push
functions)
o Web service (Amazon Simple Storage Service, Nirvanix SDN)
Dept of ISE
16
D.S.C.E, Bangalore
History
Amazon announced a limited public beta of EC2 on August 25, 2006. Access to EC2
was granted on a first come first served basis. EC2 became generally available on
October 23, 2008 along with support for Microsoft Windows Server.
Virtual machines
EC2 uses Xen virtualization. Each virtual machine, called an "instance", functions as
a virtual private server in one of three sizes; small, large or extra large. Amazon.com
sizes instances based on "EC2 Compute Units" the equivalent CPU capacity of
physical hardware. One EC2 Compute Unit equals 1.0-1.2 GHz 2007 Opteron or
2007 Xeon processor. The system offers the following instance types:
Small Instance
The small instance (default) equates to "a system with 1.7 GB of memory, 1
EC2 Compute Unit (1 virtual core with 1 EC2 Compute Unit), 160 GB of instance
storage, 32-bit platform"
Large Instance
The large instance represents "a system with 7.5 GB of memory, 4 EC2
Compute Units (2 virtual cores with 2 EC2 Compute Units each), 850 GB of
instance storage, 64-bit platform".
Extra Large Instance
The extra large instance offers the "equivalent of a system with 15 GB of
memory, 8 EC2 Compute Units (4 virtual cores with 2 EC2 Compute Units each),
1690 GB of instance storage, 64-bit platform."
Dept of ISE
17
D.S.C.E, Bangalore
High-CPU Instance
Instances of this family have proportionally more CPU resources than memory
(RAM) and address compute-intensive applications.
High-CPU Medium Instance
Instances of this family have the following configuration:
1.7 GB of memory
5 EC2 Compute Units (2 virtual cores with 2.5 EC2 Compute Units each)
350 GB of instance storage
32-bit platform
I/O Performance: Moderate
High-CPU Extra Large Instance
Instances of this family have the following configuration:
7 GB of memory
20 EC2 Compute Units (8 virtual cores with 2.5 EC2 Compute Units each)
1690 GB of instance storage
64-bit platform
I/O Performance: High
Pricing
Amazon charges customers in two primary ways:
The hourly virtual machine rate is fixed, based on the capacity and features of the virtual
machine. Amazon advertising describes the pricing scheme as "you pay for resources you
consume," but defines resources such that an idle virtual machine is consuming
resources, as opposed to other pricing schemes where one would pay for basic resources
such as CPU time.
Customers can easily start and stop virtual machines to control charges, with Amazon
measuring with one hour granularity. Some are thus able to keep each virtual machine
running near capacity and effectively pay only for CPU time actually used.
As of March 2009, Amazon's time charge is about $73/month for the smallest virtual
machine without Windows and twelve times that for the largest one running Windows.
The data transfer charge ranges from $.10 to $.17 per gigabyte, depending on the
direction and monthly volume.
Amazon does not have monthly minimums or account maintenance charges.
Dept of ISE
18
D.S.C.E, Bangalore
Operating systems
When it launched in August 2006, the EC2 service offered Linux and later Sun
Microsystems' OpenSolaris and Solaris Express Community Edition. In October 2008,
EC2 added the Windows Server 2003 operating system to the list of available operating
systems.
Plans are in place for the Eucalyptus interface for the Amazon API to be packaged into
the standard Ubuntu distribution.
Persistent Storage
Amazon.com provides persistent storage in the form of Elastic Block Storage (EBS).
Users can set up and manage volumes of sizes from 1GB to 1TB. The servers can attach
these instances of EBS to one server at a time in order to maintain data storage by the
servers
13.2 Salesforce.com
Origins
Salesforce.com was founded in 1999 by former Oracle executive Marc Benioff. In June
2004, the company went public on the New York Stock Exchange under the stock symbol
CRM. Initial investors in salesforce.com were Marc Benioff, Larry Ellison, Halsey
Minor, Magdalena Yesil and Igor Sill, Geneva Venture Partners.
Dept of ISE
19
D.S.C.E, Bangalore
Current status
Salesforce.com is headquartered in San Francisco, California, with regional headquarters
in Dublin (covering Europe, Middle East, and Africa), Singapore (covering Asia Pacific
less Japan), and Tokyo (covering Japan). Other major offices are in Toronto, New York,
London, Sydney, and San Mateo, California. Salesforce.com has its services translated
into 15 different languages and currently has 43,600 customers and over 1,000,000
subscribers. In 2008, Salesforce.com ranked 43rd on the list of largest software
companies in the world.
Following the Federal takeover of Freddie Mac and Fannie Mae in September 2008, the
S&P 500 removed the two mortgage giants after Wednesday, September 10, 2008, and
added Fastenal and Salesforce.com to the index, effective after Friday, September 12,
2008
Force.com Platform
Salesforce.com's Platform-as-a-Service product is known as the Force.com Platform. The
platform allows external developers to create add-on applications that integrate into the
main Salesforce application and are hosted on salesforce.com's infrastructure.
These applications are built using Apex (a proprietary Java-like programming language
for the Force.com Platform) and Visualforce (an XML-like syntax for building user
interfaces in HTML, AJAX or Flex).
AppExchange
Launched in 2005, AppExchange is a directory of applications built for Salesforce by
third-party developers which users can purchase and add to their Salesforce environment.
As of September 2008, there are over 800 applications available from over 450 ISVs.
Customization
Salesforce users can customize their CRM application. In the system, there are tabs such
as "Contacts", "Reports", and "Accounts". Each tab contains associated information. For
example, "Contacts" has fields like First Name, Last Name, Email, etc.
Dept of ISE
20
D.S.C.E, Bangalore
Web Services
In addition to the web interface, Salesforce offers a Web Services API that enables
integration with other systems
Reception
Early reviews compared the operating system's user interface to Mac OS X and noted the
similarity of its browser to Google Chrome, although it is actually based on a modified
Mozilla Firefox browser.
Dept of ISE
21
D.S.C.E, Bangalore
14. Conclusion
Cloud Computing is a vast topic and the above report does not give a high level
introduction to it. It is certainly not possible in the limited space of a report to do justice
to these technologies. What is in store for this technology in the near future? Well, Cloud
Computing is leading the industrys endeavor to bank on this revolutionary technology.
Cloud Computing Brings Possibilities..
Simplifies IT management
Dept of ISE
22
D.S.C.E, Bangalore
15. References
[1]. www.wikipedia.com
[2]. www.infoworld.com/article/08/04/07/15FE-cloud-computing-reality_1.html
[3]. www.wiki.cloudcommunity.org/wiki/CloudComputing:Bill_of_Rights
[4]. www.davidchappell.com/CloudPlatforms--Chappell. PDF
[5]. www.amazon.com
[6]. www.thinkgos.com/cloud/index.html
[7]. www.salesforce.com
[8]. www.google.com
[9]. Chip Computer Magazine, December 2008 - Feb 2009 Edition
Dept of ISE
23
D.S.C.E, Bangalore