Cloud Computin1
Cloud Computin1
Cloud computing refers to the provision of computational resources on demand via a computer network.
Cloud computing can be compared to the supply of electricity and gas, or the provision of telephone, television
and postal services. All of these services are presented to the users in a simple way that is easy to understand
without the users needing to know how the services are provided. This simplified view is called an abstraction.
Similarly, cloud computing offers computer application developers and users an abstract view of services that
simplifies and ignores much of the details and inner workings. A provider's offering of abstracted Internet
services is often called "The Cloud".
Contents
[hide]
• 1 How it works
• 2 Technical Description
• 3 Overview
o 3.1 Comparisons
o 3.2 Characteristics
o 3.3 Architecture
• 4 History
• 5 Key characteristics
• 6 Layers
o 6.1 Client
o 6.2 Application
o 6.3 Platform
o 6.4 Infrastructure
o 6.5 Server
• 7 Deployment models
IT delivery
• 8 Cloud engineering
• 9 Cloud storage
• 10 The Intercloud
• 11 Issues
o 11.1 Privacy
o 11.2 Compliance
o 11.3 Legal
o 11.6 Security
performance
• 12 Research
• 14 See also
• 15 References
• 16 External links
[edit]How it works
When a user accesses the cloud for a popular website, many things can happen. The user's IP for example
can be used to establish where the user is located (geolocation). DNS services can then direct the user to a
cluster of servers that are close to the user so the site can be accessed rapidly and in their local language. The
user doesn't login to a server, but they login to the service they are using by obtaining asession id and/or
a cookie which is stored in their browser.
What the user sees in the browser will usually come from a cluster of web servers. The webservers run
software which presents the user with an interface which is used to collect commands or instructions from the
user (the clicks, typing, uploads etc.) These commands are then interpreted by webservers or processed by
application servers. Information is then stored on or retrieved from the database servers or file servers and the
user is then presented with an updated page. The data across the multiple servers is synchronised around the
world for rapid global access .
[edit]Technical Description
Cloud computing is computation, software, data access, and storage services that do not require end-user
knowledge of the physical location and configuration of the system that delivers the services. Parallels to this
concept can be drawn with the electricity grid where end-users consume power resources without any
necessary understanding of the component devices in the grid required to provide the service.
Cloud computing describes a new supplement, consumption, and delivery model for IT services based on
Internet protocols, and it typically involves provisioning of dynamically scalable and often virtualized resources.
[1][2]
It is a byproduct and consequence of the ease-of-access to remote computing sites provided by the
Internet.[3] This frequently takes the form of web-based tools or applications that users can access and use
through a web browser as if they were programs installed locally on their own computers.[4]
The National Institute of Standards and Technology (NIST) provides a somewhat more objective and specific
definition:
Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of
configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be
rapidly provisioned and released with minimal management effort or service provider interaction.[5]
Typical cloud computing providers deliver common business applications online that are accessed from
another Web service or software like a Web browser, while the software and data are stored on servers.
Most cloud computing infrastructures consist of services delivered through common centers and built on
servers. Clouds often appear as single points of access for consumers' computing needs. Commercial offerings
are generally expected to meet quality of service (QoS) requirements of customers, and typically
include service level agreements (SLAs).[6]
[edit]Overview
[edit]Comparisons
Cloud computing derives characteristics from, but should not be confused with:
[edit]Characteristics
The key characteristic of cloud computing is that the computing is "in the cloud" i.e. the processing (and the
related data) is not in a specified, known or static place(s). This is in contrast to a model in which the
processing takes place in one or more specific servers that are known. All the other concepts mentioned are
supplementary or complementary to this concept.
[edit]Architecture
Cloud computing sample architecture
Cloud architecture,[12] the systems architecture of the software systemsinvolved in the delivery of cloud
computing, typically involves multiple cloud components communicating with each other over application
programming interfaces, usually web services and 3-tier architecture. This resembles the Unix philosophy of
having multiple programs each doing one thing well and working together over universal interfaces. Complexity
is controlled and the resulting systems are more manageable than their monolithic counterparts.
The two most significant components of cloud computing architecture are known as the front end and the back
end. The front end is the part seen by the client, i.e. the computer user. This includes the client’s network (or
computer) and the applications used to access the cloud via a user interface such as a web browser. The back
end of the cloud computing architecture is the ‘cloud’ itself, comprising various computers, servers and data
storage devices.
[edit]History
The term "cloud" is used as a metaphor for the Internet, based on the cloud drawing used in the past to
represent the telephone network,[13]and later to depict the Internet in computer network diagrams as
an abstraction of the underlying infrastructure it represents.[14]
The underlying concept of cloud computing dates back to the 1960s, when John McCarthy opined that
"computation may someday be organized as a public utility." Almost all the modern-day characteristics of cloud
computing (elastic provision, provided as a utility, online, illusion of infinite supply), the comparison to the
electricity industry and the use of public, private, government and community forms, were thoroughly explored
in Douglas Parkhill's 1966 book, The Challenge of the Computer Utility.
The actual term "cloud" borrows from telephony in that telecommunications companies, who until the 1990s
primarily offered dedicated point-to-point data circuits, began offering Virtual Private Network (VPN) services
with comparable quality of service but at a much lower cost. By switching traffic to balance utilization as they
saw fit, they were able to utilize their overall network bandwidth more effectively. The cloud symbol was used to
denote the demarcation point between that which was the responsibility of the provider from that of the user.
Cloud computing extends this boundary to cover servers as well as the network infrastructure.[16] The first
scholarly use of the term “cloud computing” was in a 1997 lecture by Ramnath Chellappa.
Amazon played a key role in the development of cloud computing by modernizing their data centers after
the dot-com bubble, which, like most computer networks, were using as little as 10% of their capacity at any
one time, just to leave room for occasional spikes. Having found that the new cloud architecture resulted in
significant internal efficiency improvements whereby small, fast-moving "two-pizza teams" could add new
features faster and more easily, Amazon initiated a new product development effort to provide cloud computing
to external customers, and launched Amazon Web Service (AWS) on a utility computing basis in 2006.[17][18]
In 2007, Google, IBM and a number of universities embarked on a large scale cloud computing research
project.[19] In early 2008, Eucalyptusbecame the first open source AWS API compatible platform for deploying
private clouds. In early 2008, OpenNebula, enhanced in the RESERVOIR European Commission funded
project, became the first open source software for deploying private and hybrid clouds and for the federation of
clouds.[20] In the same year, efforts were focused on providing QoS guarantees (as required by real-time
interactive applications) to cloud-based infrastructures, in the framework of the IRMOS European Commission
funded project.[21] By mid-2008, Gartner saw an opportunity for cloud computing "to shape the relationship
among consumers of IT services, those who use IT services and those who sell them"[22] and observed that
"[o]rganisations are switching from company-owned hardware and software assets to per-use service-based
models" so that the "projected shift to cloud computing ... will result in dramatic growth in IT products in some
areas and significant reductions in other areas."[23]
[edit]Key characteristics
[edit]Layers
Once an Internet Protocol connection is established among several computers, it is possible to share services
within any one of the following layers.
[edit]Client
A cloud client consists of computer hardware and/or computer software that relies on cloud computing for
application delivery, or that is specifically designed for delivery of cloud services and that, in either case, is
essentially useless without it. Examples include some computers, phones and other devices, operating
systems andbrowsers.[34][35][36][37][38]
[edit]Application
Cloud application services or "Software as a Service (SaaS)" deliver software as a service over the Internet,
eliminating the need to install and run the application on the customer's own computers and simplifying
maintenance and support. People tend to use the terms "SaaS" and "cloud" interchangeably, when in fact they
are two different things.[citation needed] Key characteristics include:[39][clarification needed]
Network-based access to, and management of, commercially available (i.e.,
not custom) software
Activities that are managed from central locations rather than at each
customer's site, enabling customers to access applications remotely via the
Web
[edit]Platform
Cloud platform services or "Platform as a Service (PaaS)" deliver a computing platform and/or solution stack as
a service, often consuming cloud infrastructure and sustaining cloud applications.[40] It facilitates deployment of
applications without the cost and complexity of buying and managing the underlying hardware and software
layers.[41][42]
[edit]Infrastructure
Cloud infrastructure often takes the form of a tier 3 data center with many tier 4 attributes, assembled from
hundreds of virtual machines.
[edit]Server
The servers layer consists of computer hardware and/or computer software products that are specifically
designed for the delivery of cloud services, including multi-core processors, cloud-specific operating systems
and combined offerings.[34][44][45][46]
[edit]Deployment models
Cloud computing types
[edit]Public cloud
Public cloud or external cloud describes cloud computing in the traditional mainstream sense, whereby
resources are dynamically provisioned on a fine-grained, self-service basis over the Internet, viaweb
applications/web services, from an off-site third-party provider who bills on a fine-grained utility
computing basis.[26]
[edit]Community cloud
A community cloud may be established where several organizations have similar requirements and seek to
share infrastructure so as to realize some of the benefits of cloud computing. With the costs spread over fewer
users than a public cloud (but more than a single tenant) this option is more expensive but may offer a higher
level of privacy, security and/or policy compliance. Examples of community clouds include Google's "Gov
Cloud".[47]
A hybrid storage cloud uses a combination of public and private storage clouds. Hybrid storage clouds are
often useful for archiving and backup functions, allowing local data to be replicated to a public cloud.[50]
Another perspective on deploying a web application in the cloud is using Hybrid Web Hosting, where the
hosting infrastructure is a mix between cloud hosting and managed dedicated servers – this is most commonly
achieved as part of a web cluster in which some of the nodes are running on real physical hardware and some
are running on cloud server instances.[citation needed]
[edit]Combined cloud
Two clouds that have been joined together are more correctly called a "combined cloud". A combined cloud
environment consisting of multiple internal and/or external providers[51] "will be typical for most enterprises".
[52]
By integrating multiple cloud services users may be able to ease the transition to public cloud services while
avoiding issues such as PCI compliance.[53]
[edit]Private cloud
Douglas Parkhill first described the concept of a "private computer utility" in his 1966 book The Challenge of
the Computer Utility. The idea was based upon direct comparison with other industries (e.g. the electricity
industry) and the extensive use of hybrid supply models to balance and mitigate risks.
"Private cloud" and "internal cloud" have been described as neologisms, but the concepts themselves pre-date
the term cloud by 40 years. Even within modern utility industries, hybrid models still exist despite the formation
of reasonably well-functioning markets and the ability to combine multiple providers.
Some vendors have used the terms to describe offerings that emulate cloud computing on private networks.
These (typically virtualizationautomation) products offer the ability to host applications or virtual machines in a
company's own set of hosts. These provide the benefits of utility computing – shared hardware costs, the ability
to recover from failure, and the ability to scale up or down depending upon demand.
Private clouds have attracted criticism because users "still have to buy, build, and manage them" and thus do
not benefit from lower up-front capital costs and less hands-on management,[52] essentially "[lacking] the
economic model that makes cloud computing such an intriguing concept".[54] [55] Enterprise IT organizations use
their own private cloud(s) for mission critical and other operational systems to protect critical infrastructures. [56]
[edit]Cloud engineering
Main article: Cloud engineering
Cloud engineering is the application of a systematic, disciplined, quantifiable, and interdisciplinary approach to
the ideation, conceptualization, development, operation, and maintenance of cloud computing, as well as the
study and applied research of the approach, i.e., the application of engineering to cloud. It is a maturing and
evolving discipline to facilitate the adoption, strategization, operationalization, industrialization, standardization,
productization, commoditization, and governance of cloud solutions, leading towards a cloud ecosystem[further
explanation needed]
. Cloud engineering is also known as cloud service engineering.
[edit]Cloud storage
Main article: Cloud storage
Cloud storage is a model of networked computer data storage where data is stored on multiple virtual servers,
generally hosted by third parties, rather than being hosted on dedicated servers. Hosting companies operate
large data centers; and people who require their data to be hosted buy or lease storage capacity from them and
use it for their storage needs. The data center operators, in the background, virtualizethe resources according
to the requirements of the customer and expose them as virtual servers, which the customers can themselves
manage. Physically, the resource may span across multiple servers.
[edit]The Intercloud
Main article: Intercloud
The Intercloud[57] is an interconnected global "cloud of clouds"[58][59] and an extension of the Internet "network of
networks" on which it is based.[60] The term was first used in the context of cloud computing in 2007 when Kevin
Kelly stated that "eventually we'll have the intercloud, the cloud of clouds. This Intercloud will have the
dimensions of one machine comprising all servers and attendant cloudbooks on the planet.".[58] It became
popular in 2009[61] and has also been used to describe the datacenter of the future.[62]
The Intercloud scenario is based on the key concept that each single cloud does not have infinite physical
resources. If a cloud saturates the computational and storage resources of its virtualization infrastructure, it
could not be able to satisfy further requests for service allocations sent from its clients. The Intercloud scenario
aims to address such situation, and in theory, each cloud can use the computational and storage resources of
the virtualization infrastructures of other clouds. Such form of pay-for-use may introduce new business
opportunities among cloud providers if they manage to go beyond theoretical framework. Nevertheless, the
Intercloud raises many more challenges than solutions concerning cloud federation, security, interoperability,
quality of service, vendor's lock-ins, trust, legal issues, monitoring and billing.[citation needed]
The concept of a competitive utility computing market which combined many computer utilities together was
originally described by Douglas Parkhill in his 1966 book, the "Challenge of the Computer Utility". This concept
has been subsequently used many times over the last 40 years and is identical to the Intercloud.
[edit]Issues
[edit]Privacy
The cloud model has been criticized by privacy advocates for the greater ease in which the companies hosting
the cloud services control, and thus, can monitor at will, lawfully or unlawfully, the communication and data
stored between the user and the host company. Instances such as the secret NSA program, working
with AT&T, and Verizon, which recorded over 10 million phone calls between American citizens, causes
uncertainty among privacy advocates, and the greater powers it gives to telecommunication companies to
monitor user activity.[63] While there have been efforts (such as US-EU Safe Harbor) to "harmonize" the legal
environment, providers such as Amazon still cater to major markets (typically the United States and
the European Union) by deploying local infrastructure and allowing customers to select "availability zones."[64]
[edit]Compliance
In order to obtain compliance with regulations including FISMA, HIPAA and SOX in the United States, the Data
Protection Directive in the EUand the credit card industry's PCI DSS, users may have to
adopt community or hybrid deployment modes which are typically more expensive and may offer restricted
benefits. This is how Google is able to "manage and meet additional government policy requirements beyond
FISMA"[65][66] and Rackspace Cloud are able to claim PCI compliance.[67] Customers in the EU contracting with
cloud providers established outside the EU/EEA have to adhere to the EU regulations on export of personal
data.[68]
Many providers also obtain SAS 70 Type II certification (e.g. Amazon,[69] Salesforce.com,[70] Google[71] and
Microsoft[72]), but this has been criticised on the grounds that the hand-picked set of goals and standards
determined by the auditor and the auditee are often not disclosed and can vary widely.[73] Providers typically
make this information available on request, under non-disclosure agreement.[74]
[edit]Legal
In March 2007, Dell applied to trademark the term "cloud computing" (U.S. Trademark 77,139,082) in the
United States. The "Notice of Allowance" the company received in July 2008 was canceled in August, resulting
in a formal rejection of the trademark application less than a week later. Since 2007, the number of trademark
filings covering cloud computing brands, goods and services has increased at an almost exponential rate. As
companies sought to better position themselves for cloud computing branding and marketing efforts, cloud
computing trademark filings increased by 483% between 2008 and 2009. In 2009, 116 cloud computing
trademarks were filed, and trademark analysts predict that over 500 such marks could be filed during 2010.[75]
Other legal cases may shape the use of cloud computing by the public sector. On October 29, 2010, Google
filed a lawsuit against the U.S. Department of Interior, which opened up a bid for software that required that
bidders use Microsoft's Business Productivity Online Suite. Google sued, calling the requirement "unduly
restrictive of competition."[76] Scholars have pointed out that, beginning in 2005, the prevalence of open
standards and open source may have an impact on the way that public entities choose to select vendors.[77]
[edit]Open source
Open source software has provided the foundation for many cloud computing implementations.[78] In November
2007, the Free Software Foundation released the Affero General Public License, a version of GPLv3 intended
to close a perceived legal loophole associated with free software designed to be run over a network.[79]
[edit]Open standards
See also: Category:Cloud standards
Most cloud providers expose APIs which are typically well-documented (often under a Creative
Commons license[80]) but also unique to their implementation and thus not interoperable. Some vendors have
adopted others' APIs[81] and there are a number of open standards under development, including
the OGF's Open Cloud Computing Interface. The Open Cloud Consortium (OCC)[82] is working to develop
consensus on early cloud computing standards and practices.
[edit]Security
The relative security of cloud computing services is a contentious issue which may be delaying its adoption.
[83]
Issues barring the adoption of cloud computing are due in large part to the private and public sectors unease
surrounding the external management of security based services. It is the very nature of cloud computing
based services, private or public, that promote external management of provided services. This delivers great
incentive amongst cloud computing service providers in producing a priority in building and maintaining strong
management of secure services.[84]
Organizations have been formed in order to provide standards for a better future in cloud computing services.
One organization in particular, the Cloud Security Alliance is a non-profit organization formed to promote the
use of best practices for providing security assurance within cloud computing.[85]
There are also concerns about a cloud provider shutting down for financial or legal reasons, which has
happened in a number of cases.[87]
[edit]Research
A number of universities, vendors and government organizations are investing in research around the topic of
cloud computing.[92] Academic institutions include University of Melbourne (Australia), Georgia Tech, Yale,
Wayne State, Virginia Tech, University of Wisconsin–Madison, Carnegie Mellon, MIT, Indiana University,
University of Massachusetts, University of Maryland, IIT Bombay, North Carolina State University, Purdue
University, University of California, University of Washington, University of Virginia, University of Utah,
University of Minnesota, among others.[93]
Joint government, academic and vendor collaborative research projects include the IBM/Google Academic
Cloud Computing Initiative (ACCI). In October 2007 IBM and Google announced the multi- university project
designed to enhance students' technical knowledge to address the challenges of cloud computing.[94] In April
2009, the National Science Foundation joined the ACCI and awarded approximately $5 million in grants to 14
academic institutions.[95]
In July 2008, HP, Intel Corporation and Yahoo! announced the creation of a global, multi-data center, open
source test bed, called Open Cirrus,[96] designed to encourage research into all aspects of cloud computing,
service and data center management.[97] Open Cirrus partners include the NSF, the University of Illinois (UIUC),
Karlsruhe Institute of Technology, the Infocomm Development Authority (IDA) of Singapore, the Electronics and
Telecommunications Research Institute (ETRI) in Korea, the Malaysian Institute for Microelectronic
Systems(MIMOS), and the Institute for System Programming at the Russian Academy of Sciences (ISPRAS).
[98]
In Sept. 2010, more researchers joined the HP/Intel/Yahoo Open Cirrus project for cloud computing
research. The new researchers are China Mobile Research Institute (CMRI), Spain's Supercomputing Center
of Galicia (CESGA by its Spanish acronym), Georgia Tech's Center for Experimental Research in Computer
Systems (CERCS) and China Telecom.[99][100]
In July 2010, HP Labs India announced a new cloud-based technology designed to simplify taking content and
making it mobile-enabled, even from low-end devices.[101] Called SiteonMobile, the new technology is designed
for emerging markets where people are more likely to access the internet via mobile phones rather than
computers.[102] In Nov. 2010, HP formally opened its Government Cloud Theatre, located at the HP Labs site in
Bristol, England.[103] The demonstration facility highlights high-security, highly flexible cloud computing based on
intellectual property developed at HP Labs. The aim of the facility is to lessen fears about the security of the
cloud. HP Labs Bristol is HP’s second-largest central research location and currently is responsible for
researching cloud computing and security.[104]
The IEEE Technical Committee on Services Computing[105] in IEEE Computer Society sponsors the IEEE
International Conference on Cloud Computing (CLOUD).[106] CLOUD 2010 was held on July 5–10, 2010 in
Miami, Florida
On March 23, 2011, Google, Microsoft, HP, Yahoo, Verizon, Deutsche Telecom and 17 other companies
formed a nonprofit organization called Open Networking Foundation, focused on providing support for a new
cloud initiative called Software-Defined Networking.[107] The initiative is meant to speed innovation through
simple software changes in telecommunications networks, wireless networks, data centers and other
networking areas.[108]
[edit]See also
Cloud backup
Cloud engineering
Cloud gaming
Data center
Green computing
High-performance computing
List of cloud computing providers
Open Data Center Alliance
Hosted desktop
Cloud Computing Modeling Notation (CCMN)
[edit]References
Retrieved 2009-11-03.
Retrieved 2008-07-31.
2009-06-16.
10. ^ "It's probable that you've misunderstood 'Cloud Computing' until now".
TechPluto. Retrieved 2010-09-14.
16. ^ "July, 1993 meeting report from the IP over ATM working group of the
IETF". Retrieved 2010-08-22.
22. ^ Keep an eye on cloud computing, Amy Schurr, Network World, 2008-
07-08, citing the Gartner report, "Cloud Computing Confusion Leads to
27. ^ Farber, Dan (2008-06-25). "The new geek chic: Data centers".
News.cnet.com. Retrieved 2010-08-22.
29. ^ "Google Apps Admins Jittery About Gmail, Hopeful About Future".
Pcworld.com. 2008-08-15. Retrieved 2010-08-22.
41. ^ Jack Schofield (2008-04-17). "Google angles for business users with
'platform as a service'". London: Guardian. Retrieved 2010-08-22.
45. ^ Duffy, Jim (2009-05-12). "Cisco unveils cloud computing platform for
service providers". Infoworld.com. Retrieved 2010-08-22.
49. ^ Internet News: “HP Details Hybrid Approach to Cloud Services.” David
Needle. Nov.2, 2010.
50. ^ Managing Private and Hybrid Clouds for Data Storage, SNIA, Jan
2010
51. ^ Eric Krangel (2009-02-10). "IBM Embraces Juniper For Its Smart
'Hybrid Cloud', Disses Cisco (IBM)". Businessinsider.com. Retrieved 2010-
08-22.
52. ^ a b Foley, John. "Private Clouds Take Shape". Informationweek.com.
Retrieved 2010-08-22.
53. ^ "Forecast for 2010: The Rise of Hybrid Clouds". Gigaom.com. 2010-
01-01. Retrieved 2010-08-22.
54. ^ Haff, Gordon (2009-01-27). "Just don't call them private clouds".
News.cnet.com. Retrieved 2010-08-22.
328–336.doi:10.1109/ICIW.2009.55.
58. ^ a b "Kevin Kelly: A Cloudbook for the Cloud". Kk.org. Retrieved 2010-
08-22.
60. ^ "Vint Cerf: Despite Its Age, The Internet is Still Filled with Problems".
Readwriteweb.com. Retrieved 2010-08-22.
73. ^ "Amazon gets SAS 70 Type II audit stamp, but analysts not satisfied".
Searchcloudcomputing.techtarget.com. 2009-11-17. Retrieved 2010-08-22.
consumption"
89. ^ Finland – First Choice for Siting Your Cloud Computing Data Center..
Retrieved 4 August 2010.
provide IT services."
94. ^ ""IBM, Google Team on an Enterprise Cloud." May 2008. Rich Miller
Retrieved 2010-04-01". DataCenterKnowledge.com. 2008-05-02. Retrieved
2010-08-22.
2010-08-22.
96. ^ "HP News Release. "HP, Intel and Yahoo! Create Global Cloud
Computing Research Test Bed." July 2008". Hp.com. 2008-07-29. Retrieved
2010-08-22.
97. ^ "HP News Release. "HP, Intel and Yahoo! Attract Leading Research
Organizations to Collaborative Cloud Computing Test Bed. June 2009".
Nov. 2010
2011.
Frank Gillett Big Tech Companies Have "Cloud Envy"". Beet.tv. Retrieved
2010-08-22.
[edit]External links
Clients Browsers · Devices (Netbooks · Tablets · Smartphones) · Operating Systems (Android · iOS · Windows Phone 7 · Linux)
Platforms App Engine · Azure · Engine Yard · Force.com · Heroku · MTurk · RightScale · S3 · SimpleDB · SQS
Infrastructure EC2 · Eucalyptus · FlexiScale · GoGrid · Joyent · Nimbus · Rackspace Cloud · VPC
Technologies Networking · Security · Datacenters · Cloud storage · Internet · Structured storage · Virtualization · Web Services · Virtual Applianc
• Main page
• Contents
• Featured content
• Current events
• Random article
• Donate to Wikipedia
Interaction
• Help
• About Wikipedia
• Community portal
• Recent changes
• Contact Wikipedia
Toolbox
Print/export
Languages
• العربية
• Беларуская
• Български
• Català
• Česky
• Dansk
• Deutsch
• Español
• Esperanto
• Euskara
• فارسی
• Français
• 한국어
• िहनदी
• Bahasa Indonesia
• Italiano
• עברית
• Lietuvių
• Magyar
• Македонски
• മലയാളം
• Bahasa Melayu
• Nederlands
• नेपाली
• 日本語
• Norsk (bokmål)
• Polski
• Português
• Română
• Русский
• Simple English
• Slovenščina
• Suomi
• Svenska
• தமிழ்
• తలుగు
• ไทย
• Українська
• Tiếng Việt
• 中文
• Privacy policy
• About Wikipedia
• Disclaimers