Cloud computing is Internet ("cloud") based development and use of computer technology ("computing"). It is a style of computing in which resources are provided 「as a service」[1] over the Internet[2] to users who need not ha ve knowledge of, expertise in, or control over the technology infrastructure ("in the cloud") that supports them[3].
It is a general concept that incorporates software as a service (SaaS), Web 2.0 and other recent, well-known technology tr ends, in which the common theme is reliance on the Internet for satisfying the computing needs of the users. An often-quoted example is Google Apps, which provides common business applications online that are accessed from a web br owser, while the software and data are stored on Google servers.
The cloud is a metaphor for the Internet, based on how it is depicted in computer network diagrams, and is an abstraction for the complex infrastructure it conceals. [4].
Contents[hide]1 Brief 1.1 Comparisons 1.2 Architecture 1.3 Characteristics 1.4 Companies 2 History 3 Political issues 4 Le gal issues 5 Risk mitigation 6 Key characteristics 7 Components 7.1 Application 7.2 Client 7.3 Infrastructure 7.4 Platform 7.5 Service 7.6 Storage 8 Architecture 9 Roles 9.1 Provider 9.2 User 9.3 Vendor 10 Standards 11 References 12 External li nks
[edit] Brief
[edit] Comparisons
Cloud computing is often confused with grid computing, ("a form of distributed computing whereby a 'super and virtual comp uter' is composed of a cluster of networked, loosely-coupled computers, acting in concert to perform very large tasks"), u tility computing (the "packaging of computing resources, such as computation and storage, as a metered service similar to a traditional public utility such as electricity")[5] and autonomic computing ("computer systems capable of self-managemen t"). [6]
Indeed many cloud computing deployments as of 2009[update] depend on grids, have autonomic characteristics and bill like u tilities — but cloud computing can be seen as a natural next step from the grid-utility model. [7] Some successful cloud architectures have little or no centralised infrastructure or billing systems whatsoever, includ ing peer-to-peer networks like BitTorrent and Skype and volunteer computing like SETI@home. [8]
[edit] Architecture
The majority of cloud computing infrastructure as of 2009[update] consists of reliable services delivered through data cen ters and built on servers with different levels of virtualization technologies. The services are accessible anywhere in the world, with The Cloud appearing as a single point of access for all the comput ing needs of consumers. Commercial offerings need to meet the quality of service requirements of customers and typically offer service level agree ments. [9] Open standards and open source software are also critical to the growth of cloud computing. [10]
[edit] Characteristics
As customers generally do not own the infrastructure, they merely access or rent, they can avoid capital expenditure and c onsume resources as a service, paying instead for what they use. Many cloud-computing offerings have adopted the utility computing model, which is analogous to how traditional utilities l ike electricity are consumed, while others are billed on a subscription basis. Sharing "perishable and intangible" computing power among multiple tenants can improve utilization rates, as servers are n ot left idle, which can reduce costs significantly while increasing the speed of application development. A side effect of this approach is that "computer capacity rises dramatically" as customers do not have to engineer for pea k loads. [11] Adoption has been enabled by "increased high-speed bandwidth" which makes it possible to receive the same response ti mes from centralized infrastructure at other sites.
[edit] Companies
Providers including Amazon, Google and Microsoft exemplify the use of cloud computing. [12] It is being adopted by individual users through large enterprises including General Electric, L'Oréal, and Procter &a mp; Gamble. [13] [14]
[edit] History
The Cloud[15] has served as a metaphor for the Internet,[16] deriving from its common depiction in network diagrams as a c loud outline. [4]
The underlying concept dates back to 1960 when John McCarthy opined that "computation may someday be organized as a public utility"; indeed it shares characteristics with service bureaus which date back to the 1960s. The term cloud had already come into commercial use in the early 1990s to refer to large ATM networks. [17] By the turn of the 21st century, the term "cloud computing" had started to appear,[18] although most of the focus at this time was on Software as a service.
Amazon.com played a key role in the development of cloud computing by modernizing their data centres after the dot-com bub ble and, having found that the new cloud architecture resulted in significant internal efficiency improvements, providing access to their systems by way of Amazon Web Services in 2002 on a utility computing basis. [19]
2007 saw increased activity, with Google, IBM, and a number of universities embarking on a large scale cloud computing res earch project,[20] around the time the term started gaining popularity in the mainstream press. It was a hot topic by mid-2008 and numerous cloud computing events had been scheduled. [21]
In August 2008 Gartner observed that "organisations are switching from company-owned hardware and software assets to per-u se service-based models" and that the "projected shift to cloud computing will result in dramatic growth in IT products in some areas and in significant reductions in other areas". [22]
[edit] Political issues
The Cloud spans many borders and "may be the ultimate form of globalization"[23]. As such it becomes subject to complex geopolitical issues: providers must satisfy a myriad of regulatory environments in o rder to deliver service to a global market. This dates back to the early days of the Internet, where libertarian thinkers felt that "cyberspace was a distinct place c alling for laws and legal institutions of its own"; author Neal Stephenson envisaged this as a tiny island data haven called Kinakuta in his science-fiction classic novel Cry ptonomicon[23].
Despite efforts (such as US-EU Safe Harbor) to harmonise the legal environment, providers like Amazon Web Services cater a s of 2009[update] to the major markets (typically the United States and the European Union) by deploying local infrastruct ure and allowing customers to select "availability zones"[24]. Nonetheless there are still concerns about security and privacy from individual through governmental level, e.g., the USA PATRIOT Act and use of national security letters and the Electronic Communications Privacy Act's Stored Communications Act .
[edit] Legal issues
In March 2007, Dell applied to trademark the term '"cloud computing" (U.S. Trademark 77,139,082) in the United States. The "Notice of Allowance" it received in July 2008 got canceled on August 6, resulting in a formal rejection of the tradem ark application less than a week later.
Richard Stallman, founder of the Free Software Foundation, believes that cloud computing endangers liberties because users sacrifice their privacy and personal data to a third party. [25]. In November 2007, the Free Software Foundation released the Affero General Public License, a version of GPLv3 design ed to close a perceived legal loophole associated with Free software designed to be run over a network, particularly softw are as a service. An application service provider is required to release any changes they make to Affero GPL open source code.
[edit] Risk mitigation
Corporations or end-users wishing to avoid losing or not being able to access their data should research vendor's policies on data security before using vendor services. The technology analyst and consulting firm, Gartner, lists seven security issues which one should discuss with a cloud-com puting vendor:
Privileged user access—inquire about who has specialized access to data and about the hiring and management of such admini strators Regulatory compliance—make sure a vendor is willing to undergo external audits and/or security certifications Dat a location—ask if a provider allows for any control over the location of data Data segregation—make sure that encryption i s available at all stages and that these 「encryption schemes were designed and tested by experienced professionals.」 Recovery—find out what will happen to data in the case of a disaster; do they offer complete restoration and, if so, how long that would take Investigative Support—inquire as to whether a vend or has the ability to investigate any inappropriate or illegal activity Long-term viability—ask what will happen to data i f the company goes out of business; how will data be returned and in what format[26]
In practice, one can best determine data-recovery capabilities by experiment: asking to get back old data, seeing how long it takes, and verifying that the checksums match the original data. Determining data security is harder. A tactic not covered by Gartner is: encrypt the data yourself. If you encrypt the data using a trusted algorithm, then regardless of the service provider's security and encryption polic ties, the data will only be accessible with the decryption keys. This leads to a follow-on problem: managing private keys in a pay-on-demand computing infrastructure.
[edit] Key characteristicsCustomers minimize capital expenditure; this lowers barriers to entry, as infrastructure is owned by the provider and does not need to be purchased for one-time o r infrequent intensive computing tasks. Services are typically available to or specifically targeted to retail consumers and small businesses. Device and location independence[27] enable users to access systems regardless of their location or what device they are u sing, e.g., PC, mobile. Multi-tenancy enables sharing of resources and costs among a large pool of users, allowing for: Centralization of infrastr ucture in areas with lower costs (such as real estate, electricity, etc.) Peak-load capacity increases (users need not engineer for highest possible load-levels) Utilisation and efficiency improve ments for systems that are often only 10-20% utilised. [19] On-demand allocation and de-allocation of CPU, storage and network bandwidth Performance is monitored and consistent, but can suffer from insufficient bandwidth or high network load. Reliability improves through the use of multiple redundant sites, which makes it suitable for business continuity and disa ster recovery,[28]. Nonetheless, most major cloud computing services have suffered outages and IT and business managers are able to do little when they are affected. [29]. [30] Scalability meets changing user demands quickly without users having to engineer for peak loads. Security typically improves due to centralization of data[31], increased security-focused resources, etc., but raises conc erns about loss of control over certain sensitive data. Providers typically log accesses, but accessing the audit logs themselves can be difficult or impossible. Sustainability comes about through improved resource utilisation, more efficient systems, and carbon neutrality[32][33]. Nonetheless, computers and associated infrastructure are major consumers of energy