Sie sind auf Seite 1von 5

Cloud computing refers to the delivery of computing and storage capacity[citation needed] as a service to a heterogeneous community of end-recipients.

The name comes from the use of clouds as an abstraction for the complex infrastructure it contains in system diagrams[citation needed]. Cloud computing entrusts services with a user's data, software and computation over a network. It has considerable overlap withsoftware as a service (SaaS). End users access cloud based applications through a web browser or a light weight desktop or mobile app while the business software and data are stored on servers at a remote location. Proponents claim that cloud computing allows enterprises to get their applications up and running faster, with improved manageability and less maintenance, and enables IT to more rapidly adjust resources to meet fluctuating and unpredictable business demand. [1][2] Cloud computing relies on sharing of resources to achieve coherence and economies of scale similar to autility (like the electricity grid) over a network (typically the Internet).[3] At the foundation of cloud computing is the broader concept of converged infrastructure and shared services.[4] The term cloud is used as a metaphor for the Internet, based on the cloud drawing used in the past to represent the telephone network, [5] and later to depict the Internet in computer network diagrams as an abstraction of the underlying infrastructure it represents. [6] In the 1990s, telecommunications companies who previously offered primarily dedicated point-to-point data circuits, began offering virtual private network (VPN) services with comparable quality of service but at a much lower cost. By switching traffic to balance utilisation as they saw fit, they were able to utilise their overall network bandwidth more effectively. The cloud symbol was used to denote the demarcation point between that which was the responsibility of the provider and that which was the responsibility of the user. Cloud computing extends this boundary to cover servers as well as the network infrastructure.[7] The underlying concept of cloud computing dates back to the 1960s, when John McCarthy opined that "computation may someday be organised as a public utility." Almost all the modern-day characteristics of cloud computing (elastic provision, provided as a utility, online, illusion of infinite supply), the comparison to the electricity industry and the use of public, private, government, and community forms, were thoroughly explored in Douglas Parkhill's 1966 book, The Challenge of the Computer Utility. Other scholars have shown that cloud computing's roots go all the way back to the 1950s when scientist Herb Grosch (the author of Grosch's law) postulated that the entire world would operate on dumb terminals powered by about 15 large data centers. [8] An early but surprisingly complete implementation of cloud computing was implemented and patented (in Germany and England) by Hardy Schloer (which he termed the "one-page web") [9] with multiple user applications, multiple identification providers, cloud storage, back-end servers with plug-in applications, a multiple tiered server architecture able to handle different user devices over the internet, and built-in security features. The ubiquitous availability of high capacity networks, low cost computers and storage devices as well as the widespread adoption of hardware virtualization, service-oriented architecture, autonomic, and utility computing have led to a tremendous growth in cloud computing. [10][11][12] After the dot-com bubble, Amazon played a key role in the development of cloud computing by modernising their data centers, which, like most computer networks, were using as little as 10% of their capacity at any one time, just to leave room for occasional spikes. Having found that the new cloud architecture resulted in significant internal efficiency improvements whereby small, fast-moving "two-pizza teams" could add new features faster and more easily, Amazon initiated a new product development effort to provide cloud computing to external customers, and launched Amazon Web Service (AWS) on a utility computing basis in 2006.[13][14] In early 2008, Eucalyptus became the first open-source, AWS API-compatible platform for deploying private clouds. In early 2008, OpenNebula, enhanced in the RESERVOIR European Commission-funded project, became the first open-source software for deploying private and hybrid clouds, and for the federation of clouds.[15] In the same year, efforts were focused on providing quality of service guarantees (as required by real-time interactive applications) to cloud-based infrastructures, in the framework of the IRMOS European Commission-funded project, resulting to a real-time cloud environment.[16] By mid-2008, Gartner saw an opportunity for cloud computing "to shape the relationship among consumers of IT services, those who use IT services and those who sell them"[17] and observed that "[o]rganisations are switching from company-owned hardware and software assets to per-use service-based models" so that the "projected shift to computing... will result in dramatic growth in IT products in some areas and significant reductions in other areas."[18] In 2012, Dr. Biju John and Dr. Souheil Khaddaj incorporated the semantic term into the cloud "Cloud computing is a universal collection of data which extends over the internet in the form of resources (such as information hardware, various platforms, services etc.) and forms individual units within the virtualization environment. Held together by infrastructure providers, service providers and the consumer, then it is semantically accessed by various users." (CLUSE 2012), Bangalore, April 2012[19] [edit]Similar

systems and concepts

Cloud computing shares characteristics with:

Autonomic computing Computer systems capable of self-management.[20] Clientserver model Clientserver computing refers broadly to any distributed application that distinguishes between service providers (servers) and service requesters (clients).[21] Grid computing "A form of distributed and parallel computing, whereby a 'super and virtual computer' is composed of a cluster of networked, loosely coupled computers acting in concert to perform very large tasks." Mainframe computer Powerful computers used mainly by large organizations for critical applications, typically bulk data processing such as census, industry and consumer statistics, police and secret intelligence services, enterprise resource planning, and financial transaction processing.[22] Utility computing The "packaging of computing resources, such as computation and storage, as a metered service similar to a traditional public utility, such as electricity."[23][24] Peer-to-peer Distributed architecture without the need for central coordination, with participants being at the same time both suppliers and consumers of resources (in contrast to the traditional clientserver model).

[edit]Characteristics Cloud computing exhibits the following key characteristics:

Agility improves with users' ability to re-provision technological infrastructure resources. Application programming interface (API) accessibility to software that enables machines to interact with cloud software in the same way the user interface facilitates interaction between humans and computers. Cloud computing systems typically use REST-based APIs.

Cost is claimed to be reduced and in a public cloud delivery model capital expenditure is converted to operational expenditure.[25] This is purported to lower barriers to entry, as infrastructure is typically provided by a third-party and does not need to be purchased for one-time or infrequent intensive computing tasks. Pricing on a utility computing basis is fine-grained with usage-based options and fewer IT skills are required for implementation (inhouse).[26] The e-FISCAL project's state of the art repository[27] contains several articles looking into cost aspects in more detail, most of them concluding that costs savings depend on the type of activities supported and the type of infrastructure available in-house. Device and location independence[28] enable users to access systems using a web browser regardless of their location or what device they are using (e.g., PC, mobile phone). As infrastructure is off-site (typically provided by a third-party) and accessed via the Internet, users can connect from anywhere. [26] Virtualization technology allows servers and storage devices to be shared and utilization be increased. Applications can be easily migrated from one physical server to another. Multitenancy enables sharing of resources and costs across a large pool of users thus allowing for:

Centralization of infrastructure in locations with lower costs (such as real estate, electricity, etc.) Peak-load capacity increases (users need not engineer for highest possible load-levels) Utilisation and efficiency improvements for systems that are often only 1020% utilised.[13]

Reliability is improved if multiple redundant sites are used, which makes well-designed cloud computing suitable for business continuity and disaster recovery.[29] Scalability and Elasticity via dynamic ("on-demand") provisioning of resources on a fine-grained, self-service basis near real-time, without users having to engineer for peak loads.[30][31] Performance is monitored, and consistent and loosely coupled architectures are constructed using web services as the system interface.[26] Security could improve due to centralization of data, increased security-focused resources, etc., but concerns can persist about loss of control over certain sensitive data, and the lack of security for stored kernels.[32] Security is often as good as or better than other traditional systems, in part because providers are able to devote resources to solving security issues that many customers cannot afford.[33] However, the complexity of security is greatly increased when data is distributed over a wider area or greater number of devices and in multi-tenant systems that are being shared by unrelated users. In addition, user access to security audit logs may be difficult or impossible. Private cloud installations are in part motivated by users' desire to retain control over the infrastructure and avoid losing control of information security. Maintenance of cloud computing applications is easier, because they do not need to be installed on each user's computer and can be accessed from different places.

[edit]Service

Models

Cloud computing providers offer their services according to three fundamental models:[3][34] Infrastructure as a service (IaaS), platform as a service (PaaS), and software as a service (SaaS) where IaaS is the most basic and each higher model abstracts from the details of the lower models.

[edit]Infrastructure

as a service (IaaS)

See also: Category:Cloud infrastructure In this most basic cloud service model, cloud providers offer computers as physical or more often as virtual machines, raw (block) storage, firewalls , load balancers, and networks. IaaS providers supply these resources on demand from their large pools installed in data centers. Local area networks including IP addresses are part of the offer. For the wide area connectivity, the Internet can be used or - in carrier clouds - dedicated virtual private networks can be configured.

To deploy their applications, cloud users then install operating system images on the machines as well as their application software. In this model, it is the cloud user who is responsible for patching and maintaining the operating systems and application software. Cloud providers typically bill IaaS services on a utility computing basis, that is, cost will reflect the amount of resources allocated and consumed. Infrastructure-as-a-Service or IaaS Cloud is a platform through which businesses can avail equipment in the form of hardware, servers, storage space etc at payper-use service. Moreover, IaaS is a branch of cloud computing that has gathered attention among the entrepreneurs largely with the prime motive to make their business environments more organized and in sync with the ongoing operational activities of organizations. When we talk about IaaS functioning, it is not a machine that we are talking about, which does all the work, it is simply a facility given to the business enterprises that offers users the leverage of extra storage space in servers and data centers. [edit]Platform

as a service (PaaS)

Main article: Platform as a service See also: Category:Cloud platforms In the PaaS model, cloud providers deliver a computing platform and/or solution stack typically including operating system, programming language execution environment, database, and web server. Application developers can develop and run their software solutions on a cloud platform without the cost and complexity of buying and managing the underlying hardware and software layers. With some PaaS offers, the underlying compute and storage resources scale automatically to match application demand such that cloud user does not have to allocate resources manually. [edit]Software

as a service (SaaS)

Main article: Software as a service In this model, cloud providers install and operate application software in the cloud and cloud users access the software from cloud clients. The cloud users do not manage the cloud infrastructure and platform on which the application is running. This eliminates the need to install and run the application on the cloud user's own computers simplifying maintenance and support. What makes a cloud application different from other applications is its elasticity. This can be achieved by cloning tasks onto multiple virtual machines at run-time to meet the changing work demand [35]. Load balancers distribute the work over the set of virtual machines. This process is inconspicuous to the cloud user who sees only a single access point. To accommodate a large number of cloud users, cloud applications can be multitenant, that is, any machine serves more than one cloud user organization. It is common to refer to special types of cloud based application software with a similar naming convention: desktop as a service, business process as a service, Test Environment as a Service, communication as a service. The pricing model for SaaS applications is typically a monthly or yearly flat fee per user.[36] [edit]Cloud

clients

See also: Category:Cloud clients Users access cloud computing using networked client devices, such as desktop computers, laptops, tablets and smartphones. Some of these devices - cloud clients - rely on cloud computing for all or a majority of their applications so as to be essentially useless without it. Examples are thin clients and the browserbased Chromebook. Many cloud applications do not require specific software on the client and instead use a web browser to interact with the cloud application. With Ajax and HTML5 these Web user interfaces can achieve a similar or even better look and feel as native applications. Some cloud applications, however, support specific client software dedicated to these applications (e.g., virtual desktop clients and most email clients). Some legacy applications (line of business applications that until now have been prevalent in thin client Windows computing) are delivered via a screen-sharing technology. [edit]Deployment

models

Cloud computing types

[edit]Public

cloud

Public cloud applications, storage, and other resources are made available to the general public by a service provider. These services are free or offered on a payper-use model. Generally, public cloud service providers like Microsoft and Google own and operate the infrastructure and offer access only via Internet (direct connectivity is not offered).[26] [edit]Community cloud Community cloud shares infrastructure between several organizations from a specific community with common concerns (security, compliance, jurisdiction, etc.), whether managed internally or by a third-party and hosted internally or externally. The costs are spread over fewer users than a public cloud (but more than a private cloud), so only some of the cost savings potential of cloud computing are realized.[3] [edit]Hybrid

cloud

Hybrid cloud is a composition of two or more clouds (private, community or public) that remain unique entities but are bound together, offering the benefits of multiple deployment models.[3] By utilizing "hybrid cloud" architecture, companies and individuals are able to obtain degrees of fault tolerance combined with locally immediate usability without dependency on internet connectivity. Hybrid Cloud architecture requires both on-premises resources and off-site (remote) server based cloud infrastructure. Hybrid clouds lack the flexibility, security and certainty of in-house applications.[37] Hybrid cloud provides the flexibility of in house applications with the fault tolerance and scalability of cloud based services. [edit]Private

cloud

Private cloud is cloud infrastructure operated solely for a single organization, whether managed internally or by a third-party and hosted internally or externally.[3] They have attracted criticism because users "still have to buy, build, and manage them" and thus do not benefit from less hands-on management,[38] essentially "[lacking] the economic model that makes cloud computing such an intriguing concept".[39][40] [edit]Architecture

Cloud computing sample architecture

Cloud architecture,[41] the systems architecture of the software systems involved in the delivery of cloud computing, typically involves multiple cloud components communicating with each other over a loose coupling mechanism such as a messaging queue. Elastic provision implies intelligence in the use of tight or loose coupling as applied to mechanisms such as these and others. [edit]The

Intercloud

Main article: Intercloud The Intercloud[42] is an interconnected global "cloud of clouds"[43][44] and an extension of the Internet "network of networks" on which it is based.[45][46][47] [edit]Cloud

engineering

Cloud engineering is the application of engineering disciplines to cloud computing. It brings a systematic approach to the high level concerns of commercialisation, standardisation, and governance in conceiving, developing, operating and maintaining cloud computing systems. It is a multidisciplinary method encompassing contributions from diverse areas such as systems, software, web, performance, information, security, platform, risk, and quality engineering. [edit]Issues [edit]Privacy The cloud model has been criticised by privacy advocates for the greater ease in which the companies hosting the cloud services control, thus, can monitor at will, lawfully or unlawfully, the communication and data stored between the user and the host company. Instances such as the secret NSA program, working

with AT&T, and Verizon, which recorded over 10 million phone calls between American citizens, causes uncertainty among privacy advocates, and the greater powers it gives to telecommunication companies to monitor user activity.[48] Using a cloud service provider (CSP) can complicate privacy of data because of the extent to which virtualization for cloud processing (virtual machines) and cloud storage are used to implement cloud service.[49] The point is that CSP operations, customer or tenant data may not remain on the same system, or in the same data center or even within the same provider's cloud. This can lead to legal concerns over jurisdiction. While there have been efforts (such as US-EU Safe Harbor) to "harmonise" the legal environment, providers such as Amazon still cater to major markets (typically the United States and the European Union) by deploying local infrastructure and allowing customers to select "availability zones."[50] Cloud computing poses privacy concerns because the service provider at any point in time, may access the data that is on the cloud. They could accidentally or deliberately alter or even delete some info.[51] [edit]Compliance In order to obtain compliance with regulations including FISMA, HIPAA, and SOX in the United States, the Data Protection Directive in the EU and the credit card industry's PCI DSS, users may have to adopt community or hybrid deployment modes that are typically more expensive and may offer restricted benefits. This is how Google is able to "manage and meet additional government policy requirements beyond FISMA"[52][53] and Rackspace Cloud or QubeSpace are able to claim PCI compliance.[54] Many providers also obtain a SAS 70 Type II audit, but this has been criticised on the grounds that the hand-picked set of goals and standards determined by the auditor and the auditee are often not disclosed and can vary widely.[55] Providers typically make this information available on request, under non-disclosure agreement.[56][57] Customers in the EU contracting with cloud providers outside the EU/EEA have to adhere to the EU regulations on export of personal data.[58] U.S. Federal Agencies have been directed by the Office of Management and Budget to use a process called FedRAMP (Federal Risk and Authorization Management Program) to assess and authorize cloud products and services. Federal CIO Steven VanRoekel issued a memorandum to federal agency Chief Information Officers on December 8, 2011 defining how federal agencies should use FedRAMP. FedRAMP consists of a subset of NIST Special Publication 80053 security controls specifically selected to provide protection in cloud environments. A subset has been defined for the FIPS 199 low categorization and the FIPS 199 moderate categorization. The FedRAMP program has also established a Joint Acceditation Board (JAB) consisting of Chief Information Officers from DoD, DHS and GSA. The JAB is responsible for establishing accreditation standards for 3rd party organizations who will perform the assessments of cloud solutions. The JAB will also review authorization packages and may grant provisional authorization (to operate). The federal agency consuming the service will still have the final responsibility for final authority to operate. More information available from GSA athttp

Das könnte Ihnen auch gefallen