Sie sind auf Seite 1von 68

Institute of Science and technology

B.Sc(Honors) on Electronics & Communication Engineering


Part:4
Subject Code: 412

Project name: Cloud Computing

Prepared By

Minhaz Mohammad Kowsar


Roll: 108042, Registration: 707067

Supervised By
Mehedi Hasan

Date:April, 2012

Cloud Computing

ECE 4th Year Project

Page |2

IST, Dhaka

Cloud Computing

Page |3

Acknowledgement

ECE 4th Year Project

IST, Dhaka

Cloud Computing

Page |4

Contents

1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.

Abstract
Introduction
The Concept
Comparison with other sytems
Brief history
Service model of Cloud computing
Deployment model of cloud computing
Architecture of cloud computing
Cloud consumption type
Obstackels
Advantages
Example
Some issues
Research
future

ECE 4th Year Project

IST, Dhaka

Cloud Computing

Page |5

Abstract
Imagine yourself in the world where the users of the computer of todays internet
world dont have to run, install or store their application or data on their own
computers, imagine the world where every piece of your information or data
would reside on the Cloud (Internet).
As a metaphor for the Internet, "the cloud" is a familiar clich, but when
combined with "computing", the meaning gets bigger and fuzzier. Some analysts
and vendors define cloud computing narrowly as an updated version of utility
computing: basically virtual servers available over the Internet. Others go very
broad, arguing anything you consume outside the firewall is "in the cloud",
including conventional outsourcing.
Cloud computing comes into focus only when you think about what we
always need: a way to increase capacity or add capabilities on the fly without
investing in new infrastructure, training new personnel, or licensing new
software. Cloud computing encompasses any subscription-based or pay-per-use
service that, in real time over the Internet, extends ICT's existing capabilities.
Cloud computing is at an early stage, with a motley crew of providers large
and small delivering a slew of cloud-based services, from full-blown applications
to storage services to spam filtering. Yes, utility-style infrastructure providers are
part of the mix, but so are SaaS (software as a service) providers such as
Salesforce.com. Today, for the most part, IT must plug into cloud-based services
individually, but cloud computing aggregators and integrators are already
emerging.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

Page |6

Introduction

this project is done about all about cloud computing. this project includes

what is cloud
characteristics
Brief history
types of Cloud Services
its clients
How to deploy Cloud system
the Architecture of cloud sytem
Software and applications regarding Cloud
Some issues like, Privacy, legal, security, open source and Sustainability
researches in this cloud computing
References

ECE 4th Year Project

IST, Dhaka

Cloud Computing

Page |7

Cloud computing- The Concept

Cloud computing is Internet ("cloud") based development and use of


computer technology ("computing"). It is a style of computing in which
dynamically scalable and often virtualized resources are provided as a service
over the Internet. Users need not have knowledge of, expertise in, or control over
the technology infrastructure "in
the cloud" that supports them.
Cloud computing entrusts,
typically centralized, services
with your data, software, and
computation on a
published application
programming interface (API) over
a network. It has a lot of overlap
with software as a service (SaaS).
End users access cloud
based applications through a web
browser or a light weight desktop
or mobile applications while the
business software and data are
stored on servers at a remote
location. Cloud application providers strive to give the same or better service and
performance than if the software programs were installed locally on end-user
computers.
At the foundation of cloud computing is the broader concept of infrastructure
convergence (or Converged Infrastructure) and shared services. This type of data
centre environment allows enterprises to get their applications up and running
faster, with easier manageability and less maintenance, and enables IT to more
rapidly adjust IT resources (such as servers, storage, and networking) to meet
fluctuating and unpredictable business demand.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

Page |8

The two words in the phrase Cloud Computing have the following interpretations:

Cloud: As a noun, this is a metaphor for the Internet, and as an adjective it


means pertaining to the Internet. This usage derives from the cloud symbols
that represent the Internet on diagrams.
Computing: Any IT activity carried out by an entity:

When using "a local server or a personal computer", which implies


that the IT resources are under the exclusive control of the entity.

To "store, manage, and process data, which implies that the data
is private to the entity, in the sense that it is determined by them, even if it
is accessible by others.

This means that Cloud Computing is a type of Internet-based computing, and it


consists of every situation where the use of IT resources by an entity has all of the
following characteristics:

Access to the resources is:


Controlled by the entity, and restricted by them to their authorized
users.
Delivered via the Internet to all of these users.
The resources are:
Hosted by a service provider on behalf of the entity.
Dedicated to the exclusive use of the entity.
Data processed by the resources is:
Private to the entity and its associates.
Entered or collected by them, or automatically produced for them.

Depending on the context, Cloud Computing can mean:

Access to and use of the resources.


The hosting and delivery service that provides this access.
A model for enabling such access and delivery.
The hosted resources or services themselves.
The computing execution carried out by the services.
Technology used for the provision of the services.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

Page |9

Comparison with other systems

Cloud computing shares characteristics with:

Autonomic computing Computer systems capable of self-management.

Clientserver model Clientserver computing refers broadly to


any distributed application that distinguishes between service providers
(servers) and service requesters (clients).

Grid computing "A form of distributed and parallel computing, whereby


a 'super and virtual computer' is composed of a cluster of networked, loosely
coupled computers acting in concert to perform very large tasks."

Mainframe computer Powerful computers used mainly by large


organizations for critical applications, typically bulk data processing such
as census, industry and consumer statistics, police and secret intelligence
services, enterprise resource planning, and financial transaction processing.

Utility computing The "packaging of computing resources, such as


computation and storage, as a metered service similar to a traditional public
utility, such as electricity."

Peer-to-peer Distributed architecture without the need for central


coordination, with participants being at the same time both suppliers and
consumers of resources (in contrast to the traditional clientserver model).

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 10

Characteristics
Cloud computing exhibits the following key characteristics:
Empowerment of end-users of computing resources by putting the
provisioning of those resources in their own control, as opposed to the
control of a centralized IT service (for example)
Agility improves with users' ability to re-provision technological
infrastructure resources.
Application programming interface (API) accessibility to software that
enables machines to interact with cloud software in the same way the user
interface facilitates interaction between humans and computers. Cloud
computing systems typically use REST-based APIs.
Cost is claimed to be reduced and in a public cloud delivery model capital
expenditure is converted to operational expenditure. This is purported to
lower barriers to entry, as infrastructure is typically provided by a thirdparty and does not need to be purchased for one-time or infrequent
intensive computing tasks. Pricing on a utility computing basis is finegrained with usage-based options and fewer IT skills are required for
implementation (in-house).
Device and location independence enable users to access systems using a
web browser regardless of their location or what device they are using
(e.g., PC, mobile phone). As infrastructure is off-site (typically provided by a
third-party) and accessed via the Internet, users can connect from
anywhere.
Virtualization technology allows servers and storage devices to be shared
and utilization be increased. Applications can be easily migrated from one
physical server to another.
ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 11

Multi-tenancy enables sharing of resources and costs across a large pool of


users thus allowing for:
1. Centralization of infrastructure in locations with lower costs (such as
real estate, electricity, etc.)
2. Peak-load capacity increases (users need not engineer for highest
possible load-levels)
3. Utilization and efficiency improvements for systems that are often
only 1020% utilized.
Reliability is improved if multiple redundant sites are used, which makes
well-designed cloud computing suitable for business continuity and disaster
recovery.
Scalability and Elasticity via dynamic ("on-demand") provisioning of
resources on a fine-grained, self-service basis near real-time, without users
having to engineer for peak loads.
Performance is monitored and consistent and loosely coupled architectures
are constructed using web services as the system interface.
Security could improve due to centralization of data, increased securityfocused resources, etc., but concerns can persist about loss of control over
certain sensitive data, and the lack of security for stored kernels.[17]
Security is often as good as or better than other traditional systems, in part
because providers are able to devote resources to solving security issues
that many customers cannot afford.[18] However, the complexity of
security is greatly increased when data is distributed over a wider area or
greater number of devices and in multi-tenant systems that are being
shared by unrelated users. In addition, user access to security audit logs
may be difficult or impossible. Private cloud installations are in part
motivated by users' desire to retain control over the infrastructure and
avoid losing control of information security.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 12

Maintenance of cloud computing applications is easier, because they do


not need to be installed on each user's computer and can be accessed from
different places.

So as a summary of previous paragraph we can have six Essential characteristics


of Cloud Computing below:
1. On demand self services: computer services such as email, applications,
network or server service can be provided without requiring human
interaction with each service provider. Cloud service providers providing on
demand self services include Amazon Web Services (AWS), Microsoft,
Google, IBM and Salesforce.com. New York Times and NASDAQ are
examples of companies using AWS (NIST). Gartner describes this
characteristic as service based.
2. Broad network access: Cloud Capabilities are available over the network
and accessed through standard mechanisms that promote use by
heterogeneous thin or thick client platforms such as mobile phones,
laptops and PDAs.
3. Resource pooling: The providers computing resources are pooled together
to serve multiple consumers using multiple-tenant model, with different
physical and virtual resources dynamically assigned and reassigned
according to consumer demand. The resources include among others
storage, processing, memory, network bandwidth, virtual machines and
email services. The pooling together of the resource builds economies of
scale (Gartner).
4. Rapid elasticity: Cloud services can be rapidly and elastically provisioned, in
some cases automatically, to quickly scale out and rapidly released to
quickly scale in. To the consumer, the capabilities available for provisioning
often appear to be unlimited and can be purchased in any quantity at any
time.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 13

5. Measured service: Cloud computing resource usage can be measured,


controlled, and reported providing transparency for both the provider and
consumer of the utilized service. Cloud computing services use a metering
capability which enables to control and optimize resource use. This implies
that just like air time, electricity or municipality water IT services are
charged per usage metrics pay per use. The more you utilize the higher
the bill. Just as utility companies sell power to subscribers, and telephone
companies sell voice and data services, IT services such as network security
management, data center hosting or even departmental billing can now be
easily delivered as a contractual service.
6. Multi Tenacity: is the 6th characteristics of cloud computing advocated by
the Cloud Security Alliance. It refers to the need for policy-driven
enforcement, segmentation, isolation, governance, service levels, and
chargeback/billing models for different consumer constituencies.
Consumers might utilize a public cloud providers service offerings or
actually be from the same organization, such as different business units
rather than distinct organizational entities, but would still share
infrastructure.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 14

Brief History
The term "cloud" is used as a metaphor for the Internet, based on the cloud
drawing used in the past to represent the telephone network,[19] and later to
depict the Internet in computer network diagrams as an abstraction of the
underlying infrastructure it represents.
The ubiquitous availability of high capacity networks, low cost computers and
storage devices as well as the widespread adoption of virtualization, serviceoriented architecture, autonomic, and utility computing have led to a tremendous
growth in cloud computing. Details are abstracted from end-users, who no longer
have need for expertise in, or control over, the technology infrastructure "in the
cloud" that supports them.
The underlying concept of cloud computing dates back to the 1960s, when John
McCarthy opined that "computation may someday be organized as a public
utility." Almost all the modern-day
characteristics of cloud computing (elastic
provision, provided as a utility, online,
illusion of infinite supply), the comparison
to the electricity industry and the use of
public, private, government, and
community forms, were thoroughly
explored in Douglas Parkhill's 1966 book,
The Challenge of the Computer Utility.
Other scholars have shown that cloud
computing's roots go all the way back to
the 1950s when scientist Herb Grosch
(the author of Grosch's law) postulated
that the entire world would operate on
dumb terminals powered by about 15
large data centers.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 15

The actual term "cloud" borrows from


telephony in that telecommunications
companies, who until the 1990s offered
primarily dedicated point-to-point data
circuits, began offering Virtual Private
Network (VPN) services with comparable
quality of service but at a much lower cost.
By switching traffic to balance utilization as
they saw fit, they were able to utilize their
overall network bandwidth more
effectively. The cloud symbol was used to
denote the demarcation point between
that which was the responsibility of the
provider and that which was the
responsibility of the user. Cloud computing
extends this boundary to cover servers as
well as the network infrastructure.
After the dot-com bubble, Amazon played
a key role in the development of cloud
computing by modernizing their data
centers, which, like most computer
networks, were using as little as 10% of
their capacity at any one time, just to leave
room for occasional spikes. Having found
that the new cloud architecture resulted in
significant internal efficiency
improvements whereby small, fast-moving
"two-pizza teams" could add new features
faster and more easily, Amazon initiated a
new product development effort to
provide cloud computing to external
customers, and launched Amazon Web
Service (AWS) on a utility computing basis
in 2006.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 16

In early 2008, Eucalyptus became the first open-source, AWS API-compatible


platform for deploying private clouds. In early 2008, OpenNebula, enhanced in
the RESERVOIR European Commission-funded project, became the first opensource software for deploying private and hybrid clouds, and for the federation of
clouds. In the same year, efforts were focused on providing QoS guarantees (as
required by real-time interactive applications) to cloud-based infrastructures, in
the framework of the IRMOS European Commission-funded project, resulting to a
real-time cloud environment. By mid-2008, Gartner saw an opportunity for cloud
computing "to shape the relationship among consumers of IT services, those who
use IT services and those who sell them" and observed that " organizations are
switching from company-owned hardware and software assets to per-use servicebased models" so that the "projected shift to cloud computing... will result in
dramatic growth in IT products in some areas and significant reductions in other
areas.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 17

Service Model of Cloud Computing

Infrastructure as a Service (IaaS)


In this most basic cloud service model, cloud providers offer computers as
physical or more often as virtual machines , raw (block) storage, firewalls,
load balancers, and networks. IaaS providers supply these resources on
demand from their large pools installed in data centers. Local area networks
including IP addresses are part of the offer. For the wide area connectivity, the
Internet can be used or - in carrier clouds - dedicated virtual private networks
can be configured.
To deploy their applications, cloud users then install operating system images
on the machines as well as their application software. In this model, it is the
cloud user who is responsible for patching and maintaining the operating
systems and application software. Cloud providers typically bill IaaS services
on a utility computing basis, that is, cost will reflect the amount of resources
allocated and consumed.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 18

Some More Detail:


The System Administrators are the subscriber of this service. Usage
fees are calculated per CPU hour, data GB stored per hour, network
bandwidth consumed, network infrastructure used per hour, value added
services used, e.g., monitoring, auto-scaling etc.
The following figure shows IaaS Component Stack and Scope of Control as
defined by NIST:

Who are IaaS Subscribers?


The most popular Facebook games created by Zynga.com. It has
more than 230 million monthly users run more than 12000 servers
on Amazon AWS. When they launch a new game, they start with a
few servers and then ramp up their capacity in real time.
To prevent the DDOS attack on its servers, the controversial
Wikileaks was hosted on Amazon AWS. Now it seems it has moved
back to a Swedish host.
Most important among the lot are SaaS and PaaS Players who are
hosted with IaaS providers.
When/Why should opt for an IaaS?
Very useful for startup companies who dont know how successful
their newly launched application/website will be.
You have the choice of multiple Operating System, Platforms,
Databases and Content Delivery Network (CDN) all in one place.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 19

Platform as a Service (PaaS)


In the PaaS model, cloud providers deliver a computing platform and/or
solution stack typically including operating system, programming language
execution environment, database, and web server. Application developers
can develop and run their software solutions on a cloud platform without
the cost and complexity of buying and managing the underlying hardware
and software layers. With some PaaS offers, the underlying compute and
storage resources scale automatically to match application demand such
that the cloud user does not have to allocate resources manually.
Some More Detail:
In plain English, PaaS is a platform where software can be developed,
tested and deployed, meaning the entire life cycle of software can be
operated on a PaaS. This service model is dedicated to application
developers, testers, deployers and administrators. This service provides
everything you need to develop a cloud SaaS application.
The following figure shows PaaS Component Stack and Scope of Control as
defined by NIST:

A PaaS typically includes the development environment,


programming languages, compilers, testing tools and deployment
mechanism. In some cases, like Google Apps Engine (GAE), the
developers may download development environment and use them

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 20

locally in the developers infrastructure, or the developer may access


tools in the providers infrastructure through a browser.
Who are PaaS Subscribers?
ISV (Independent Software Vendors), IT Service providers or even I
ndividual developers who want to develop SaaS.
When/Why should you opt for a PaaS?
You focus only on developing the application, everything else will be
taken care of by the platform.

Software as a Service (SaaS)


In this model, cloud providers install and operate application software in
the cloud and cloud users access the software from cloud clients. The cloud
users do not manage the cloud infrastructure and platform on which the
application is running. This eliminates the need to install and run the
application on the cloud user's own computers simplifying maintenance
and support. What makes a cloud application different from other
applications is its elasticity. This can be achieved by cloning tasks onto
multiple virtual machines at run-time to meet the changing work demand.
Load balancers distribute the work over the set of virtual machines. This
process is transparent to the cloud user who sees only a single access point.
To accommodate a large number of cloud users, cloud applications can be
multitenant, that is, any machine serves more than one cloud user
organization. It is common to refer to special types of cloud based
application software with a similar naming convention: desktop as a
service, business process as a service, Test Environment as a Service,
communication as a service.
The pricing model for SaaS applications is typically a monthly or yearly flat
fee per user.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 21

Some More Detail:


Here the consumer is free of any worries and hassles related to the service.
The Service Provider has very high administrative control on the application
and is responsible for update, deployment, maintenance and security. The
provider exercises final authority over the application. For example, Gmail
is a SaaS where Google is the provider and we are consumers. We have
very limited administrative and user level control over it, although there is a
limited range of actions, such as enabling priority inbox, signatures, undo
send mail, etc, that the consumer can initiate through settings.
The following figure illustrates the relative levels of control between the
provider and the subscriber SaaS Component Stack and Scope of Control borrowed from the NIST document.

Who are SaaS Subscribers?


Apart from organizations and enterprises, SaaS subscribers/users can also
be individuals like you and me. In most of the cases the usage fee is
calculated based on the number of users. For example, Google Apps is free
up to 10 email accounts, but it charges $5 per user per month for Google
Apps for Business(more than 10 users)
When/Why should you opt for a SaaS?
When you want to focus on your business rather than wasting your time in
replacing broken pieces of hardware, managing IT infrastructure, and the
most critical of them all - hiring and retaining your IT staff etc.
ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 22

Service Model of Cloud Computing in graphical

way

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 23

Deployment Model of Cloud computing

There are several cloud-computing deployment models, and these represent


different types of exclusive and non-exclusive clouds provided to consumers or
groups of consumers.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 24

Public cloud
Public clouds are
cloud systems that are
made available to any
entity in a nonexclusive group, such
as the general public,
or all organisations in a
specific industry.
Because there are
many consumers,
these are multitenanted clouds. They
are owned by cloud
providers, and are off-premise for all consumers.
The cloud is public only in the sense that, potentially, any entity that requires the
provided services can become a consumer, and a public cloud may not
necessarily be of interest to every entity. For example, a SaaS public cloud might
provide an accounting system that is of interest only to certain types of small
business.

The most ubiquitous, and almost a synonym for, cloud computing. The
cloud infrastructure is made available to the general public or a large
industry group and is owned by an organization selling cloud services.
Examples of Public Cloud:

Google App Engine


Microsoft Windows Azure
IBM Smart Cloud
Amazon EC2

The main benefits of using a public cloud service are:


a. Easy and inexpensive set-up because hardware, application and
bandwidth costs are covered by the provider.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 25

b. Scalability to meet needs.


c. No wasted resources because you pay for what you use.

Community cloud
Community cloud shares infrastructure between several organizations from
a specific community with common concerns (security, compliance,
jurisdiction, etc.), whether managed internally or by a third-party and
hosted internally or externally. The costs are spread over fewer users than
a public cloud
(but more than a
private cloud), so
only some of the
cost savings
potential of cloud
computing are
realized.
Community
clouds are
supplied to a
group of related
entities that share
a common
purpose, such as mission, security requirements, policy or compliance
considerations, and that therefore need the same type of hosting. These
are multi-tenanted clouds that may be managed by the community or by a
third party, and they may be off-premise for all the consumers, or onpremise for one of the consumers.
Unlike a private cloud, the community of consumers isn't narrowly
exclusive. However, they may not be truly public clouds, because not every
entity that could use the service may be able to become a consumer.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 26

A Community cloud is for a specific community of consumers from groups


or organizations with shared goals or concerns. Any organization or third
party can also own this type of cloud.

Hybrid Cloud
A hybrid cloud is a composition of two or more public, private and community
clouds that are used on a day-to-day basis or for cloud bursting.
Becoming part of such a cloud can be attractive to the providers, because it results
in a larger pool of resources that
can be made available to their
consumers, so that variations
in demand can be managed
more flexibly. Also, for
consumers, it may be that
some of their data must be in a
private cloud, for security and
privacy reasons, but it may be
more economical to keep some
other, perhaps less sensitive,
data in a public cloud, because
the cost of these is generally
lower.
A hybrid cloud is also a cloud
of clouds, but the difference
with the latter is that it can contain only one type of cloud deployment, rather than
a mixture of public and private clouds, as with a hybrid.

A hybrid cloud is the intermixing of private and public cloud infrastructures,


with private clouds typically confined to an enterprises data centers and
public clouds being offered by service providers.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 27

As a combination of these two models, a hybrid cloud enables IT


organizations that have already built a private cloud to use resources from
public clouds as a way to expand the capabilities of their own IT
infrastructure. The critical part of the hybrid model is having a robust,
secure, and high-performance network between the private and public cloud
so that you can effectively move applications and storage back and forth.

Benefits of Hybrid Cloud


a. it allows you to leverage your own assets as well as assets from public
cloud providers to create a more cost-effective and flexible IT
environment.
b. It gives you a way to augment your existing facilities, either in terms of
handling bursts of traffic or having specific workloads get hosted. You
can do that in a very agile manner by using resources from the public
cloud. Or you can utilize those external resources to build out your own
data center.

Private cloud
Private clouds(also called internal cloud or corporate cloud) are cloud systems
that are accessible only by a single entity, or by an exclusive group of related
entities that share the same purpose and requirements, such as all the
business units in a single organisation. They are generally single-tenanted, but
they can be multi-tenanted if the individual group members act as separate
consumers. They may be owned by a cloud provider and be located offpremise, or they may be owned and operated by the consumer and be
located on-premise. In the latter case, they can also be known as internal
clouds or corporate clouds. These clouds are usually private because of the

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 28

need for system and data security, and, for this reason, they will usually be
behind a firewall that restricts access to a limited set of client devices.
There are some it systems that have some of the same characteristics and
advantages as cloud computing, but that are accessible only through a
private LAN or WAN, rather than the Internet. These have been described as
cloud computing-like, but, because of the shared features, they are
sometimes included as part of cloud computing itself.

Sometimes, Private cloud is a marketing term for a proprietary computing


architecture that provides hosted services to a limited number of people
behind a firewall. Advances in virtualization and distributed computing have
allowed corporate network and datacenter administrators to effectively
ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 29

become service providers that meet the needs of their "customers" within the
corporation. Marketing media that uses the words "private cloud" is designed
to appeal to an organization that needs or wants more control over their data
than they can get by using a third-party hosted service such as Amazon's
Elastic Compute Cloud (EC2) or Simple Storage Service (S3).

Some other types of cloud deployment models:


Virtual private cloud
When a service provider uses a public-cloud system to create a private
cloud, the result is known as a virtual private cloud.

Vertical cloud
A vertical cloud is a public cloud optimized for a specific, vertical industry.

A Graphical Summary of Cloud deployment Model

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 30

The Architectute of cloud system


Cloud architecture, the systems architecture of the software systems involved in
the delivery of cloud computing, typically involves multiple cloud components
communicating with each other over a loose coupling mechanism such as a
messaging queue. Elastic provision implies intelligence in the use of tight or loose
coupling as applied to mechanisms such as these and others.

The Intercloud
The Intercloud is an interconnected global "cloud of clouds" and an
extension of the In the Intercloud is an interconnected global "cloud
of clouds" and an extension of the Internet "network of networks" on which it is
based. The term was first used in the context of cloud computing in 2007
when Kevin Kelly opined that "eventually we'll have the Intercloud, the cloud of
clouds". It became popular in 2009 and has also been used to describe the
datacenter of the future.
The Intercloud scenario is based on the key concept that each single cloud does
not have infinite physical resources. If a cloud saturates, the computational and
storage resources of its infrastructure, it would not be able to satisfy further
requests for service allocations sent from its clients. The Intercloud scenario aims
to address such a situation, and in theory, each cloud can use the computational
and storage resources of the infrastructures of other clouds. Such forms of pay-

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 31

for-use may introduce new business opportunities among cloud providers if they
manage to go beyond the theoretical framework.
Nevertheless, the Intercloud raises many more challenges than solutions
concerning federation, security, interoperability, vendors' lock-ins, trust, legal
issues, QoS, monitoring, and billing.
Ternet "network of networks" on which it is based. The term was first used in the
context of cloud computing in 2007 when Kevin Kelly opined that "eventually we'll
have the Intercloud, the cloud of clouds". It became popular in 2009 and has
also been used to describe the datacenter of the future.
The Intercloud scenario is based on the key concept that each single cloud does
not have infinite physical resources. If a cloud saturates, the computational and
storage resources of its infrastructure, it would not be able to satisfy further
requests for service allocations sent from its clients. The Intercloud scenario aims
to address such a situation, and in theory, each cloud can use the computational
and storage resources of the infrastructures of other clouds. Such forms of payfor-use may introduce new business opportunities among cloud providers if they
manage to go beyond the theoretical framework.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 32

Nevertheless, the Intercloud raises many more challenges than solutions


concerning federation, security, interoperability, vendors' lock-ins, trust, legal
issues, QoS, monitoring, and billing.

Cloud engineering
Cloud Architecture
Cloud engineering is the application of engineering disciplines to cloud
computing. It brings a systematic approach to the high level concerns of
commercialization, standardization, and governance of cloud computing
applications. At a practical level it adopts the tools of engineering in
conceiving, developing, operating and maintaining cloud computing
systems.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 33

The notion of cloud engineering in the context of cloud computing had


been sparsely used in discussions, presentations and talks in various
occasions in the middle of the 2000s. The term of cloud engineering was
formally coined around 2007 and the concept of cloud engineering was
officially introduced in April 2009. Various aspects and topics of this subject
have been extensively covered in a number of industry events. Extensive
research has been conducted on specific areas in cloud engineering, such as
development support for cloud patterns, and cloud business continuity
services.
Cloud engineering is a field of engineering that focuses on cloud services,
such as "software as a service", "platform as a service", and "infrastructure
as a service". It is a multidisciplinary method encompassing contributions
from diverse areas such as systems engineering, software engineering, web
engineering, performance engineering, information engineering, security
engineering, platform engineering, service engineering, risk engineering,
and quality engineering. The nature of commodity-like capabilities
delivered by cloud services and the inherent challenges in this business
model drive the need for cloud engineering as the process of "designing the
systems necessary to leverage the power and economics of cloud resources
to solve business problems."

Cloud Consumtion Types


There are several features of cloud computing that affect consumers in terms of

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 34

their day-to-day use of the services, or the way they contract for the services, or
their reasons for choosing one service over another, and these have been called
the consumption model.
Some of them have been described as essential, but it has also been observed
that no single feature is proposed by all definitions and they have been
discussed using terms such as alternatives, options, generally, recurrent ideas or
typically, to indicate that they dont necessarily apply to all cloud services.
We will describe five common types of consumption model of cloud computing
system:

Payment and pricing


a. Free services
There are cloud services that are entirely free, and some that are offered
on a freemium basis.
b. Commercial services
Where payment is made, typically it is on the basis of consumption in a
given time period, such as per concurrent user per month for SaaS, or
per unit of storage per month for IaaS. Charging by usage can be by any of
the following methods:
i. Utility: Consumers pay only for what they use - so called because
it is similar to the pricing of services from electricity utilities.
ii. Subscription: Consumers pay for a fixed amount of resource
whether they use it or not, which is similar to some contracts for
cable TV or mobile telecommunications.
iii. A combination of these, where consumers pay a subscription to
consume up to a certain amount, and then as a utility for
resources consumed above that amount.
c. Ownership
In some cases, cloud systems can be wholly owned by the consumer.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 35

Measured service
Payment on the basis of consumption requires a measured service and
a metering capability, but even free services may need to be metered if
they apply only up to some level of resource.

Resource pooling
Multi-tenant hosting involves pooled resources that are shared among the
several tenants. This can be a consumer advantage, because sharing the
resources may lead to lower costs for each tenant.
However, resource pooling doesn't apply to a private cloud with only one
tenant, because this type of resource sharing applies only between
different tenants, rather than among a tenant's individual users. Sharing
resources among users applies to any server or datacenter, whether it is
part of cloud computing or not.

Scaling and provisioning


Scaling means reconfiguring resources to change their size. Scaling
in means to release resources, and scaling out means to acquire more
resources. Systems that can easily scale in or out are said to
be elastic. Provisioning refers to the mechanisms used to provide and
release resources, and hence to manage scaling. Agile provisioning allows
the size of resources to be changed very easily, for example without the
lengthy decision-making and budgetary process required when purchasing
IT equipment for delivery on-premise. Elastic resources and agile
provisioning are important for flexible and cost-effective management of
variations in user demand.
The following terms are used to describe the various scaling and
provisioning features that are available with some cloud services: Ondemand self service: Scaling that can be performed by the consumer,
rather than by the host.

i.

Dynamic scaling: Scaling that can be done via software, so


that it can happen automatically and possibly in a way that is
invisible to the consumer.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

ii.

iii.

P a g e | 36

Infinite scaling: There is no effective limit to the amount of


resource that the consumer can have, although it is always actually
finite at any one time.
Rapid provisioning: Provisioning that can be immediate,
rather than waiting for the cloud provider to respond to a request
for resources.

There is some disagreement as to whether agile provisioning is a defining


characteristic of cloud computing.This is partly because the history of IT
shows that flexible scaling and provisioning was available prior to the cloud
era, and so it is more a feature of IT in general, rather than just cloud
systems in particular. However, because large public clouds can have very
many tenants, their datacentres can be much more massive than previously
known, and so they can give the appearance of unlimited scalability.

Access
a. Broad
Access to cloud services is via the Internet, and this leads to the possibility
of consumers having broad access, which means the ability to use the
services from multiple types of cloud client, including desktop, portable and
hand-held devices, or from many different locations.
To achieve access from many different clients, it is necessary for the
websites to be made compatible with hand-held devices as well as PCs, for
example because of the different screen sizes and the different mechanics
of scrolling within large web pages.
Access can be from any location where an Internet connection is available,
either from a fixed PC, for example in an office or Internet caf, or from
anywhere that mobile telephone access is available, for example using
a USB modem attached to a notebook.
However, a private cloud may only allow access from certain sources, for
example if it is behind a firewall.
b. Transparent
In IT, something is transparent to users if they do not need to understand
or be aware of it. For example, with cloud computing, consumers can

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 37

have transparent access, which means that the users of a service need not
be aware of who provides the service or where the host is located.
However, for legal and regulatory requirements regarding the security of
data and the laws that might apply to breaches of service levels, a
consumer may need to have their hosting provided by a known
organization in a specific location.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 38

Top 9 Obstacles for Cloud Computing


Availability of a Service
Organizations worry about whether Utility Computing services will have
adequate availability, and this makes some wary of Cloud Computing.
Ironically, existing SaaS products have set a high standard in this regard.
Google Search is effectively the dial tone of the Internet: if people went to
Google for search and it wasnt available, they would think the Internet was
down. Users expect similar availability from new services, which is hard to
do. Table 7 shows recorded outages for Amazon Simple Storage Service
(S3), AppEngine and Gmail in 2008, and explanations for the outages. Note
that despite the negative publicity due to these outages, few enterprise IT
infrastructures are as good.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 39

Just as large Internet service providers use multiple network providers so


that failure by a single company will not take them off the air, we believe
the only plausible solution to very high availability is multiple Cloud
Computing providers. The high-availability computing community has long
followed the mantra no single source of failure, yet the management of a
Cloud Computing service by a single company is in fact a single point of
failure. Even if the company has multiple datacenters in different
geographic regions using different network providers, it may have common
software infrastructure and accounting systems, or the company may even
go out of business. Large customers will be reluctant to migrate to Cloud
Computing without a business-continuity strategy for such situations.
We believe the best chance for independent software stacks is for them to
be provided by different companies, as it has been difcult for one
company to justify creating and maintain two stacks in the name of
software dependability.
Another availability obstacle is Distributed Denial of Service (DDoS) attacks.
Criminals threaten to cut off the incomes of SaaS providers by making their
service unavailable, extorting $10,000 to $50,000 payments to prevent the
launch of a DDoS attack. Such attacks typically use large botnets that rent
bots on the black market for $0.03 per 14bot (simulated bogus user) per
week [36]. Utility Computing offers SaaS providers the opportunity to
defend against DDoS attacks by using quick scale-up. Suppose an EC2
instance can handle 500 bots, and an attack is launched that generates an
extra 1 GB/second of bogus network bandwidth and 500,000 bots. At $0.03
per bot, such an attack would cost the attacker $15,000 invested up front.
At AWSs current prices, the attack would cost the victim an extra $360 per
hour in network bandwidth and an extra $100 per hour (1,000 instances) of
computation. The attack would therefore have to last 32 hours in order to
cost the potential victim more than it would the blackmailer. A botnet
attack this long may be difficult to sustain, since the longer an attack lasts
the easier it is to uncover and defend against, and
the attacking bots could not be immediately re-used for other attacks on
the same provider. As with elasticity, Cloud Computing shifts the attack
ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 40

target from the SaaS provider to the Utility Computing provider, who can
more readily absorb it and (as we argued in Section 3) is also likely to have
already DDoS protection as a core competency.

Data Lock-In
Software stacks have improved interoperability among platforms, but the
APIs for Cloud Computing itself are still essentially proprietary, or at least
have not been the subject of active standardization. Thus, customers
cannot easily extract their data and programs from one site to run on
another. Concern about the difcult of extracting data from the cloud is
preventing some organizations from adopting Cloud Computing. Customer
lock-in may be attractive to Cloud Computing providers, but Cloud
Computing users are vulnerable to price increases (as Stallman warned), to
reliability problems, or even to providers going out of business.
For example, an online storage service called The Linkup shut down on
August 8, 2008 after losing access as much as 45% of customer data [12].
The Linkup, in turn, had relied on the online storage service Nirvanix to
store customer data, and now there is nger pointing between the two
organizations as to why customer data was lost. Meanwhile, The Linkups
20,000 users were told the service was no longer available and were urged
to try out another storage site.
The obvious solution is to standardize the APIs so that a SaaS developer
could deploy services and data across multiple Cloud Computing providers
so that the failure of a single company would not take all copies of
customer data with it. The obvious fear is that this would lead to a race-tothe-bottom of cloud pricing and atten the prots of Cloud Computing
providers. We offer two arguments to allay this fear.
First, the quality of a service matters as well as the price, so customers will
not necessarily jump to the lowest cost service. Some Internet Service
Providers today cost a factor of ten more than others because they are
more dependable and offer extra services to improve usability.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 41

Second, in addition to mitigating data lock-in concerns, standardization of


APIs enables a new usage model in which the same software infrastructure
can be used in a Private Cloud and in a Public Cloud.
Such an option could enable Surge Computing, in which the public Cloud
is used to capture the extra tasks that cannot be easily run in the
datacenter (or private cloud) due to temporarily heavy workloads.

Data Condentiality and Auditability


My sensitive corporate data will never be in the cloud. Anecdotally we
have heard this repeated multiple times. Current cloud offerings are
essentially public (rather than private) networks, exposing the system to
more attacks. There are also requirements for audit ability, in the sense of
Sarbanes-Oxley and Health and Human Services Health
Insurance Portability and Accountability Act (HIPAA) regulations that must
be provided for corporate data to be moved to the cloud.
We believe that there are no fundamental obstacles to making a cloudcomputing environment as secure as the vast majority of in-house IT
environments, and that many of the obstacles can be overcome
immediately with well understood technologies such as encrypted storage,
Virtual Local Area Networks, and network middle boxes (e.g. rewalls,
packet lters). For example, encrypting data before placing it in a Cloud
may be even more secure than unencrypted data in a local data center;
this approach was successfully used by TC3, a healthcare company with
access to sensitive patient records and healthcare claims, when moving
their HIPAA-compliant application to AWS.
Similarly, audit ability could be added as an additional layer beyond the
reach of the virtualized guest OS (or virtualized application environment),
providing facilities arguably more secure than those built into the
applications themselves and centralizing the software responsibilities
related to condentiality and auditability into a single logical layer. Such a
new feature reinforces the Cloud Computing perspective of changing our
focus from specic hardware to the virtualized capabilities being provided.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 42

A related concern is that many nations have laws requiring SaaS providers
to keep customer data and copyrighted material within national
boundaries. Similarly, some businesses may not like the ability of a country
to get access to their data via the court system; for example, a European
customer might be concerned about using SaaS in the United States given
the USA PATRIOT Act.
Cloud Computing gives SaaS providers and SaaS users greater freedom to
place their storage. For example,Amazon provides S3 services located
physically in the United States and in Europe, allowing providers to keep
data in whichever they choose. With AWS regions, a simple conguration
change avoids the need to nd and negotiate with a hosting provider
overseas.

Data Transfer Bottlenecks


Applications continue to become more data-intensive. If we assume
applications may be pulled apart across the
boundaries of clouds, this may complicate data placement and transport.
At $100 to $150 per terabyte transferred,
these costs can quickly add up, making data transfer costs an important
issue. Cloud users and cloud providers have to
think about the implications of placement and trafc at every level of the
system if they want to minimize costs. This
kind of reasoning can be seen in Amazons development of their new
Cloudfront service.
One opportunity to overcome the high cost of Internet transfers is to ship
disks. Jim Gray found that the cheapest
way to send a lot of data is to physically send disks or even whole
computers via overnight delivery services [22].
Although there are no guarantees from the manufacturers of disks or
computers that you can reliably ship data that
way, he experienced only one failure in about 400 attempts (and even this
could be mitigated by shipping extra disks

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 43

with redundant data in a RAID-like manner).


A second opportunity is to nd other reasons to make it attractive to keep
data in the cloud, for once data is in the
cloud for any reason it may no longer be a bottleneck and may enable new
services that could drive the purchase of
Cloud Computing cycles. Amazon recently began hosting large public
datasets (e.g. US Census data) for free on S3;
since there is no charge to transfer data between S3 and EC2, these
datasets might attract EC2 cycles. As another
example, consider off-site archival and backup services. Since companies
like Amazon, Google, and Microsoft likely
send much more data than they receive, the cost of ingress bandwidth
could be much less. Therefore, for example, if
weekly full backups are moved by shipping physical disks and compressed
daily incremental backups are sent over
the network, Cloud Computing might be able to offer an affordable offpremise backup service. Once archived data is
in the cloud, new services become possible that could result in selling more
Cloud Computing cycles, such as creating
searchable indices of all your archival data or performing image recognition
on all your archived photos to group them
according to who appears in each photo.
A third, more radical opportunity is to try to reduce the cost of WAN
bandwidth more quickly. One estimate is
that two-thirds of the cost of WAN bandwidth is the cost of the high-end
routers, whereas only one-third is the ber
cost [27]. Researchers are exploring simpler routers built from commodity
components with centralized control as a
low-cost alternative to the high-end distributed routers [33]. If such
technology were deployed by WAN providers, we
could see WAN costs dropping more quickly than they have historically.
Performance Unpredictability
ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 44

Our experience is that multiple Virtual Machines can share CPUs and main
memory surprisingly well in Cloud Computing, but that I/O sharing is more
problematic. Figure 3(a) shows the average memory bandwidth for 75 EC2
instances running the STREAM memory benchmark [32]. The mean
bandwidth is 1355 MBytes per second, with a
standard deviation of just 52 MBytes/sec, less than 4% of the mean. Figure
3(b) shows the average disk bandwidth
for 75 EC2 instances each writing 1 GB les to local disk. The mean disk
write bandwidth is nearly 55 MBytes per
second with a standard deviation of a little over 9 MBytes/sec, more than
16% of the mean. This demonstrates the
problem of I/O interference between virtual machines.
One opportunity is to improve architectures and operating systems to
efciently virtualize interrupts and I/O channels. Technologies such as
PCIexpress are difcult to virtualize, but they are critical to the cloud. One
reason to be
hopeful is that IBM mainframes and operating systems largely overcame
these problems in the 1980s, so we have
successful examples from which to learn.
Another possibility is that ash memory will decrease I/O interference.
Flash is semiconductor memory that
preserves information when powered off like mechanical hard disks, but
since it has no moving parts, it is much faster
to access (microseconds vs. milliseconds) and uses less energy. Flash
memory can sustain many more I/Os per second
per gigabyte of storage than disks, so multiple virtual machines with
conicting random I/O workloads could coexist
better on the same physical computer without the interference we see with
mechanical disks. The lack of interference
that we see with semiconductor main memory in Figure 3(a) might extend
to semiconductor storage as well, thereby

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 45

increasing the number of applications that can run well on VMs and thus
share a single computer. This advance could
lower costs to Cloud Computing providers, and eventually to Cloud
Computing consumers.
Another unpredictability obstacle concerns the scheduling of virtual
machines for some classes of batch processing
programs, specically for high performance computing. Given that highperformance computing is used to justify
Government purchases of $100M supercomputer centers with 10,000 to
1,000,000 processors, there certainly are
many tasks with parallelism that can benet from elastic computing. Cost
associativity means that there is no cost
penalty for using 20 times as much computing for 1=20th
the time. Potential applications that could benet include
those with very high potential nancial returnsnancial analysis,
petroleum exploration, movie animationand
could easily justify paying a modest premium for a 20x speedup. One
estimate is that a third of todays server market
is high-performance computing [10].
The obstacle to attracting HPC is not the use of clusters; most parallel
computing today is done in large clusters
using the message-passing interface MPI. The problem is that many HPC
applications need to ensure that all the
threads of a program are running simultaneously, and todays virtual
machines and operating systems do not provide a programmer-visible way
to ensure this. Thus, the opportunity to overcome this obstacle is to offer
something like
gang scheduling for Cloud Computing.

Bugs in Large-Scale Distributed Systems


One of the difcult challenges in Cloud Computing is removing errors in
these very large scale distributed systems. A
ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 46

common occurrence is that these bugs cannot be reproduced in smaller


congurations, so the debugging must occur at
scale in the production datacenters.
One opportunity may be the reliance on virtual machines in Cloud
Computing. Many traditional SaaS providers
developed their infrastructure without using VMs, either because they
preceded the recent popularity of VMs or
because they felt they could not afford the performance hit of VMs. Since
VMs are de rigueur in Utility Computing,
that level of virtualization may make it possible to capture valuable
information in ways that are implausible without
VMs.

Scaling Quickly
Pay-as-you-go certainly applies to storage and to network bandwidth, both
of which count bytes used. Computation
is slightly different, depending on the virtualization level. Google AppEngine
automatically scales in response to
load increases and decreases, and users are charged by the cycles used.
AWS charges by the hour for the number of
instances you occupy, even if your machine is idle.
The opportunity is then to automatically scale quickly up and down in
response to load in order to save money,
but without violating service level agreements. Indeed, one RAD Lab focus
is the pervasive and aggressive use of
statistical machine learning as a diagnostic and predictive tool that would
allow dynamic scaling, automatic reaction
to performance and correctness problems, and generally automatic
management of many aspects of these systems.
Another reason for scaling is to conserve resources as well as money. Since
an idle computer uses about two-thirds
of the power of a busy computer, careful use of resources could reduce the
impact of datacenters on the environment,

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 47

which is currently receiving a great deal of negative attention. Cloud


Computing providers already perform careful
and low overhead accounting of resource consumption. By imposing perhour and per-byte costs, utility computing
encourages programmers to pay attention to efciency (i.e., releasing and
acquiring resources only when necessary),
and allows more direct measurement of operational and development
inefciencies.
Being aware of costs is the rst step to conservation, but the hassles of
conguration make it tempting to leave
machines idle overnight so that nothing has to be done to get started when
developers return to work the next day. A
fast and easy-to-use snapshot/restart tool might further encourage
conservation of computing resources.

Reputation Fate Sharing


Reputations do not virtualize well. One customers bad behavior can affect
the reputation of the cloud as a whole. For
instance, blacklisting of EC2 IP addresses [31] by spam-prevention services
may limit which applications can be effectively hosted. An opportunity
would be to create reputation-guarding services similar to the trusted
email services
currently offered (for a fee) to services hosted on smaller ISPs, which
experience a microcosm of this problem.
Another legal issue is the question of transfer of legal liabilityCloud
Computing providers would want legal
liability to remain with the customer and not be transferred to them (i.e.,
the company sending the spam should be
held liable, not Amazon).

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 48

Software Licensing
Current software licenses commonly restrict the computers on which the
software can run. Users pay for the software
and then pay an annual maintenance fee. Indeed, SAP announced that it
would increase its annual maintenance fee to at least 22% of the purchase
price of the software, which is comparable to Oracles pricing *38]. Hence,
many cloud computing providers originally relied on open source software
in part because the licensing model for commercial software is not a good
match to Utility Computing.
The primary opportunity is either for open source to remain popular or
simply for commercial software companies to change their licensing
structure to better t Cloud Computing. For example, Microsoft and
Amazon now offer pay-as-you-go software licensing for Windows Server
and Windows SQL Server on EC2. An EC2 instance running Microsoft
Windows costs $0.15 per hour instead of the traditional $0.10 per hour of
the open source version.
A related obstacle is encouraging sales forces of software companies to sell
products into Cloud Computing. Payas-you-go seems incompatible with the
quarterly sales tracking used to measure effectiveness, which is based on
one-time purchases. The opportunity for cloud providers is simply to offer
prepaid plans for bulk use that can be sold at discount. For example,
Oracle sales people might sell 100,000 instance hours using Oracle that
can be used over the next two years at a cost less than is the customer
were to purchase 100,000 hours on their own. They could then
meet their quarterly quotas and make their commissions from cloud sales
as well as from traditional software sales, potentially converting this
customer-facing part of a company from naysayers into advocates of cloud
computing.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 49

Advantages
There are many possible advantages of cloud computing, but they may not
apply to all consumers.

Reduced Costs:
Cloud services paid for on a usage basis can be financially advantageous for
a consumer when compared to the outright purchase, or long-term rental,
of what would be a big-budget item. Also, there are reduced operating
costs, because a cloud consumer does not need to house, staff and
maintain their own equipment.

Up to date software:
SaaS consumers can always have the most up-to-date software, because
versioning is controlled centrally by the cloud provider, and when they
make a new release it is automatically available to every user.
This is particularly advantageous for cloud desktops, because deployment
of new software versions can be very costly and time consuming for a large
organisation with many PCs, and because it can therefore be difficult to
ensure that everyone has the same version of the organisation's PC
software applications at any one time.

Improved access:
Cloud computing involves using the Internet, and this can provide access
from multiple locations and many different types of user device.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 50

Sharing and co-operation


Cloud services are advantageous, when compared to PCs and local servers,
for activities that require co-operation among distributed groups.

Flexible and infinite scaling


Flexible and infinite scaling can be an advantageous feature of cloudcomputing services, for example to allow for a sudden increase in demand
by the users. This has traditionally been a difficulty for fully owned and selfmanaged IT resources, where there can be, for example, one server with a
given, fixed size, and where some of its capacity may be wasted when
demand is low, but where it may be overloaded, resulting in slow response
times, when demand is high.

Simpler capacity sampling


Cloud computing moves the IT capacity-planning role from the consumer to
the cloud provider, and they can be in a better position to optimize the
cloud resources used by their consumers than the consumers themselves
would before their own resources.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 51

For example, the provider may be able to supply better demand smoothing,
because they can perform capacity planning over a much larger pool of
resources, and for a large group of consumers, whose peak loads will
probably not occur all at the same time.

Examples
All cloud services have a dedicated-resource aspect, with consumer-controlled access
to these resources by authorised users, via a secure-access method, such as a login
ID. Also, the resources process data that is private to the consumer and their
associates, which means that it is entered or created by them, although it may be
accessible by others, including the general public.
There are many organisations that supply cloud services, and there is a very wide range
of such services.

Application Software
A range of cloud applications is available, including common small-business
applications, such as accounting, and medium-to-large organisation line-ofbusiness or mission-critical applications, such as CRM and ERP.
These applications are:

Part of Software as a Service, which is one of the cloud-computing service


models.

Located in a private cloud if the software is supplied to just one consumer or in


a public cloud if any entity needing the software can become a consumer. These are
two of the cloud-computing deployment models.

Hosted by an Application Service Provider (ASP).

The authorized users of the hosted software include the organizations staff, and
possibly the staff of their associates, such as customers and suppliers. The
private data includes confidential information entered by the users, such as

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 52

financial-transaction amounts, but there could also be a publicly-accessible


aspect, such as a shopping-cart feature that can be used by visitors to a retail
organizations website.

Third Party application provider


ASPs may host software that they have developed themselves, or
software developed by others. Some market-leading software systems,
such as SAP and Sharepoint, are available from third-party ASPs.

Database
Database as a Service (DbaaS) hosts cloud databases, and virtually all
major database platforms are available in the cloud, including Amazon
SimpleDB and Amazon Relational Database Service.
The private data for these services consists of the records stored in the
database.

Email
Email computing involves two aspects:

Composing, reading and organising emails.

Sending and receiving them via the Internet.

The first of these can be done on a user-device, such as a PC, in which


case it is not part of cloud computing, or it can be done at the website of
an ISP, in which case it is part of cloud computing. The second
aspect is part of cloud computing in both cases.
For PC-based email, composing and reading emails, and
organizing them in folders, is done with software running on a PC, and all
of the permanent storage, such as for the inbox, sent and other folders,
and address books, is allocated on the PC. This is not cloud computing,
because the email software and storage are not accessed via the Internet,
but directly on the PC, even though the send and receive software is
hosted by an ISP. The latter software is part of cloud computing, and the

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 53

private data for this computing consists of the received and sent emails
stored, perhaps temporarily, by the ISP prior to being retrieved for reading
on the PC, or sent via the Internet.

For web-based email, or webmail, such as Gmail, the data is stored for the
consumer in disk space allocated by the service provider, and emails are
composed, read and organised using software hosted by the provider at their
site. The private data for this computing includes all emails, folders and address
books. Webmail is part of Software as a Service.
In both cases, a PC, or other user device, is a cloud client used to access the
services.

Office-productivity software
Office-productivity software, such as Google Docs, is available as a cloud
service. The private data for this software consists of the user's created
artefacts, such as word-processor documents and spreadsheet models, which
are stored and managed on the provider's infrastructure. This is part
of Software as a Service.

Software production
Development environments
There are cloud services, such as the Azure Services Platform, that
provide software-development environments. These are part
of Platform as a Service, which is one of the cloud-computing service
models.
The private data for these services consists of all development
information, which would be accessible only by the consumers and
their authorized users, including any third-party developers.

Customer support
There are cloud services, such as Get Satisfaction, that provide self-help
and developer support for the customers of a software company. This

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 54

support is obtained and entered at a website that hosts and sells the
companys products.
This is a service for the software company, and the private data includes
the domain names of the websites for which support is required.

Storage Service
Disk storage space can be rented from some cloud providers, and consumers of
these services can upload software or data, for example by using the service for
backup of client-device information. The private data would consist of the
uploaded material.
The infrastructure is:

Known as Storage as a Service, and it is part of Infrastructure as a Service,


which is one of the cloud-computing service models.

Hosted by a Storage Service Provider.

Co-oparation and Community service


There are cloud services that use the remote connectivity features of the Internet
to support distributed co-operative activities, such as systems support, project
work or voice and video communication.
Examples of such services include:

Screen-sharing systems, such as LogMeIn and Mikogo that can be used for remote
support or co-operation on projects among geographically distributed participants.

Teleconferencing systems, such as Skype.

For these systems, client software needs to be installed on a user device, and
this isn't part of cloud computing, but there is also central storage of a user's
identity, so that they can connect with others. This storage and the associated
connectivity software are part of cloud computing.
The private data for this service includes the consumers identity.

Websites
Creation and hosting
There are cloud services that provide website creation and hosting.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 55

The private data for the consumers of these services includes the
websites content, and the dedicated resources would include a CMS, so
that they can manage the website, and possibly a CRM, so that they can
manage customers and purchases.

Feedback
There are cloud services for website operators, that allow visitors to a
website to provide feedback to these operators, and that allow the
operators to analyse this feedback.
The private data includes the consumers domain names for which
feedback is required.

Visitor statistics
There are cloud services, such as Google Analytics, that provide website
visitor statistics to the operators, and that provide analysis of these
statistics.
These are consumed by website operators, and the private data includes
the consumers domain names for which statistics are required.

Payment
There are cloud services, such as PayPal, that allow website visitors to
pay for anything purchased at the site.
These are consumed by website operators, and the private data includes
information on the consumers connected bank accounts, so that transfers
of accumulated payment amounts can be made.

Personal uses
Cloud storage
Anyone with Internet access can rent cloud storage and upload their
personal data, for backup or sharing purposes. This is known as
a personal cloud, and it is part of Infrastructure as a Service.
For example, with photos in the cloud, a family can share them with
members and friends that are in distributed locations, in a way that
couldnt happen with data on their PCs. The private data consists of the

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 56

uploaded information, and the authorised users consist of all those given
access to this information.
Potentially, this type of service could have the largest group of tenants
across all providers, because any member of the general public with a
device that can access the Internet can become a consumer, if only to
backup data.

Internet TV
Internet TV, also known as cloud TV, is a cloud service.
The private data for the consumers of these services includes
their multicast address.

Online banking
Online banking is an example of SaaS for the banks customers, and the
private data for each consumer includes their bank account transaction
information.
Besides desktop and portable computers, the cloud clients used to access
this service include ATMs, online or mobile wallets, and point-of-sale
terminals.

Social media and networking


Media and networking sites, such as Facebook and LinkedIn, are part of
cloud computing.
The private data for the consumers includes uploaded information, and
this is accessible by the authorized network consisting of their friends or
colleagues.

Synced data
Different client devices owned by the same consumer can have their data
synchronized, or synced, via the cloud, so that each device can access
the data produced by all the other devices. This is done by automatically
backing up the data of each client using cloud infrastructure.
One example of this for personal use is the iCloud, which provides cloud
storage for an individual's music downloads from the iTunes Store in such
a way that they can be accessed from any of their client devices. In this

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 57

case, the private data consists of the tunes that are automatically gathered
for the consumer onto their cloud storage, regardless of how they are
purchased.

Online retailing
There are cloud services, such as eBay, that allow individuals to sell items
on the Internet.
The private data for these services includes details of the consumers sale
items.

Blogging
There are cloud services, such as WordPress, that enable individuals to
create and maintain a weblog.
The consumers of these services control access by allowing only
themselves to contribute blog topics and to respond to visitor comments,
or by authorising others to do so.
The private data includes the:
1. Topics that are entered from time to time.
2. Comments entered by blog visitors, which can be published or
suppressed by the consumer.
3. Responses of the consumer.
4. Details of who can read or contribute to the blog, which can be the
general public, or an exclusive group.

Peer-to-peer file sharing


Cloud computing involves using a network of remote servers hosted on
the Internet. These servers can be kept in a datacentre operated by a
single cloud provider, but they can also be part of a distributed P2P
network that shares resources via the Internet. In such a network, all
participating systems are peers, which means that they are both clients
and servers, and so their users are both service consumers and service
providers.
For example, P2P file-sharing is part of cloud computing. At any one time,
the group participating in this service consists of the users of all devices

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 58

with the same file-sharing software, such as BitTorrent, that are on-line at
any one time. For these participants, the service is the mutual sharing of
files, and this sharing is:
1. Consumed by each participant by uploading from another participant, or

downloading to another participant.


2. Provided by each participant by making available some of the files on

their own device, for downloading or uploading.

For the participants, as:


A.

Consumers, the:
1. Private data consists of the files on their own device that they allow to be shared.
2. Dedicated resources include their file-sharing software, which is used for the

uploading and downloading of files, and to identify them as part of the network.
3. Controlled access consists of allowing the use of their client device for uploading

to, or downloading from, the other participants.


B. Providers, they host on their own behalf.

Collaborative distributed computing


The Internet services that control collaborative, distributed computing,
such as GIMPS and SETI@home, are part of cloud computing.This is also
known as volunteer computing.
These services divide up the computation into small parts that are then
distributed to the participating user devices over the Internet. After
carrying out its part of the task, a device sends the results back to the
cloudbased control as a contribution to the whole process.
The consumers of these services are the participants whose user devices
carry out parts of the computation. The private data for each consumer
consists of their registration information, including the Internet address of
their device.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 59

Some Issues About Cloud Computing


system

Privacy
The cloud model has been criticized by privacy advocates for the greater
ease in which the companies hosting the cloud services control, thus, can
monitor at will, lawfully or unlawfully, the communication and data stored
between the user and the host company. Instances such as the secret NSA
program, working with AT&T, and Verizon, which recorded over 10 million
phone calls between American citizens, causes uncertainty among privacy
advocates, and the greater powers it gives to telecommunication
companies to monitor user activity. Using a cloud service provider (CSP) can
complicate privacy of data because of the extent to which virtualization for
cloud processing (virtual machines) and cloud storage are used to
implement cloud service . The point is that because of CSP operations,
customer or tenant data may not remain on the same system, or in the
same data center or even within the same provider's cloud. This can lead to
legal concerns over jurisdiction. While there have been efforts (such as USEU Safe Harbor) to "harmonize" the legal environment, providers such
as Amazon still cater to major markets (typically the United States and the
European Union) by deploying local infrastructure and allowing customers
to select "availability zones."Cloud computing poses privacy concerns
because the service provider at any point in time, may access the data that
is on the cloud. They could accidentally or deliberately alter or even delete
some info.
Compliance
In order to obtain compliance with regulations including FISMA, HIPAA,
and SOX in the United States, the Data Protection Directive in the EU and

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 60

the credit card industry's PCI DSS, users may have to


adopt community or hybrid deployment modes that are typically more
expensive and may offer restricted benefits. This is how Google is able to
"manage and meet additional government policy requirements beyond
FISMA" and Rackspace Cloud or QubeSpace are able to claim PCI
compliance.
Many providers also obtain SAS 70 Type II certification, but this has been
criticised on the grounds that the hand-picked set of goals and standards
determined by the auditor and the auditee are often not disclosed and can
vary widely. Providers typically make this information available on request,
under non-disclosure agreement.
Customers in the EU contracting with cloud providers established outside
the EU/EEA have to adhere to the EU regulations on export of personal
data.
U.S. Federal Agencies have been directed by the Office of Management and
Budget to use a process called FedRAMP (Federal Risk and Authorization
Management Program) to assess and authorize cloud products and
services. Federal CIO Steven VanRoekel issued a memorandum to federal
agencies should use FedRAMP. FedRAMP consists of a subset of NIST
Special Publication 800-53 security controls specifically selected to provide
protection in cloud environments. A subset has been defined for the FIPS
199 low categorization and the FIPS 199 moderate categorization. The
FedRAMP program has also established a Joint Accreditation Board (JAB)
consisting of Chief Information Officers from DoD, DHS and GSA. The JAB is
responsible for establishing accreditation standards for 3rd party
organizations who will perform the assessments of cloud solutions. The JAB
will also review authorization packages and may grant provisional
authorization (to operate). The federal agency consuming the service will
still have the final responsibility for final authority to operate.
Legal
As can be expected with any revolutionary change in the landscape of
global computing, certain legal issues arise; everything from trademark
infringement, security concerns to the sharing of propriety data resources.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 61

Open source
Open-source software has provided the foundation for many cloud
computing implementations, one prominent example being the Hadoop
framework. In November 2007, the Free Software Foundation released
the Affero General Public License, a version of GPLv3 intended to close a
perceived legal loophole associated with free software designed to be run
over a network.
Open standards
Most cloud providers expose APIs that are typically well-documented (often
under a Creative Commons license) but also unique to their
implementation and thus not interoperable. Some vendors have adopted
others' APIs and there are a number of open standards under development,
with a view to delivering interoperability and portability.
Security
As cloud computing is achieving increased popularity, concerns are being
voiced about the security issues introduced through adoption of this new
model. The effectiveness and efficiency of traditional protection
mechanisms are being reconsidered as the characteristics of this innovative
deployment model can differ widely from those of traditional
architectures. An alternative perspective on the topic of cloud security is
that this is but another, although quite broad, case of "applied security"
and that similar security principles that apply in shared multi-user
mainframe security models apply with cloud security.
The relative security of cloud computing services is a contentious issue that
may be delaying its adoption. Physical control of the Private Cloud
equipment is more secure than having the equipment off site and under
someone elses control. Physical control and the ability to visually inspect
the data links and access ports is required in order to ensure data links are
not compromised. Issues barring the adoption of cloud computing are due
in large part to the private and public sectors' unease surrounding the
external management of security-based services. It is the very nature of
cloud computing-based services, private or public, that promote external
management of provided services. This delivers great incentive to cloud
computing service providers to prioritize building and maintaining strong

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 62

management of secure services. Security issues have been categorized into


sensitive data access, data segregation, privacy, bug exploitation, recovery,
accountability, malicious insiders, management console security, account
control, and multi-tenancy issues. Solutions to various cloud security issues
vary, from cryptography, particularly public key infrastructure (PKI), to use
of multiple cloud providers, standardization of APIs, and improving virtual
machine support and legal support.
Sustainability
Although cloud computing is often assumed to be a form of "green
computing", there is no published study to substantiate this assumption.
Citing the servers affects the environmental effects of cloud computing. In
areas where climate favors natural cooling and renewable electricity is
readily available, the environmental effects will be more moderate. (The
same holds true for "traditional" data centers.) Thus countries with
favorable conditions, such as Finland, Sweden and Switzerland are trying to
attract cloud computing data centers. Energy efficiency in cloud computing
can result from energy-aware scheduling and server
consolidation. However, in the case of distributed clouds over data centers
with different source of energies including renewable source of energies, a
small compromise on energy consumption reduction could result in high
carbon footprint reduction.
Abuse
As with privately purchased hardware, crackers posing as legitimate
customers can purchase the services of cloud computing for nefarious
purposes. This includes password cracking and launching attacks using the
purchased services. In 2009, a banking trojan illegally used the popular
Amazon service as a command and control channel that issued software
updates and malicious instructions to PCs that were infected by the
malware.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 63

Research

Many universities, vendors and government organizations are investing in


research around the topic of cloud computing:
In October 2007, the Academic Cloud Computing Initiative (ACCI) was
announced as a multi-university project designed to enhance students'
technical knowledge to address the challenges of cloud computing.
In April 2009, UC Santa Barbara released the first open source platform-asa-service, AppScale, which is capable of running Google App Engine
applications at scale on a multitude of infrastructures.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 64

In April 2009, the St Andrews Cloud Computing Co-laboratory was


launched, focusing on research in the important new area of cloud computing.
Unique in the UK, StACC aims to become an international centre of excellence
for research and teaching in cloud computing and will provide advice and
information to businesses interested in using cloud-based services
In October 2010, the TClouds (Trustworthy Clouds) project was started,
funded by the European Commission's 7th Framework Programme. The
project's goal is to research and inspect the legal foundation and architectural
design to build a resilient and trustworthy cloud-of-cloud infrastructure on top
of that. The project also develops a prototype to demonstrate its results.
In December 2010, the TrustCloud research project [74][75] was started by HP
Labs Singapore to address transparency and accountability of cloud computing
via detective, data-centric approaches[76] encapsulated in a five-layer
TrustCloud Framework. The team identified the need for monitoring data life
cycles and transfers in the cloud,[74] leading to the tackling of key cloud
computing security issues such as cloud data leakages, cloud accountability
and cross-national data transfers in transnational clouds.
In July 2011, the High Performance Computing Cloud (HPCCLoud) project
was kicked-off aiming at finding out the possibilities of enhancing performance
on cloud environments while running the scientific applications - development
of HPCCLoud Performance Analysis Toolkit which was funded by CIMReturning Experts Programme - under the coordination of Prof. Dr. Shajulin
Benedict.
In June 2011, the Telecommunications Industry Association developed a
Cloud Computing White Paper, to analyze the integration challenges and
opportunities between cloud services and traditional U.S. telecommunications
standards.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

ECE 4th Year Project

P a g e | 65

IST, Dhaka

Cloud Computing

P a g e | 66

Future of Cloud Computing


Five key trends in cloud computing future
First, the buzzwords "cloud computing" are enmeshed in computing. I'm
not sure I ever liked the term, though I've built my career around it for the
last 10 years. The concept predated the rise of the phrase, and the concept
will outlive the buzzwords. "Cloud computing" will become just
"computing" at some point, but it will still be around as an approach to
computing.

Second, we're beginning to focus on fit and function, and not the hype.
However, I still see many square cloud pegs going into round enterprise
holes. Why? The hype drives the movement to cloud computing, but there
is little thought as to the actual fit of the technology. Thus, there is
diminished business value and even a failed project or two. We'll find the
right fit for this stuff in a few years. We just need to learn from our failures
and become better at using clouds.

Third, security will move to "centralized trust." This means we'll learn to
manage identities within enterprises -- and within clouds. From there we'll
create places on the Internet where we'll be able to validate identities, like
the DMV validates your license. There will be so many clouds that we'll
have to deal with the need for a single sign-on, and identity-based security
will become a requirement.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 67

Fourth, centralized data will become a key strategic advantage. We'll get
good at creating huge databases in the sky that aggregate valuable
information that anybody can use through a publicly accessible API, such as
stock market behavior over decades or clinical outcome data to provide
better patient care. These databases will use big data technology such as
Hadoop, and they will reach sizes once unheard of.

Fifth, mobile devices will become more powerful and thinner. That's a nobrainer. With the continued rise of mobile computing and the reliance on
clouds to support mobile applications, mobile devices will have more
capabilities, but the data will live in the cloud. Apple's iCloud is just one
example.

ECE 4th Year Project

IST, Dhaka

Cloud Computing

P a g e | 68

Refences:

ECE 4th Year Project

IST, Dhaka

Das könnte Ihnen auch gefallen