Sie sind auf Seite 1von 17

A Seminar Report

Role of Network layer on Cloud

Submitted for Partial fulfillment of
Bachelor of Technology
Computer Science Engineering
Anuj Kr. Srivastava
Guided by:

Sagar Institute of Technology & Management
Barabanki-225001, Uttar Pradesh, INDIA
Affiliated to U.P. Technical University


I take this precious opportunity to express my gratitude toward Role of Network layer in
Cloud Computing to grant is permission for undergoing the seminar. Without its willingness to permit
this topic would not have been succeed.

First of all, I would like to thanks all those people who helped me directly or indirectly to
complete my project whenever I found myself in problems. Our all faculties encourages me and due to
their kindness and helpful nature and help I got very much confidence to complete this project.


Cloud computing is an umbrella term used to refer to Internet based development and services. The
cloud is a metaphor for the Internet. A number of characteristics define cloud data, applications services
and infrastructure:
Remotely hosted: Services or data are hosted on someone elses infrastructure.

Ubiquitous: Services or data are available from anywhere.

Commodified: The result is a utility computing model similar to traditional that of traditional
utilities, like gas and electricity. You pay for what you would like.


S. No. Topic Page No.
Introduction 1
Cloud Deployment Models 2-3
Networking Consideration for Public
Why companies moving services to the
Case occurred in AWS 4-5
Opportunities & Challenges 6
The Future 7
Conclusion 7
References 8


In recent years, no single advance in information technology has
commanded the attention that cloud computing hasand for good
reason. Because clouds allocate resources when and where theyre
required, while also more heavily leveraging automation than any
previous computing technology, they
are remarkably efficient and cost-effective service delivery
platforms. In increasing IT speed and agility, cloud computing can
help organizations more quickly respond to competitive challenges
and opportunities, thereby more closely aligning IT with business
goals. Chief executive officers and other IT executives also look to
cloud computing to solve ubiquitous challenges in the data center
environment. These challenges include low server utilization rates;
significant operational inefficiencies; arduous processes to procure,
build and maintain server environments; and long application
deployment times.

Cloud deployment models and their
networking ramifications

Organizations have a variety of cloud deployment options tochoose
from, depending on business objectives, security needs,performance
goals and manageability requirements. (SeeFigure 1.) Network
requirements will vary from deploymentmodel to deployment model.

In mostenterprises will choose among the following cloud models:

Public clouds: Public cloud infrastructures are typicallyhosted in a
third-party cloud service providers datacenter, on a platform that is
shared by other organizationsor individuals.

Private clouds: Private cloud infrastructures can be locatedon or
off premises, are operated solely for the organization,and are
managed by the organization itself or a third party.

Hybrid clouds: Hybrid clouds utilize the capabilities ofboth public
and private cloud infrastructures merged withtraditional IT to meet
business requirements.

Depending on the cloud deployment model chosen, theorganization
will be responsible for different networkingrequirementssuch as
security, performance, availability andmanageability. What follows
is a look at how network elementsmay vary from cloud model to
cloud model. It is importantto remember that, regardless of the cloud
model employed,organizations must make sure that the network fully
supportsbusiness needs and meets security requirements and
targetservice levels.

Networking considerations for public

Public cloud delivery models will commonly deliver services
tosupport the interests of a broad population. For cost efficiency,the
Internet is used as the basic networking platform for usersto connect
to the cloud. When employees use public clouddelivery models,
organizations are expanding the companysecurity boundary to the
Internet and beyond. Cloud serviceproviders may offer a broad range
of available access methodsand connectivity technologies (including
broadband, wirelessand mobile technologies) that allow the cloud
providersservices to be accessed anywhere, anytime. Enterprises
mustsee to it that the network design and enforcement of securityand
privacy policies encompass the public domain, includingpublic
access technologies and methodologies.

SaaS is a model of software deployment where an application is
hosted as a service provided to customers across the Internet. SaaS is
generally used to refer to business software rather than consumer
software, which falls under Web 2.0. By removing the need to install
and run an application on a users own computer it is seen as a way
for businesses to get the same benefits as commercial software with
smaller cost outlay. Saas also alleviates the burden of software
maintenance and support but users relinquish control over software
versions and requirements. The other terms that are used in this
sphere include Platform as a Service (PaaS) and Infrastructure as a
Service (IaaS).


Why are companies moving services to the
cloud, and why would you want to consider
moving your critical business applications from
your data centers to the cloud?

Three primary drivers of this shift: cost, speedy introduction of new
applications/services and availability. The large cloud providers
realize economies of scale in operating virtualized platforms that few
enterprises can replicate.
These providers offer GUIs and APIs that allow users to spin up new
virtual machines (VMs) and services in minutes. Even if you already
use virtualization in your data center, you'd have to invest a lot of
money in systems and tools to match the flexibility of the cloud.
The last driver is availability. You probably have redundancy in your
data center design to keep applications available to the user base.
Imagine if you could host applications from many different
geographic locations and have the infrastructure managed by the
world's leading experts of high availability services.

Case occurred in AWS: -
AWS offers two cloud environments--Amazon Elastic Compute
Cloud (EC2) and Virtual Private Cloud (VPC). EC2 is intended for
delivering services to Internet users without data center integration.
Web servers offering content is one example. VPC is better suited for
integrating with your corporate network and users. By default, the
VPC has no connectivity to the Internet unless explicitly configured.
Amazon makes data center integration possible by letting IT:
Create subnets using private addresses in the RFC1918 space
Establish custom route tables
Deploy network access lists (ACLs) that provide protection at
the subnet level
Pass configuration information to VMs using DHCP option sets
Connect securely to your data center using IPsec over the
Internet or dedicated connections from AWS data centers to
your data center

VPC has limitations that administrators should understand. The VPC
supports only RFC1918 space within the VPC. If this presents
problems for your network, you can use NAT in your data center to
make the VPC appear to be numbered from another address space.
AWS built the VPC to scale to massive size. To accomplish this feat,
the engineers chose a Layer 3 (that is, IP) foundation for networking.
A ramification of this decision is that VPC does not support
broadcast and VLANs. Traffic separation must be done at the subnet
level. Since enterprise networks rely heavily on VLANs for
separations, this is a significant problem if you expect to port VLAN-
centric designs to the cloud.
Let's turn to an example. You want to provision a set of VMs in the
cloud to run your Web-based expense reporting system. Only users
on your corporate network need to access the application. You need
two Web servers and one database server. Data between your data
center and your VPC will be encrypted using the IPSec protocol.
AWS's VPC Creation Wizard makes the configuration of this set-up
simple. You'll need one unused subnet from the address space that
you use on your corporate network. Since the is often
used, we'll go with the subnet. The wizard will
automatically create the VPC router, which is a virtual router. While
you can't log in to this router as you would a physical router, you can
make routing tables changes that affect how this router does its work.
In this example, no routing table changes are needed; the wizard sets
up routing automatically.
Next, you'll create the VPC gateway and IPSec tunnel to your data
center. In your data center, you must have a router that supports
IPSec and the Border Gateway Protocol (BGP). AWS has tested
Cisco and Juniper routers. While other routers will probably work, I
recommend using one of these vendors for at least the initial turn-up.
I've seen several organizations spend days trying to get IPSec to the
VPC working with other vendors' equipment. You probably have a
Cisco or Juniper router lying around somewhere. Use it. AWS
provides the IPSec and BGP configuration for these routers. You can
attempt to use another router once you've confirmed that the tunnel is

Now your data center has been extended to the cloud. Your users
will access the expense reporting application no differently than they
would applications hosted in your data center. You can use the VPC
for much more involved setups that include multiple subnets for
public and private use. The VPC can be configured to allow users on
the Internet to reach the subnets you specify. This is useful for
deploying e-commerce and other customer-facing services.

Opportunities and Challenges: -

The use of the cloud provides a number of opportunities:
It enables services to be used without any understanding of their
Cloud computing works using economies of scale. It lowers the
outlay expense for startup companies, as they would no longer
need to buy their own software or servers. Cost would be by on-
demand pricing. Vendors and Service providers claim costs by
establishing an ongoing revenue stream.
Data and services are stored remotely but accessible from
anywhere. In parallel there has been backlash against cloud
Use of cloud computing means dependence on others and that
could possibly limit flexibility and innovation. The others are
likely become the bigger Internet companies like Google and IBM
who may monopolise the market. Some argue that this use of
supercomputers is a return to the time of mainframe computing
that the PC was a reaction against.
Security could prove to be a big issue. It is still unclear how safe
outsourced data is and when using these services ownership of
data is not always clear.
There are also issues relating to policy and access. If your data is
stored abroad whose FOI policy do you adhere to? What happens
if the remote server goes down? How will you then access files?
There have been cases of users being locked out of accounts and
losing access to data.

The Future

Many of the activities loosely grouped together under cloud
computing have already been happening and centralised computing
activity is not a new phenomena: Grid Computing was the last
research-led centralised approach. However there are concerns that
the mainstream adoption of cloud computing could cause many
problems for users. Whether these worries are grounded or not has
yet to be seen.


Cloud computing offers benefits for organizations and individuals.
There are also privacy and security concerns. If you are considering a
cloud service, you should think about how your personal
information, and that of your customers, can best be protected.
Carefully review the terms of service or contracts, and challenge the
provider to meet your needs.


1. Welcome to the Data Cloud, The Semantic Web blog, 6 Oct
2. Any anyany old data, Paul Walks blog, 7 Oct 2008,
3. New methodology; Fowler; Martin;
4. Highsmith Jim (2002) Agile Software Development
ecosystems.Boston,MA Pearson Education.