Sie sind auf Seite 1von 12

ASSIGNMENT

2016-17
CLOUD DB SYSTEM
(MCA-6012)

SIKKIM MANIPAL UNIVERSITY


DIRECTORATE OF DISTANCE EDUCATION

Submitted by:
Anshul Singh
MCA- IInd year
6th semester

Q1. Differentiate cloud and distributed Computing ?

Ans: Cloud Computing is a new term for a long-held dream in terms of utility computing. The origin
began with the idea of grid computing, where the resources are used temporarily using remote
supercomputers or clusters of mainframes to solve the huge and complex problems that could not
be handled with the available infrastructures. This progressed towards a service-oriented business
model that offers physical and virtual resources on a pay as you go basis. While cloud computing
takes distributed computing to a utility stage, ubiquitous and unmetered access to broadband
Internet is the key to its success. In addition, better standardization, portability and interoperability
of its distributed components will help move Cloud Computing to its full potential.
Cloud computing technology provides various resources as a service; it does this service process
majorly through the internet support. It is basically a sales and distribution model for various types
of resources over the internet. Here the cloud consumers can be considered as subscribers since
they pay for what they consume or utilized. Whereas the distributed computing works with the
concept of using the group of self-governed nodes in order to solve the huge and complex
problems. It can be identified as a type of computing, which uses a group of machines to work as a
single unit to solve a large scale problem. Distributed computing achieves this by breaking the
problem up to simpler tasks, and assigning these tasks to individual nodes. Distributed Computing
is a group of networked computers that are able to work on a smaller portion of the large task in a
parallel way. This group may consist of variety of computers including cloud virtual servers. In
distributed computing environment the computers are scalable, open and are allowed to join and
leave the network anytime. Here the role of distributed server is to assign and collect the output
from the system based on their arrival and leave. On the other hand cloud computing can be
considered as the refined or specialized form of distributed Computing. Here the resources like
storage, memory, processors are completely abstracted from the consumer. In turn the cloud
service providers take the responsibility for the performance, security, scalability and reliability of
the service. If you think from the developer side this is been considered as the big advantage at
the same time the disadvantages over this setup are control over failure, latency, resources etc.

Q2. Discuss the following methods which are enforcing the Serialisability by locks ?
A) Locking scheduler
B) Two phase locking
Ans: Serialise ability describes the concurrent execution of several transactions. Serialise ability
must be guaranteed to prevent inconsistency from transactions interfering with one another. The
order of Read and Write operations are important in Serialise ability We use the following methods
for enforcing Serialise ability by locks:
Locks
Locking scheduler
Two phase locking
These are discussed as below. Locks Performing locks on a resource is one of the methods that
are used to serialise transactions. Any data that is going to be Utilised in support of the transaction
will be locked by the process. A lock is used by the transaction to reject data access to other
transactions and thus avoid inaccurate updates. Locks can be of different types such as Read

(shared) or Write (exclusive) locks. A data items write locks stops other transaction from reading

that data item. On the other hand, Read Locks just prevent other transactions from editing (writing
to) the data item. You will study these types later in the unit. So, we can say that there can be
three states of a lock:

Exclusive (write)lock
Shared (read) lock
Unlocked

You can use locks in the following manner:


1. Firstly, the transaction which is required to access any data item must lock the item. It makes a
request to a shared lock for read only access or it makes a request to an exclusive lock for read
and write access.
2. If another transaction does not lock the item, then the lock will be provided.
3. If the item is currently in the locked state, then the database management system (DBMS)
identifies whether the request is wellmatched with the current lock. If an item already having
shared lock gets a request of shared lock, then the request will be granted. Or else, transaction
must wait until the current lock is released.
4. Transaction maintains to hold a lock until it is clearly released either during execution or when it
finishes (that is aborts or commits). The effects of the write operation will become noticeable to
another transaction only when the exclusive lock has been released.
Locking scheduler :-The scheduler is used to create the order which implements the operations
inside simultaneous transactions. To guarantee serialisability, the implementation of database
operations is interleaved by the scheduler. For identifying the proper order, the scheduler
establishes its proceedings on algorithms of concurrency control, like time stamping or locking
methods, which are discussed later in the unit. It also ensures that the computers central
processing unit (CPU) is used in an efficient manner. Lock requests are allowed by the locking
scheduler provided the condition is that of being in a legal schedule. Lock table accumulates the
information regarding existing locks on the elements.
Two phase locking :-Two phase locking is another method which is used to ensure serialisability.
The two-phase lock is followed by a transaction if all lock requests occur prior to the first unlock
operation inside the transaction. Two phase locking defines the process of acquiring and giving up
the locks. It does not prevent deadlocks. Deadlocks take place when two or more transactions are
in a waiting position and they are waiting for the locks held by each other that are to be released.
Two main phases are there within the transaction:
Growing phase: In case of growing phase, all necessary locks are attained by a transaction
without releasing data.
Shrinking phase: In case of shrinking phase, all locks are released by a transaction. Also you
cannot attain any new lock.

The rules followed in case of two phases locking protocol are given below:

Two transactions cannot comprise inconsistent locks.


No unlock operation can be performed before a lock operation in the similar transaction.

It will not affect any data until the attainment of all the locks.

If, throughout the initial stage, a process is not able to obtain all the locks, then it is necessary to
release all of them. They wait, and begin once more. Make note that if two phase locking is utilised
by all transactions, then each schedule created by interleaving them are said to be serialisable. To
makve sure that data is not accessed by a transaction until another transaction operating on the
data has either committed or aborted, locks may be held until the transaction is committed or
terminated. We call this as strict two phase locking. This is comparable to two phase locking
excluding that here, the shrinking phase occurs throughout the commit or abort. We have one
consequence of strict two phase locking. By placing the second phase at the end of a transaction,
all lock acquirements and releases can be managed by the system without the knowledge of
transaction.

Q3. Discuss the different services under cloud computing technology ?


Ans: The different services under cloud computing technology.
Services: Also, these services will likely affect depends on how your cloud infrastructure
deployment.
Identity: The user needs to know where the application is running either it runs on cloud platform
or on the individual setup. In order to provide such service the system asks for the digital identity of
the individual user. This stored information will be helpful to identify each user and their role on
accessing the application. Active Directory in local or in-house applications will be relied for these
operations. The cloud environment application identifies the user with their own application with an
identity services. For example if you are signing through the Amazon cloud services you should
provide the Amazon defined credential. Similarly Google App Engine looks for the Google account
credentials and using Windows Live ID we can access the Micro soft cloud application. These
credentials basically help to identify the users and provide them the service that can be accessed
by the particular user. Now we will discuss the Open ID access system here the sign on credential
is single allow the user control is decentralized allows the user to avail multiple services. Since the
Open ID is in the form of a URL it does not rely on a central authority to authenticate a users
identity. Since a specific type of authentication is not required, nonstandard forms of authentication
may be used, including passwords, biometric or smart cards. Many organizations are using Open
ID authentication system includes.

Microsoft
IBM
Yahoo!
Google

Integration: Applications talking among themselves have become highly common. Vendors come
up with all sorts of on-premises infrastructure services to accomplish it. Here the technologies
comprises from the integration of servers for complex program to message queues. Integration is
also on the cloud and technologies are being developed for that use, as well. For example,
through Amazons Simple Queue Service (SQS) messages between the applications can be

exchanged. Generally SQS will replicate the messages across all the queues, when the

application reads message from the queue it may not get all the messages in the given single
request. In-Order delivery is also not guaranteed by SQS. These sound like shortcomings, but in
fact its these simplifications that make SQS more scalable, but it also means that developers must
use SQS differently from on-premises messaging. Another example of cloud-based integration is
BizTalk Services. Instead of using queuing, BizTalk Services utilizes a relay service in the cloud,
allowing applications to communicate through firewalls. Since cloud-based integration requires
communicating through different organizations, the ability to tunnel through firewalls is an
important problem to solve. BizTalk Services also utilizes simplified workflow support with a way
for applications to register the services it exposes, and then lets those services be invoked by
other applications. Integration services in the cloud is going to gain in prominence as it becomes
more and more important, especially given how important it is in-house.
Mapping: As all we know map is becoming a very popular web applications. Here there are
organizations those are interested to promote by tagging their organization with the map web
application. For example the restaurant or the hotel shows their routes in the web sites, also a
customer can customize the web site by providing his address to know the route from his place.
Most of the organization are interested to go with this technology now is is also been offered
through cloud platform. Such services as Google Maps and Microsofts Virtual Earth provide this
cloud-based function, allowing developers to embed maps in web pages. These services are really
just additions to existing web sites.
Payments: One more application that every individual and the organization eager to get through
cloud platform is the payment application. You may or may not want to accept online payments
from customers depending on your organization policy. Luckily, there is no lack of ways to get paid
online. You can simply sign up with a service to accept credit cards, or you can go the route of
PayPal. With an online payment service, customers can send money directly to your organization.
Search: The ability to embed search options in a web site is certainly nothing new, but it is a rich
feature that you might want to employ in your own web or application development.
Through Microsofts Live Search the customer can go to on-site cloud applications to submit query
and get the answer back. Here an organization can be restricted the search within the organization
to understand the activities.
For example, a company want to develop an application that does both. Lets say a company has
the database of movie information. By typing in the name of the movie, you can search its own
database as well as a search of the Internet to give you two types of results. Both the information
stored in the company database as well as stored on the entire Web. If you were to use a single
computer to access the cloud, the requirements are pretty minimal. Here a user needs to have
only the computer along with the Internet connection. However, when you start planning cloud
solutions for your organization, you need to spend more time to identify the hardware and
infrastructure support that may suit to your organization with the maximum benefit.

Q4. Discuss the merits and demerits of cloud Storage ?

Ans: Merits and Demerits of Cloud Storage: There are several advantages of the cloud storage
model compared to networked online storage and individual storage. As we know it not easy to
access the data stored in the end user device whereas the cloud storage data is handled by the

third party. In cloud storage since it is with third party there are more prone to theft or loss. Even
the Network storage devices require more information from resources as well as the user that the

Organisation may not possess, to address these problems.


For example, creating a redundant storage array and backing it up is outside the skill set of most
people, while building and maintaining a data center is beyond the core capabilities of most
organizations. It is a wise decision to choose cloud storage since it is more efficient and scalable
rather than calculating the storage required to storage the data and purchasing the required for
this purpose. Whereas the cloud user need to pay only for what they want or for the quantity they
utilized. Cloud storage is a feasible solution that can grow instinctively to meet increased demand.
Also focusing on economy of scale, when many individuals and organizations decided to store
their data on the same systems which is managed by a single provider.
Merits:

It is possible to access the data from any location where you have internet facility with that
system.
Implementation can be simple.
It is easy and efficient to expand the service based on the organization need.

Various parties can be permitted to access your data.

Theres no need to physically manage storage (fix bad disks, backup data) if the storage is
bundled as an outsourced managed service.

Demerits:
Having difficulty accessing your data can be unsatisfying for the individual or to an organization,
while losing it all together becomes the worst situation. This platform leads many individuals as
well as the organisation to unaware or the lack of skill towards the storage and maintenance of
data.
Performance can be affected by the traffic in the internet. Access speed of the remote
storage will never match with local access.
Security is a very challenging concern. When a companys sensitive data is transported
over an unsecure network such as the internet, theres a possible risk of exposure. But,
even a corporate network has its own risks. Rigorous security practices may reduce this
problem
There is chance for dramatic in price hike if the usage is not monitored carefully. Out of
sight should not be out of mind; Cloud Storage needs to be proactively supervised.
The experience of signal creek technology a cloud storage support providers in the market is
shared as Based on our experience at Signal Creek with a range of our partners, the state of
Cloud Storage today seems most advantageous to smaller businesses, as opposed to the larger
organizations we typically work with. With the heavy demands of large enterprise storage and
archiving, Cloud Storage isnt a clear choice for large organizations with enormous amounts of
data, or heavy data security and compliance requirements.

Q5. Discuss the need for privacy in cloud computing ?


Ans: The need for privacy in cloud computing Television commercials advertise security
products and news programs frequently describe the latest data breach. Public perception

aside, any organization has a legal obligation to ensure that the privacy of their employees
and clients is protected. Laws prohibit some data from being used for secondary reasons
other than the purpose for which it was originally collected . In the world of cloud
computing, this becomes much more difficult, as you now have a third party operating and
managing your infrastructure. By its very nature, that provider will have access to your
data. Privacy notices often specify that individuals can access their data and have it
deleted or modified. If the data is in a cloud providers environment, privacy requirements
still apply and the enterprise must ensure this is allowed within a similar timeframe as if
the data were stored on site. If data access can only be accomplished by personnel within
the cloud providers enterprise, you must be satisfied they can fulfill the task as needed.
Privacy notices often specify that individuals can access their data and have it deleted or
modified. If the data is in a cloud providers environment, privacy requirements still apply
and the enterprise must ensure this is allowed within a similar timeframe as if the data
were stored on site. If data access can only be accomplished by personnel within the
cloud providers enterprise, you must be satisfied they can fulfill the task as needed. There
are a number of cloud providers that specialize in distinct markets and tailor their services
to those markets. This is likely to become more prevalent in the upcoming years. Niche
cloud providers will also likely emerge. We would expect them to charge for the special
handling and controls that are needed.
Data Location: Any business with a Web presence or individuals who post on social-

networking sites is recording data on one or more servers that could actually be located
anywhere. Whether youre posting personal information to Facebook or updating your
business links on LinkedIn, this data will be stored somewhere. As businesses move
toward using and embracing cloud providers, the location of this data will become more
important due to data privacy, legal or regulatory demands. The primary location of the
data and any backup locations must be known to ensure these laws and regulations are
followed. Often, the backup locations need to be determined. Amazon.com Inc., for
instance, has large datacenters in both the United States and Ireland, which could cause
problems if they were used as backup centers for certain types of data The data protection
laws of the European Union (EU) member states, as well as other regions, are extremely
complex and have a number of definitive requirements. The transfer of personal data
outside these regions needs to be handled in very specific ways. For instance, the EU
requires that the collector of the data, or data controller, must inform individuals that the
data will be sent and processed in a region outside of the EU. The data controller and end
processor must also have contracts approved by the Data Protection Authority in advance.
This will have different levels of difficulty depending on the region thats processing the
data. The United States and EU have a reciprocal agreement, and the U.S. recipient only
has to self-certify its data procedures by registering with the U.S. Department of
Commerce. You need to ensure that any cloud providers you use that are outside your
jurisdiction have adequate security measures in place. This includes their primary and
backup locations, as well as any intermediate locations if data is being transferred
between jurisdictions. The cloud provider market is expanding, but there are still only a

limited number of players that can offer large-scale application and data hosting. This may
lead companies to subcontract some or all of the hosting to another company, possibly in
another region. Before entering into any agreement, be aware of any subcontracts and
perform appropriate security checks on these as well.
Some cloud providers will inevitably go bankrupt or cease operating. Access to your data
instantly becomes an issue. Depending on where the server resides, this may require you
to go through another regions jurisdiction to get the data back, and the data may be
subject to completely different access rules.
Secondary Use of Data: Depending on the type of cloud provider with whom you contract,

youll have to consider if your data is going to be mined by the supplier or others. The use
of your data may occur unbeknownst to you or by virtue of a configuration error on the
providers part. Based on the sensitivity of your data, you may wish to ensure your
contract prohibits or at least limits the access the cloud provider has to use this data.

This can be especially hard when you enter into a click-wrap agreement. As we all know,
very few of us will read the fine print at all. We just click the Agree box when it appears. In
2009, when Facebook changed its terms around data security, many people complained.
However, the majority of users carried on using the service because they found it useful.
Its likely your users will react in the same way, which may well give you security issues.
The data youre storing in the cloud may be confidential or hold personal data for which
you must ensure security. The cloud provider is likely to have full access to this data to
maintain and manage your servers. Youll need to ensure this access is not abused in any
way. Although a contract may protect you legally, youll also need to ensure youre
confident that the security in place at the provider will detect any unauthorized access to
your data.
Disaster Recovery: In terms of disaster recovery, you need to consider some possible

scenarios: a provider might go out of business, or their datacenter could become


inoperable. The main issues with the first scenario are getting your data back and
relocating your cloud applications to another supplier. Set out some form of plan
when you move to the cloud and revisit that plan on a regular basis. Market factors
and other circumstances change quite rapidly.

A fire in a data center in Green Bay, Wis., in 2009 led to outages for some hosted
Web sites for up to 10 days.
An outage in Fisher Plaza (Seattle) in July 2009 affected many sites, including Bing
Travel.
An explosion in The Plant data center in Houston in 2008 took nearly 9,000
customers offline for as much as a few days.
Rackspace US Inc. had an outage in its Dallas center in 2009, which lasted just
under an hour.
The 365 Main data center had outages in 2007 that affected Craigslist, Yelp and
others.

Google suffered a data center rolling blackout due to a software upgrade error
during February of 2009, causing the loss of mail service for many customers.
Depending on your level of preparedness, any of these events could be a mere
inconvenience or an imminent threat to your business. Smaller companies are more likely
to be hit harder, as they may have less expertise and fewer resources.
As you can see from that list of incidents, its not just physical issues due to power or
cooling failures, but also software errors that can take a datacenter down. Hackers denial
of service attacks against specific Web sites could also affect your site by virtue of
bandwidth issues if the attacked site is hosted in the same datacenter .

Q6. Explain the role of IT governance. Discuss the benefits of IT governance ?


Ans: IT Governance: Governance is all about applying policies relating to using services.
Its about defining the organizing principles and rules that determine how an organization
should behave. The word governance derives from the Latin word for steering. It is
important to have a steering process because, well, it helps to make sure that you stay on
the road before diving in, take a step back and look at the IT governance process in
general because many of the same principles are relevant to the cloud environment. IT
manages a complex infrastructure of hardware, data, storage, and software environments.
The data center is designed to use all assets efficiently while guaranteeing a certain
service level to the customer. A data center has teams of people responsible for managing
everything from the overall facility: workloads, hardware, data, software, and network
infrastructure. In addition to the data center itself, your organization may have remote
facilities with technology that depends on the data center. IT management has long
established processes for managing and monitoring individual IT components, which is
good. IT governance does the following:

Ensures that IT assets are implemented and used according to approved upon
policies and procedures.
Ensures that these assets are appropriately controlled and maintained.
Ensures that these assets are providing value to the organization.

IT governance, therefore, has to include the techniques and policies that measure and
control how systems are managed. However, IT doesnt stand alone in the governance
process. In order for governance to be effective, it needs to be holistic. It is as much about
organizational issues and how people work together to achieve business goals as it is
about any technology. Therefore, the best kind of governance occurs when IT and the
business are working together.
Governance defines who is responsible for what and who is allowed to take action to fix
whatever needs fixing. Governance also sets down what policies people are responsible
for. It puts in place means to determine whether the responsible person or group has, in
fact, acted responsibly and done the right thing. Critical part of governance is establishing
organizational relationships between business and IT, as well as defining how people will
work together across organizational boundaries.
IT governance usually involves establishing a board made up of business and IT

representatives. The board creates rules and processes that the organization must follow
to ensure that policies are being met. This might include

Understanding business issues such as regulatory requirements or funding for


development

Establishing best practices and monitoring these processes


Responsibility for things like programming standards, proper design, reviewing,
certifying, and monitoring applications from a technical perspective, and so on

A simple example of IT governance in action is making sure that IT is meeting its


obligations in terms of computing uptime. This uptime obligation is negotiated between the
business and IT, based on the criticality of the application to the business. IT governance
is defined as structure around how organizations align IT strategy with business strategy,
ensuring that companies stay on track to achieve their strategies and goals, and
implementing good ways to measure ITs performance. The structure design need to
make sure that all the stakeholders ideas and interest are been considered. Also it should
make sure that the processes will support the measurable output. IT governance is
responsible for answering the following important questions towards the organization.

The performance and functioning of the IT department as a whole.


What are the key metrics that the organization needs to have?
What is the return that the business gets back at the end of the process?
The proportion calculation of investment and the gain from the process

Benefits of IT governance:
Transparency and Accountability:

There is a transparency in IT process, portfolio and the cost incurred for the
process. This accounts calculation is included the various services and the projects
of IT governance.
Able to get clear clarity on decision making accountabilities and provides the
relationship between the service provider and the user.

Return on Investment/Stakeholder Value:

Clarity and clear understanding about the overall IT costs


Able to focus the cost-cutting with an ability to reason for investment.
IT risk/returns can be viewed by the stakeholders of the system
Better development towards the contribution to stakeholder returns.

Opportunities and Partnerships:

It gives insight towards the process that may not get sponsorship and attention.
Basically an idea about the non-priority process

Positioning of IT as a business

Enable to have business with other companies.


Enables more professional and business relationships with various partners
including suppliers and vendors.
Supports to make consistent and strong approach towards risk taking.
Facilitate to make strategic decisions towards IT participation in business that may
reflect in IT strategy and vice versa.
Able to face the business opportunities and. market challenges in a better way.

Performance Improvement:

Able to get clarity of whether an IT service or project supports business as usual


or is intended to provide future added value.
Increased transparency will raise the bar for performance, and advertise that the
bar should be continuously raised.
A focus on performance improvement will lead to attainment of best practices.
Avoid unnecessary expenditures expenditures are demonstrably matched to
business goals.
Increase ability to benchmark.

Das könnte Ihnen auch gefallen