Beruflich Dokumente
Kultur Dokumente
Mrs.GOKILAVANI.A
DR.RAJALAKSHMI
M.E (CSE)
Jay Sriram Group of
Institution
shobijayam@gmail.com
Assistant Professor
Jay Sriram Group of
Institution
gokisubbu@gmail.com
HOD(CSE)
Jay Sriram Group of
Institution
mrajislm@gmail.com
ABSTRACT Cloud computing is a climbing technology that as of late has drawn essential consideration from
each one exchange and academe. There is high amount of data stored in cloud from various clients,
deduplication is useful and efficient technique to make data management more scalable. Data deduplication is a
specialized data compression method for removing duplicate copies of repeating data in cloud storage. Data
deduplication removes the duplicate copies of repeating data. To encrypt the sensitive data, the convergent
encryption method has been proposed. It is first official attempt to address the issue of secure authorized data
deduplication. Our work is to propose new data deduplication construction supporting system which authorizes
data duplicate check in public multi cloud architecture.The result outcome shows that proposed authorized data
duplicate check scheme incurs minimal overhead in public multi cloud architecture as compared to normal
operations.
KEYWORDS Proof Of Ownership, Storage Cloud Service Provider, Data Sharing Scheme, Oblivious Pseudo
Random Function, Management Console Security.
I. INTRODUCTION
Cloud computing, or something being in the cloud, is an expression used to describe a variety of
different types of computing concepts that involve a large number of computers connected through a real-time
communication network such as the internet. In science, cloud computing is a synonym for distributed
computing over a network and means the ability to run a program on many connected computers at the same
time. The phrase is also more commonly used to refer to network-based services which appear to be provided by
real server hardware, which in fact is served up by virtual hardware, simulated by software running on one or
more real machines. Such virtual servers do not physically exist and can therefore be moved around and scaled
up (or down) on the fly without affecting the end userarguably, rather like a cloud.
To make data management scalable in cloud computing, deduplication has been a well-known technique and has
attracted more and more attention recently. Data deduplication is a specialized data compression technique for
eliminating duplicate copies of repeating data in storage. The technique is used to improve storage utilization
and can also be applied to network data transfers to reduce the number of bytes that must be sent. Instead of
keeping multiple data copies with the same content, deduplication eliminates redundant data by keeping only
one physical copy and referring other redundant data to that copy. Deduplication can take place at either the file
level or the block level. For file-level deduplication, it eliminates duplicate copies of the same file.
Deduplication can also take place at the block level, which eliminates duplicate blocks of data that occur in nonidentical files. Traditional deduplication systems based on convergent encryption, although providing
confidentiality to some extent, do not support the duplicate check with differential privileges. In other words, no
differential privileges have been considered in the deduplication based on convergent encryption technique. It
seems to be contradicted if we want to realize both deduplication and differential authorization duplicate check
at the same time.
www.ijstre.com
Page 1
Multi Cloud Data Storage System With High Level Authorized Deduplication
can recover datas falling into a known set. This proposed an architecture that provides secure deduplicated
storage resisting brute-force attacks, and realizes it in a system called DupLESS. In DupLESS, users encrypt
under message-based keys attained from a key-server via an unaware PRF protocol. It enables users to store
encrypted data/content with an existing service, have the service perform data deduplication on their behalf, and
yet achieves strong confidentiality guarantees.
However, previous deduplication systems cannot support differential authorization duplicate check,
which is important in many applications.
In existing data deduplication systems, the private cloud is involved as a proxy to allow data owner/users
to securely perform duplicate check with differential privileges.
Existing prototype overhead is high in file upload operations.
C. Cloud Deduplication
ClouDedup, a secure and efficient storage service which assures block-level data deduplication and
data/content confidentiality at the same time. Even though based on convergent encryption, ClouDedup remains
secure thanks to the definition of a component that implement an additional encryption operation and an access
control mechanism. Furthermore, as the requirement for data deduplication at block-level lifts an problem with
respect to key management, This suggest to include a new component in order to execute the keys management
for each block together with the actual data deduplication operation.
This designed a system which achieves confidentiality and enables block-level data deduplication at
the same time. This system is built on top of convergent encryption. This showed that it is worth performing
block-level data deduplication instead of data level data deduplication since the gains in terms of storage
space are not affected by the overhead of metadata management, which is minimal. Additional layers of
encryption are added by the server and the optional HSM. Thanks to the features of these components, secret
keys can be generated in a hardware dependent way by the device itself and do not need to be shared with
anyone else. As the additional encryption is symmetric, the impact on performance is negligible. This also
showed that this design, in which no component is completely trusted, prevents any single component from
compromising the security of the whole system. This solution also prevents curious cloud storage providers
from inferring the original content of stored data/content by observing access patterns or accessing metadata.
www.ijstre.com
Page 2
Multi Cloud Data Storage System With High Level Authorized Deduplication
Furthermore, this showed that this solution can be easily implemented with existing and widespread
technologies.
www.ijstre.com
Page 3
Multi Cloud Data Storage System With High Level Authorized Deduplication
C. MULTI PUBLIC CLOUD SETUP
Authorized Deduplication (Token verification)
The authorized duplicate check for this file can be performed by the user with the public cloud
system before uploading this file/data. Based on the results outcome of data duplicate check, the user either
uploads this file or runs PoW. So after receiving the tag from private cloud, the user will send file duplication
request tag to the S-CSP. If a file/data duplicate is found, the user/client needs to run the protocol POW (proof
of ownership) with the SCSP (public cloud) to prove the file ownership. Otherwise, if no duplicate is found, a
proof from the S-CSP will be returned.
File Upload Request/Response
After no duplication response from the public cloud then, the user computes the encryption to the
upload file with key using a symmetric encryption algorithm. Then the user computes the encrypted file with
the convergent key and uploads to the cloud server. The convergent key is stored by the user locally.
File Download Request/Response
Suppose a user wants to download a file. It first sends a request and the file name to the S-CSP.
Upon receiving the request and name of the file, the S-CSP will check whether the user/client is eligible to
download. If failed, the S-CSP sends back a signal to the user/client to indicate the download failure or else,
the S-CSP returns the corresponding cipher text. Upon receiving the encrypted data/content from the S-CSP,
the user uses/clients the key stored locally to recover the original file i.e the user can recover the original file
with the convergent key after receiving the encrypted data from the S-CSP.
IV. CONCLUSION
In this project, the notion of authorized data deduplication was proposed to protect the data security by
including differential privileges of users in the data duplicate check. Also presented several new data
deduplication methods supporting authorized data duplicate check in hybrid cloud architecture, in which the
data duplicate-check tokens of files are generated by the private cloud server system with private keys. Security
analysis demonstrates that our approaches are secure in terms of insider and outsider intruders specified in the
proposed security architecture. As a proof of concept, implemented a prototype of our proposed authorized data
duplicate check method and executed on our prototype. Results showed that our authorized data duplicate check
method incurs low overhead compared to convergent encryption and network transfer.
REFERENCES
[1]
[2]
[3]
[4]
[5]
[6]
[7]
[8]
[9]
www.ijstre.com
Page 4