You are on page 1of 36

A Secured Cost-effective Multi-Cloud Storage in

Cloud Computing
Introduction to the area
The end of this decade is marked by a paradigm shift of the
industrial information technology towards a pay-per-use service
business model known as cloud computing. Cloud data storage
redefines the security issues targeted on customer’s outsourced
data (data that is not stored/retrieved from the costumers own
servers). In this work we observed that, from a customer’s point
of view, relying upon a solo SP for his outsourced data is not very
promising. In addition, providing better privacy as well as
ensuring data availability, can be achieved by dividing the user’s
data block into data pieces and distributing them among the
available SPs in such a way that no less than a threshold number
of SPs can take part in successful retrieval of the whole data
block. In this paper, we propose a secured cost-effective multicloud storage (SCMCS) model in cloud computing which holds an
economical distribution of data among the available SPs in the
market, to provide customers with data availability as well as
secure storage. Our results show that, our proposed model
provides a better decision for customers according to their
available budgets.
Literature survey
Exiting System
The end of this decade is marked by a paradigm shift of the
industrial information technology towards a subscription based or
pay-per-use service business model known as cloud computing.
This paradigm provides users with a long list of advantages, such
as provision computing capabilities; broad, heterogeneous
network access; resource pooling and rapid elasticity with
measured services. Huge amounts of data being retrieved from
geographically distributed data sources, and non-localized datahandling requirements, create such a change in technological as

well as business model. One of the prominent services offered in
cloud computing is the cloud data storage, in which, subscribers
do not have to store their data on their own servers, where
instead their data will be stored on the cloud service provider’s
servers. In cloud computing, subscribers have to pay the service
providers for this storage service. This service does not only
provides flexibility and scalability for the data storage, it also
provide customers with the benefit of paying only for the amount
of data they need to store for a particular period of time, without
any concerns for efficient storage mechanisms and
maintainability issues with large amounts of data storage. In
addition to these benefits, customers can easily access their data
from any geographical region where the Cloud Service Provider’s
network or Internet can be accessed. Cloud data storage also
redefines the security issues targeted on customer’s outsourced
data (data that is not stored/retrieved from the costumers own
Problem in the existing system
In Existing System we observed that, from a customer’s point of
view, relying upon a solo SP (Service Provider) for his outsourced
data is not very promising. In addition, providing better privacy as
well as ensure data availability, can be achieved by dividing the
user’s data block into data pieces and distributing them among
the available SPs in such a way that no less than a threshold
number of SPs can take part in successful retrieval of the whole
data block.
Available solution and their features
Since the inception of Information Technology, it has played an
important part in ensuring that companies and businesses run
smoothly. Information Technology has provided various types of
services which are secure, reliable, and available every time. In
order to obtain the highest quality cloud computing, business
owners have turned to its characteristics and features in order to

acquire this service. Cloud computing has become attractive to
end users and customers because of these salient characteristics.

A key characteristic of cloud computing is its quick scalability.
Upgrades and changes to the services are done instantaneously
and easily enabling the cloud computing service to be resilient. A
business owner can easily request for additional bandwidth, data
storage, processing speed, and additional users or licenses. There
is no need to do project implementation, procurement, and
project costing because the system just needs the business owner
to place an order to the cloud computing vendor.
With the cloud computing service, everything is measurable. The
business owner can obtain a specific number of user license per
software, and a fixed network bandwidth and data space which
fits the business’s demands. This feature makes the cost of cloud
computing predictable. It also defines accurately the inclusions in
the service. If the business owner avails of such service, his
employees can experience different services online with large
data spaces; various new software; multi-value added services;
various processing techniques; and ease of accessibility to a
capable and rich network.
An important feature of cloud computing is its ability to let the
business owner decide on his current and future needs. If he
expands his business, he can easily request for additional services
which can match his needs. Cloud computing also makes
available various hardware or software resources. A business
owner can access such resources on demand. Cloud hosting is
also more reliable because it manages the whole cloud thereby
allowing a business owner’s website more data spaces,
bandwidth, and more resources depending on the site’s needs.
Resources of websites which are not accessed currently are freed

data space. that only authorized users in one account can access it. the security policies also evolved from the conventional cryptographic schemes applied in . and other resources. In conventional paradigm. Disk failure or server crash won’t create much problem because the supplier can easily restore the latest backup. Loss of data is also avoided because the supplier must ensure that every hardware or software resources are high end because there are a lot of clients relying on the service. Backup is also sophisticated in cloud computing. But in case of cloud computing. A business owner need not worry about backup responsibilities because the supplier has taken steps to put up a great system for backup. the organizations had the physical possession of their data and hence have an ease of implementing better data security policies.and moved to sites which are in dire need of additional bandwidth. The users have to trust the cloud service provider (SP) with security of their data. the author discussed the criticality of the privacy issues in cloud computing. Following the pattern of paradigm shift. Data is share within a server therefore the provider must ensure that each account is secured. In. and pointed out that obtaining information from a third party is much easier than from the creator himself. the data is stored on an autonomous business party that provides data storage as a subscription service. Problem Definition Problem definition Privacy preservation and data integrity are two of the most critical security issues related to user data.

Proposed solution In this project. This not only rules out the possibility of a SP misusing the customers’ data. In our model. In this case. based on his available budget. for enabling the data privacy. Also. we proposed an economical distribution of data among the available SPs in the market. none of the SP can successfully retrieve meaningful information from the data pieces allocated at their servers. Also we provide a decision for the customer. but can easily ensure the data availability with a better quality of service. Our proposed approach will provide the cloud computing users a decision model. if a service provider suffers service outage or goes bankrupt. to which SPs he must chose to access data. to provide customers with data availability as well as secure storage. the customer divides his data among several SPs available in the market.centralized and distributed data storage. Development process . with respect to data access quality of service offered by the SPs at the location of data retrieval. in addition. that provides a better security by distributing the data over multiple cloud service providers in such a way that. breaching the privacy of data. we provide the user with better assurance of availability of data. by maintaining redundancy in data distribution. the user still can access his data by retrieving it from other service providers.

this model is known as the ‘waterfall model’ Requiremen ts System and Software Implementati on Integration and System testing Operation and Maintenanc There are numerous variations of this process model.A ‘waterfall’ model is a means of making the development process more visible. Because of the cascade from one phase to another. The principal stages of the model map onto the fundamental development activities: .

The system is installed and put into practical use. Maintenance involves correcting errors which were not discovered in earlier stages of the life cycle. Normally this is the longest life cycle phase. Advantages of proposed solution . Integration and system. the software design is realized as a set of programs or program unit. The individual program units or programs are integrated and tested as a complete system to ensure that the software requirements have been met. It establishes an overall system architecture. 4. Operation and maintenance. 3. Requirements analysis and definition. After testing. 5. During this stage. Implementation and unit testing.1. improving the implementation of system units and enhancing the system’s services as new requirements are discovered. Both users then define them in a manner. The systems design process partitions the requirements to either hardware or software systems. which is understandable and development staff. the software system is delivered to the customer. The system’s services. constraints and goals are established by consultation with system users. Software design involves representing the software system functions in a form that may be transformed into one or more executable programs. 2. System and software design. Unit testing involves verifying that each unit meets its specification.

we assume that all the data is to be stored for same period of time. The cloud storage service is generally priced on two factors. We consider number of cloud service providers each available cloud service provider is . Without any concerns for efficient storage mechanisms and maintainability issues with large amounts of data storage. each available cloud service provider is associated with a QoS factor. how much data is to be stored on the cloud servers and for how long the data is to be stored. we assume that all the data is to be stored for same period of time.  Cloud data storage also redefines the security issues targeted on customer’s outsourced data. The cloud storage service is generally priced on two factors. cloud user and cloud service providers. the cloud user can store his data on more than one SPs according to the required level of security and their affordable budgets. along with its cost of providing storage service per unit of stored data (C). scope We consider the storage services for cloud data storage between two entities. Every SP has a different level of quality of service (QoS) offered as well as a different cost associated with it. In our model. Product overview We consider the storage services for cloud data storage between two entities. cloud users (U) and cloud service providers (SP). In our model. Software requirement specification Purpose. how much data is to be stored on the cloud servers and for how long the data is to be stored. Hence. We consider p number of cloud service providers (SP).

it is necessary to define many types of supplemental requirements unique to cloud computing. but also differs in many ways. Not defining a clear and complete set of requirements for cloud computing is a recipe for disappointment. Requirements are needed to ensure alignment with your business processes and compatibility with your system architecture. In addition to functional requirements. Hence. Every has a different level of quality of service offered as well as a different cost associated with it.associated with a factor. Developing requirements for cloud computing is similar to other projects. vendors are trying to cash in on the hype by over promising and under delivering. Similar to many other new and emerging technologies. such as:  Governance  Who will own the application?  What governance structure is needed?  Who pays for the solution?  What are the responsibilities of IT?  What are the responsibilities of the Business Unit?  What operational mechanisms are needed to support the solution?  Accessibly  Who is responsible for setting up new users?  Who will have access to the system? . Functional requirement “Cloud computing” is the one of the most overused buzzwords in IT. the cloud user can store his data on more than one according to the required level of security and their affordable budgets. along with its cost of providing storage service per unit of stored data.

thereby continuing to provide guaranteed service performance for consumers. a cloud system autonomously selects a provider that offers an SLA that satisfies the consumer’s demands. Guaranteed availability . and distributes its load to other cloud systems. Architectural integration  How do we integrate this into our existing infrastructure?  How will we monitor performance?  Deployment and test responsibilities  Who is responsible for designing and testing the solution?  Who is responsible for training the users?  Data integration  How do we extract data and import to our data warehouse?  How do we integrate with our existing ERP systems?  How do we integrate with our external suppliers?  Security  What are the security implications for our organization?  Do we have any auditing requirements?  Will the solution integrate with our single user log in? Performance requirement Guaranteed performance Guaranteed performance means that. in the face of a abrupt increase in traffic to an unexpected level. which can degrade its performance due to overload to cloud system. It also means guaranteeing the performance for a higher-priority processing by means of temporarily delegating the workload of low-priority processing tasks to other cloud systems.

Convenience of service cooperation Convenience of service cooperation means to improve convenience that. a cloud system recovers the services (disaster recovery) by interworking with cloud systems located in areas unaffected by the disaster. it is important to recover services according to priority of each services. a cloud system cooperates applying service and all the related procedural services in such a way that the consumer can see all the services involved as a one-stop service. Exception handling Missing or defective exception handling provisions have caused many failures in critical software intensive systems even though they had undergone extensive review and test. thereby continuing to provide the guaranteed services as before the disaster. such as when applying for a passport.Guaranteed availability means that. such as continues to provide the guaranteed quality for high priority services and attempts to satisfy the a part of quality requirements only on a best-effort basis. when damaged by a disaster and threatened to continuity of services provided by a cloud system. The failures occurred under conditions that had not been covered in the reviews and tests because of incomplete or imprecise . If it is difficult to recover services in such a way as to provide guaranteed quality for all the services. when several related procedures need to be completed.

They are found in aerospace. The programmer views exception handling as a task that requires detecting an abnormal condition. stopping the normal execution. and increasingly in automotive applications. An example of the issues . sometimes with reduced capabilities. Thus. Software for critical systems is expected to protect against a wide range of anomalies that can include • Unusual environmental conditions • Erroneous inputs from operators • Faults in the computer(s). In keeping with the EWICS TC7 convention [1] such systems are in the following called critical systems. exception handling is an important part of software development and of the verification and validation activities. process control.system requirements. the software and communication lines The portions of the programs that are charged with providing this protection are called exception handling provisions or exception handlers. Their purpose is (a) to detect that an anomalous condition has been encountered and (b) to provide a recovery path that permits continued system operation. In critical systems a large part of the software can be devoted to exception handling and in some cases a substantial part of the failures in these systems have been traced to deficiencies in the exception handlers. The systems most in need of precise exception handling requirements are real-time control systems because in these there is usually no opportunity to roll back and try a second time. To curb this cause of failures the paper addresses the generation of system requirements for exception handling. saving the current program state. and locating the resources required for continuing the execution.

Acceptance criteria Acceptance criteria define the boundaries of a user story. the acceptance criteria are written in simple language. Protection against spam is working An acknowledgment email is sent to the user after submitting the form. 4. A user cannot submit a form without completing all the mandatory fields 2. Information from the form is stored in the registrations database 3. the acceptance criteria could include: 1. and are used to confirm when a story is completed and working as intended. When the development team has finished working on the user story they demonstrate the functionality to the Product Owner. showing how each criterion is satisfied. As you can see. For the above example. How can a blank method throw exceptions? Java does not stop you from doing this”. it does not have any code in it. . just like the user story.dealt with at that level is the following program construct and the comment that follows it: public void someMethod() throws Exception{ } “This method is a blank one.

the more data you can get through in a shorter period of time. and (2) to restore small numbers of files after they have been accidentally deleted or corrupted. More meaningful information is obtained by combining consecutive bits into larger units. Bandwidth: The amount of data that can be transmitted in a fixed amount of time." Backups are useful primarily for two purposes: (1) to restore a state following a disaster (called disaster recovery). the smallest unit of information on a machine. The amount of data that can travel through a circuit. The larger the bandwidth. For digital devices. Think of this as the difference between a small diameter hose and a larger one. You’ll have the advantage in a water fight with the larger hose. the bandwidth is usually expressed in bits per second (bps) or bytes per second. a byte is composed of 8 consecutive bits. . A single bit can hold only one of two values: 0 or 1.Including acceptance criteria as part of your user stories has several benefits:    they get the team to think through how a feature or piece of functionality will work from the user’s perspective they remove ambiguity from requirements they form the tests that will confirm that a feature or piece of functionality is working and complete. For analog devices. For example. This is measured in bits per second. Glossary of terms Backup: Refers to making copies of data so that these additional copies may be used to restore the original after a data loss event. or Hertz (Hz). Bit: Short for binary digit. the bandwidth is expressed in cycles per second. These additional copies are typically called "backups.

Blog: Short for Web log. software and computation. Customer Relationship Management (CRM): A database that stores all customer information for easy retrieval. Cloud computing entrusts remote services with a user's data. Browser: Short for Web browser. blogs often reflect the personality of the author. for example. Convergence: The condition or process of combining complementary technologies such as telecommunications. a blog is a Web page that serves as a publicly accessible personal journal for an individual. including sound and video. though they require plug-ins for some formats. The next time you go to the same website. When you enter a website using cookies. . networking and multimedia. The most popular browser is Microsoft Internet Explorer – a graphical browser. your browser will send the cookie to the Web server. In addition. Cookies: The main purpose of cookies is to identify users and possibly prepare customized Web pages for them. Typically updated daily. The name comes from the use of a cloud-shaped symbol as an abstraction for the complex infrastructure it contains in system diagrams. most modern browsers can present multimedia information. you may be asked to fill out a form providing such information as your name and interests. So. This information is packaged into a cookie and sent to your Web browser which stores it for later use. a software application used to locate and display Web pages. instead of seeing just a generic welcome page you might see a welcome page with your name on it. which means that it can display graphics as well as text. Cloud Computing: Cloud computing is the use of computing resources (hardware and software) that are delivered as a service over a network (typically the Internet). The server can use this information to present you with custom Web pages.

DHCP: Short for Dynamic Host Configuration Protocol. audio. such as e-mail. images. particularly via the Internet. Because this would be difficult to remember and also hard to type in without making a mistake. This type of site is also necessary if e-commerce is to be considered.18.1. These addresses are usually expressed as a sequence of four sets of numbers separated by a decimal (for example 172. to cause physical. we use the www. business that is conducted over the Internet using any of the applications that rely on the Internet. Electronic commerce can be . DNS . Dynamic Sites: Through the use of Database programming. It eliminates having to manually assign permanent "static" IP addresses. this type of website offers more than a static site since it can constantly be updated from any where there’s access to the internet. DHCP is software that automatically assigns temporary IP addresses to client stations logging onto an IP network. real-world harm or severe disruption.address.Domain Name System: Computers on the Internet are kept separate by the use of names and addresses. DHCP software runs in servers and routers. DSL: Short for Digital Subscriber Lines. They are sometimes referred to as last-mile technologies because they are used only for connections from a telephone switching station to a home or office. They’re translated into the numbering system.Cyber attack: The leveraging of a target's computers and information technology.0). instant messaging. not between switching stations. into a digital (binary) form. Electronic Commerce: Often referred to as simply e-commerce. DSL technologies use sophisticated modulation schemes to pack data onto copper wires. Digitizing: The process of converting data. search functions are require and secure transactions of any type are to be conducted. video. etc. and shopping style names for us humans.

or a combination of both. search and process any electronic data for use as evidence in a legal proceeding or investigation. Hacker: A person who enjoys exploring the details of computers and how to stretch their capabilities. Ethernet: The format that all computers use to talk to each other. A malicious or inquisitive . Popular EFT providers are VeriSign and PayPal. File Transfer Protocol (FTP): This is the system that allows you to copy files from computers around the world onto your computer. A way of coding the information in a file or e-mail message so that if it is intercepted by a third party as it travels over a network it cannot be read. services and/or data or between a business and a customer. Firewalls can be implemented in both hardware and software. goods. Only the persons sending and receiving the information have the key and this makes it unreadable to anyone except the intended persons. Electronic discovery may be limited to a single computer or a network-wide search. is a type of cyber forensics and describes the process by where law enforcement can obtain. Electronic Funds Transfer: Often abbreviated as EFT. which examines each message and blocks those that do not meet the specified security criteria. Firewall: A system designed to prevent unauthorized access to or from a private network. Encryption is the most effective way to achieve data security. Also see Unlimited FTP. secure. especially intranets. Electronic discovery: or e-discovery.between two businesses transmitting funds. All messages entering or leaving the intranet pass through the firewall. Used in conjunction with Internet Protocol. Firewalls are frequently used to prevent unauthorized Internet users from accessing private networks connected to the Internet. it is the paperless act of transmitting money through a secure computer network. Encryption: The translation of data into a secret code.

When a packet arrives at one port. HTML (Hypertext Mark Up Language): That’s the programming language that’s universally accepted for internet programming. using rules from a secure web server. Every client. Internet Protocol (IP): The format that all computers use to talk over the Internet. Identity theft is a form of identity crime (where somebody uses a false identity to commit a crime). A person who enjoys learning the details of programming systems and how to stretch their capabilities. IP Address: Short for Internet Protocol address. HTTP defines how messages are formatted and transmitted. Hubs are commonly used to connect segments of a LAN. and network device must have a unique IP address for each network connection (network interface).meddler who tries to discover information by poking around. Every IP packet contains a source IP address and a destination IP address. An IP network is somewhat similar to the . For example. and what actions Web servers and browsers should take in response to various commands. HTTPS: Same as above. an IP address is the address of a device attached to an IP network (TCP/IP network). server. Hub: A common connection point for devices in a network. HTTP (HyperText Transfer Protocol): The underlying protocol used by the World Wide Web. it is copied to the other ports so that all segments of the LAN can see all packets. as opposed to most users who prefer to learn the minimum necessary. when you enter a URL in your browser. A hub contains multiple ports. this actually sends an HTTP command to the Web server directing it to fetch and transmit the requested Web page. Identity theft: Identity theft occurs when somebody steals your name and other personal information for fraudulent purposes. These are the rules in which a web user’s browser accesses files from a web server.

Key Words: See Meta Tags Local Area Network (LAN): An Ethernet switch and the cables that go to computers that are geographically close together. ODBC (Open Database Connectivity): A standard database access method developed with the goal to make it possible to access any data from any application. including: local-area networks (LANs) where the computers are geographically close together (that is. Meta Tags: A list of approximately thirty (30) words that are ‘Key’ to helping search engines find your website. and access phone number. A . Equipped with a high-speed device. regardless of which database management system (DBMS) is handling the data. Phishing: A form of Internet fraud that aims to steal valuable information such as credit cards. in the same building). This list should be made in the order of importance as well as common misspellings since it’s a human entering the search words into a search engine. you can then log on to the Internet and browse the World Wide Web.telephone network in that you have to have the phone number to reach a destination. and send and receive e-mail. SSNs. and wide-area networks (WANs) where the computers are farther apart and are connected by telephone lines or wireless radio waves. user IDs. username. For a monthly fee. The big difference is that IP addresses are often temporary. There are many types of computer networks. Each device in an IP network is either assigned a permanent address (static IP) by the network administrator or is assigned a temporary address (dynamic IP) via DHCP software. and passwords. Harbour. the service provider gives you a software package. password. For example: Harbor Freight: Harbor. Frieght Network: A group of two or more computer systems linked together. Freight. ISP (Internet Service Provider): A company that provides access to the Internet.

typically a financial institution such as a bank or insurance company. back-up and recovery. When it is done right. Unlike public cloud options. then a public cloud could be the computing model for your business. it can have a positive impact on a business. RAM. Private Cloud: Private cloud is custom cloud infrastructure for an individual organization that can be managed internally or by an IT service company such as ITX. and can be hosted internally or collocated depending on the businesses security and risk tolerance. An email is sent requesting that the recipient access the fake website (which will usually be a replica of a trusted site) and enter their personal details. but use resources dedicated to only one business.fake website is created that is similar to that of a legitimate organization. knowledgeable and can build trust with a business to ensure the best possible private cloud environment. Generally. etc. At ITX we are confident. Undertaking a private cloud project requires a degree of engagement between the organization and the IT company to virtualize the business environment. Working together with a trusted company like ITX will allow each step of the engineered design to be addressed. A private cloud lets you capitalize on existing IT investments while making IT more dynamic. including security access codes. public cloud service providers like Microsoft own and operate the infrastructure and offer access only via Internet. storage. security. Public clouds share server resources and other standard resources such as CPU. If your business is comfortable with sharing resources. private clouds do not share server resources with other customers. and other resources are made available to the general public by a service provider such as what ITX offers in Microsoft Office 365. Public Cloud: Public cloud applications. Public clouds are beneficial for newer companies and start-ups not wanting the heavy expenses of IT gear and established companies with aging infrastructure. Or businesses that see the . risk tolerance. to avoid possible vulnerabilities. and drive space..

huge hard drive! A computer or device on a network that manages network resources. For example. Security: In the computer industry. . Virtual Private Network (VPN): A secure connection created over a public network by using tunneling-mode encryption. Data encryption is the translation of data into a form that is unintelligible without a deciphering mechanism. URL (Uniform Resource Locator): A URL is the global address of documents and other resources on the World Wide Web. This is your street address that no one else can have. delay. and that public clouds do not require rethinking your IT from the ground up. It’s your www.yourname. A database server is a computer system that processes database queries. transmission speed. QoS (Quality of Service): A term used when describing IP phone systems. Server: This is where your website programming actually resides. Most security measures involve data encryption and passwords. and error that is necessary to ensure adequate performance of particular well understood and widely used. A password is a secret word or phrase that gives a user access to a particular program or system. A network server is a computer that manages network value. QoS is a guaranteed or predictable level of bandwidth. A print server is a computer that manages one or more printers. refers to techniques for ensuring that data stored in a computer cannot be read or compromised by any individuals without authorization. a file server is a computer and storage device dedicated to storing files. Pure IP: Digital phone system that digitizes analog speech into bits to transmit them along with data bits over a unified network. Think of this as one very. jitter. Any user on the network can store files on the server. and freedom from dropped packets.

0(jdk1. Wide Area Network (WAN): Two or more Ethernet LANs connected with long-distance data lines. It is also a computer software application that is coded in a browser-supported language (such as HTML.6.Web App: Short for Web Application.0) or above  MS SQL Server 2008 Languages  Java 2 Enterprise Edition (J2EE) o Java Server pages. JavaScript. An application that is accessed via web browser over a network such as the Internet or an intranet. Web-based Interface: Using any common browser such as Microsoft Explorer. Technology requirement Hardware requirements:  1 GB RAM  2 GB of free hard disk space  Intel P4 or Higher Software requirements:  Java development kit 1.) and reliant on a common web browser to render the application executable.6. etc. o Java Swing o RMI o JDBC o SQL System design Use Case Model . Java.

Detailed design High level design .

Data Flow Diagram .

Low level design .

Relational model and Flowchart and pseudo code .


Code review and walk through Both reviews and walk through used to deliver the correct codes. The translation process continues when the compiler accepts source code as input and produces machine dependent object code as output.jsp” and “. Linking of object files are done to produce the machine code.class” files. The code review is done as soon as the source code is ready to be executed.Implementation The goal of the coding phase is to translate the design into code in the given programming language.htm”. this is to reduce syntax errors and also check the coding standards. Internal documentation is another important factor. to facilitate others to understand the code and the logic. Module Specifications The modules specified in the design are implemented using various “. These files in the source code shares the common routines and share data structures. . “. The coding steps translate the detailed design of the system into programming language. to establish the hierarchical relationship.

During testing. The application is made to run with the Run in internet explorer using the address as “http://localhost:8080/LIC” present in ROOT directory of Apache Tomcat Server Testing & Result Testing is a process. Corrections are then made. which reveals errors in program. Once all errors are found.this time with an additional objective of finding out whether or not corrections in one part of the system have introduced any new errors elsewhere in the system. If errors are found. It is the major quality measure employed during software development. then another objective must be accomplished that is check whether or not the system is doing . The program/system must be tested once again after corrections have been implemented . then the software must be debugged to locate these errors in the various programs. The Primary and Larger objective of testing is to deliver quality software. the program is executed with a set of conditions known as test cases and output is evaluated to determine whether the program is performing as expected. Quality software is one that is devoid of errors and meets with a customer’s stated requirements.Compilation and building the executables The source code for the system organized in various files is compiled using the “javac” utility provided in the JAVA.

Performance testing .This attempts to verify that the protection mechanisms built into the system. .This was done to determine whether or not the system is able to handle a large volume of data.This is corollary to volume testing. The volume was a representative of the real life volume with some provision for future growth. So another aspect of testing is that it must also ensure that the system meets with user requirements.what it is supposed to do. Security Testing .  Techniques of testing  Black Box Testing  White Box Testing  Equivalence Portioning  Boundary Value Analysis  Ad-hoc Testing Specialized Testing done for this Project are Volume Testing . actually protects the system from unauthorized access or not. This testing was done to focus on the performance of the System under large volumes and not just the ability to handle it.

Integration testing Tested modules are put together and tested in their integrity. The objectives are to take unit tested components and build a program structure that has been discarded by design.This was basically done to see if any changes are made to one part of a Program whether it affects another part of System and also to check the deviations in behavior of unchanged parts of system Unit testing Unit testing is normally considered as an adjunct to the coding step. A review of design information provides guidance for establishing test cases that are likely to uncover errors in each of the categories. After source level code has been developed. reviewed and verified for correspondence to component level design. Different strategies a may be adopted depending on the type of system to be tested and the development process used. Testing strategies A testing strategy is general approach to the testing process rather than a method of devising particular system or components tests. Unit testing is responsible for testing each module in software structure independently. Integration testing is a systematic technique for constructing the program structure while at the same time conducting tests to uncover errors associated with interfacing. .Regression Testing .

Bottom-up testing where testing starts with the fundamental components and works upwards.The testing strategies which discuss in this are: Top-down testing where testing starts with the most abstract component and works downwards. Thread testing which is used for systems with multiple processes where the processing of transaction threads its way through these processes. Number of software testing strategies is proposed. Large systems are usually tested using a mixture of these testing strategies rather than any approach. Different testing techniques are appropriate at different point of time. Different strategies may be needed for different parts of the system and at different stages in the testing process. Back-to-back testing which is used when versions of systems are available the systems are tested together and their outputs are compared. it is always sensible to adopt an incremental approach to sub-system and system testing. . Whatever testing strategy is adopted. Testing begins at the module /well &works “outward” towards the integration of the entire computer based system. Stress testing which relies on stressing the system by going beyond its specified limits and hence testing how well the system can cope with over-load situations.

which seeks to provide each customer with a better cloud data storage decision. Testing & debugging must be accommodated in any testing sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact =8&ved=0CCIQFjAA&url=http%3A%2F %2Fieeexplore. this research should be extended by adding the ensuring the availability system in this project which in result of availability of data in case of failure of data retrieving https://www.The developer of the s/w & independent test group conducts testing. taking into consideration the user budget as well as providing him with the best quality of service (Security and availability of data) offered by available cloud service providers. we proposed a secured cost-effective multi cloud storage (SCMCS) in cloud our model has shown its ability of providing a customer with a secured storage under his affordable Conclusion In this Project. Reference books.jsp%3Farnumber %3D5928887&ei=ybPIVJWHN4W3mAWvwoCQBw&usg=AFQj CNFxHlaBKCW0UdD5- . By dividing and distributing customers data. Future Enhancement For the future work . sites and other resources And even the backup data server can fails so there is no cured mention for this. So this drawback can be covered in next future work of this project task.

com/ https://www. [3] M. Online at Online at http://www.pdf&ei=ybPIVJWHN4W3mAWvwoCQBw&usg=AFQ jCNFkDAaAFwg2t4Vq79js9Mx593pCqw& sa=t&rct=j&q=&esrc=s&source=web&cd=6&cad=rja&uact =8&ved=0CEcQFjAF&url=http%3A%2F%2Fijarcet.84607526. 2008.0TiloSTRi_Ndw& 2008” Arrington.dGc 3. [2] “A Mordern Language for Mathematical Programming”. .in/url? sa=t&rct=j&q=&esrc=s&source=web&cd=3&cad=rja&uact =8&ved=0CDIQFjAC&url=http%3A%2F%2Fwww. “Gmail Disaster: Reports of mass email deletions”. December 2006. https://www.dGc Appendices IEEE Reference Papers [1] Amazon.84607526.d. Online at 2.pdf&ei=ybPIVJWHN4W3mAWvwoCQBw&usg=AFQjCNG KsJ_qtmH6bq0zuq_cPF156bFZ9g&bvm=bv.d . “Amazon s3 availablity event: July 20.

” Eighth IEEE International Conference on Dependable. Identity in the Information Society. Cloud Computing (CLOUD). S. Computer and Communications Security (ASIACCS ’10). A. “RunTest: assuring integrity of dataflow processing in cloud computing infrastructures”. IEEE International Conference on Cloud Computing.blogspot. html. In Proceeding of SIGFIDET ’71 Proceedings of the ACM SIGFIDET (now SIGMOD). March 2010. USA. September 2009. Gellman. Gu. [6] J. . “Attack surfaces: A taxonomy for attacks on cloud services”. Banglore. online at http://googleblog. Feb 2009.worldprivacyforum. Gruschka. India. 293-304. A. ACM. [8] The Official Google Blog. [9] N. Jensen. Gruschka. Browne. Schwenk. “Privacy in the clouds: Risks to privacy and confidentiality from cloud computing”. 2010 IEEE 3rd International Conference on. “Privacy as a Service: PrivacyAware Data Storage and Processing in Cloud Computing Architectures. [5] A. Du. Kayssi. 1971. “Data privacy and integrity: an overview”. [11] M. Chehab. Dec 2009. L. T. N. NY. New York. Cavoukian. Itani. 109-116. [10] W. “On Technical Security Issues in Cloud Computing”. M. Dec 2008.pdf.L.[4] P. Iacono. Wei. Autonomic and Secure Cloud Privacy Report. online at http://www. J. In Proceedings of the 5th ACM Symposium on Information. Prepared for the World Privacy Forum. 5-10 July 2010. Jensen. “A new approach to China: an update”. (CLOUD II 2009). “Privacy in clouds”. X. [7] R. W.

“How to share a secret”. Miami. Jan. K. Wang. Sherman S. [13] B. Lima. Commun. “Payment Processor Breach May Be Largest Ever”. Referenced on June. Juels. Lou. V. Demo for CloudCom2010. “On the Impossibility of Cryptography Alone for Privacy-Preserving Cloud Computing”.nist. K. Shin. FL. payment processor breach may b. Online at http://voices. “Towards secure cloud storage”. [15] P. 2009. USA. 11(November 1979). in InfoCom2010. Dec 2010. Ren. Shamir. 2009. T. IEEE.-M. March 2010 . Mell. HotSec 2010. Wang. “Trusted storage over untrusted networks”. [14] M. Oliveira. L.html. J. M. H. F. A.[12] J. Chow. [16] P. [17] A. T. IEEE GLOBECOM 2010.html. “Draft NIST working definition of cloud computing”. Kincaid. “Privacypreserving public auditing for secure cloud storage”. 2009. [18] S. Krebs. Online at http://www. Online at http://csrc. July 2008. T. “MediaMax/TheLinkup Closes Its Doors”. Kobara. M´edard.washingtonpost. [19] C. W. 3rd. ACM 22.techcrunch. Dijk. Vinhoza.