You are on page 1of 36

A Secured Cost-effective Multi-Cloud Storage in

Cloud Computing
Introduction to the area
The end of this decade is marked by a paradigm shift of the
industrial information technology towards a pay-per-use service
business model known as cloud computing. Cloud data storage
redefines the security issues targeted on customer’s outsourced
data (data that is not stored/retrieved from the costumers own
servers). In this work we observed that, from a customer’s point
of view, relying upon a solo SP for his outsourced data is not very
promising. In addition, providing better privacy as well as
ensuring data availability, can be achieved by dividing the user’s
data block into data pieces and distributing them among the
available SPs in such a way that no less than a threshold number
of SPs can take part in successful retrieval of the whole data
block. In this paper, we propose a secured cost-effective multicloud storage (SCMCS) model in cloud computing which holds an
economical distribution of data among the available SPs in the
market, to provide customers with data availability as well as
secure storage. Our results show that, our proposed model
provides a better decision for customers according to their
available budgets.
Literature survey
Exiting System
The end of this decade is marked by a paradigm shift of the
industrial information technology towards a subscription based or
pay-per-use service business model known as cloud computing.
This paradigm provides users with a long list of advantages, such
as provision computing capabilities; broad, heterogeneous
network access; resource pooling and rapid elasticity with
measured services. Huge amounts of data being retrieved from
geographically distributed data sources, and non-localized datahandling requirements, create such a change in technological as

well as business model. One of the prominent services offered in
cloud computing is the cloud data storage, in which, subscribers
do not have to store their data on their own servers, where
instead their data will be stored on the cloud service provider’s
servers. In cloud computing, subscribers have to pay the service
providers for this storage service. This service does not only
provides flexibility and scalability for the data storage, it also
provide customers with the benefit of paying only for the amount
of data they need to store for a particular period of time, without
any concerns for efficient storage mechanisms and
maintainability issues with large amounts of data storage. In
addition to these benefits, customers can easily access their data
from any geographical region where the Cloud Service Provider’s
network or Internet can be accessed. Cloud data storage also
redefines the security issues targeted on customer’s outsourced
data (data that is not stored/retrieved from the costumers own
servers).
Problem in the existing system
In Existing System we observed that, from a customer’s point of
view, relying upon a solo SP (Service Provider) for his outsourced
data is not very promising. In addition, providing better privacy as
well as ensure data availability, can be achieved by dividing the
user’s data block into data pieces and distributing them among
the available SPs in such a way that no less than a threshold
number of SPs can take part in successful retrieval of the whole
data block.
Available solution and their features
Since the inception of Information Technology, it has played an
important part in ensuring that companies and businesses run
smoothly. Information Technology has provided various types of
services which are secure, reliable, and available every time. In
order to obtain the highest quality cloud computing, business
owners have turned to its characteristics and features in order to

acquire this service. Cloud computing has become attractive to
end users and customers because of these salient characteristics.

A key characteristic of cloud computing is its quick scalability.
Upgrades and changes to the services are done instantaneously
and easily enabling the cloud computing service to be resilient. A
business owner can easily request for additional bandwidth, data
storage, processing speed, and additional users or licenses. There
is no need to do project implementation, procurement, and
project costing because the system just needs the business owner
to place an order to the cloud computing vendor.
With the cloud computing service, everything is measurable. The
business owner can obtain a specific number of user license per
software, and a fixed network bandwidth and data space which
fits the business’s demands. This feature makes the cost of cloud
computing predictable. It also defines accurately the inclusions in
the service. If the business owner avails of such service, his
employees can experience different services online with large
data spaces; various new software; multi-value added services;
various processing techniques; and ease of accessibility to a
capable and rich network.
An important feature of cloud computing is its ability to let the
business owner decide on his current and future needs. If he
expands his business, he can easily request for additional services
which can match his needs. Cloud computing also makes
available various hardware or software resources. A business
owner can access such resources on demand. Cloud hosting is
also more reliable because it manages the whole cloud thereby
allowing a business owner’s website more data spaces,
bandwidth, and more resources depending on the site’s needs.
Resources of websites which are not accessed currently are freed

Disk failure or server crash won’t create much problem because the supplier can easily restore the latest backup. Backup is also sophisticated in cloud computing. Following the pattern of paradigm shift. data space. In conventional paradigm. Problem Definition Problem definition Privacy preservation and data integrity are two of the most critical security issues related to user data. the data is stored on an autonomous business party that provides data storage as a subscription service. A business owner need not worry about backup responsibilities because the supplier has taken steps to put up a great system for backup.and moved to sites which are in dire need of additional bandwidth. the organizations had the physical possession of their data and hence have an ease of implementing better data security policies. that only authorized users in one account can access it. the author discussed the criticality of the privacy issues in cloud computing. The users have to trust the cloud service provider (SP) with security of their data. the security policies also evolved from the conventional cryptographic schemes applied in . In. Loss of data is also avoided because the supplier must ensure that every hardware or software resources are high end because there are a lot of clients relying on the service. Data is share within a server therefore the provider must ensure that each account is secured. and pointed out that obtaining information from a third party is much easier than from the creator himself. and other resources. But in case of cloud computing.

In our model. the user still can access his data by retrieving it from other service providers. by maintaining redundancy in data distribution. Proposed solution In this project. Development process . This not only rules out the possibility of a SP misusing the customers’ data. Also. if a service provider suffers service outage or goes bankrupt. to provide customers with data availability as well as secure storage. breaching the privacy of data. but can easily ensure the data availability with a better quality of service. we provide the user with better assurance of availability of data. in addition. the customer divides his data among several SPs available in the market. Also we provide a decision for the customer. In this case. that provides a better security by distributing the data over multiple cloud service providers in such a way that. Our proposed approach will provide the cloud computing users a decision model. none of the SP can successfully retrieve meaningful information from the data pieces allocated at their servers. to which SPs he must chose to access data. for enabling the data privacy. based on his available budget. we proposed an economical distribution of data among the available SPs in the market. with respect to data access quality of service offered by the SPs at the location of data retrieval.centralized and distributed data storage.

Because of the cascade from one phase to another.A ‘waterfall’ model is a means of making the development process more visible. The principal stages of the model map onto the fundamental development activities: . this model is known as the ‘waterfall model’ Requiremen ts System and Software Implementati on Integration and System testing Operation and Maintenanc There are numerous variations of this process model.

Both users then define them in a manner.1. the software design is realized as a set of programs or program unit. The system’s services. Advantages of proposed solution . The system is installed and put into practical use. Requirements analysis and definition. It establishes an overall system architecture. System and software design. Operation and maintenance. 3. Unit testing involves verifying that each unit meets its specification. Integration and system. After testing. Normally this is the longest life cycle phase. the software system is delivered to the customer. Software design involves representing the software system functions in a form that may be transformed into one or more executable programs. Implementation and unit testing. 5. 2. Maintenance involves correcting errors which were not discovered in earlier stages of the life cycle. constraints and goals are established by consultation with system users. 4. During this stage. The individual program units or programs are integrated and tested as a complete system to ensure that the software requirements have been met. improving the implementation of system units and enhancing the system’s services as new requirements are discovered. The systems design process partitions the requirements to either hardware or software systems. which is understandable and development staff.

how much data is to be stored on the cloud servers and for how long the data is to be stored. Every SP has a different level of quality of service (QoS) offered as well as a different cost associated with it. Hence. we assume that all the data is to be stored for same period of time. the cloud user can store his data on more than one SPs according to the required level of security and their affordable budgets. Software requirement specification Purpose. Without any concerns for efficient storage mechanisms and maintainability issues with large amounts of data storage. In our model. each available cloud service provider is associated with a QoS factor. cloud users (U) and cloud service providers (SP). We consider number of cloud service providers each available cloud service provider is . Product overview We consider the storage services for cloud data storage between two entities. we assume that all the data is to be stored for same period of time. The cloud storage service is generally priced on two factors. along with its cost of providing storage service per unit of stored data (C).  Cloud data storage also redefines the security issues targeted on customer’s outsourced data. scope We consider the storage services for cloud data storage between two entities. In our model. We consider p number of cloud service providers (SP). The cloud storage service is generally priced on two factors. how much data is to be stored on the cloud servers and for how long the data is to be stored. cloud user and cloud service providers.

Similar to many other new and emerging technologies. Developing requirements for cloud computing is similar to other projects. Requirements are needed to ensure alignment with your business processes and compatibility with your system architecture. along with its cost of providing storage service per unit of stored data. but also differs in many ways. In addition to functional requirements. Not defining a clear and complete set of requirements for cloud computing is a recipe for disappointment. Functional requirement “Cloud computing” is the one of the most overused buzzwords in IT. Hence. vendors are trying to cash in on the hype by over promising and under delivering. Every has a different level of quality of service offered as well as a different cost associated with it. it is necessary to define many types of supplemental requirements unique to cloud computing. such as:  Governance  Who will own the application?  What governance structure is needed?  Who pays for the solution?  What are the responsibilities of IT?  What are the responsibilities of the Business Unit?  What operational mechanisms are needed to support the solution?  Accessibly  Who is responsible for setting up new users?  Who will have access to the system? .associated with a factor. the cloud user can store his data on more than one according to the required level of security and their affordable budgets.

which can degrade its performance due to overload to cloud system. It also means guaranteeing the performance for a higher-priority processing by means of temporarily delegating the workload of low-priority processing tasks to other cloud systems. a cloud system autonomously selects a provider that offers an SLA that satisfies the consumer’s demands. in the face of a abrupt increase in traffic to an unexpected level. Guaranteed availability . and distributes its load to other cloud systems. thereby continuing to provide guaranteed service performance for consumers. Architectural integration  How do we integrate this into our existing infrastructure?  How will we monitor performance?  Deployment and test responsibilities  Who is responsible for designing and testing the solution?  Who is responsible for training the users?  Data integration  How do we extract data and import to our data warehouse?  How do we integrate with our existing ERP systems?  How do we integrate with our external suppliers?  Security  What are the security implications for our organization?  Do we have any auditing requirements?  Will the solution integrate with our single user log in? Performance requirement Guaranteed performance Guaranteed performance means that.

such as continues to provide the guaranteed quality for high priority services and attempts to satisfy the a part of quality requirements only on a best-effort basis. Exception handling Missing or defective exception handling provisions have caused many failures in critical software intensive systems even though they had undergone extensive review and test. it is important to recover services according to priority of each services. thereby continuing to provide the guaranteed services as before the disaster. a cloud system recovers the services (disaster recovery) by interworking with cloud systems located in areas unaffected by the disaster. such as when applying for a passport. a cloud system cooperates applying service and all the related procedural services in such a way that the consumer can see all the services involved as a one-stop service. when damaged by a disaster and threatened to continuity of services provided by a cloud system. when several related procedures need to be completed. Convenience of service cooperation Convenience of service cooperation means to improve convenience that. The failures occurred under conditions that had not been covered in the reviews and tests because of incomplete or imprecise .Guaranteed availability means that. If it is difficult to recover services in such a way as to provide guaranteed quality for all the services.

and increasingly in automotive applications. exception handling is an important part of software development and of the verification and validation activities. In keeping with the EWICS TC7 convention [1] such systems are in the following called critical systems. Software for critical systems is expected to protect against a wide range of anomalies that can include • Unusual environmental conditions • Erroneous inputs from operators • Faults in the computer(s). They are found in aerospace.system requirements. An example of the issues . saving the current program state. Their purpose is (a) to detect that an anomalous condition has been encountered and (b) to provide a recovery path that permits continued system operation. and locating the resources required for continuing the execution. process control. stopping the normal execution. The systems most in need of precise exception handling requirements are real-time control systems because in these there is usually no opportunity to roll back and try a second time. In critical systems a large part of the software can be devoted to exception handling and in some cases a substantial part of the failures in these systems have been traced to deficiencies in the exception handlers. Thus. The programmer views exception handling as a task that requires detecting an abnormal condition. To curb this cause of failures the paper addresses the generation of system requirements for exception handling. the software and communication lines The portions of the programs that are charged with providing this protection are called exception handling provisions or exception handlers. sometimes with reduced capabilities.

just like the user story. As you can see.dealt with at that level is the following program construct and the comment that follows it: public void someMethod() throws Exception{ } “This method is a blank one. When the development team has finished working on the user story they demonstrate the functionality to the Product Owner. 4. the acceptance criteria could include: 1. showing how each criterion is satisfied. A user cannot submit a form without completing all the mandatory fields 2. it does not have any code in it. Acceptance criteria Acceptance criteria define the boundaries of a user story. the acceptance criteria are written in simple language. Information from the form is stored in the registrations database 3. How can a blank method throw exceptions? Java does not stop you from doing this”. . For the above example. Protection against spam is working An acknowledgment email is sent to the user after submitting the form. and are used to confirm when a story is completed and working as intended.

These additional copies are typically called "backups. Bandwidth: The amount of data that can be transmitted in a fixed amount of time. . Think of this as the difference between a small diameter hose and a larger one. the more data you can get through in a shorter period of time. the smallest unit of information on a machine. the bandwidth is usually expressed in bits per second (bps) or bytes per second. The amount of data that can travel through a circuit. A single bit can hold only one of two values: 0 or 1. Glossary of terms Backup: Refers to making copies of data so that these additional copies may be used to restore the original after a data loss event. a byte is composed of 8 consecutive bits. You’ll have the advantage in a water fight with the larger hose. This is measured in bits per second. the bandwidth is expressed in cycles per second. The larger the bandwidth. More meaningful information is obtained by combining consecutive bits into larger units. For example." Backups are useful primarily for two purposes: (1) to restore a state following a disaster (called disaster recovery). Bit: Short for binary digit. and (2) to restore small numbers of files after they have been accidentally deleted or corrupted.Including acceptance criteria as part of your user stories has several benefits:    they get the team to think through how a feature or piece of functionality will work from the user’s perspective they remove ambiguity from requirements they form the tests that will confirm that a feature or piece of functionality is working and complete. For digital devices. or Hertz (Hz). For analog devices.

including sound and video. for example. Cloud computing entrusts remote services with a user's data. The server can use this information to present you with custom Web pages. In addition.Blog: Short for Web log. Typically updated daily. Customer Relationship Management (CRM): A database that stores all customer information for easy retrieval. This information is packaged into a cookie and sent to your Web browser which stores it for later use. Cookies: The main purpose of cookies is to identify users and possibly prepare customized Web pages for them. which means that it can display graphics as well as text. your browser will send the cookie to the Web server. blogs often reflect the personality of the author. instead of seeing just a generic welcome page you might see a welcome page with your name on it. The name comes from the use of a cloud-shaped symbol as an abstraction for the complex infrastructure it contains in system diagrams. software and computation. The most popular browser is Microsoft Internet Explorer – a graphical browser. Convergence: The condition or process of combining complementary technologies such as telecommunications. The next time you go to the same website. Cloud Computing: Cloud computing is the use of computing resources (hardware and software) that are delivered as a service over a network (typically the Internet). a software application used to locate and display Web pages. you may be asked to fill out a form providing such information as your name and interests. most modern browsers can present multimedia information. So. Browser: Short for Web browser. When you enter a website using cookies. networking and multimedia. . though they require plug-ins for some formats. a blog is a Web page that serves as a publicly accessible personal journal for an individual.

not between switching stations. DHCP software runs in servers and routers. such as e-mail. this type of website offers more than a static site since it can constantly be updated from any where there’s access to the internet. This type of site is also necessary if e-commerce is to be considered. video. audio. Dynamic Sites: Through the use of Database programming. DHCP: Short for Dynamic Host Configuration Protocol. business that is conducted over the Internet using any of the applications that rely on the Internet. Because this would be difficult to remember and also hard to type in without making a mistake.18.com style names for us humans.1. DSL technologies use sophisticated modulation schemes to pack data onto copper wires.0). DHCP is software that automatically assigns temporary IP addresses to client stations logging onto an IP network. These addresses are usually expressed as a sequence of four sets of numbers separated by a decimal (for example 172. into a digital (binary) form.address. search functions are require and secure transactions of any type are to be conducted. Electronic commerce can be . Digitizing: The process of converting data. DNS .Domain Name System: Computers on the Internet are kept separate by the use of names and addresses.Cyber attack: The leveraging of a target's computers and information technology. to cause physical. images. They’re translated into the numbering system. Electronic Commerce: Often referred to as simply e-commerce. DSL: Short for Digital Subscriber Lines. particularly via the Internet. etc. real-world harm or severe disruption. They are sometimes referred to as last-mile technologies because they are used only for connections from a telephone switching station to a home or office. we use the www. It eliminates having to manually assign permanent "static" IP addresses. instant messaging. and shopping carts.

search and process any electronic data for use as evidence in a legal proceeding or investigation. Firewall: A system designed to prevent unauthorized access to or from a private network. Also see Unlimited FTP. secure. goods. A malicious or inquisitive .between two businesses transmitting funds. especially intranets. Electronic Funds Transfer: Often abbreviated as EFT. Firewalls are frequently used to prevent unauthorized Internet users from accessing private networks connected to the Internet. Only the persons sending and receiving the information have the key and this makes it unreadable to anyone except the intended persons. Popular EFT providers are VeriSign and PayPal. Encryption: The translation of data into a secret code. Firewalls can be implemented in both hardware and software. is a type of cyber forensics and describes the process by where law enforcement can obtain. All messages entering or leaving the intranet pass through the firewall. Electronic discovery: or e-discovery. File Transfer Protocol (FTP): This is the system that allows you to copy files from computers around the world onto your computer. Ethernet: The format that all computers use to talk to each other. Electronic discovery may be limited to a single computer or a network-wide search. Used in conjunction with Internet Protocol. Encryption is the most effective way to achieve data security. or a combination of both. Hacker: A person who enjoys exploring the details of computers and how to stretch their capabilities. which examines each message and blocks those that do not meet the specified security criteria. services and/or data or between a business and a customer. it is the paperless act of transmitting money through a secure computer network. A way of coding the information in a file or e-mail message so that if it is intercepted by a third party as it travels over a network it cannot be read.

and what actions Web servers and browsers should take in response to various commands. An IP network is somewhat similar to the . A hub contains multiple ports. Hubs are commonly used to connect segments of a LAN. IP Address: Short for Internet Protocol address. using rules from a secure web server. as opposed to most users who prefer to learn the minimum necessary. These are the rules in which a web user’s browser accesses files from a web server. Internet Protocol (IP): The format that all computers use to talk over the Internet. and network device must have a unique IP address for each network connection (network interface). HTTPS: Same as above. HTML (Hypertext Mark Up Language): That’s the programming language that’s universally accepted for internet programming.meddler who tries to discover information by poking around. Every client. A person who enjoys learning the details of programming systems and how to stretch their capabilities. When a packet arrives at one port. it is copied to the other ports so that all segments of the LAN can see all packets. HTTP (HyperText Transfer Protocol): The underlying protocol used by the World Wide Web. Every IP packet contains a source IP address and a destination IP address. when you enter a URL in your browser. For example. HTTP defines how messages are formatted and transmitted. Hub: A common connection point for devices in a network. an IP address is the address of a device attached to an IP network (TCP/IP network). server. Identity theft is a form of identity crime (where somebody uses a false identity to commit a crime). this actually sends an HTTP command to the Web server directing it to fetch and transmit the requested Web page. Identity theft: Identity theft occurs when somebody steals your name and other personal information for fraudulent purposes.

For a monthly fee.telephone network in that you have to have the phone number to reach a destination. user IDs. This list should be made in the order of importance as well as common misspellings since it’s a human entering the search words into a search engine. regardless of which database management system (DBMS) is handling the data. There are many types of computer networks. Freight. password. the service provider gives you a software package. A . Meta Tags: A list of approximately thirty (30) words that are ‘Key’ to helping search engines find your website. ODBC (Open Database Connectivity): A standard database access method developed with the goal to make it possible to access any data from any application. Equipped with a high-speed device. Key Words: See Meta Tags Local Area Network (LAN): An Ethernet switch and the cables that go to computers that are geographically close together. SSNs. you can then log on to the Internet and browse the World Wide Web. and send and receive e-mail. and wide-area networks (WANs) where the computers are farther apart and are connected by telephone lines or wireless radio waves. Harbour. Frieght Network: A group of two or more computer systems linked together. and passwords. Phishing: A form of Internet fraud that aims to steal valuable information such as credit cards. ISP (Internet Service Provider): A company that provides access to the Internet. The big difference is that IP addresses are often temporary. and access phone number. username. in the same building). Each device in an IP network is either assigned a permanent address (static IP) by the network administrator or is assigned a temporary address (dynamic IP) via DHCP software. including: local-area networks (LANs) where the computers are geographically close together (that is. For example: Harbor Freight: Harbor.

storage. Or businesses that see the . and other resources are made available to the general public by a service provider such as what ITX offers in Microsoft Office 365. knowledgeable and can build trust with a business to ensure the best possible private cloud environment. and drive space. Unlike public cloud options. typically a financial institution such as a bank or insurance company. public cloud service providers like Microsoft own and operate the infrastructure and offer access only via Internet. then a public cloud could be the computing model for your business. Public clouds share server resources and other standard resources such as CPU. If your business is comfortable with sharing resources. including security access codes.fake website is created that is similar to that of a legitimate organization. risk tolerance. Generally. Public clouds are beneficial for newer companies and start-ups not wanting the heavy expenses of IT gear and established companies with aging infrastructure. security. An email is sent requesting that the recipient access the fake website (which will usually be a replica of a trusted site) and enter their personal details. etc. A private cloud lets you capitalize on existing IT investments while making IT more dynamic. to avoid possible vulnerabilities. back-up and recovery. it can have a positive impact on a business. Undertaking a private cloud project requires a degree of engagement between the organization and the IT company to virtualize the business environment.. At ITX we are confident. private clouds do not share server resources with other customers. When it is done right. RAM. Public Cloud: Public cloud applications. Working together with a trusted company like ITX will allow each step of the engineered design to be addressed. but use resources dedicated to only one business. and can be hosted internally or collocated depending on the businesses security and risk tolerance. Private Cloud: Private cloud is custom cloud infrastructure for an individual organization that can be managed internally or by an IT service company such as ITX.

Most security measures involve data encryption and passwords. Server: This is where your website programming actually resides.yourname.com. It’s your www. transmission speed. A print server is a computer that manages one or more printers. Any user on the network can store files on the server. huge hard drive! A computer or device on a network that manages network resources. A network server is a computer that manages network traffic. delay. well understood and widely used. A password is a secret word or phrase that gives a user access to a particular program or system. refers to techniques for ensuring that data stored in a computer cannot be read or compromised by any individuals without authorization. For example. and that public clouds do not require rethinking your IT from the ground up. URL (Uniform Resource Locator): A URL is the global address of documents and other resources on the World Wide Web. Virtual Private Network (VPN): A secure connection created over a public network by using tunneling-mode encryption. A database server is a computer system that processes database queries. QoS is a guaranteed or predictable level of bandwidth. Data encryption is the translation of data into a form that is unintelligible without a deciphering mechanism. This is your street address that no one else can have. a file server is a computer and storage device dedicated to storing files. QoS (Quality of Service): A term used when describing IP phone systems. and error that is necessary to ensure adequate performance of particular applications. Pure IP: Digital phone system that digitizes analog speech into bits to transmit them along with data bits over a unified network. and freedom from dropped packets. .cloud value. Security: In the computer industry. Think of this as one very. jitter.

Technology requirement Hardware requirements:  1 GB RAM  2 GB of free hard disk space  Intel P4 or Higher Software requirements:  Java development kit 1.0(jdk1. Web-based Interface: Using any common browser such as Microsoft Explorer.0) or above  MS SQL Server 2008 Languages  Java 2 Enterprise Edition (J2EE) o Java Server pages. etc. An application that is accessed via web browser over a network such as the Internet or an intranet.6.) and reliant on a common web browser to render the application executable. JavaScript.Web App: Short for Web Application. Java. Wide Area Network (WAN): Two or more Ethernet LANs connected with long-distance data lines.6. It is also a computer software application that is coded in a browser-supported language (such as HTML. o Java Swing o RMI o JDBC o SQL System design Use Case Model .

Detailed design High level design .

Data Flow Diagram .

Low level design .

Relational model and Flowchart and pseudo code .

.

These files in the source code shares the common routines and share data structures. Internal documentation is another important factor. .class” files. Module Specifications The modules specified in the design are implemented using various “. to establish the hierarchical relationship. to facilitate others to understand the code and the logic.htm”. The code review is done as soon as the source code is ready to be executed. Code review and walk through Both reviews and walk through used to deliver the correct codes.jsp” and “. The translation process continues when the compiler accepts source code as input and produces machine dependent object code as output. “. The coding steps translate the detailed design of the system into programming language.Implementation The goal of the coding phase is to translate the design into code in the given programming language. Linking of object files are done to produce the machine code. this is to reduce syntax errors and also check the coding standards.

If errors are found. then the software must be debugged to locate these errors in the various programs.this time with an additional objective of finding out whether or not corrections in one part of the system have introduced any new errors elsewhere in the system. The Primary and Larger objective of testing is to deliver quality software. which reveals errors in program. The program/system must be tested once again after corrections have been implemented . The application is made to run with the Run in internet explorer using the address as “http://localhost:8080/LIC” present in ROOT directory of Apache Tomcat Server Testing & Result Testing is a process.Compilation and building the executables The source code for the system organized in various files is compiled using the “javac” utility provided in the JAVA. Corrections are then made. Quality software is one that is devoid of errors and meets with a customer’s stated requirements. Once all errors are found. then another objective must be accomplished that is check whether or not the system is doing . It is the major quality measure employed during software development. During testing. the program is executed with a set of conditions known as test cases and output is evaluated to determine whether the program is performing as expected.

what it is supposed to do.This attempts to verify that the protection mechanisms built into the system. So another aspect of testing is that it must also ensure that the system meets with user requirements. Security Testing . Performance testing . .This was done to determine whether or not the system is able to handle a large volume of data. The volume was a representative of the real life volume with some provision for future growth. This testing was done to focus on the performance of the System under large volumes and not just the ability to handle it.This is corollary to volume testing.  Techniques of testing  Black Box Testing  White Box Testing  Equivalence Portioning  Boundary Value Analysis  Ad-hoc Testing Specialized Testing done for this Project are Volume Testing . actually protects the system from unauthorized access or not.

A review of design information provides guidance for establishing test cases that are likely to uncover errors in each of the categories. After source level code has been developed. Integration testing Tested modules are put together and tested in their integrity. Integration testing is a systematic technique for constructing the program structure while at the same time conducting tests to uncover errors associated with interfacing.This was basically done to see if any changes are made to one part of a Program whether it affects another part of System and also to check the deviations in behavior of unchanged parts of system Unit testing Unit testing is normally considered as an adjunct to the coding step. Different strategies a may be adopted depending on the type of system to be tested and the development process used. Testing strategies A testing strategy is general approach to the testing process rather than a method of devising particular system or components tests.Regression Testing . Unit testing is responsible for testing each module in software structure independently. . The objectives are to take unit tested components and build a program structure that has been discarded by design. reviewed and verified for correspondence to component level design.

Different testing techniques are appropriate at different point of time. Bottom-up testing where testing starts with the fundamental components and works upwards. Testing begins at the module /well &works “outward” towards the integration of the entire computer based system. Different strategies may be needed for different parts of the system and at different stages in the testing process. Large systems are usually tested using a mixture of these testing strategies rather than any approach. Whatever testing strategy is adopted. Thread testing which is used for systems with multiple processes where the processing of transaction threads its way through these processes.The testing strategies which discuss in this are: Top-down testing where testing starts with the most abstract component and works downwards. Stress testing which relies on stressing the system by going beyond its specified limits and hence testing how well the system can cope with over-load situations. . Number of software testing strategies is proposed. Back-to-back testing which is used when versions of systems are available the systems are tested together and their outputs are compared. it is always sensible to adopt an incremental approach to sub-system and system testing.

ieee.google.in/url? sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact =8&ved=0CCIQFjAA&url=http%3A%2F %2Fieeexplore. By dividing and distributing customers data. https://www. Reference books.org%2Fxpls%2Fabs_all. So this drawback can be covered in next future work of this project task.The developer of the s/w & independent test group conducts testing. sites and other resources 1. which seeks to provide each customer with a better cloud data storage decision. Testing & debugging must be accommodated in any testing strategy. Future Enhancement For the future work . And even the backup data server can fails so there is no cured mention for this. Conclusion In this Project.co. this research should be extended by adding the ensuring the availability system in this project which in result of availability of data in case of failure of data retrieving process. taking into consideration the user budget as well as providing him with the best quality of service (Security and availability of data) offered by available cloud service providers. our model has shown its ability of providing a customer with a secured storage under his affordable budget. we proposed a secured cost-effective multi cloud storage (SCMCS) in cloud computing.jsp%3Farnumber %3D5928887&ei=ybPIVJWHN4W3mAWvwoCQBw&usg=AFQj CNFxHlaBKCW0UdD5- .

com.amazon.pdf&ei=ybPIVJWHN4W3mAWvwoCQBw&usg=AFQ jCNFkDAaAFwg2t4Vq79js9Mx593pCqw&bvm=bv.google.d.d.84607526.dGc 3.co. [3] M.org %2Fresearchpaper%255CA-Secured-Cost-Effective-MultiCloud-Storage-in-CloudComputing. [2] “A Mordern Language for Mathematical Programming”. 2008. .ampl.aws.dGc Appendices IEEE Reference Papers [1] Amazon. Online at http://www.com/s3-20080720.ijser.d . Arrington.0TiloSTRi_Ndw&bvm=bv. 2008”.google.84607526.org %2Fwp-content%2Fuploads%2FIJARCET-VOL-2-ISSUE-4-14051409. https://www.84607526.com.in/url? sa=t&rct=j&q=&esrc=s&source=web&cd=3&cad=rja&uact =8&ved=0CDIQFjAC&url=http%3A%2F%2Fwww.dGc 2. December 2006.co. https://www. “Gmail Disaster: Reports of mass email deletions”.in/url? sa=t&rct=j&q=&esrc=s&source=web&cd=6&cad=rja&uact =8&ved=0CEcQFjAF&url=http%3A%2F%2Fijarcet. “Amazon s3 availablity event: July 20.techcrunch. Online at http://www.html.com/2006/12/28/gmail-disasterreportsofmass-email-deletions/.pdf&ei=ybPIVJWHN4W3mAWvwoCQBw&usg=AFQjCNG KsJ_qtmH6bq0zuq_cPF156bFZ9g&bvm=bv. Online at http://status.

S. India. [10] W. 2010 IEEE 3rd International Conference on. Du.[4] P. “A new approach to China: an update”. Schwenk. Itani. Gruschka.org/pdf/WPF Cloud Privacy Report. Kayssi. Identity in the Information Society. X. online at http://www. Gu. March 2010. Cavoukian. Chehab.worldprivacyforum. “Privacy as a Service: PrivacyAware Data Storage and Processing in Cloud Computing Architectures.blogspot. In Proceeding of SIGFIDET ’71 Proceedings of the ACM SIGFIDET (now SIGMOD). Prepared for the World Privacy Forum. Jensen. NY. New York. . Browne. “Privacy in clouds”. [5] A. Yu. “Privacy in the clouds: Risks to privacy and confidentiality from cloud computing”. [11] M. USA. Wei.pdf. html. Feb 2009. M. ACM. “RunTest: assuring integrity of dataflow processing in cloud computing infrastructures”. “Attack surfaces: A taxonomy for attacks on cloud services”. Iacono. “On Technical Security Issues in Cloud Computing”. 1971. L. online at http://googleblog. 109-116. N. (CLOUD II 2009). Autonomic and Secure Computing. Banglore. W. Gellman.com/2010/03/newapproach-to-chinaupdate. T. Dec 2009. A.” Eighth IEEE International Conference on Dependable. [6] J. IEEE International Conference on Cloud Computing. 293-304. Cloud Computing (CLOUD). Jensen. “Data privacy and integrity: an overview”. [7] R. A. [9] N. Computer and Communications Security (ASIACCS ’10). J. [8] The Official Google Blog. Dec 2008. In Proceedings of the 5th ACM Symposium on Information.L. 5-10 July 2010. Gruschka. September 2009.

Wang. 2009. V. [15] P. Grance. Shin. 3rd. Kincaid. Krebs. Lou. USA. July 2008.[12] J. “Draft NIST working definition of cloud computing”. K. M´edard. Online at http://www. Wang. Barros. “Privacypreserving public auditing for secure cloud storage”. [18] S. “MediaMax/TheLinkup Closes Its Doors”. Oliveira. [16] P. IEEE GLOBECOM 2010. Online at http://voices. H. “Payment Processor Breach May Be Largest Ever”. T. Juels. Dec 2010. T. Online at http://csrc.html. HotSec 2010. J. Vinhoza. Demo for CloudCom2010. ACM 22.gov/groups/SNS/cloud-computing/index. Sherman S. Lima.com/securityfix/2009/01/ payment processor breach may b.-M. “How to share a secret”. T. Miami. March 2010 . Commun. Chow. in InfoCom2010. W. IEEE. Jan. 2009. Shamir. Ren. Referenced on June. [17] A.techcrunch. [13] B. “On the Impossibility of Cryptography Alone for Privacy-Preserving Cloud Computing”. K. 2009.html.com/2008/-7/10/mediamaxthelinkupcloses-itsdorrs/. “Towards secure cloud storage”. Q. Kobara. 11(November 1979).washingtonpost.nist. [19] C. M. A. FL. “Trusted storage over untrusted networks”. F. Mell. L. [14] M. Dijk.