Sie sind auf Seite 1von 70

CHAPTER 1 INTRODUCTION

1.1 General The frequent attacks on network infrastructure, using various forms of denial of service (DoS) attacks and worms, have led to an increased need for developing techniques for analyzing and monitoring network traffic. If efficient analysis tools were available, it could become possible to detect the attacks, anomalies and take action to suppress them before they have had much time to propagate across the network. In this paper, we study the possibilities of trafficanalysis based mechanisms for attack and anomaly detection. The motivation for this work came from a need to reduce the likelihood that an attacker may hijack the campus machines to stage an attack on a third party. A campus may want to prevent or limit misuse of its machines in staging attacks, and possibly limit the liability from such attacks. In particular, we study the utility of observing packet header data of outgoing traffic, such as destination addresses, port numbers and the number of flows, in order to detect attacks/anomalies originating from the campus at the edge of a campus. Detecting anomalies/attacks close to the source allows us to limit the potential damage close to the attacking machines. Traffic monitoring close to the source may enable the network operator quicker identification of potential anomalies and allow better control of administrative domains resources. Attack propagation could be slowed through early detection. Our approach passively monitors network traffic at regular intervals and analyzes it to find any abnormalities in the aggregated traffic. By observing the traffic and correlating it to previous states of traffic, it may be possible to see whether the current traffic is behaving in a similar manner.

The network traffic could look different because of flash crowds, changing access patterns, infrastructure problems such as router failures, and DoS attacks. In the case of bandwidth attacks, the usage of network may be increased and abnormalities may show up in traffic volume. Flash crowds could be observed through sudden increase in traffic volume to a single destination. Sudden increase of traffic on a certain port could signify the onset of an anomaly such as worm propagation. Our approach relies on analyzing packet header data in order to provide indications of possible abnormalities in the traffic.

CHAPTER 2 LITERATURE REVIEW


2.1 General General properties of network packet traffic have been studies intensely for many years - standard references include. Many different analysis techniques have been employed in these and other studies including wavelets in. The majority of these traffic analysis studies have been focused on the typical, packet level and end-to-end behavior (a notable exception being ). Our focus is mainly at the flow level and on identifying frequency characteristics of anomalous network traffic. There have been many prior studies of network fault detection methods. Example includes. Feather et al. use statistical deviations from normal traffic behavior to identify faults while a method of identifying faults by applying thresholds in time series models of network traffic is developed in. These are focus on accurate detection of deviations from normal behavior. Our work is focused on identifying anomalies by removing first from the signal its predictable, ambient part, and only then employing statistical methods. Wavelet are used for the former task. Detection of black-hat activity including denial-of-service (DoS) attacks and port scan attacks has also been treated widely. Methods for detecting intrusions include clustering , neural networks and Markov models.

2.2 Literature Survey


Moore et al. show that flow data can be effective for identifying DoS attacks . A number of intrusion detection tools have been developed in recent years in response to the rise in black-hat activity. An example is Bro which provides an extensible environment for identifying intrusion and attack activity. Our work complements this work by providing another means for identifying a variety of anomalous behaviors including attacks. We identify flash crowds as an important anomaly category. The events of September 11, 2001 and the inability of most online news services to deal with the offered demand is the most extreme example of this kind of behavior. While infrastructure such as content delivery networks (CDNs) have been developed to mitigate the impact of flash crowds, almost no studies of their characteristics exist. A recent study on flash crowds is by Jung. That work considers flash crowds (and DoS attacks) from the perspective of Web servers logs whereas ours is focused on network traffic. Finally, cooperative pushback are to be proposed in as a means for detection and control of events such as flash crowds. Rule-based approaches, such as IDS (intrusion detection system), try to apply previously established rules against incoming traffic to detect and identify potential DoS attacks close to the victims network. To cope with novel attacks, however, IDS tools such as Snort require to be updated with the latest rules. This paper looks at the problem of designing generalized measurement based real-time detection mechanism. Measurement-based studies have considered traffic volume, number of flows as potential signals that can be analyzed in order to detect anomalies in network traffic. While we further treat the traffic headers such as addresses and port numbers.Work in relies on input data from multiple sources while our work focuses on a single link at a time.

Some approaches proactively seek methods to suppress the overflow of traffic at the source. Controls based on rate limits have been adopted for reducing the monopolistic consumption of available bandwidth, to diminish the effects of attacks, either at the source or at the destination . The apparent symptoms of bandwidth attack may be sensed through monitoring bit rates and/or packet counts of the traffic flow. Bandwidth accounting mechanisms have been suggested to identify and contain attacks . Packeteer and others offer commercial products that can account traffic volume along multiple dimensions and allow policy-based rate control of bandwidth. Pushback mechanisms have been proposed to contain the detected attacks closer to the source. Trace back has been proposed to trace the source of DDoS attacks even when the source addresses may be spoofed by the attacker . However, sophisticated low-rate attacks , which do not give rise to noticeable variance in traffic volume, could go undetected when only traffic volume is considered. Recently statistical analysis of aggregate traffic data has been studied. In general, the generated signal can be analyzed by employing techniques such as FFT (Fast Fourier Transform) and wavelet transforms. FFT of traffic arrivals may reveal inherent flow level information through frequency analysis. Fourier transforms and wavelets have been applied to network traffic to study its periodicity . Our previous work in and the work in studied traffic volume as a signal for wavelet analysis and these earlier studies have considerably motivated our current study. . A similar data structure was employed with significant differences in processing of collected data, detection mechanisms and the resulting traffic anomaly detectors. The structure of addresses at various points in the network was observed to be multi-fractal.

2.3 EXISTING SYSTEM There is no established existing system to prevent the network traffic.And so we are developing the statically analysis for detecting the traffic anomalies. 2.4 PROPOSED SYSTEM In this project we are going to detect the anomalies using the following three techniques. Traffic Analysis at the Source General mechanism of detector. Trace. Traffic Analysis at the Source We focus on analyzing the traffic at an egress router. Monitoring traffic at a source network enables early detection of attacks, to control hijacking of AD (administrative domain, e.g., campus) machines, and to limit the squandering of resources. There are two kinds of filtering based on traffic controlling point as shown in. Ingress filtering protects the flow of traffic entering into an internal network under administrative control. IOn the other hand, egress filtering controls the flow of traffic leaving the administered network. Thus, internal machines are typically the origin of this outbound traffic in view of an egress filter. As a result, the filtering is performed at the campus edge. Outbound filtering has been advocated for limiting the possibility of address spoofing with such filtering in place, we can focus on destination addresses and port numbers of the outgoing traffic for analysis purposes.

General Mechanism of Detector The first step is a traffic parser, in which the correlation signal is generated from packet header traces or NetFlow records as input. The first step is a traffic parser, in which the correlation signal is generated from packet header traces or NetFlow records as input. Fields in the packet header, such as destination addresses and port numbers, and traffic volume depending on the nature of the traffic, can be used as a signal. By this way we generate the signal. Second step is to transform the signal using the discrete wavelet transform DWT. Analyzing discrete domains such as address spaces and port Numbers poses interesting problems for wavelet analysis. We employ the correlation in different domains to generate the suitable signal for analysis. Finally we use the technique of finding the attack or the anomalies. this is done with the help of setting the threshold . And we are comparing the result with the historical data .and the anomalies are detected using the statically analysis. We report on our results employing correlation of destination addresses, port numbers and the distribution of the number of flows as monitored traffic signals.

Trace To verify the validity of our approach, we run our algorithm on four traces of network traffic. First, we examine our method on traces from the University of Southern California that contain real network attacks. Second, to inspect the performance of our detector on backbone links, we examine the mechanism on KREONet2 traces, which include over 230 organizations, from July 21, 2003, to July 28, 2003, that contain real worm attacks . In the trace employed, there were three major attacks and a few instantaneous probe attacks, which were judged by various forensic traffic analyses in advance. Third, to compare our method with Snort, we exploit a live network in Texas A&M University. Fourth, to evaluate the sensitivity of our detectors performance over attacks of various configurations, we employ the attack-free traces from the NLANR (National Laboratory for Applied Network Research) , which are later superimposed with simulated virtual attacks.

2.5 PROBLEM FORMULATION 2.5.1 Main Objectives In our project we detect the traffic anomalies by monitoring the header data. Some attacks like denial of service led to develop the techniques for identifying the network traffic. If we have the efficient analysis tool we could prevent the network from the traffic before it could get attacked. We can analyze the network traffic with the help of, correlation of the destination IP address in the egress router. The address correlations are data transformed using the discrete wavelet transform for detecting the traffic anomalies. Results from trace-driven evaluation suggest that proposed approach could provide an effective means of detecting anomalies close to the source. We also present a multidimensional indicator using the correlation of port numbers and the number of flows as a means of detecting anomalies. 2.5.2 Modules Login Client Ingress router Egress router File Sending

Login In this module the user are allowed to sign up as a new user. Once the user signs in there is a separate log maintained for the particular user. The existing user can sign in to perform the operation. Client The user who wants to send a file is treated as client. Before selecting a file to send, the client has to provide his details to the server. The client is restricted to choose the file which creates traffic in the network. Ingress Router Ingress filtering protects the flow of traffic entering into an internal network under administrative control. Once the user signs in to the application his details are stored in the server. After choosing a particular file the details of file is gathered in order to prevent traffic. Egress Router In this module the activities of user after choosing a file is checked. An separate log is created for the user. Here all the details including size and type of the file he chooses is stored. If he chooses the file which may create trafficthe error count in his account gets added. File Sending Server checks the size and type of the file chosen by the client. If the server finds that it may create traffic then server provide request to the client to choose another file. If the client tries to send the file which may create traffic for more than 3times, the server disconnects his connection.

10

Module Input Client chooses a file to send providing his details to the server. Module Output If the client choose the file which creates traffic server request him to choose another file and increments his error count else his file is transferred.

2.6 SYSTEM REQUIREMENTS


2.6.1 HARDWARE REQUIREMENTS

SYSTEM HARD DISK MONITOR MOUSE RAM KEYBOARD

: Pentium IV 2.4 GHz : 40 GB : 15 VGA colour : Logitech. : 256 MB : 110 keys enhanced.

FLOPPY DRIVE : 1.44 MB

2.6.2 SOFTWARE REQUIREMENTS


Operating system :- Windows XP Professional Front-End Database :- Microsoft Visual Studio .Net 2005 :- SQL SERVER 2000 Coding Language :- C# 2.0

11

2.7 LANGUAGE SPECIFICATION XML and ADO.NET ADO.NET leverages the power of XML to provide disconnected access to data. ADO.NET was designed hand-in-hand with the XML classes in the .NET Framework both are components of a single architecture. ADO.NET and the XML classes in the .NET Framework converge in the DataSet object. The DataSet can be populated with data from an XML source, whether it is a file or an XML stream. The DataSet can be written as World Wide Web Consortium (W3C) compliant XML, including its schema as XML Schema definition language (XSD) schema, regardless of the source of the data in the DataSet. Because the native serialization format of the DataSet is XML, it is an excellent medium for moving data between tiers making the DataSet an optimal choice for remoting data and schema context to and from an XML Web service. The DataSet can also be synchronized with an XmlDataDocument to provide relational and hierarchical access to data in real time.

12

ADO.NET Components The ADO.NET components have been designed to factor data access from data manipulation. There are two central components of ADO.NET that accomplish this: the DataSet, and the .NET data provider, which is a set of components including the Connection, Command, DataReader, and DataAdapter objects. This is the core component of the disconnected architecture of ADO.NET. The DataSet is explicitly designed for data access independent of any data source. As a result it can be used with multiple and differing data sources, used with XML data, or used to manage data local to the application. The DataSet contains a collection of one or more DataTable objects made up of rows and columns of data, as well as primary key, foreign key, constraint, and relation information about the data in the DataTable objects. The other core element of the ADO.NET architecture is the .NET data providers, whose components are explicitly designed for data manipulation and fast, forward-only, read-only access to data. The Connection object provides connectivity to a data source. The Command object enables access to database commands to return data, modify data, run stored procedures, and send or retrieve parameter information. The DataReader provides a high-performance stream of data from the data source. Finally, the DataAdapter provides the bridge between the DataSet object and the data source. The DataAdapter uses Command objects to execute SQL commands at the data source to both load the DataSet with data, and reconcile changes made to the data in the DataSet back to the data source. You can write .NET data providers for any data source. The .NET Framework ships with two .NET data providers the SQL Server .NET Data Provider and the OLE DB .NET Data Provider. The following diagram illustrates the components of ADO.NET architecture the SQL Server .NET Data Provider and the OLE DB .NET Data Provider. The following diagram illustrates the components of ADO.NET architecture
13

ADO.NET ARCHITECTURE

FIG NO 1:ADO.NET ARCHITECTURE

14

Remoting or Marshaling Data between Tiers and Clients The design of the DataSet enables you to easily transport data to clients over the Web using XML Web services, as well as allowing you to marshal data between .NET components using .NET Remoting services. You can also remote a strongly typed DataSet in this fashion. For an overview of XML Web services. An overview of remoting services can be found in the .NET Remoting Overview to note that DataTable objects can also be used with remoting services, but cannot be transported via an XML Web service. Ado.Net Platform Requirements The Microsoft .NET Framework SDK (including ADO.NET) is supported on Microsoft Windows 2000, Microsoft Windows NT 4 with Service Pack 6a, Microsoft Windows Millennium Edition, Microsoft Windows 98, and Microsoft Windows SE. Use of the SQL Server .NET Data Provider or OLE DB .NET Data Provider requires the installation of Microsoft Data Access Components version 2.6 or later. The following code example shows how to include the System.Data namespace in your applications, in order to use ADO.NET.

15

SQL Server Microsoft and Sybase teamed up to create SQL Server. It ran only on IBMs OS/2 operating system platform. After release of version 6.5 Microsoft and Sybase separated and SQL Server made its progress into the competitive market. SQL Server 6.5, the earlier mainly focused on database design and implementation. The Latest version 7 is a significant release of SQL Server. In the version two major changes have been made. New Server architecture

Inclusion of graphical tools likes Web Assistant Wizard, Data Transformation services.

However, the new version maintains ANSI standards and 6.x compatibility. SQL Server is an SQL-compliant RDBMS. SQL-complaint means it uses the ANSI version of Structured Query Language (SQL). SQL is a set of commands that allows you to modify or retrieve information from the database. The ANSI and International Standards Organization have defined various standards for SQL. Most Modern RDBMS products have their own dialect of SQL. For example, Oracle has PL/SQL. SQL Server supports SQL-92 standards. It uses a version of SQL called Transact- SQL. Client/Server means that SQL Server is designed to store data in the central location (the server) and it can be shared on demand to numerous other locations (The client). SQL Sever is also a Relational Database Management System.

16

Understanding Relational features of SQL Server Information Representation In SQL Server data is represented in terms of rows and columns of a table. Data Stored as a table can be easily visualized, because we encounter data stored in tables in everyday life. For example, train or plane schedule can be referenced as a table. Unique Definition of Rows The relational model requires each row to be uniquely defined by at least one or more columns of a table. The unique row requirement ensures that each row in the table can be accessed and changed independently from other rows of a table. It means a change made to a row in the table does not affect the other rows. In Unique Definition of Rows we can make each row of a table unique by using a feature called a constraint, which is a property that can be placed on a column or set of columns in a table. Systematic treatment of Null values SQL Server, like most RDBMS treats Null values Zeros and blanks differently. While creating a table, one can specify whether a field allows Null values or not. High Level Update, Insert and Delete In SQL Server if a record is updated or deleted in a master table the corresponding record in the other table is also updated or dropped, this process of ensuring that corresponding records of related tables are maintained to keep the relationship intact is called Referential Integrity.

17

Query Fundamentals Once the data is entered in SQL Server database, you need to have a way to retrieve it, to change it, to add or insert new data or to delete existing data. In SQL Server, a data that is stored across tables in one or more databases can be combined using a query. Query is a request for data stored in SQL Server database. Basically, it is a specific demand from the user for retrieval, modification or deletion. Query is fired by the user and is processed by the SQL Server .These are different ways of accessing the information in SQL Server: Transact SQL is one of them.

18

Transact-SQL Programming Language There are several methods of programming SQL Server applications. Transact-SQL is the SQL database language used by the SQL Server. Client applications use this language to communicate with the SQL Server. Transact-SQL is used to create and manipulate database objects as well as for inserting, selecting and updating data. Transact-SQL is a standard language for communication between the SQL Server and the application. The Transact-SQL language is an enhancement to Structured Query Language (SQL), the ANSI standard relational database language. It provides comprehensive language for defining tables, inserting, deleting, updating and accessing the data in the table. Elements of Transact-SQL The elements of Transact-SQL are as follows: Data Definition Language (DDL) Data Manipulation Language (DML) Data Control Language (DCL)

19

MICROSOFT SQL SERVER TOOLS Enterprise Manager Query Analyzer Enterprise Manager SQL Server Enterprise Manager is a graphical tool that allows easy configuration and management of Microsoft SQL Server and SQL Server objects. Enterprise Manager can be found in SQL Server7 program group. SQL Server Enterprise Manager can also be used to: Manage Logins, permissions and users Create Database Take back up of database and transaction logs Manage Tables Query Analyzer The SQL Server Query Analyzer allows you to create adhoc queries and run them interactively. The Query Analyzer also has a color-coded editor to assist the user with checking of syntax. It also has context sensitive help.

20

VB SCRIPT Scripting language used mostly at the client level which have the capability of creating dynamic web pages. The essential need of VBScript is due to the following requirements user interaction data validation client side utility FEATURES Interpreted language Object oriented support Provides procedural capabilities Works on event driven model Embedded within ML format Works on DOM (Document Object Model)

21

ELEMENTS OF VB SCRIPT Variable Arrays Control statements Functions Objects Events To program with ASP.NET, VB Script is probably the most common language. VB Scripts only has one data type the variant data type. The variant is used because of its flexibility with all data types. The variant data type is unique in the sense that the variant actually changes behavior depending on the type of data it is storing. The variant uses subtypes to provide some optimization and data type specific functionality.

22

Techniques and Algorithm Used There are two kinds of filtering based on traffic controlling. Ingress filtering protects the flow of traffic entering into an internal network under administrative control. We focus on analyzing the traffic at an egress router. Monitoring traffic at a source network enables early detection of attacks. Applications: Detecting anomalies through multiple levels will have a number of advantages: (i) by setting a high threshold at each level, anomalies can be detected with high confidence; (ii) depending on operators filtering criteria, he/she can adjust the threshold between accuracy and flexibility. (iii) the attributes of attacks.

23

2.8 SYSTEM DESIGN Module diagram

Start

Client

Registration and Login

Selecting file

Ingress router

Sending file

Egress router

Stop

Fig No 2. Module Diagram

24

UML Diagrams Use case diagram

Registration

Client

Select File

Sending File

Client Prov_id,prov_name, Prov_id,prov_name, Add IP address Add IP address Choose file(); file(); Choose Server

Sending file file Sending Client Data Data Client


Stores client details

Server Server File size, File type File size, File type Binary Binary reader reader

Check server Socket,Get IP IP activities Socket,Get

Ingress router Ingress router Maintain the error log IP address IP address User User name name ClientClientNo 3 Use Case Diagram Details() Fig Details() Threat() Threat() Repeated threat() Repeated threat() Egress router Egress router Error count Error count

CLASS DIAGRAM
25

Fig No 4 Class Diagram

Object Diagram

Client Login

Select file

Ingress router

Sending file

Server

Regress router

Fig no 5 Object Diagram

State Diagram

Client File selection File sending ()

Server Getting IP Type, size Binding Data ()

Egress router User name File type() File length()

Ingress router Error log Threat() Repeat threat()

26

Fig no 6 State Diagram

Activity Diagram

27

Fig no 7 Activity Diagram

Sequence Diagram

28

New User User Info

Login

Client

Provide Detail

Sending file

Egress router

Ingress routeer

IErrorlog

Login Info

User Details

File Details

Check threat

Repeated threat

Incrementing Error count

Fig no 8 Sequence Diagram

29

Collaboration Diagram

6: Completed Job Client Login


Error count

Error Message

1: User Info

5: File creating Traffic

User Details

Checking Details Receiving File

2: IP address

4: File Size Type of file

File Selection

3: Details

Sending File

Fig no 9 Collaboration Diagram

30

Component Diagram

Client Details Client File Details

Fille Selection Error count Server Error Message

Fig no 10 Component Diagram

31

32

DATA FLOW DIAGRAM


Login

Client

Verifying Address Yes File selection

No

No File does not create traffic Yes Select another file

Transfer file

Yes Normal file

Maintaining error count No Display error message Stop

Fig no 12 DATA FLOW DIAGRAM

33

Project Flow Diagram

Client

Choose file

Sending file

Verification

Maintain error count

Receiving file

Server

Fig no 13 Project Flow Diagram System Architecture


Client IP address

Client Details

Login

Server Guidence

Sending File

Checking Error

Receiving File

Maintaing Error

Fig no 14 System Architecture

34

2.8.1 SYSTEM DESIGN AND DEVELOPMENT


System Design The most creative and challenging phase of the system development is system design it is a solution, a how to approach to the creation of the proposed system. It provides the understanding and procedural details necessary for implementing the system recommended in the feasibility study. Design goes through the logical and physical stages of development. At an early stage in designing in new system, the system analyst must have a clear understanding of the objectives which the design is aiming to fulfill. The first step is toe determine how the output is to be produced and in what format. Second input data and the files have to be designed to meet the requirements of the proposed output. The operational (processing) phases are handled though program construction and testing. Finally details related to justification of the system and an estimate of the impact of the candidate system on the user and the organization are documented and evaluated by management. The final report prior to the implementation phase includes procedural flow charts, records and layout and a workable plan for implementing the candidate system.

35

Input Design Input design is the process of converting user oriented inputs to computer based format. It also includes determining the record media, method of input, speed of capture, and entry into the system. Consideration can be given to: Type of input Flexibility of format Speed Accuracy Verification methods Rejection rates Ease of correction Offline facilities Need for specialized documentation Storage and handling requirements Automatic features Hard copy requirements Security Ease of use Portability Compatibility with other system Cost etc., Keyboard may be used as in input media. The data are displayed on cathode ray tube screen for verification. In accurate input data are the most common cause of errors in data processing. Errors entered by the user can be controlled by input design. In this project USER NAME, PASSWORD and etc., are given as input.

36

Output Design The normal procedure is to design the outputs in detail first and then to work back to the inputs. The outputs can be in the form of operational documents, lengthy reports. The input records have to be validated, edited, organized and accepted by the system before being processed to produse the outputs. The output may have been defined, in terms of: Type of output Content Format Location Frequency Response Volume Sequence Action required The next stage is to determine the most appropriate medium for output Consideration will be: Suitability of the device to the particular application The need for hard copy The response time required The location of the user The software/hardware available and the cost

37

Code Design The process of code is to facilitate the identification and retrieval of items of information. The code should be simple and easily understandable. The codes were designed in such a way that the features such as optimum humanoriented use and machine efficiency are unaffected. For the code to be designed effectively, the following characteristics were also considered while designing the code. Uniqueness Versatility Stability Simplicity Consciousness The code should be adequate for present and anticipated data processing for machine and human use. Care was taken to minimize the clerical effort and computer time required to continue operations.

38

Database Design All the general activities of the application is identified in the system analysis stage. For the list of activities identified, the subject areas of the application need to maintain. Identified the activities and relationship between the tables primary keys were identified. List all the data needed to be used in the application . These data describes the tables. The supporting data identified becomes the names of the columns in the table. Normalization is a series of tests used against the data to eliminate redundancy and make sure the data is associated with the correct table or relationship. Normal forms are tests usually used to normalize data. Data structuring is refined through the process of normalization. Normalization is used to reduce data redundancy. All the tables identified in the system are subjected to the first normal form, second normal form and the third normal form. The data is analyzed before normalization to set the primary key of each table. After the normalization process the relationship between the tables were resolved. Relationships, like one-to-many, one-to-one must be set carefully.

39

2.9 TESTING
TESTING TECHNIQUES Software testing is a critical element of software quality assurance and represents the ultimate review of specification, design and code generation. Once source code has been generated, software must be tested to uncover (and correct) as many errors as possible before delivery to the customer. Software is tested from two different perspectives: Internal program logic is exercised using 'White Box' test case design techniques. Software requirements are exercised using 'Black Box' test case design techniques. In both cases the intention is to find the maximum number of errors with the minimum amount of effort and time.

40

Testing Strategies Designing effective test cases is important, but so is the strategy you use to execute them. Testing is a set of activities that can be planned in advance and conducted systematically. For this reason a template for software testing - a set of steps into which we can place specific test case design techniques and testing methods - should be defined for the software process. A strategy for software testing must accommodate low-level tests that are necessary to verify that a small source code segment has been correctly implemented as well as high-level tests that validate major system functions against customer requirements. Verification and Validation Software testing is one element of a broader topic that is often referred to as Verification and Validation. Verification refers to the set of activities that ensure that software correctly implements a specific function. traceable to customer requirements. Verification:"Are we building the product right?" Validation: "Are we building the right product?" Verification and Validation encompasses a wide array of SQA (Software Quality Assurance) activities that include formal technical reviews, quality and configuration audits, performance monitoring, simulation, feasibility study, documentation review, database review, algorithm analysis, development testing, qualification testing, and installation testing. Validation refers to a different set of activities that ensure that the software that has been built is

41

Software Testing Strategy The software engineering process may be viewed as the spiral. Initially, system engineering defines the role of software and leads to software requirement analysis, where the information domain, function, behavior, performance, constraints and validation criteria for software are established. A Strategy for software testing may also be viewed in the context of the spiral. Unit Testing begins at the vortex of the spiral and concentrates on each unit of the software as implemented in source code. Testing progresses by moving outward along the spiral as to Integration testing, where the focus is on design and the construction of the software architecture. Taking another turn outward on the spiral, we encounter validation testing, where requirements established as part of software requirement analysis are validated against the software that has been constructed. Finally, we arrive at system testing, where the software and other system elements are test as a whole. Unit Testing At a very lowest level is Unit Testing, where the programmer who writes the code testing the same as per the detailed specification. Each individual programs of the software were tested and the outputs obtained were satisfactory. found

42

Integration Testing Top-Down integration testing is used, which is an incremental approach to construction of program structure. Modules are integrated by moving downward through the control hierarchy, beginning with the main control module. Modules subordinate to the main control module are incorporated into the structure in either a depth-first or breadth-first manner. Therefore in this Project we are using the Depth-First Method for testing and executing the Modules. First username and password is checked. If it is valid go to the next module Here Host name is checked. Next selecting a file and Encryption and Decryption are tested and also the corresponding tables are also checked. Validation Testing Validation succeeds when software functions in a manner that can be reasonably expected by the customer. It is done by executing the application with a set of inputs, for which the manual result is ready. The application's output and the required output is tallied with the manual one. There may be two conditions performance satisfactory and accepted or deviations / deficiencies uncovered for which a solution is found. Enter Login ID and password of the registered user, and execute the Login Process for the same and verify the results. Enter the Host Name and checks whether it is true or not.

43

2.9.1 SYSTEM TESTING Recovery Testing It is a system test that forces the software to fail in a variety of ways and verifies that recovery is properly performed. This application requires human intervention. Security Testing It attempts to verify that protection mechanisms built into a system will, in fact, protect it from improper penetration using the following measures: Using certain commands, which are not permitted and granted by the DBA in Oracle 9i. Using a wrong User Name or a Password will not allow the user to login to that particular page. Trying to edit the entries where permission is not granted. Performance Testing It is designed to test the run-time performance of software within the context of an integrated system In addition the following test strategies are also carried out during the test period.

44

Application Function Test The functionality of Client Application is tested using the methods of Software testing strategies, Unit testing, Validation testing and Integration testing. Server Test: The coordination and Data Management Function of the Server are tested. Server performance i.e. overall response time and data throughput is also considered. Database Test The accuracy and integrity of data stored by the server is tested. Transactions posted by clients application are examined to ensure that data are properly stored, updated and retrieved. Transaction Testing A series of test are made to ensure that each class of transaction is processed according to the requirements. Acceptance Testing When the user finds no major problems with its accuracy, the system passes through a final acceptance test. This confirms that the system meets the original goals, objectives and requirements established during analysis. The responsibilities for acceptance test falls on the shoulders of the users and management. If the system fulfills all requirements, it is finally acceptable and ready for operation.

45

2.10 IMPLEMENTATION System Implementation is the process of converting a new record design in to an operational one. It is the key stage in achieving a successful new system because usually it involves a lot of uphill in the user. It involves user training, system testing and successful running of the new system Apart from planning; the two major tasks of preparing for implementation are educational training of users and testing of the system. Implementation Procedures The procedures of implementation includes Testing Training Installation Changeover Testing: Software testing is a critical element of software quality assurance and represent the ultimate reviews of the requirement specification, design and coding. Before the system turns out into a tangible implementation from an abstract concept, it is to be thoroughly tested and verified. Testing is the phase where the errors in study, design and coding is identified. Testing uncovers the problems that are left unsolved by the system. It also proves user acceptance and attitude of users towards the developed software.

46

CHAPTER 3 CONCLUSION AND FUTURE ENHANCEMENT


In this Project the feasibility of analyzing packet header data through wavelet analysis for detecting traffic anomalies. Specifically, and the proposed use of correlation of destination IP addresses, port numbers and the number of flows in the outgoing traffic at an egress router. Our results show that statistical analysis of aggregate traffic header data may provide an effective mechanism for the detection of anomalies within a campus oredge network.

FUTURE ENHANCEMENT
In this effectiveness of our approach in postmortem and realtime analysis of future research. network traffic. The results of our analysis are encouraging and point to a number of interesting directions for

47

APPENDICES APPENDIX 1: SAMPLE CODING server using System; using System.Collections.Generic; using System.ComponentModel; using System.Data; using System.Drawing; using System.Text; using System.Windows.Forms; using System.Data.SqlClient; namespace trafficServer { public partial class login : Form { SqlCommand cmd; SqlDataReader dr; SqlConnection cn; public static String m; public login() {
48

InitializeComponent(); } private void btnLog_Click(object sender, EventArgs e) { m = "" + textBox1.Text + ""; Recieve rec = new Recieve(); rec.Show(); } private void button1_Click(object sender, EventArgs e) { if ((txtUser.Text != "") && (txtPass.Text != "")) { cn = new SqlConnection("server=.;initial catalog=Yokesh;uid=sa;pwd=;"); cn.Open(); cmd = new SqlCommand("Select * from login1 where username='" + txtUser.Text + "' AND pass='" + txtPass.Text + "'", cn); dr = cmd.ExecuteReader(); dr.Read(); if (dr.HasRows) { //groupBox1.Visible = false; groupBox2.Visible = true; button1.Enabled = false; btnCancel.Enabled = false; txtUser.Text = ""; txtPass.Text = "";
49

else { lblError.Text = ""; lblError.Text = "Username or Password Incorrect"; }

} else { lblError.Text = "User Name or Password cannot be blank"; } } private void btnServerPath_Click(object sender, EventArgs e) { rdoDefault.Checked = false; FolderBrowserDialog fd = new FolderBrowserDialog(); if (fd.ShowDialog() == DialogResult.OK) { Server.receivedPath = fd.SelectedPath; }

txtFilePath.Text= Server.receivedPath; }
50

Client public void btnLogin_Click(object sender, EventArgs e) { cn = new SqlConnection("server=.;initial catalog=Yokesh;uid=sa;pwd=;"); cn.Open(); string server = "" + txtUname.Text + ".txt"; Servername = "" + textBox1.Text + ""; string path3 = "\\\\" + Servername + "/DOTNET/" + server + ""; if (System.IO.File.Exists(path3)) { lblError1.Text = ""; lblError1.Text = "Contact Admin To use the tool"; txtUname.Text = ""; txtPass.Text = ""; } else { fileName = "uname.txt"; Servername = "" + textBox1.Text + ""; path = "\\\\" + Servername + "/DOTNET/" + fileName + ""; // path = "\\\\" + Servername + "/DOT NET/" + server + ""; FileStream fileStr = new FileStream(path, FileMode.Create, FileAccess.Write);
51

StreamWriter sw = new StreamWriter(fileStr); sw.Write(txtUname.Text); sw.Close(); fileStr.Close(); //groupBox1.Visible = false; sendFile objsendfile = new sendFile(); objsendfile.Show(); //this.Hide(); } private void button1_Click(object sender, EventArgs e) { cn = new SqlConnection("server=.;initial catalog=Yokesh;uid=sa;pwd=;"); cn.Open(); if ((txtUname.Text != "") && (txtPass.Text != "")) { cmd = new SqlCommand("Select * from login where username='" + txtUname.Text + "' AND pass='" + txtPass.Text + "'", cn); dr = cmd.ExecuteReader(); dr.Read(); if (dr.HasRows) { groupBox2.Visible = true; txtUname.Text = ""; txtPass.Text = ""; button1.Enabled = false; }

52

Codes: Server using System; using System.Collections.Generic; using System.ComponentModel; using System.Data; using System.Drawing; using System.Text; using System.Windows.Forms; using System.Data.SqlClient; namespace trafficServer { public partial class login : Form { SqlCommand cmd; SqlDataReader dr; SqlConnection cn; public static String m; public login() { InitializeComponent(); } private void btnLog_Click(object sender, EventArgs e) { m = "" + textBox1.Text + "";
53

Recieve rec = new Recieve(); rec.Show(); } private void button1_Click(object sender, EventArgs e) { if ((txtUser.Text != "") && (txtPass.Text != "")) { cn = new SqlConnection("server=.;initial catalog=Yokesh;uid=sa;pwd=;"); cn.Open(); cmd = new SqlCommand("Select * from login1 where username='" + txtUser.Text + "' AND pass='" + txtPass.Text + "'", cn); dr = cmd.ExecuteReader(); dr.Read(); if (dr.HasRows) { //groupBox1.Visible = false; groupBox2.Visible = true; button1.Enabled = false; btnCancel.Enabled = false; txtUser.Text = ""; txtPass.Text = ""; } else { lblError.Text = ""; lblError.Text = "Username or Password Incorrect"; }
54

} else { lblError.Text = "User Name or Password cannot be blank"; } } private void btnServerPath_Click(object sender, EventArgs e) { rdoDefault.Checked = false; FolderBrowserDialog fd = new FolderBrowserDialog(); if (fd.ShowDialog() == DialogResult.OK) { Server.receivedPath = fd.SelectedPath; }

txtFilePath.Text= Server.receivedPath; } Client public void btnLogin_Click(object sender, EventArgs e) { cn = new SqlConnection("server=.;initial catalog=Yokesh;uid=sa;pwd=;"); cn.Open(); string server = "" + txtUname.Text + ".txt"; Servername = "" + textBox1.Text + "";
55

string path3 = "\\\\" + Servername + "/DOTNET/" + server + ""; if (System.IO.File.Exists(path3)) { lblError1.Text = ""; lblError1.Text = "Contact Admin To use the tool"; txtUname.Text = ""; txtPass.Text = ""; } else { fileName = "uname.txt"; Servername = "" + textBox1.Text + ""; path = "\\\\" + Servername + "/DOTNET/" + fileName + ""; // path = "\\\\" + Servername + "/DOT NET/" + server + ""; FileStream fileStr = new FileStream(path, FileMode.Create, FileAccess.Write); StreamWriter sw = new StreamWriter(fileStr); sw.Write(txtUname.Text); sw.Close(); fileStr.Close(); //groupBox1.Visible = false; sendFile objsendfile = new sendFile(); objsendfile.Show(); //this.Hide(); } private void button1_Click(object sender, EventArgs e) {
56

cn = new SqlConnection("server=.;initial catalog=Yokesh;uid=sa;pwd=;"); cn.Open(); if ((txtUname.Text != "") && (txtPass.Text != "")) { cmd = new SqlCommand("Select * from login where username='" + txtUname.Text + "' AND pass='" + txtPass.Text + "'", cn); dr = cmd.ExecuteReader(); dr.Read(); if (dr.HasRows) { groupBox2.Visible = true; txtUname.Text = ""; txtPass.Text = ""; button1.Enabled = false; } else { lblError1.Text = ""; lblError1.Text = "Username or Password Incorrect"; } } else { lblError1.Text = "User Name or Password cannot be blank"; } using System; using System.Collections.Generic; using System.Windows.Forms;
57

namespace trafficClient { static class Program { [STAThread] static void Main() { Application.EnableVisualStyles(); Application.SetCompatibleTextRenderingDefault(false); Application.Run(new login()); } } } using System; using System.Collections.Generic; using System.ComponentModel; using System.Data; using System.Drawing; using System.Text; using System.Windows.Forms; using System.Net; using System.IO; namespace trafficClient { public partial class CheckServer : Form {

58

public string Server = ""; public String Filename = login.fileName; public String Servername = login.Servername; public CheckServer() { InitializeComponent(); } private void CheckServer_Load(object sender, EventArgs e) {} private void btnServer_Click(object sender, EventArgs e) { if (System.IO.File.Exists(path)) { StreamReader sr = new StreamReader(path); Server = sr.ReadLine(); MessageBox.Show("Server Connected"); lblIP.Text = Server; sendFile sf = new sendFile(); sf.Show(); this.Close(); } else { MessageBox.Show( "Server is not connected Contact Admin"); sendFile sf = new sendFile(); sf.btnSend.Enabled = false; sf.Show(); this.Close();
59

} } } namespace trafficClient { partial class sendFile { private System.ComponentModel.IContainer components = null; protected override void Dispose(bool disposing) { if (disposing && (components != null)) { components.Dispose(); } base.Dispose(disposing); } #region Windows Form Designer generated code private void InitializeComponent() { this.lblError = new System.Windows.Forms.Label(); this.lblS = new System.Windows.Forms.Label(); this.lblf = new System.Windows.Forms.Label(); this.btnSend = new System.Windows.Forms.Button(); this.btnOpen = new System.Windows.Forms.Button(); this.txtFilename = new System.Windows.Forms.TextBox(); this.lblFilesize = new System.Windows.Forms.Label(); this.btnAvail = new System.Windows.Forms.Button(); this.groupBox1 = new System.Windows.Forms.GroupBox();
60

this.label1 = new System.Windows.Forms.Label(); this.groupBox1.SuspendLayout(); this.SuspendLayout(); this.lblError.AutoSize = true; this.lblError.Font = new System.Drawing.Font("Forte", 14.25F, System.Drawing.FontStyle.Italic, (0))); this.lblError.ForeColor = System.Drawing.Color.Maroon; this.lblError.Location = new System.Drawing.Point(99, 221); this.lblError.Margin = new System.Windows.Forms.Padding(4, 0, 4, 0); this.lblError.Name = "lblError"; this.lblError.Size = new System.Drawing.Size(0, 21); this.lblError.TabIndex = 14; this.lblS.AutoSize = true; this.lblS.Font = new System.Drawing.Font("Forte", 14.25F, System.Drawing.FontStyle.Italic, (0))); this.lblS.ForeColor = System.Drawing.Color.Red; this.lblS.Location = new System.Drawing.Point(85, 177); this.lblS.Margin = new System.Windows.Forms.Padding(4, 0, 4, 0); this.lblS.Name = "lblS"; this.lblS.Size = new System.Drawing.Size(84, 21); this.lblS.TabIndex = 12; this.lblf.AutoSize = true; this.lblf.Font = new System.Drawing.Font("Forte", 14.25F, System.Drawing.FontStyle.Italic, (0))); this.lblf.ForeColor = System.Drawing.Color.Red;
61

System.Drawing.GraphicsUnit.Point,

((byte)

System.Drawing.GraphicsUnit.Point,

((byte)

System.Drawing.GraphicsUnit.Point,

((byte)

this.lblf.Location = new System.Drawing.Point(88, 91); this.lblf.Margin = new System.Windows.Forms.Padding(4, 0, 4, 0); this.lblf.Name = "lblf"; this.lblf.Size = new System.Drawing.Size(101, 21); this.lblf.TabIndex = 10; this.lblf.Text = "Flie Name:"; this.btnSend.Font = new System.Drawing.Font("Forte", 14.25F, System.Drawing.FontStyle.Italic, (0))); this.btnSend.ForeColor = System.Drawing.Color.Maroon; this.btnSend.Location = new System.Drawing.Point(734, 286); this.btnSend.Margin = new System.Windows.Forms.Padding(4); this.btnSend.Name = "btnSend"; this.btnSend.Size = new System.Drawing.Size(124, 32); this.btnSend.TabIndex = 3; this.btnSend.Text = "Send"; this.btnSend.UseVisualStyleBackColor = true; this.btnSend.Click += new System.EventHandler(this.btnSend_Click); this.btnOpen.Font = new System.Drawing.Font("Forte", 14.25F, System.Drawing.FontStyle.Italic, (0))); this.btnOpen.ForeColor = System.Drawing.Color.Maroon; this.btnOpen.Location = new System.Drawing.Point(759, 80); this.btnOpen.Margin = new System.Windows.Forms.Padding(4); this.btnOpen.Name = "btnOpen"; this.btnOpen.Size = new System.Drawing.Size(124, 32); this.btnOpen.TabIndex = 1; this.btnOpen.Text = "Open File";
62

System.Drawing.GraphicsUnit.Point,

((byte)

System.Drawing.GraphicsUnit.Point,

((byte)

this.btnOpen.UseVisualStyleBackColor = true; this.btnOpen.Click += new System.EventHandler(this.btnOpen_Click); this.txtFilename.Font = new System.Drawing.Font("Forte", 14.25F, System.Drawing.FontStyle.Italic, (0))); this.txtFilename.ForeColor = System.Drawing.Color.Maroon; this.txtFilename.Location = new System.Drawing.Point(244, 77); this.txtFilename.Margin = new System.Windows.Forms.Padding(4); this.txtFilename.Name = "txtFilename"; this.txtFilename.Size = new System.Drawing.Size(465, 33); this.txtFilename.TabIndex = 2; this.lblFilesize.AutoSize = true; this.lblFilesize.Font = new System.Drawing.Font("Forte", 14.25F, System.Drawing.FontStyle.Italic, (0))); this.lblFilesize.ForeColor = System.Drawing.Color.Maroon; this.lblFilesize.Location = new System.Drawing.Point(237, 177); this.lblFilesize.Margin = new System.Windows.Forms.Padding(4, 0, 4, 0); this.btnAvail.ForeColor = System.Drawing.Color.Maroon; this.btnAvail.Location = new System.Drawing.Point(164, 286); this.btnAvail.Margin = new System.Windows.Forms.Padding(4); this.btnAvail.Name = "btnAvail"; this.btnAvail.Size = new System.Drawing.Size(228, 32); this.btnAvail.TabIndex = 0; this.btnAvail.Text = "Check Availability"; this.btnAvail.UseVisualStyleBackColor = true; this.btnAvail.Click += new System.EventHandler(this.btnAvail_Click); this.groupBox1.Controls.Add(this.txtFilename);
63

System.Drawing.GraphicsUnit.Point,

((byte)

System.Drawing.GraphicsUnit.Point,

((byte)

this.groupBox1.Controls.Add(this.btnAvail); this.groupBox1.Controls.Add(this.btnOpen); this.groupBox1.Controls.Add(this.btnSend); this.groupBox1.Controls.Add(this.lblError); this.groupBox1.Controls.Add(this.lblf); this.groupBox1.Controls.Add(this.lblFilesize); this.groupBox1.Controls.Add(this.lblS); this.groupBox1.Location = new System.Drawing.Point(30, 100); this.groupBox1.Name = "groupBox1"; this.groupBox1.Size = new System.Drawing.Size(1027, 448); this.groupBox1.TabIndex = 15; this.groupBox1.TabStop = false; this.label1.AutoSize = true; this.label1.Font = new System.Drawing.Font("Forte", 24F, System.Drawing.FontStyle.Italic, (0))); this.label1.ForeColor = System.Drawing.Color.Red; this.label1.Location = new System.Drawing.Point(230, 32); this.label1.Name = "label1"; this.label1.Size = new System.Drawing.Size(269, 35); this.label1.TabIndex = 16; this.label1.Text = "Send File To Server"; this.Controls.Add(this.label1); this.Controls.Add(this.groupBox1); this.Font = new System.Drawing.Font("Palatino Linotype", 9.75F, System.Drawing.FontStyle.Regular, System.Drawing.GraphicsUnit.Point, ((byte) (0))); this.Margin = new System.Windows.Forms.Padding(4);
64

System.Drawing.GraphicsUnit.Point,

((byte)

this.Name = "sendFile"; this.Text = "sendFile"; System.Windows.Forms.FormClosedEventHandler(this.sendFile_FormClosed); this.Load += new System.EventHandler(this.sendFile_Load); this.groupBox1.ResumeLayout(false); this.groupBox1.PerformLayout(); this.ResumeLayout(false); this.PerformLayout(); } #endregion private System.Windows.Forms.Label lblError; private System.Windows.Forms.Label lblS; private System.Windows.Forms.Label lblf; private System.Windows.Forms.Button btnOpen; private System.Windows.Forms.TextBox txtFilename; private System.Windows.Forms.Label lblFilesize; private System.Windows.Forms.Button btnAvail; private System.Windows.Forms.GroupBox groupBox1; private System.Windows.Forms.Label label1; public System.Windows.Forms.Button btnSend; } }

65

APPENDIX: 2 SCREEN SHOTS

66

67

68

69

REFERENCES
1. A. Ramanathan, WADeS: A tool for distributed denial of service attack detection M.S. thesis, TAMU-ECE-2002-02, Aug. 2002. 2. NLANR measurement and operations analysis team, NLANR Network Traffic Packet Header Traces, Aug. 2002 3. J. Mirkovic, G. Prier, and P. Reiher, Attacking DDoS at the source, in IEEE Int. Conf. Network Protocols, Nov. 2002. 4. CERT Advisory CA-2003-04 MS-SQL Server Worm, CERT Coordination Ctr.(CERT/CC),Jan 2003 5. 6. Packeteer, PacketShaper Express, white paper, 2003 KREONet2 (Korea Research Environment Open NETwork2). regulation, in Proc. IWQOS, May 2002 Sites Referred:
1. http://www.pma.nlanr.net/Traces/ 2. http://www.cert.org/advisories/CA-2003-04.html

7. Garg and A. L. N. Reddy, Mitigation of DoS attacks through QoS

3. http://www.packeteer.com/resources/prod-sol/ Xpress_Whitepaper.pdf.
4. http://www.kreonet2.net

70

Das könnte Ihnen auch gefallen