Sie sind auf Seite 1von 72

ADVERTISEMENT CREATOR

A project Report Submitted to

MOTHER TERESA WOMENS UNIVERSITY KODAIKANAL


in partial fulfillment of the requirements for the award of the Degree of Master of Science in Computer Science Submitted by R.PRIYA Reg.no : 113MCS015

Under the Guidance of

Mrs.S.Rajathi M.C.A.,M.Phil.,

DEPARTMENT OF COMPUTER SCIENCE M.V.M GOVT. ARTS COLLEGE FOR WOMEN, Affiliated to Mother Teresa Womens University, Kodaikanal) Dindigul - 624001. March 2013

CERTIFICATE

CERTIFICATE

This is to certify that the Project entitled ADVERTISEMENT CREATOR is a bonafide work done by R.PRIYA ,Reg No:113MCS015 submitted in partial fulfillment of requirements for the award of the degree Master of Computer Science, M.V.M Govt. Arts College for Womens, Dindigul during the academic year 20122013 under my supervision.

INTERNAL GUIDE

HEAD OF THE DEPARTMENT

INTERNAL EXAMINER

EXTERNAL E XAMINER

DECLARATION

DECLARATION
R.PRIYA M.Sc(Computer Science) 113MCS015 M.V.M Govt. Arts College for Women, Dindigul.
Hereby I declare that the project entitled ADVERTISEMENT

CREATOR submitted to M.V.M Govt. Arts College for Women, Dindigul for the Degree of Master of Science in Computer Science, is my original
work and that it has not been previously done for the award of any degree, diploma or any other title.

Signature of the Candidate

Place: Date:

ACKNOWLEDGEMENT

ACKNOWLEDGEMENT
I here by acknowledge my deep sense of gratitude and sincere thanks to Dr.Mrs. C.PadmaLatha M.Sc.,Ph.D., Principle, M.V.M Arts College for women, Dindigul for have permitted me to undergo the project work. I am extremely thankful to Dr.Mrs.A.Pethalakshmi M.Sc., M.Phil., Ph.D., the H.O.D, Department of Computer Science, for her encouragement in completing the project successfully. I sincerely thank my guide Mrs.S.Rajathi M.C.A.,M.Phil., inspiring guidance and encouragement to complete the project work in a successful manner. I also express my sincere thanks to other staff members of our department for their support and assistance provided to complete the project work. I sincerely thank to our Lab Programmer Mrs.N.Manimala M.A., M.phil., I deeply thank to my parents, sister and friends for their encouragement in preparing this work. Above all I thank the almighty for this blessings showered all throughout the execution of the project work.

(R.PRIYA, II-M.Sc(CS))

DECLARATION

SYNOPSIS
SYNOPSIS
Creator Professional combines premium page-layout capabilities with image selection, illustration and image manipulation tools. If you develop advertising or promotional materials, Creator is made for the way you work. This sophisticated software is ideal for design and production, with powerful tools to boost your efficiency and creativity. Also, Creator Professional will now allow you to open In Design Interchange files. The more you use Creator, the easier it becomes to automate your everyday tasks. Creator is one of the few programs on the market that is fully scriptable Windows. This means you can script frequently used actions and commands, giving you a dynamic set of timesaving tools. With the latest updates to Creator Professional, designers more flexibility and freedom in their work. Open In Design Interchange files, allowing you to use the features you love about Creator on existing In Design files New variable opacity for elements eliminates the need to use Photoshop to get just the effect you want in your design. With global text wrap, you're able to set you're desired wrap on your element so it applies for any text box, making you more efficient. can now enjoy even

CONTENT
TABLE OF CONTENTS

CHEPTER NO

TITLE

PAGE NO

1. 2.

INTRODUTION SYSTEM ANALYSIS 2.1 Existing System 2.2 Proposed System

3.

SYSTEM SPECIFICATION 3.1 Hardware Specification 3.2 Software Specification 3.3 Software Description

4.

SOFTWARE ARCHITECTURE 4.1 Modules 4.2 Modules Description

5.

SYSTEM DESIGN 4.3 Data Flow Diagram 4.4 System Flow Diagram

6. 7. 8. 9. 10.

SYSTEM TESTING SYSTEM IMPLEMENTATION CONCLUSION BIBLIOGRAPHY APPENDIX 10.1 Screen Shots

INTRODUCTION

1 INTRODUCTION

Advertising is a form of communication for marketing and used to encourage audience (viewers, readers or listeners ,sometimes a specific group) to continue or take some new action. Most commonly, the desired result is to drive consumer behavior with respect to a commercial offering, although political and ideological advertising is also common.

Advertisement is one of basic resource to develop business either corporate and individual, so many technology has been improved in multimedia fields.To create ads some DTP software were used in progress which may depend on create innovation.

SYSTEM ANALYSIS
2. SYSTEM ANALYSIS

2.1 Existing System:

The existing system can work with multiple tool processing one ad creation. Compare to proposed it has less feature and support DISADVANTAGE: The DTP software used to create advertisement cannot create or support innovative solution , it can just provide and support developing ads. The ad result are based on user knowledge skill in ad creation. Not more attractive comparing to Multimedia ads. Image processing need various tool. Managing and maintaining becomes difficult and more costly. Slow in process Each process is handled with various technique.

2.2 Proposed System:


The proposed is a multimedia tool that support audio, image and animation for creating ads more effectively. This tool is capable of providing a enhancement and easy go environment for user to create ads. Ad creation need a imaginary and draw

skill person for producing very good ads.But by using this every one can produce ads as much better then other tool ADVANTAGE:

Not only does Creator layout and design software offer graphic arts professionals the flexibility and power to create with total freedom, but it is also designed to optimize efficiency in high-volume production environments without the need for expensive add-ons. It stands apart from other layout applications in three ways. Out of the box, it includes the most comprehensive set of advanced features and capabilities on the market. It's exceptionally user friendly, and it's equally at home in any operating system.

SYSTEM SPECIFICATION
3. SYSTEM SPECIFICATION
3.1 Hardware Specification
Processor RAM Hard disk FDD Monitor Mouse CD Drive Keyboard : : : : : Pentium IV : : : 128 MB 20 GB 1.44MB 14 inch 3 Button scroll 52 X 108 keys

3.2 Software Specification

OPERATING SYSTEM FRONT END PLATFORM

: WINDOWS XP : VISUAL BASIC.NET :

3.3 Software Description


Microsoft Visual Studio Microsoft Visual Studio is Microsofts flagship software developed product for computers. It center on an integrated distribution environment which has programming create stand alone, and web service that run on any platform supported platforms include Microsoft windows, server and workstations, Pocket PC, Smart Phone and World Wide Web browsers not the Java Virtual Machine that all others java tool target.

Over View of .NET The .NET framework is a new computing platform that simplifies application development in the highly distributed environment of the internet. The .NET framework is designed to fulfill the following objectives. To provide a consistent object oriented programming environment whether object code is stored and executed locally, but internet-distributed, or executed remotely.

To provide a code execution environment that minimizes software deployment and versioning conflict. To provide a code execution environment that guarantees safe execution of code, including code created by an unknown or semi trusted third party.

To provide a code-execution environment that eliminates the performance problems of scripted or interpreted environments. To make the developer experience consist ant across widely varying types of applications, such as windows-based applications and web-based applications.

The .NET framework has two main components Common Language Runtime (CLR). NET framework class library.

Common Language Runtime (CLR) The common language runtime is the foundation of the .NET framework .you can think of the runtime as an agent that manages code at execution time, providing core services such as memory management and thread management while also enforcing strict type safety and other forms of code accuracy that ensure security and robustness. In fact, the concept of code management is a fundamental principal of the runtime .Code that targets the runtime is known as managed code, while code that dose not target the runtime is known as unmanaged code. The .NET framework can be hosted by unmanaged components that load the common language runtime into their processes and initiate the execution of managed code, thereby creating a software

environment that can explicit both managed and unmanaged features. The .NET framework not only provides several runtime hosts, but also supports the development of third-party runtime hosts. Internet explore is an example of an unmanaged application that runtime (in the form of a mime type extension).Using internet explorer to host the runtime enables you to embed managed components or windows from controls in the HTML documents. .NET Framework Class Library: The .NET framework library is a collection of reused types that tightly with the common language runtime (CLR). The class library is object oriented, providing type from which your own managed code can derive functionality. Managed codes are intermediate language codes along with metadata contained in portable executable (PE) files. This not only makes the .NET framework type easy to use, but also reduces the associated with learning new features of the .NET framework. In addition, third party components can integrate seamlessly with classes in the .NET framework . Console application Scripted and hosted application Windows GUI application (windows forms) VB.NET application XML web services

VISUAL BASIC.NET:

The system is developed using Visual Basic. NET, which is a very popular Microsoft Product developed by Microsoft Corporation. This is one of the improved languages from basic language. Visual basic. NET includes a variety of open active controls for user interfaces to design application forms.

VB.NET is the multiple document inter face format (MDI).The user interface is the part of the program that responds to the key press and mouse clicks. The action is referred to as events of the form and controls in the form. VB.NET provides vast properties and methods for each control, which helps to utilize all those, functions for record manipulations. Menu driven is one of the most effective controls in the VB.NET. In this menu driven the menu names in a program appear in the menu bar when the user selects a menu, that menu open. Each menu usually contains items arranged in a vertical list. These items are often grouped into functional groups with menu separaters. When the user selects a menu item, that item appears highlighted; pressing enter or releasing the mouse button opens that item. Each item should have a unique access character for users to choose commands with keyboards. The user reaches the menu or menu item by pressing alt key and access character. Short cuts are also useful to the user these keys are faster than access character in that the user only needs to enter a shortcut to execute the corresponding menu item.

Properties Window The properties window appears beneath the solution explorer on the righthand of the VS.NET main window. It displays the properties for the currently selected object in the main window. Pressing key F4 also displays the properties window for the selected object. Solution Explorer Solution explorer window is similar to the project explorer window in the Bathe solution explorer is a bit more advanced, since it allows us to construct solutions out of several different projects including those written in different languages. Class View The class view window is somewhat similar to the solution explorer, in the it provides a view into our solution and project. A view of classes, methods and

properties rather than a view of files are provided by the class view in the world of object-oriented world of .NET. Server Explorer Server explorer is an exciting new feature of VS.NET as it allows us to explore and access server components in nice graphical environments. The server explorer lists the data connections and the servers that are available to the user. It can be used to examine and manipulate servers and the databases they contain in the server explorer. Output Window The output window is similar to the immediate window available in the previous version of visual basic. The immediate window is used to view debug output from the application, and to interact with the environment by entering bits of code or even calling procedures within the users code. Web Development Web development is now an integral part of visual basic.Net.The two major types of web applications are web forms and web services. Web forms let you create web-based applications with user interfaces. Are made up of code that can be called by other components on the internet or applications that use internet protocols. Using web services, you can send and process data using HTTP and XML messaging standards on the internet. SPECIAL FEATURES IN VB.NET:

VB.NET is an ideal programming language for developing sophisticated professional application for Microsoft windows. It makes use of the graphical user interface for creating powerful applications, which enables the user to interact easily within an application.

VB.NET provides many aspects such as easier comprehension, user friendliness and faster application development, which help the developer to design the application more effectively.

VB.NET provides the facilities such as log in dialog form, browser form, query form ,option dialog form and wizard from which enable the developer design the application more effectively.

MS SQL Microsoft SQL Server is a relational database management system developed by Microsoft. As a database, it is a software product whose primary function is to store and retrieve data as requested by other software applications, be it those on the same computer or those running on another computer across a network (including the Internet). There are at least a dozen different editions of Microsoft SQL Server aimed at different audiences and for different workloads (ranging from small applications that store and retrieve data on the same computer, to millions of users and computers that access huge amounts of data from the Internet at the same time). Its primary query languages are T-SQL and ANSI SQL. Introduction Security is becoming increasingly important as more networks are connected together. Your organizations assets must be protected,

particularly its databases, which contain your companys valuable information. Security is one of the critical features of a database engine, protecting the enterprise against myriad threats. The new security features of Microsoft SQL Server 2005 are designed to make it more secure and to make security more approachable and understandable to those who are responsible for data protection. During the past few years, the world has developed a far more mature understanding of what a secure, computer-based system must be. Microsoft has been in the forefront of this development, and SQL Server is one of the first server products that fully implements that understanding. It enables the important principle of least privilege so you do not have to grant users more permissions than are necessary for them to do their jobs. It provides in-depth tools for defense so that you can implement measures to frustrate even the most skillful attackers.

FEATURES OF SQL-SERVER The OLAP Services feature available in SQL Server version 7.0 is now called SQL Server 2000 Analysis Services. The term OLAP Services has been replaced with the term Analysis Services. Analysis Services also includes a new data mining component. The Repository component available in SQL Server version 7.0 is now called Microsoft SQL Server 2000 Meta Data Services. References to the component now use the term Meta Data Services. The term repository is used only in reference to the repository engine within Meta Data Services SQL-SERVER database consist of six type of objects, They are, 1. TABLE

2. QUERY 3. FORM 4. REPORT 5. MACRO SQL Server 2005 (code named Yukon), released in October 2005, is the successor to SQL Server 2000. It included native support for managing XML data, in addition to relational databases. For this purpose, it defined an xml data type that could be used either as a data type in database columns or as literals in queries. XML columns can be associated with XSD schemas; XML data being stored is verified against the schema. XML is converted to an internal binary data type before being stored in the database. SQL Server 2005 also allows a database server to be exposed over web services using TDS packets encapsulated within SOAP Protocol requests. When the data is accessed over web services, results are returned as XML. SQL Server 2005 has also been enhanced with new indexing algorithms and better error recovery systems. Data pages are Check summed for better error resiliency, and optimistic concurrency support has been added for better performance. Partitions on tables and indexes are supported natively, so scaling out a database onto a cluster is easier. Architecture of SQL Server 2005 Protocol layer - implements the external interface to SQL Server. All operations that can be invoked on SQL Server are communicated to it via a Microsoft-defined format, called Tabular Data Stream (TDS). TDS is an application layer protocol, used to transfer data between a database

server and a client. Initially designed and developed by Sybase Inc. for their Sybase SQL Server relational database engine in 1984, and later by Microsoft in Microsoft SQL Server, TDS packets can be encased in other physical transport dependent protocols,including TCP/IP, Named pipes, and Shared memory. Consequently, access to SQL Server is available over these protocols. In addition, the SQL Server API is also exposed over band web services. Data storage - The main unit of data storage is a database, which is a collection of tables with typed columns. SQL Server supports different data types, including primary types such as Integer, Float, Decimal, Char (including character strings), Varchar (variable length character strings), binary (for unstructured blobs of data), Text (for textual data) among others. It also allows user-defined composite types (UDTs) to be defined and used. SQL Server also makes server statistics available as virtual tables and views (called Dynamic Management Views or DMVs). A database can also contain other objects including views, stored procedures, indexes and constraints, in addition to tables, along with a transaction log. Buffer management - SQL Server buffers pages in RAM to minimize disc I/O. The amount of memory available to SQL Server decides how many pages will be cached in memory. The buffer cache is managed by the Buffer Manager. Subsequent reads or writes are redirected to the in-memory copy, rather than the on-disc version. The page is updated on the disc by the Buffer Manager only if the in-memory cache has not been referenced for some time.While writing pages back to disc, asynchronous I/O is used whereby the I/O operation is done in a background thread so that other operations do not have to wait for the I/O operation to complete. Data retrieval - The main mode of retrieving data from an SQL Server database is querying for it. The query declaratively specifies what is to be retrieved. It will be necessary to retrieve the requested data. The sequence of actions necessary to execute a query is

called a query plan. There might be multiple ways to process the same query. In such case, SQL Server chooses the plan that is supposed to yield the results in the shortest possible time. This is called query optimization and is performed by the query processor itself. SQL Server includes a cost-based query optimizer which tries to optimize on the cost, in terms of the resources it will take to execute the query. Given a query, the query optimizer looks at the database schema, the database, which sequence to execute the operations and what access method to be used to access the tables Server also allows stored procedures to be defined. Stored procedures are parameterized T-SQL queries that are stored in the server itself. Stored procedures can accept values sent by the client as input parameters, and send back results as output parameters. Distributed Databases As more and more applications are used on an enterprise wide basis or beyond, the ability of a single, centralized database to support dozens of major applications and thousands of concurrent users will continue to erode. Instead, major corporate databases will become more and more distributed, with dedicated databases supporting the major applications and functional areas of the corporation. To meet the higher service levels required of enterprise wide or Internet-based applications, data must be distributed; but to ensure the integrity of business decisions and operations, the operation of these distributed databases must be tightly coordinated. Roles and permissions For a sense of the number of permissions available in SQL Server you can invoke the fn_builtin_permissions system function: SELECT * FROM sys.fn_builtin_permissions(default)

Here are the new permission types in SQL Server 2005:

CONTROL. Confers owner-like permissions that effectively grant all defined permissions to the object and all objects in its scope, including the ability to grant other grantees any permissions. CONTROL SERVER grants the equivalent of sysadmin privileges.

ALTER. Confers permission to alter any of the properties of the securable objects except to change ownership. Inherently confers permissions to ALTER, CREATE, or DROP securable objects within the same scope. For example, granting ALTER permissions on a database includes permission to change its tables.

ALTER ANY <securable object>. Confers permission to change any securable object of the type specified. For example, granting ALTER ANY ASSEMBLY allows changing any .NET assembly in the database, while at the server level granting ALTER ANY LOGIN lets the user change any login on that server.

IMPERSONATE ON <login or user>. Confers permission to impersonate the specified user or login. As you'll see later in this article, this permission is necessary to switch execution contexts for stored procedures. You also need this permission when doing impersonating in a batch.

TAKE OWNERSHIP. Confers the permission to the grantee to take ownership of the securable, using the ALTER AUTHORIZATION statement.

SQL Server 2005 still uses the familiar GRANT, DENY, and REVOKE scheme for assigning or refusing permissions on a securable object to a principal. The GRANT statement is expanded to cover all of the new permission options, such as the scope of the grant and whether the principal is able to grant the permission to other principals. Cross-

database permissions are not allowed. To grant such permissions, you create a duplicate user in each database and separately assign each database's user the permission. Like earlier versions of SQL Server, activating an application role suspends other permissions for the duration that the role is active. However, new in SQL Server 2005, is the ability to unset an application role. Another difference between SQL Server 2000 and 2005 is that when activating an application role, the role also suspends any server privilege, including public. For example, if VIEW ANY DEFINITION is granted to public, the application role wont honor it. This is most noticeable when accessing server-level metadata under an application role context.

Functional Requirements: Functional requirements specify which output file should be produced from the given file they describe the relationship between the input and output of the system, for each functional requirement a detailed description of all data inputs and their source and the range of valid inputs must be specified. Non Functional Requirements: Describe user-visible aspects of the system that are not directly related with the functional behavior of the system. Non-Functional requirements include quantitative constraints, such as response time (i.e. how fast the system reacts to user commands.) or accuracy ((.e. how precise are the systems numerical answers.) Pseudo Requirements: The client that restricts the implementation of the system imposes these requirements. Typical pseudo requirements are the implementation language and the platform on which the system is to be implemented. These have usually no direct effect on the users view of the system.

SOFTWARE ARCHITECTURE
4. SOFTWARE ARCHITECTURE
4.1 Modules
Ad Creation Ad Nature Template Implementation User Interface File Management

4.2 Module Description Ad Nature:


This module is used to provide data support for creating ads ,which was categorized in different sector. It work like ad server by providing user requirement based on topics

Template Implementation:
It allow user to use available template or add new template to design.

User Interface:
This windows which allow user to select the nature of ad, provide details about concern and any quote for ad.

File Management:
This is used to handle the file operation in as it was huge attractive ads has to handle more multimedia files. Multimedia file should be handled properly in order to support this software. software support some

Ad creation:
Finally this windows which allow user to set ads according to user selection and create final ads.

Introduction to the Software Development Life Cycle (SDLC)

This section of the document will describe the Software Development Life Cycle (SDLC) for small to medium database application development efforts. This chapter presents an overview of the SDLC, alternate lifecycle models, and associated references. This chapter also describes the internal processes that are common across all stages of the SDLC and describes the inputs, outputs, and processes of each stage.

The SDLC Waterfall


Small to medium database software projects are generally broken down into six stages:

The relationship of each stage to the others can be roughly described as a waterfall, where the outputs from a specific stage serve as the initial inputs for the following stage. To follow the waterfall model, one proceeds from one phase to the next in a purely sequential manner. For example, After completing the Project Planning phase, one will be completing the "requirements definitions" phase. When and only when the requirements are fully completed, one proceeds to design. This design should be a plan for implementing the requirements given. When and only when the design is fully completed, an implementation of that design is made by coders. Towards the later stages of this implementation phase, disparate software components produced by different teams are integrated. After the implementation and integration phases are complete, the software product is tested and debugged; any faults introduced in earlier phases are removed here. Then the software product is installed, and later maintained to introduce new functionality and remove bugs.

Thus the waterfall model maintains that one should move to a phase only when its proceeding phase is completed and perfected. Phases of development in the waterfall model are thus discrete, and there is no jumping back and forth or overlap between them. The central idea behind the waterfall model - time spent early on making sure that requirements and design are absolutely correct is very useful in economic terms (it will save you much time and effort later). One should also make sure that each phase is 100% complete and absolutely correct before proceeding to the next phase of program creation. It is argued that the waterfall model in general can be suited to software projects which are stable (especially with unchanging requirements) and where it is possible and likely that designers will be able to fully predict problem areas of the system and produce a correct design before implementation is started. The waterfall model also requires that implementers follow the well made, complete design accurately, ensuring that the integration of the system proceeds smoothly. The waterfall model however is argued by many to be a bad idea in practice, mainly because of their belief that it is impossible to get one phase of a software product's lifecycle "perfected" before moving on to the next phases and learning from them (or at least, the belief that this is impossible for any non-trivial program). For example clients may not be aware of exactly what requirements they want before they see a working prototype and can comment upon it - they may change their requirements constantly, and program designers and implementers may have little control over this. If clients change their requirements after a design is finished, that design must be modified to accommodate the new requirements, invalidating quite a good deal of effort if overly large amounts of time have been invested into the model. In response to the perceived problems with the "pure" waterfall model, many modified waterfall models have been introduced namely Royce's final model, sashimi model, and other alternative models. These models may address some or all of the

criticism of the "pure" waterfall model. There are other alternate SDLC models such as Spiral and V which have been explained in the later part of this chapter. After the project is completed, the Primary Developer Representative (PDR) and Primary End-User Representative (PER), in concert with other customer and development team personnel develop a list of recommendations for enhancement of the current software. Prototypes The software development team, to clarify requirements and/or design elements, may generate mockups and prototypes of screens, reports, and processes. Although some of the prototypes may appear to be very substantial, they're generally similar to a movie set: everything looks good from the front but there's nothing in the back. When a prototype is generated, the developer produces the minimum amount of code necessary to clarify the requirements or design elements under consideration. No effort is made to comply with coding standards, provide robust error management or integrate with other database tables or modules. As a result, it is generally more expensive to retrofit a prototype with the necessary elements to produce a production module than it is to develop the module from scratch using the final system design document. For these reasons, prototypes are never intended for business use, and are generally crippled in one way or another to prevent them from being mistakenly used as production modules by end-users. Allowed Variations In some cases, additional information is made available to the development team that requires changes in the outputs of previous stages. In this case, the development effort is usually suspended until the changes can be reconciled with the current design, and the new results are passed down the waterfall until the project reaches the point where it was suspended. The PER and PDR may, at their discretion, allow the development effort to continue while previous stage deliverables are updated in cases where the impacts are minimal

and strictly limited in scope. In this case, the changes must be carefully tracked to make sure all their impacts are appropriately handled.

Other SDLC Models


Apart from the waterfall model other popular SDLC modules are the Spiral model and V model which have been explained in this section.

Spiral Lifecycle

The spiral model starts with an initial pass through a standard waterfall lifecycle, using a subset of the total requirements to develop a robust prototype. After an evaluation period, the cycle is initiated again, adding new functionality and releasing the next prototype. This process continues with the prototype becoming larger and larger with each iteration, hence the spiral. The Spiral model is used most often in large projects and needs constant review to stay on target. For smaller projects, the concept of agile software development is

becoming a viable alternative. Agile software development tends to be rather more extreme in their approach than the spiral model. The theory is that the set of requirements is hierarchical in nature, with additional functionality building on the first efforts. This is a sound practice for systems where the entire problem is well defined from the start, such as modeling and simulating software. Business-oriented database projects do not enjoy this advantage. Most of the functions in a database solution are essentially independent of one another, although they may make use of common data. As a result, the prototype suffers from the same flaws as the prototyping lifecycle. For this reason, the software development teams usually decide against the use of the spiral lifecycle for database projects.

V-Model
The V-model was originally developed from the waterfall software process model. The four main process phases requirements, specification, design and Implementation have a corresponding verification and validation testing phase. Implementation of modules is tested by unit testing, system design is tested by Integration testing, system specifications are tested by system testing and finally Acceptance testing verifies the requirements. The V-model gets its name from the timing of the phases. Starting from the requirements, the system is developed one phase at a time until the lowest phase, the implementation phase, is finished. At this stage testing begins, starting from unit testing and moving up one test level at a time until the acceptance testing phase is completed. During development stage the program will be tested at all levels simultaneously.

V-Model The different levels in V-Model are unit tests, integration tests, system tests and acceptance test. The unit tests and integration tests ensure that the system design is followed in the code. The system and acceptance tests ensure that the system does what the customer wants it to do. The test levels are planned so that each level tests different aspects of the program and so that the testing levels are independent of each other. The traditional V-model states that testing at a higher level is started only when the previous test level is completed.

Verification and Validation

Verification: Are we building the product right?


The software should conform to its specification. The process of evaluating a system or component to determine whether the products of a given development phase satisfy the conditions imposed at the start of that phase and form formal proof of program correctness.

Validation: Are we building the right product?


The software should do what the user really requires. The process of evaluating a system or component during or at the end of the development process to determine whether it satisfies specified requirements.

The V & V process


V & V is a whole life-cycle process and it must be applied at each stage in the software process. It has two principal objectives The discovery of defects in a system The assessment of whether or not the system is usable in an operational situation.

Verification Vs Validation
Verification Validation Am I building the product right Am I building the right product The review of interim work steps and interim deliverables during a project to Determining if the system complies with the ensure they are acceptable. To requirements and performs functions for it is intended and meets the determine if the system is consistent, which

adheres to standards, uses reliable organizations goals and user needs. It is techniques and prudent practices, and traditional and is performed at the end of the performs the selected functions in the project. correct manner. Am I accessing the data right (in the Am I accessing the right data (in terms of the right place; in the right way) data required to satisfy the requirement) Low level activity High level activity Performed during development on key Performed after a work product is produced artifacts, like walkthroughs, reviews against established criteria ensuring that the and inspections, mentor feedback, product integrates correctly into the training, checklists and standards environment Demonstration of consistency, Determination of correctness of the final completeness, and correctness of the software product by a development project software at each stage and between with respect to the user needs and each stage of the development life requirements cycle.

SYSTEM DESIGN

5. SYSTEM DESIGN
5.1 System Flow Diagram

USER

Company Profile

CUSTOMER DB Ad Creation

Select Theme

Slide Modification

Add Audio

Add Video

Add Animation

Final Ad

Image Collection Data Flow Diagram

USER

Set Category

Browse Image

LOCAL MEMORY

Selected Image

Image DB

SYSTEM TESTING
CHAPTER 5 SYSTEM TESTING
Testing is the process of executing the program with the intent of finding errors. During testing, the program to be tested is executed with a set of test cases and the output of the program for the test cases is evaluated to determine the program is performing as it is expected. Error is the testing fundamental and is defined as the difference between the actual output of a software and a correct output i.e., difference between the actual and ideal testing is usually relied upon to detect these faults in the coding phase for this, different levels of testing are used which performs different tasks and aim to the test different aspects of the system. 5.1 GOALS OF TESTING: The famous statement by Dijkstra (in Dahl et al. 1972) is a perfect synthesis of the goals of the testing. If the results delivered by the system are different from the expected ones in just one case, in this unequally shows that the system is incorrect: by contrast, a correct behavior of the system on a finite number of cases does not guarantee correctness in the general case. For instance, we could have built a program that behaves properly for even integer numbers but not odd numbers. Clearly, any number of tests with even input values will face to show the error.

Testing should be based on sound and systematic techniques so that, after testing, we may have a better understanding of the products reliability. Testing should help locate errors, not just detect their presence. The result of testing should not be viewed as simply providing a Boolean answer to the question of whether the software works properly or not. Tests should be organized in a way that helps to isolate errors. This information can then be used in debugging. Testing should be repeatable, i.e., tests should be arranged in such a way that separating the same experiment-supplying the same input data to the same piece of code produces the same results. Finally testing should be accurate this will increase the reliability of testing. Here we should observe that the accuracy of the testing activity depends on the level of precision and may be even formality of software specifications. 5.2 TESTING METHODOLOGIES: 5.2.1 Unit Testing In it different modules are tested against the specifications produced during design for the modules. It is essential for verification of the code produced during the code phase and the goal is to test the internal logic of the module. 5.2.2 Integration Testing The goal here is to see if the modules can be integrated properly, the emphasis being on testing interfaces between modules. After structural

testing and functional testing we get error free modules these modules are to be integrated to get the required results of the system. After checking the module another module is tested and is integrated with the previous module. After the integration the test phases are generated and the results are tested. 5.2.3 System Testing Here the entire software is tested. The reference document for this process is the requirement document and the goal is to see whether the software needs its requirements. The system was tested for various test cases with various inputs. 5.2.4 Validation Testing In this testing the software is tested to determine whether it suits to that particular environment. Validation testing provides the final assurance that the software meets all functional, behavioral and performance requirements. Validation refers to the process of using the software in a live environment to find errors. During the course of validation the system failure may occur and software will be changed. Tested all the fields whether accepting the valid input or not. 5.2.5 Acceptance Testing It is sometimes performed with realistic data of the client to demonstrate that the software is working satisfactorily. Testing here focus on the external behavior of the system, the internal logic of the program is not emphasized.

In acceptance test the system is tested for various inputs. types of testing are performed.

Thus different

5.2.6 Black Box Testing Here the structure of the program is not considered. Only the test cases are decided solely on the basis of the requirements or specification of the program or module and the internal details of the module or the program is not considered for the selection of test cases. This is also called Black Box Testing or Functional Testing. . Incorrect or missing functions. Performance errors. Database access errors Initialization and termination 5.2.6 White Box Testing It is considered with testing the implementation of the program. The intention of the structural testing is not to exercise all the different input and output conditions but to exercise the different programming and data files used in the program. This testing is also called White Box Testing or Structural Testing. .

SYSTEM IMPLEMENTATION
7. SYSTEM IMPLEMENTATION

SYSTEM IMPLEMENTATION
Implementation Implementation is the stage in the project where the theoretical design of the project is turned into a working system. It is a stage where the operation of the system is monitored to ensure that it continues to work effectively. Education and training of the users are also essential to ensure smooth functioning of the system. The major tasks involved in the implementation are: Computer based/system testing. Training the user personnel Full system testing and making the necessary changes as desired by the user. Change over. Maintenance.

The implementation strategy used is the parallel changeover. The automated system has been put to use gradually so that its usage can prove better for the concern. After the system has been tested, the implementation type or the change over technique from the existing system to the new system is a step-by-step process. In the system, at first only a module of the system is implemented and checked for suitability and efficiency. When the enduser related to the particular module is satisfied with the performance, the next step of implementation is preceded. Implementation to some extent is also parallel. For instance, modules, which are not linked, with other modules are implemented parallel and the remaining is the step-by-step process. Backups are necessary since any time unexpected events may happen. And so during the program execution, the records are stored in the workspace. This helps to recover the original status of the records from any accidental updating or intentional deletion of records. Implementation Procedures Implementation means converting older system to a new design in operation. This involves creating computer capable files and basic software needed to run this system. The basic concept for implementation needed is software installation and system requirements. So in order to implement them, suitable hardware and software must be available. Then the database must be created in the computer without changing the database names which are used in the table design.

Now the computer is ready for implementing the proposed system. There are three types of implementation. Implementation of a new computer system to replace a manual one. Implementation of a new computer system to replace an existing one. Implementation a modified application to replace an existing one.

User Training Planning for user acceptance testing calls for the analyst and the user to agree on the condition for the test. Many of these conditions may be derived from the test plan. Others are an agreement on the test schedule, the test duration and the test should be specified in advance. Plan User Training User training is designed to prepare the user for testing and converting the system. User involvement and training take place parallel with programming for three reasons: The system group has time available to spend on training while the program is being written. Initiating a user-training program gives the system group a clearer image of the users interest in the new system. A trained user participates more effectively in system testing.

For user training, preparation of a checklist is useful. Included are provisions for developing training materials and other documents to complete the training activity. In effect, the checklist call for a commitment of personnel, facilities and effort for implementing the candidate system The training plan is followed by preparation of the user training manual and other text materials. Facility requirements and the necessary hardware are specified and documented. A common procedure is to train supervisors and heads who, in turn train their staff as they fit.

4.4.4 Operational Documentation In operational document, the general idea is about explaining different module of this project. The detail explanation of this operational document is to know the form infrastructure. There are different forms that have different features, when it is clicked it displays the particular detail about the particular contents. The content will display all the details about the fields. In the particular form we can add, edit, delete and update can be made. Each form displays the particular module to perform this operation. 4.4.5 System Maintenance Software maintenance is of course, far more than finding mistakes. Provision must be made for environment changes, which may affect either the computer, or other parts of the computer based systems. Such activity is normally called maintenance. It includes both the improvement

of the system functions and the correction of faults, which arise during the operation of a new system. It may involve the continuing involvement of a large proportion of computer department resources. The main task may be to adapt existing systems in a changing environment. Systems should not be changed casually following informal requests. To avoid unauthorized amendments, all requests for changes should be channeled to a person nominated by management. The nominated person has sufficient knowledge of the organizations computer based systems to be able to judge the relevance of each proposed change.

CONCLUSION
8. CONCLUSION
The new Ad Maker / Ad Design Generator makes it easy to use highquality graphic designs with your ads. Create eye-catching ads within 3 easy steps! Add Image,Theme,audio and animation to produce a quality ads future. Benefits:

It's Free! No Programming or Software Experience Needed! Attracts More Attention to Your Ad Message! Multiple image Category Designs! It's Fun!

Future Enhancement This tool can be made as webbased tool ,if it was user can access it from any where , seller can also cost them according to their usage,it would be benefit for both buyer and sellers.Otherwise it should be added in module ,now its ready to develop image and animation ads ,it can be made as website designer.In future module it can be redesign as website designer without coding.

BIBLIOGRAPHY

9. BIBLIOGRAPHY

Book Reference:

APPENDIX
10.APPENDIX

10.1 Screen Shots

Das könnte Ihnen auch gefallen