Sie sind auf Seite 1von 74

ABSTRACT

(i)
ABSTRACT

In internet a wide range of web information increases rapidly, user could wants to

retrieve the information based upon their preference using search engines. Our paper going to

propose a new type of search engine for web personalization approach. It will capture the

interests and preferences of the user in the form of concepts of mining search results and their

clickthroughs.

Our approach is to improve the search accuracy by means separating the concepts into

content concepts and location concepts due to the important role location information plays in

mobile search. Then organize them into ontologies to create an ontology-based, multi-facet

(OMF).

Moreover, recognizing the fact that different users and queries may have different

emphases on content and location information, we introduce the notion of content and

location entropies to measure the amount of content and location information associated with

a query, and click content and location entropies to measure how much the user is interested

in the content and location information in the results.

As a result, we propose to define personalization effectiveness based on the entropies

and use it to balance the weights between the content and location facets. Finally, based on

the derived ontologies and personalization effectiveness; we train an SVM to adapt a

personalized ranking function for re-ranking of future search.

We perform extensive experiments to compare the precision produced by our OMF

profiles and that of a baseline method. Experimental results show that OMF improves the

precision significantly compared to the baseline.

2
CONTENTS

CHAPTER.NO TITLE PAGENO

LIST OF FIGURES 4

LIST OF TABLES 5

1. INTODUCTION 6

1.1 OBJECTIVE 7

2. SYSTEM ANALYSIS 8

2.1 EXISTING SYSTEM 9


2.2 PROPOSED SYSTEM 9
2.3 FEASIBILITY STUDY 10
2.3.1 ECONO MICAL FEASIBLITY 10
2.3.2 OPERATIONAL FEASIBLITY 10
2.3.3TECHNICAL FEASIBLITY 11

3. SYSTEM SPECIFICATION 12

3.1 HARDWARE REQUIREMENTS 13


3.2 SOFTWARE REQUIREMENTS 13

4. SOFTWARE DESCRIPTION 14
4.1. FRONT END 15
4.2. FEATURES 18

5. PROJECT DESCRIPTION 20

5.1 PROBLEM DEFINITION 21


5.2 OVERVIEW OF THE PROJECT 21
5.3 MODULE DESCRIPTION
5.3 MODULES 22

3
5.3.1 PROFILE REGISTERATION 23
5.3.2 RANKING 23
5.3.3 CONTENT SEARCHING 24
5.3.4 LOCATION SEARCHING 25

5.4 DATA FLOW DIAGRAM 26

5.5 DATABASE DESIGN

5.5.1 TABLE 1 27
5.5.2 TABLE 2 27
5.5.3 TABLE 3 27

5.6 INPUT DESIGN 28

5.7 OUPTUT DESIGN 28

6. SYSTEM TESTING 29

6.1 UNIT TESTING 31

6.2 SYSTEM TESTING 31

6.3 INTEGRATION TESTING 31

6.4 VALIDATION TESTING 32

6.5 MAINTANACE 32

7. SYSTEM IMPLEMENTATION 33

8. CONCLUSION & FUTURE ENHANCEMENTS 35

8.1 CONCLUSION 36

8.2 FUTURE ENHANCEMENTS 36

9. APPENDIX 37

9.1 SOURCE CODE 38

9.2 SCREEN SHOTS 56


4
10. REFERENCES 72

LIST OF FIGURES

FIGURE.NO FIGURE NAME PAGE.NO

1. SYSTEM ARCHITECTURE 30

2. DFD FOR PROFILE REGISTERATION 31

3. DFD FOR RANKING 32

5
4. DFD FOR CONTENT SEARCHING 33

5. DFD FOR LOCATION SEARCHING 34

6. DFD FOR SEARCH ENGINE 35

7. LOGIN FORM 56

8. INSERTING NEW URL FOR LOCATION BASED 56

9. INSERTING NEW URL FOR CONTENT BASED 57

10. VIEW DATABASE 58

11. DELETING WITHOUT SELECTING A RECORD 59

12. DELETE A RECORD FROM THE DATABASE 59

13. NEW USER REGISTERED 60

14. SEARCH ENGINE PAGE 60

15. HISTORY OF NEW USER 61

16. USER SEARCH RESULT FOR LOCATION BASED SEARCH 62

17. WEBSITE SELECTED BY THE USER FROM THE RESULT 63

18. USER SEARCH RESULT FOR CONTENT BASED SEARCH 64

19. WEBSITE SELECTED BY THE USER FROM THE RESULT 65

20. HISTORY OF THE USER SEARCHED DATA 66

LIST OF TABLES

TABLE.NO TABLE NAME PAGE.NO

1. DATABASE TABLE FOR LOGIDEATAILS 37

2. DATABASE TABLE FOR SEARCHES 37

3. DATABASE TABLE FOR USER HISTORY 37

6
7
INTRODUCTION

CHAPTER 1

1.1OBJECTIVE

In mobile search, the interaction between users and mobile devices are constrained by
the small form factors of the mobile devices. To reduce the amount of user's interactions with
the search interface. Personalized search is one way to resolve the problem. By capturing the
users' interests in user profiles, a personalized search middleware is able to adapt the search
results obtained from general search engines to the users' preferences through personalized

8
location preferences reranking of the search results. In the personalization process, user
profiles play a key role in reranking search results and thus need to be trained constantly
based on the user's search activities of user’s clicking and browsing behaviours. . We propose
an ontology-based, multi-facet (OMF). The general process of proposed personalization
approach.Profiling strategy to capture both of the users' content and location preferences (i.e.,
.multi-facets.) for building a personalized search engine for mobile users.
The general process of our approach, which consists of two major activities: 1)
Reranking and 2) profile Updating.
Reranking: When a user submits a query, the search results are obtained from the
backend search engines (e.g., Google, MSNSearch, and Yahoo). The search results are
combined and reranked according to the user's profile trained from the user's previous search
activities.
Profile Updating: After the search results are obtained from the backend search
engines, the content and location concepts (i.e. important terms and phrases) and their
relationships are mined online from the search results and stored, respectively, as content
ontology and location ontology.
When the user clicks on a search result, the clicked result together with its associated
content and location concepts are stored in the user's clickthrough data. The content and
location ontologies, along with the clickthrough data, are then employed in RSVM [9]
training to obtain a content weight vector and a location weight vector for reranking the
search results for the user.

9
SYSTEM ANALYSIS

CHAPTER 2

2. SYSTEM ANALYSIS

2.1 EXISTING SYSTEM


In Existing system, the interaction between users and mobile devices are constrained
by the small form factors of the mobile devices. Reduced amount of user's interactions with
the Search interface, an important requirement for mobile search engine is to be able to
10
understand the users' needs, and deliver highly relevant information to the users. More
complex queries have been used to retrieve the information processing the queries makes the
system more complexity for retrieving the information from the database
2.1.1 Drawbacks
 Time Complexity
 Complex Queries
 Less User Interactivity
2.2 PROPOSED SYSTEM
In our proposed system, we propose a content ontology and location ontology to
accommodate the extracted content and location concepts as well as the relationships among
the concepts. We introduce different entropies to indicate the amount of concepts associated
with a query and how much a user is interested in these concepts. With the entropies, we are
able to estimate the effectiveness of personalization for different users and different queries.
Based on the proposed ontologies and entropies, we adopt an SVM to learn personalized
ranking functions for content and location preferences. We use the personalization
effectiveness to integrate the learned ranking functions into a coherent profile for
personalized reran king. We implement a working prototype to validate the proposed ideas. It
consists of a middleware for capturing user clickthroughs, performing personalization, and
interfacing with commercial search engines at the backend. Empirical results show that OMF
can successfully capture users' content and location preferences and utilize the preferences to
produce relevant results for the users. Finally, It significantly out-performs strategies which
use either content or location preference only.

2.3 FEASIBILTY STUDY

Every project is feasible given unlimited resources and infinite time. Unfortunately,
the development of a computer-based system is more likely to be plagued by resources
scarcity and stringent delivery scheduled. It is both necessary and prudent to evaluate
feasibility of a project at the earliest possible time.

11
Wastage of man power of financial resources and untold professional embarrassment
can be averted if as ill conceived system is recognized in the definition phase. So, a detailed
study was carried to check the work ability of the proposed system.

Feasibility study on the system proposal regarding its work ability, impact on the
organization, ability to meet user nodes, and effective use of resources, this when a new
application is proposed, it normally goes through a feasibility study before it is approved for
development. For any project to be successful there is a need for an effective feasibility study.
The purpose of feasibility study is not to solve the problem but to determine if the problem is
worth solving. We have feasibility studies to be connected. But there are three primary
feasibility tests to be performed.

 Economical feasibility

 Technical feasibility

 Operational feasibility

2.3.1 Economical Feasibility

This study is carried out to check the economic impact that the system will have on
the organization. The amount of fund that the company can pour into the research and
development of the system is limited. The expenditures must be justified. Thus the developed
system as well within the budget and this was achieved because most of the technologies
used are freely available. Only the customized products had to be purchased.

2.3.2Technical Feasibility
This study is carried out to check the technical feasibility, that is, the technical
requirements of the system. Any system developed must not have a high demand on the
available technical resources. This will lead to high demands on the available technical
resources. This will lead to high demands being placed on the client. The developed system
must have a modest requirement, as only minimal or null changes are required for
implementing this system.

2.3.3 Operational Feasibility

12
The aspect of study is to check the level of acceptance of the system by the user. This
includes the process of training the user to use the system efficiently. The user must not feel
threatened by the system, instead must accept it as a necessity. The level of acceptance by the
users solely depends on the methods that are employed to educate the user about the system
and to make him familiar with it. His level of confidence must be raised so that he is also able
to make some constructive criticism, which is welcomed, as he is the final user of system.

13
SYSTEM SPECIFICATION

CHAPTER 3

3. SYSTEM SPECIFITION

3.1 HARDWARE SPECIFICATION

PROCESSOR : PENTIUM 4 CPU 2.40GHZ

RAM : 128 MB
14
HARD DISK : 40 GB

KEYBOARD : STANDARD

MONITOR : “ 15”

3.2 SOFTWARE SPECIFICATION

FRONT END : .NET


BACK END : SQL SERVER 2005
OPERATING SYSTEM : WINDOWS XP
DOCUMENTATION : MS-OFFICE 2007

15
SOFTWARE DESCRIPTION

CHAPTER 4

4. LANGAUGE SPECIFICATION
4.1 FEATURES OF. NET
Microsoft .NET is a set of Microsoft software technologies for rapidly building and
integrating XML Web services, Microsoft Windows-based applications, and Web solutions.
The .NET Framework is a language-neutral platform for writing programs that can easily and
securely interoperate. There’s no language barrier with .NET: there are numerous languages
available to the developer including Managed C++, C#, Visual Basic and Java Script. The
16
.NET framework provides the foundation for components to interact seamlessly, whether
locally or remotely on different platforms. It standardizes common data types and
communications protocols so that components created in different languages can easily
interoperate.

“.NET” is also the collective name given to various software components built upon
the .NET platform. These will be both products (Visual Studio.NET and Windows.NET
Server, for instance) and services (like Passport, .NET My Services, and so on).

4.1.1 The .NET Framework

The .NET Framework has two main parts:

1. The Common Language Runtime (CLR).

2. A hierarchical set of class libraries.

The CLR is described as the “execution engine” of .NET. It provides the environment
within which programs run. The most important features are

 Conversion from a low-level assembler-style language, called Intermediate


Language (IL), into code native to the platform being executed on.
 Memory management, notably including garbage collection.
 Checking and enforcing security restrictions on the running code.
 Loading and executing programs, with version control and other such
features.
 The following features of the .NET framework are also worth description:

4.1.2 Managed Code

The code that targets .NET, and which contains certain extra Information - “metadata”
- to describe itself. Whilst both managed and unmanaged code can run in the runtime, only
managed code contains the information that allows the CLR to guarantee, for instance, safe
execution and interoperability.

4.1.3 Managed Data

17
With Managed Code comes Managed Data. CLR provides memory allocation and
Deal location facilities, and garbage collection. Some .NET languages use Managed Data by
default, such as C#, Visual Basic.NET and JScript.NET, whereas others, namely C++, do not.
Targeting CLR can, depending on the language you’re using, impose certain constraints on
the features available. As with managed and unmanaged code, one can have both managed
and unmanaged data in .NET applications - data that doesn’t get garbage collected but instead
is looked after by unmanaged code.

4.1.4 Common Type System

The CLR uses something called the Common Type System (CTS) to strictly enforce
type-safety. This ensures that all classes are compatible with each other, by describing types
in a common way. CTS define how types work within the runtime, which enables types in
one language to interoperate with types in another language, including cross-language
exception handling. As well as ensuring that types are only used in appropriate ways, the
runtime also ensures that code doesn’t attempt to access memory that hasn’t been allocated to
it.

4.1.5 Common Language Specification

The CLR provides built-in support for language interoperability. To ensure that you
can develop managed code that can be fully used by developers using any programming
language, a set of language features and rules for using them called the Common Language
Specification (CLS) has been defined. Components that follow these rules and expose only
CLS features are considered CLS-compliant.

4.1.6 The Class Library

.NET provides a single-rooted hierarchy of classes, containing over 7000 types. The
root of the namespace is called System; this contains basic types like Byte, Double, Boolean,
and String, as well as Object. All objects derive from System. Object. As well as objects,
there are value types. Value types can be allocated on the stack, which can provide useful

18
flexibility. There are also efficient means of converting value types to object types if and
when necessary.

The set of classes is pretty comprehensive, providing collections, file, screen, and
network I/O, threading, and so on, as well as XML and database connectivity.

The class library is subdivided into a number of sets (or namespaces), each providing
distinct areas of functionality, with dependencies between the namespaces kept to a
minimum.

4.1.7 Languages Supported by .net

The multi-language capability of the .NET Framework and Visual Studio .NET
enables developers to use their existing programming skills to build all types of applications
and XML Web services. The .NET framework supports new versions of Microsoft’s old
favourites Visual Basic and C++ (as VB.NET and Managed C++), but there are also a number
of new additions to the family.

Visual Basic .NET has been updated to include many new and improved language
features that make it a powerful object-oriented programming language. These features
include inheritance, interfaces, and overloading, among others. Visual Basic also now
supports structured exception handling, custom attributes and also supports multi-threading.

Visual Basic .NET is also CLS compliant, which means that any CLS-compliant
language can use the classes, objects, and components you create in Visual Basic .NET.

Managed Extensions for C++ and attributed programming are just some of the
enhancements made to the C++ language. Managed Extensions simplify the task of migrating
existing C++ applications to the new .NET Framework.

C# is Microsoft’s new language. It’s a C-style language that is essentially “C++ for
Rapid Application Development”. Unlike other languages, its specification is just the
grammar of the language. It has no standard library of its own, and instead has been designed
with the intention of using the .NET libraries as its own.

Microsoft Visual J# .NET provides the easiest transition for Java-language developers
into the world of XML Web Services and dramatically improves the interoperability of Java-

19
language programs with existing software written in a variety of other programming
languages.

Active State has created Visual Perl and Visual Python, which enable .NET-aware
applications to be built in either Perl or Python. Both products can be integrated into the
Visual Studio .NET environment. Visual Perl includes support for Active State’s Perl Dev
Kit.

Other languages for which .NET compilers are available include

 FORTRAN
 COBOL
 EIFFEL
4.2 FEATURES OF C#. NET
C#.NET is also compliant with CLS (Common Language Specification) and supports
structured exception handling. CLS is set of rules and constructs that are supported by the
CLR (Common Language Runtime). CLR is the runtime environment provided by the .NET
Framework; it manages the execution of the code and also makes the development process
easier by providing services.
C#.NET is a CLS-compliant language. Any objects, classes, or components that
created in C#.NET can be used in any other CLS-compliant language. In addition, we can use
objects, classes, and components created in other CLS-compliant languages in C#.NET .The
use of CLS ensures complete interoperability among applications, regardless of the languages
used to create the application.

4.2.1 Constructors and Destructors

Constructors are used to initialize objects, whereas destructors are used to destroy
them. In other words, destructors are used to release the resources allocated to the object. In
C#.NET the sub finalize procedure is available. The sub finalize procedure is used to
complete the tasks that must be performed when an object is destroyed. The sub finalize
procedure is called automatically when an object is destroyed. In addition, the sub finalize
procedure can be called only from the class it belongs to or from derived classes.

4.2.2 Garbage Collection

20
Garbage Collection is another new feature in C#.NET. The .NET Framework monitors
allocated resources, such as objects and variables. In addition, the .NET Framework
automatically releases memory for reuse by destroying objects that are no longer in use.

In C#.NET, the garbage collector checks for the objects that are not currently in use
by applications. When the garbage collector comes across an object that is marked for
garbage collection, it releases the memory occupied by the object.

4.2.3 Overloading

Overloading is another feature in C#. Overloading enables us to define multiple


procedures with the same name, where each procedure has a different set of arguments.
Besides using overloading for procedures, we can use it for constructors and properties in a
class.

4.2.4 Multithreading

C#.NET also supports multithreading. An application that supports multithreading can


handle multiple tasks simultaneously, we can use multithreading to decrease the time taken
by an application to respond to user interaction.

4.2.5 Structured Exception Handling

C#.NET supports structured handling, which enables us to detect and remove errors at
runtime. In C#.NET, we need to use Try…Catch…Finally statements to create exception
handlers. Using Try…Catch…Finally statements, we can create robust and effective
exception handlers to improve the performance of our application.

21
PROJECT DESCRIPTION

CHAPTER 5

5.1 PROBLEM DEFINITION

In mobile search, the interaction between users and mobile devices are constrained by
the small form factors of the mobile devices. To reduce the amount of user's interactions with
the search interface, an important requirement for mobile search engine is to be able to
understand the users' needs, and deliver highly relevant information to the users.
5.2 OVERVIEW OF THE PROJECT

22
In mobile search, the interaction between users and mobile devices are constrained by
the small form factors of the mobile devices. To reduce the amount of user's interactions with
the search interface, an important requirement for mobile search engine is to be able to
understand the users' needs, and deliver highly relevant information to the users. Personalized
search is one way to resolve the problem. By capturing the users' interests in user profiles, a
personalized search middleware is able to adapt the search results obtained from general
search engines to the users' preferences through personalized reranking of the search results.
In the personalization process, user profiles play a key role in reranking search results and
thus need to be trained constantly based on the user's search activities. Several
personalization techniques have been proposed to model users' content preferences via
analysis of users' clicking and browsing behaviours. we recognize the importance of location
information in mobile search and proposes to incorporate the
User’s location preferences in addition to content preferences in user profiles. We propose an
ontology-based, multi-facet (OMF). The general process of proposed personalization
approach.
Profiling strategy to capture both of the users' content and location preferences (i.e., .multi-
facets.) for building a personalized search engine for mobile users. The general process of our
approach, which consists of two major activities: 1) Reranking and 2) profile Updating.
Reranking: When a user submits a query, the search results are obtained from the backend
search engines (e.g., Google, MSNSearch, and Yahoo). The search results are combined and
reranked according to the user's profile trained from the user's previous search activities.
Profile Updating: After the search results are obtained from the backend search engines, the
content and location concepts (i.e. important terms and phrases) and their relationships are
mined online from the search results and stored, respectively, as content ontology and
location ontology. When the user clicks on a search result, the clicked result together with its
associated content andlocation concepts are stored in the user's clickthrough data. The content
and location ontologies, along with the clickthrough data, are then employed in RSVM [9]
training to obtain a content weight vector and a location weight vector for reranking the
search results for the user.

23
FIG 5.2.1 SYSTEM ARCHITECUTRE OF PERSONLIZED WEBSEARCH WITH
LOCATION PREFERENCE

5.3 MODULE DESCRIPTION


MODULES:

 Profile Registration
 Ranking
 Content Searching
 Location Searching

5.3.1 Profile Registration

In this user has to register the user information and it will provide the login for
maintaining the information.It also maintains the searched data which should be useful for
next searching .it should automatically rank depends upon the user interest upon the
particular search.It also reranked whenever the searching criteria has been modifiedIn this

24
user profile contains not only profile information and also search content which helps to
search and give immediate results whatever information user needed.

User 1 Search engine Registration


portal

Registered user1

FIG 5.3.1 DATA FLOW DIAGRAM FOR PROFILE REGISTERATION

5.3.2 Ranking

In this module when a user submits a query, the search results are obtained from the
backend database. The search results are combined and reranked according to the user's
profile trained from the user's previous search activities.After the search results are obtained
from the backend search engines, the content and location concepts and their relationships are
mined online from the search results and storedWhen the user clicks on a search result, the
clicked result together with its associated content and data concepts are stored in the user's
clickthroughs data. The content and location ontologies, along with the clickthroughs data,
are then training to obtain a content weight vector and a location weight vector for reranking
the search results for the user.

User 1 Search engine Login


formalities

Rank the
25 resultant site Content search
FIG 5.3.2 DATA FLOW DIAGRAM FOR RANKING

5.3.3 Content Searching

Content searching linked the ontology shows the possible concept space arising from
a user's queries. In this ontology covers more than what the user actually wants. When the
query is submitted, the data for the query composes of various relevant data. If the user is
indeed interested in some specific data means the clickthroughs is captured and the clicked
data is favoured. The content ontology together with the clickthroughs serves as the user
profile in the personalization process. It will then be transformed into a linear feature vector
to rank the search results according to the user's content information preferences.

Engine search
Search the
content

User 1

All details related


to content Use content
displayed search

FIG 5.3.3 DATA FLOW DIAGRAM FOR CONTENT SEARCHING

5.3.4 Location Searching

In this module extracting location concepts is different from that for extracting
content concepts. First, a document usually embodies only a few location concepts. As a
result, very few of them co-occur with the query terms in web- snippets. We extract location
concepts from the full documents. Second, due to the small number of location concepts

26
embodied in documents, the similarity and parent-child relationship cannot be accurately
derived statistically.

Search the content

User 1 Location search

Preferred General
location search

Preferred location related search All details

FIG 5.3.4 DATA FLOW DIAGRAM FOR LOCATION SEARCHING

5.4 DATA FLOW DIGRAM

Search Engine

27

Login Formalities Ranking for last


searched content
Location based Content based

All data

Preferred General
location

The data will be fetched data will be fetched

according to preferred location for all location

FIG. 5.4.1 DATA FLOW DIAGRAM FOR SEARCH ENGINE

TABLE DESIGN

SLNO COLUMN NAME DATA TYPE

28
1 USR NAME VARCHAR(12)
2 PASWD VARCHAR(12)
3 TYPE VARCHAR(12)

TABLE 5.5.1 TABLE FOR LOGIDETAIL

SLNO COLUMN NAME DATA TYPE


1 ID INT
2 URL VARCHAR(25)
3 DESCRIPTION VARCHAR(25)
4 KEYWORDS VARCHAR(25)
5 TYPE VARCHAR(3)
6 RANKING INT
7 DATE CREATED DATETIME

TABLE 5.5.2 TABLE FOR SEARCHES

SLNO COLUMN NAME DATA TYPE


1 LINKED INT
2 GID INT
3 URL VARCHAR(100)
4 DESCRIPTION VARCHAR(100)
5 DATE CREATED DATE TIME

TABLE 5.5.3 TABLE FOR USER HISTORY

5.6 INPUT DESIGN

Input design is a part of the overall system design, which requires very careful
attention. If the data going into the system is incorrect, then the processing and the output will

29
magnify the errors. The clear objective of the input designs is to ensure that is acceptable to
and understood by the user.

The input is fed as description of the title and the url of the site then keywords and to
include it into the content based or location based search with help of this input the database
stores it and retrieve the data when user searches it.

5.7 OUTPUT DESIGN

The output information should be of a good format in such a way that anyone who is
entitled to have the information could be informative. Well designed output increases the
clerical efficiency. The most common output media area CRT screen. The output design of
the application is done in a very efficient way and the output display is embedded in various
test areas. The usage of text areas for the output display eliminates much of the complexity
which is encountered whilst output display. The output can be easily distinguishable due to
this fact and also the user is provided with a very interactive interface in the output design.

30
SYSTEM TESTING

CHAPTER 6

31
6. TESTING

Testing is vital to the success of the system. System testing makes a logical
assumption that if all parts of the system are correct, the goal will be successfully achieved.
In the testing process we test the actual system in an organization and gather errors from the
new system operates in full efficiency as stated. System testing is the stage of
implementation, which is aimed to ensuring that the system works accurately and efficiently.

In the testing process we test the actual system in an organization and gather errors
from the new system and take initiatives to correct the same. All the front-end and back-end
connectivity are tested to be sure that the new system operates in full efficiency as stated.
System testing is the stage of implementation, which is aimed at ensuring that the system
works accurately and efficiently.

The main objective of testing is to uncover errors from the system. For the
uncovering process we have to give proper input data to the system. So we should have more
conscious to give input data. It is important to give correct inputs to efficient testing.

Testing is done for each module. After testing all the modules, the modules are
integrated and testing of the final system is done with the test data, specially designed to
show that the system will operate successfully in all its aspects conditions. Thus the system
testing is a confirmation that all is correct and an opportunity to show the user that the system
works. Inadequate testing or non-testing leads to errors that may appear few months later.

This will create two problems

Time delay between the cause and appearance of the problem. The effect of the
system errors on files and records within the system.

The purpose of the system testing is to consider all the likely variations to which it
will be suggested and push the system to its limits.

The testing process focuses on logical intervals of the software ensuring that all the
statements have been tested and on the function intervals (i.e.,) conducting tests to uncover
errors and ensure that defined inputs will produce actual results that agree with the required
results. Testing has to be done using the two common steps Unit testing and Integration
testing. In the project system testing is made as follows:

32
The procedure level testing is made first. By giving improper inputs, the errors
occurred are noted and eliminated. This is the final step in system life cycle. Here we
implement the tested error-free system into real-life environment and make necessary
changes, which runs in an online fashion. Here system maintenance is done every months or
year based on company policies, and is checked for errors like runtime errors, long run errors
and other maintenances like table verification and reports.

6.1 UNIT TESTING

In unit testing the modules of the system are tested as individual units. Each unit has
definite input and output parameters and often a definite single function.

Hence unit testing is otherwise known as program testing.A program unit is usually
small enough that the programmer who developed can test in a great detail and certainly in
greater detail that will be possible when the unit is integrated into an evolving software
product.

There are four categories of tests that a programmer will typically perform on a
program unit.

6.2 SYSTEM TESTING

In system testing, the system is tested as whole; that’s intercommunication among the
individual units and functions of the complete system are tested.

6.3 INTEGRATION TESTING

Integration testing is a systematic testing for constructing the program structure while
at the same time conducting test to uncover errors associated with the interfacing.

While performing integration testing, all the modules of the project were integrated as
a whole project and checked whether the correct output was produced.

The objective is to take unit-tested components and built a program structure that has
been dictated by the design.

33
6.3.1 Top Down Integration Testing

It is an incremental approach for the construction of a program structure. Here,


modules are integrated by moving downwards through the control hierarchy, beginning with
the main control module.

6.3.2 Bottom Up Integration Testing

It begins with the construction and testing of atomic modules. Because components
are integrated from the bottom-up, processing required for the components, subordinate to a
given level is available and need for stubs are eliminated.

6.4. VALIDATION TESTING

Software validation is achieved through a series of black box test that demonstrate
conformity with requirements.

Test plan outlines classes of test to be conducted and test procedure defines the
specific test cases that will be used to demonstrate conformity with requirements. Both the
plan and procedure are designed to ensure that all functional requirements are satisfied, all
behavioral characteristics are achieved and all performance requirements are attained.

6.5. MAINTENANCE

Once the project or system is subjected to various testing techniques, the final stage of
software development lifecycle is maintenance.

After the product is being installed at the customer end, we carry out maintenance by
continuously monitoring the performance of the system.

When there is a reduction in the level of efficiency, the cause and effects are analyzed
and rectified.

34
SYSTEM IMPLEMENTATION

35
CHAPTER 7

7. SYSTEM IMPLEMENTATION
Implementation is the stage of the project when the theoretical design is turned out
into a working system. Thus it can be considered to be the most critical stage in achieving a
successful new system and in giving the user, confidence that the new system will work and
be effective.

The implementation stage involves careful planning, investigation of the existing


system and it’s constraints on implementation, designing of methods to achieve changeover
and evaluation of changeover methods.

Implementation is the process of converting a new system design into operation. It is


the phase that focuses on user training, site preparation and file conversion for installing a
candidate system. The important factor that should be considered here is that the conversion
should not disrupt the functioning of the organization.

36
CONCLUSION

CHAPTER 8

8.1. CONCLUSION
We proposed an Ontology-Based, Multi-Facet (OMF) personalization framework for
automatically extracting and learning a user's content and location preferences based on the

37
user's clickthrough. In the OMF framework, we develop different methods for extracting
content and location concepts, which are maintained along with their relationships in the
content and location ontologies. We also introduced the notion of content and location
entropies to measure the diversity of content and location information associated with a query
and click content and location entropies to capture the breadth of the user's interests in these
two types of information. Based on the entropies, we derived personalization effectiveness
and showed with a case study that personalization effectiveness differs for different classes of
users and queries. Experimental results confirmed that OMF can provide more accurate
personalized results comparing to the existing methods.

8.2. FUTURE ENHANCEMENT


As for the future work, we plan to study the effectiveness of other kinds of concepts
such as people names and time for personalization. We will also investigate methods to
exploit a user's content and location preference history to determine regular user patterns or
behaviours for enhancing future search.

38
APPENDIX

CHAPTER 9

9.1. SOURCE CODES

Admin

39
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Text;
using System.Windows.Forms;
using System.Data.SqlClient;
namespace Location_Search
{
public partial class AdminPage : Form
{
#region //========== Global Declaration of SQL connection
=====================================
SqlConnection conn = new SqlConnection("Data Source=.;Initial
Catalog=webSearch;Integrated Security=True");
#endregion

public AdminPage()
{
InitializeComponent();
}

#region //=========== Execute code to store values in database


=====================
private void bt_add_Click(object sender, EventArgs e)
{
//Check whether user has specified all the contents in textboxes.
if ((tb_descripiton.Text != "") && (tb_keywords.Text != "") && (tb_url.Text != ""))
{
label7.Visible = label8.Visible = label9.Visible = false;
if (conn.State == ConnectionState.Closed)
conn.Open();

40
//Check whether the url already exist in database or not
SqlCommand cmd = new SqlCommand("select * from sites where url='" +
tb_url.Text + "'", conn);
SqlDataReader dr_submitdetails = cmd.ExecuteReader();
if (dr_submitdetails.HasRows)
{
MessageBox.Show("URL already exists");
dr_submitdetails.Close();
tb_descripiton.Text = tb_keywords.Text = tb_url.Text = "";
}
else
{
dr_submitdetails.Close();
SqlCommand cmd_insert;
if (rb_content.Checked)
//insert values into sites table url,desc,key,rank,date
cmd_insert = new SqlCommand("insert into
sites(url,description,keywords,type,ranking,datecreated )values('" + tb_url.Text + "','" +
tb_descripiton.Text + "','" + tb_keywords.Text + "','CB','0','" + DateTime.Now + "')", conn);
else
cmd_insert = new SqlCommand("insert into
sites(url,description,keywords,type,ranking,datecreated )values('" + tb_url.Text + "','" +
tb_descripiton.Text + "','" + tb_keywords.Text + "','LB','0','" + DateTime.Now + "')", conn);
cmd_insert.ExecuteNonQuery();
MessageBox.Show("Records Inserted Successfully", "Success");
}
}
else
label7.Visible = label8.Visible = label9.Visible = true;
}
#endregion
#region //=========== Navigate to view Database windows
====================

41
private void bt_view_Click(object sender, EventArgs e)
{
DatBase frm_DB = new DatBase();
frm_DB.ShowDialog();
}
#endregion
#region //============ When admin window is closed navigate to login window
==========
private void AdminPage_FormClosing(object sender, FormClosingEventArgs e)
{
LoginPage Login = new LoginPage();
Login.Show();
}
#endregion
private void btnAddUsr_Click(object sender, EventArgs e)
{
AddUser newuser = new AddUser();
newuser.ShowDialog();
}
}
}

New user

using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Text;
using System.Windows.Forms;
using System.Data.SqlClient;

42
namespace Location_Search
{
public partial class AddUser : Form
{
#region //========== Global Declaration of SQL connection
=====================================
SqlConnection conn = new SqlConnection("Data Source=.;Initial
Catalog=webSearch;Integrated Security=True");
#endregion

public AddUser()
{
InitializeComponent();
}
private void AddUser_Load(object sender, EventArgs e)
{

userid();
}
private void btnAddUser_Click(object sender, EventArgs e)
{
if ((txtUsrName.Text != "") && (txtpass.Text != "") && (txtConPass.Text != ""))
{
lblerr1.Visible = lblerr2.Visible = lblerr3.Visible = false;
if ((txtpass.Text) == (txtConPass.Text))
{
conn.Open();
SqlCommand cmd = new SqlCommand("insert into logindetails values('" +
txtUid.Text + "','" + txtUsrName.Text + "','" + txtpass.Text + "','user')", conn);
cmd.Connection = conn;
cmd.ExecuteNonQuery();
MessageBox.Show("New User Added Successfully","DataBase");
cmd.Dispose();

43
conn.Close();
userid();
txtUsrName.Text = "";
txtpass.Text = "";
txtConPass.Text = "";
}
else
{
MessageBox.Show("Confirmation of Password Mismatch","Alert!!");
}
}
else
{
lblerr1.Visible = lblerr2.Visible = lblerr3.Visible = true;
}
}
public void userid()
{
int uid;
conn.Open();
SqlCommand cmd = new SqlCommand("select max(uid) from logindetails", conn);
txtUid.Text = cmd.ExecuteScalar().ToString();
uid = Convert.ToInt32(txtUid.Text) + 1;
txtUid.Text = uid.ToString();
cmd.Dispose();
conn.Close();
}
}
}

Database

using System;
using System.Collections.Generic;

44
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Text;
using System.Windows.Forms;
using System.Data.SqlClient;
namespace Location_Search
{
public partial class DatBase : Form
{
#region //============= Globla declaration of connection string
=============================
SqlConnection conn = new SqlConnection("Data Source=.;Initial
Catalog=webSearch;Integrated Security=True");
#endregion
public DatBase()
{
InitializeComponent();
}
#region //============== Load all data from database to display in grid
====================
private void DatBase_Load(object sender, EventArgs e)
{
if (conn.State == ConnectionState.Closed)
conn.Open();
SqlCommand cmd = new SqlCommand("select id as LinkID,url as URL, description
as Description,keywords as Keywords, type as Type,ranking as Ranking,datecreated as
DateCreated from sites", conn);
DataSet ds = new DataSet();
SqlDataAdapter myadapter = new SqlDataAdapter(cmd);
myadapter.Fill(ds);
DgView_DetaildDB.DataSource = ds.Tables[0].DefaultView;
if (DgView_DetaildDB.Rows.Count == 1)

45
MessageBox.Show("No Records Found", "Records not Found");
if (conn.State == ConnectionState.Open)
conn.Close();
}
#endregion
#region //============= Delete selected link from database
===========================
private void bt_delselectd_Click(object sender, EventArgs e)
{
if (DgView_DetaildDB.SelectedRows.Count > 0)
{
string sz_sno =
Convert.ToString(DgView_DetaildDB.SelectedRows[0].Cells[0].Value);
if (sz_sno != "")
{
if (conn.State == ConnectionState.Closed)
conn.Open();
SqlCommand cmd = new SqlCommand("Delete from sites where id='" + sz_sno
+ "'", conn);
cmd.ExecuteNonQuery();
foreach (DataGridViewRow dr in DgView_DetaildDB.SelectedRows)
{
DgView_DetaildDB.Rows.Remove(dr);
}
MessageBox.Show("Deleted Successfully");
if (conn.State == ConnectionState.Open)
conn.Close();
}
else
MessageBox.Show("Please select Valid Data");
}
else
MessageBox.Show("Please select any Data");

46
}
#endregion

}
}

Login

using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Text;
using System.Windows.Forms;
using System.Data.SqlClient;
namespace Location_Search
{
public partial class LoginPage : Form
{
#region //========== Global Declaration of SQL connection
=====================================
SqlConnection conn = new SqlConnection("Data Source=.;Initial
Catalog=webSearch;Integrated Security=True");
#endregion
public LoginPage()
{
InitializeComponent();
}

#region //=========== Global Decalration of variables to store user name and id


==============
public static string str = "";
public static string id="";

47
#endregion
#region //============ Execute code when Login button clicked
======================
private void bt_login_Click(object sender, EventArgs e)
{
if (conn.State == ConnectionState.Closed)
conn.Open();
SqlCommand cmd = new SqlCommand("select type,uid from logindetails where
name='" + txt_UserName.Text + "' and pass = '" + txt_passwd.Text + "'", conn);
SqlDataReader dr_login = cmd.ExecuteReader();
if (dr_login.HasRows)
{
dr_login.Read();
if (dr_login[0].ToString() == "admin")
{
AdminPage frM_ad = new AdminPage();
frM_ad.ShowDialog();
this.Hide();
}
else
{
//===================== Store uid to global variable "id" from login table
==========================
id = dr_login["uid"].ToString();
UserPage frm_user = new UserPage();
str = txt_UserName.Text;
frm_user.ShowDialog();
//sz_frm1Dept = dr_login[0].ToString();
//sz_frm1logid = dr_login[1].ToString();
//MainPage frm_mp = new MainPage();
// frm_mp.Show();
}
dr_login.Close();

48
txt_passwd.Text = txt_UserName.Text = "";
//this.Hide();
}
else
{
MessageBox.Show("Invalid Login Credentials" + "," + "Please try again",
"ERROR");
dr_login.Close();
}
if (conn.State == ConnectionState.Open)
conn.Close();
}
#endregion
private void LoginPage_Load(object sender, EventArgs e)
{

}
#region //======= End Application when form is closed
===========================
private void LoginPage_FormClosing(object sender, FormClosingEventArgs e)
{
Application.Exit();
}
#endregion

}
}

User History

using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;

49
using System.Drawing;
using System.Text;
using System.Windows.Forms;
using System.Data.SqlClient;
namespace Location_Search
{
public partial class UserHistory : Form
{
#region //================= Global declaration of variable to store in query
==================
public static string sz_link;
public static string desc = "";
public string linkid = "";
#endregion
#region //================= Global declaration SQL connection
==================
SqlConnection conn = new SqlConnection("Data Source=.;Initial
Catalog=webSearch;Integrated Security=True");
#endregion
public UserHistory()
{
InitializeComponent();
}
#region //=========== Load user browsed history in grid and diaplay
=============
private void UserHistory_Load(object sender, EventArgs e)
{
if (conn.State == ConnectionState.Closed)
conn.Open();
SqlCommand cmd = new SqlCommand("select linkid as LinkID,uid as UserID,url as
URL, description as Description,date as LastAccess from usrhistry where uid='" +
LoginPage.id + "'", conn);
DataSet ds = new DataSet();

50
SqlDataAdapter myadapter = new SqlDataAdapter(cmd);
myadapter.Fill(ds);
dataGridView1.DataSource = ds.Tables[0].DefaultView;

if (dataGridView1.Rows.Count == 0)
{
MessageBox.Show("No History Found", "Records not Found");
btnLink.Enabled = false;
}
if (conn.State == ConnectionState.Open)
conn.Close();
}
#endregion
#region //============ Clear all the history for the particular user ============
private void btnHistory_Click(object sender, EventArgs e)
{
if (dataGridView1.RowCount == 0)
{
MessageBox.Show("History Cleared", "No Records Found");
btnLink.Enabled = false;
}
if (conn.State == ConnectionState.Closed)
conn.Open();
SqlCommand cmd = new SqlCommand("delete from usrhistry where uid='" +
LoginPage.id + "'", conn);
cmd.ExecuteNonQuery();
dataGridView1.DataSource = null;
btnLink.Enabled = false;
}
#endregion
#region //======== Goto Link to which user clicked ====================
private void btnLink_Click(object sender, EventArgs e)
{

51
if (dataGridView1.SelectedRows.Count > 0)
{
// Get link,desc and link id to store in user history
string sz_sno = Convert.ToString(dataGridView1.SelectedRows[0].Cells[2].Value);
desc = Convert.ToString(dataGridView1.SelectedRows[0].Cells[3].Value);
linkid = Convert.ToString(dataGridView1.SelectedRows[0].Cells[0].Value);
if (sz_sno != "")
{
sz_link = sz_sno;
if (conn.State == ConnectionState.Closed)
conn.Open();
SqlCommand cmd = new SqlCommand("Select ranking from sites where id='" +
dataGridView1.SelectedRows[0].Cells[0].Value + "'", conn);
int n_rnk = Convert.ToInt32(cmd.ExecuteScalar());
n_rnk++;
SqlCommand cmd_update = new SqlCommand("Update sites set ranking = " +
n_rnk + " where id='" + dataGridView1.SelectedRows[0].Cells[0].Value + "'", conn);
cmd_update.ExecuteNonQuery();
#region //====== store history of the user ========
//Call user function to update user record
user();
#endregion
if (conn.State == ConnectionState.Open)
conn.Close();
Engine_Data frm_engin = new Engine_Data();
frm_engin.ShowDialog();
}
else
MessageBox.Show("Please select Valid URL");
}
else
MessageBox.Show("Please select any URL");
#endregion

52
#region //========== Save history of the user ===============
public void user()
{
SqlConnection con = new SqlConnection("Data Source=.;Initial
Catalog=webSearch;Integrated Security=True");
con.Open();
string query = "insert into usrhistry values ('" + linkid + "','" + LoginPage.id + "','" +
sz_link + "','" + desc + "','" + DateTime.Now + "')";
SqlCommand cmd = new SqlCommand(query, con);
cmd.ExecuteNonQuery();
}
#endregion

}
}

Userpage

using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Text;
using System.Windows.Forms;
using System.Data.SqlClient;
namespace Location_Search
{
public partial class UserPage : Form
{
#region //========== Global declaration for sql connection
=====================
SqlConnection conn = new SqlConnection("Data Source=.;Initial
Catalog=webSearch;Integrated Security=True");
53
#endregion
#region //=========== Global declaration of variables for user history =======
public static string sz_link;
public static string desc = "";
public static string key = "";
public static string linkid = "";
#endregion
public UserPage()
{
InitializeComponent();
}
#region //========= Search the keywork in database ======================
private void bt_search_Click(object sender, EventArgs e)
{
if (tb_search.Text != "")
{
if (conn.State == ConnectionState.Closed)
conn.Open();
string sz_like = "%" + tb_search.Text + "%";
SqlCommand cmd; //= new SqlCommand("select id as ID,url as URL, description
as Description from searches where description like '"+sz_like+"' order by ranking desc",
conn);
if (rb_content.Checked)
cmd = new SqlCommand("select id as ID,url as URL, description as Description
from sites where keywords like '" + sz_like + "' and type='CB' order by ranking desc", conn);
else
cmd = new SqlCommand("select id as ID,url as URL, description as Description from
sites where keywords like '" + sz_like + "' and type='LB' order by ranking desc", conn);
DataSet ds = new DataSet();
SqlDataAdapter myadapter = new SqlDataAdapter(cmd);
myadapter.Fill(ds);
DgView_Links.DataSource = ds.Tables[0].DefaultView;
//DataGrid.Columns(0).HeaderText = MyTextBox.Text

54
if (DgView_Links.Rows.Count == 1)
MessageBox.Show("No Records Found", "Records not Found");
if (conn.State == ConnectionState.Open)
conn.Close();
}
else
{
MessageBox.Show("Please Enter Search Keyword", "Search Warning");
}
}
#endregion
#region //========== Goto the link to which user selected ==============
private void bt_link_Click(object sender, EventArgs e)
{
if (DgView_Links.SelectedRows.Count > 0)
{
//get link,desc and link id to save user history
string sz_sno = Convert.ToString(DgView_Links.SelectedRows[0].Cells[1].Value);
desc = Convert.ToString(DgView_Links.SelectedRows[0].Cells[2].Value);
linkid = Convert.ToString(DgView_Links.SelectedRows[0].Cells[0].Value);
if (sz_sno != "")
{
sz_link = sz_sno;
if (conn.State == ConnectionState.Closed)
conn.Open();
SqlCommand cmd = new SqlCommand("Select ranking from sites where id='" +
DgView_Links.SelectedRows[0].Cells[0].Value + "'", conn);
int n_rnk = Convert.ToInt32(cmd.ExecuteScalar());
n_rnk++;
SqlCommand cmd_update = new SqlCommand("Update sites set ranking = " +
n_rnk + " where id='" + DgView_Links.SelectedRows[0].Cells[0].Value + "'", conn);
cmd_update.ExecuteNonQuery();

55
#region //==== Call user function to store user history ===============
//Call user function to update user record
user();
#endregion
if (conn.State == ConnectionState.Open)
conn.Close();
Engine_Data frm_engin = new Engine_Data();
frm_engin.ShowDialog();
}
else
MessageBox.Show("Please select Valid URL");
}
else
MessageBox.Show("Please select any URL");
}
#endregion
private void SignIn_LinkClicked(object sender, LinkLabelLinkClickedEventArgs e)
{
LoginPage login = new LoginPage();
login.Show();
this.Hide();
}
#region //========== Display user name in screen =======================
private void UserPage_Load(object sender, EventArgs e)
{
label1.Text = LoginPage.str;
}
#endregion
#region //========== storing history of user =================
public void user()
{
SqlConnection con = new SqlConnection("Data Source=.;Initial
Catalog=webSearch;Integrated Security=True");

56
con.Open();
string query = "insert into usrhistry values ('" + linkid + "','" + LoginPage.id + "','" +
sz_link + "','" + desc + "','" + DateTime.Now + "')";
SqlCommand cmd = new SqlCommand(query, con);
cmd.ExecuteNonQuery();
}
#endregion
#region //=========== Goto History ====================
private void btnHistory_Click(object sender, EventArgs e)
{
UserHistory histry = new UserHistory();
histry.ShowDialog();
}
#endregion

}
}

9.2 SCREEN SHOT

57
FIG 9.2.1 LOGIN FORM

58
FIG 9.2.2 INSERTING NEW URL FOR LOCATION BASED

59
FIG 9.2.3 INSERTING NEW URL FOR CONTENT BASED

60
FIG 9.2.4.1 ADMIN PAGE

61
FIG 9.2.4.2 VIEW DATABASE

62
FIG 9.2.5 DELETING WITHOUT SELECTING A RECORD

63
FIG 9.2.6 DELETE A RECORD FROM THE DATABASE

64
FIG 9.2.7 NEW USER REGISTERED

65
FIG 9.2.8 SEARCH ENGINE PAGE

66
FIG 9.2.9 HISTORY OF NEW USER

67
FIG 9.2.10 USER SEARCH RESULT FOR LOCATION BASED SEARCH

68
FIG 9.2.11 WEBSITE SELECTED BY THE USER FROM THE RESULT

69
FIG 9.2.12 USER SEARCH RESULT FOR CONTENT BASED SEARCH

70
FIG 9.2.13 WEBSITE SELECTED BY THE USER FROM THE RESULT

71
FIG 9.2.14 HISTORY OF THE USER SEARCHED DATA

72
REFERENCES

73
CHAPTER 10

REFERENCES

 E. Agichtein, E. Brill, and S. Dumais, .Improving web search ranking by


incorporating user behavior information,. in Proc. of ACM SIGIR Conference,
2006.
 C. Burges, T. Shaked, E. Renshaw, A. Lazier, M. Deeds, N. Hamilton, and G.
Hullender, .Learning to rank using gradient descent,. in Proc. of ICML
Conference, 2005.
 K. W. Church, W. Gale, P. Hanks, and D. Hindle, .Using statistics in lexical
analysis,. Lexical Acquisition: Exploiting On-Line Resources to Build a
Lexicon, 1991.
 Q. Gan, J. Attenberg, A. Markowetz, and T. Suel, .Analysis of geographic
queries in a search engine log,. in Proc. of the International Workshop on
Location and the Web, 2008.
 T. Joachims, .Optimizing search engines using clickthrough data,. In Proc. of
ACM SIGKDD Conference, 2002.
 K. W.-T. Leung, W. Ng, and D. L. Lee, .Personalized concept-based clustering
of search engine queries,. IEEE TKDE, vol. 20, no. 11, 2008.
 B. Liu, W. S. Lee, P. S. Yu, and X. Li, .Partially supervised classification of
text documents, in Proc. of ICML Conference, 2002.
 W. Ng, L. Deng, and D. L. Lee, .Mining user preference using spy voting for
Search engine personalization,. ACM TOIT, vol. 7, no. 4, 2007.
 Q. Tan, X. Chai, W. Ng, and D. Lee, .Applying co-training to clickthrough
data for search engine adaptation,. in Proc. of DASFAA Conference, 2004.

74

Das könnte Ihnen auch gefallen