Sie sind auf Seite 1von 76

1.

INTRODUCTION

Introduction of Project

Competition in today’s age has a new face-Agility. Companies need to act and
react faster to a rapidly changing business environment. Smart Information
Management is the key to the company’s growth. The human resource
segment of a company has always found itself constrained may be because
each functional area is working in isolation with other and causes confusion
and other adverse effects on growth and development of the organization.
The On Line Recruitment Process & Employee Management System for HR
Group of Company is an enterprise wide single application consists of
solutions for all functional areas of an HR enterprise.

1.2 Title of the Project

On Line Recruitment Process & Employee Management System for HR


Group of Company

1.3 Human Resource Management System under ERP

Enterprise Resource planning software or ERP doesn’t live up to its acronym.


Forget about planning - it doesn’t do that - and forget about resource, a
throwaway term. But remember the enterprise part. This is true ERP ambition.
It attempts to integrate all departments and functions across a company onto a
single computer system that can serve all those different department’s
particular needs.

That is a tall order , building a single software program that serves the needs
of people in finance as well as it does the people in human resources and in
the warehouse .Each of those departments typically has its own computer
system , each optimized for the particular ways that the department does its
work . But ERP combines them all together into a single , integrated software
program that runs of a single database so that the various departments can
more easily share information and communicate with each other .

That integrated approach can have a tremendous payback if companies install


the software correctly.

The project On Line Recruitment Process & Employee Management System


for HR Group of Company is the system, which works on a local area network
for ease of HR department in maintaining the details of employees of various
departments in any organization. It includes the following functions:-

1
• Recruitment /Selection Process.
• Personal Details of the employees.
• Salary Details.
• Training Attended by the employees (inside & outside the
employs).
• Work Experience of employees.
• Report Generation.

Recruitment Management
This module covers all the activities from requisition of post till a candidate is
employed in the company, The major activities performed during coarse of
action are – Requisition for post is given by HOD to the HR department.
Based on this, openings will b advertised for which the candidates will submit
their resumes, short listing of candidates based on their skill is done, and then
they are called for interview which can be taken in many rounds. After all
rounds are over, the HR department gets a clear picture of the interview status
of the selected candidates.

Employee Profile
What follows now is the employee profile wherein existing employee records
and recruited candidates are maintained. Qualification details, family related
records are maintained. Training given to him\her is kept I record. Employee
transfer and promotion are also handled in this profile.

Payroll Management
This module handles activities like creation of salary heads, formula for any
salary head, assigning these to complete department.

Scope of the Project


• Entry of Applications received against advertisement.
• Short-listing of the candidates on the basis of criteria.
• Printing of admit card / interview letters of the short listed
candidates.
• Facility of shorting the personal details of the selected
employees.
• This software can run either on standalone machine or on
Network so a number of users can access the data
simultaneously.
• Facility of recording the previous job experience, qualification
dependent information and job history in the current
organization.
• Computerized leave Maintenance of the employees.
• Facility of storing salary details of the employees.
• Facility of storing training attendant by the employees till date
with in the organization.
• Hard copy of various reports paper can be generated.
• Security features are implemented. Only administrator and HR
Manager can view /change the performance data of the
employee and access top management queries.

2
1.4 Project’s Objectives

1. The main objective to develop the project is to make the HRP


system simple, easy and increase the productivity of the Managers
in taking decisions because all the information is available in the
organized form.
2. This software provides a user-friendly interface for the user and it
reduces the data redundancy. Centralized information is available
which can be accessed by number of users.
3. The other objective of software project planning is to provide a
framework that enables the manager to make reasonable estimates
of resources and schedule.

These estimates are made within a limited time at the beginning of a


software project and should be updated regularly as the project
progresses.

There are some other objectives to develop this system. The most
important objective is:

1) Capability
Business activities are influenced by company or organizations
ability to process information quickly and efficiently. The HRP
system adds capability in the following ways:
• Improved Processing speed
The inherent speed, with which computers process information
is one reason why organization seeks the development of the
system projects.
• Faster Retrieval of information
Locating and retrieving information from storage. The ability in
conducting complex searches.

2) Control
• Greater accuracy and consistency
Carrying out computing steps, including arithmetic, correctly
and consistently.
• Better Security
Safeguarding sensitive and important information in form that
is accessible only to authorized person.

3) Communication
• Enhanced Communication
Speeding the flow of information and messages between
remote locations as well as with in offices. This includes the
transmission of documents within offices.
• Integration of business areas
Coordinating business activities taking place in separate areas
of an organization though capture and distribution of
information.

3
4) Cost
• Monitor Cost
Tracking the performance of employees and overhead is
essential to determine whether a firm is performing in line with
expectations with in budget.
• Reduce Cost
Using computing capability to process at a lower cost than
possible with other methods, while maintaining accuracy and
performance level.

1.5 Proposed System

The proposed system is an information system which is being designed to


replace the existing manual information system. The proposed system has
the following key features.

Features
• It reduces the paperwork and increased automation.
• Very fast processing
• Efficient management of information
• Improved security
• Data safety through redundancy
• Quick response to adhoc queries
• Integrity of the data is maintained
• Transparency in the system

Introduction of Organisation

e.Soft Technologies Limited is a software development and business process


consulting company providing business process re-engineering consultancy
and services, enterprise solutions, ERP, Engineering Services, e-business
intelligence, data warehousing, ecommerce solutions and CAD solutions.

e.Soft was incorporated with the prime objective of providing on-site and off-
site professional services specializing in system integration, application
development, CAD and web services.

4
2. SYSTEM ANALYSIS

2.1 Identification of Need

If system, which is going to be developed, is complex in nature the goals of


the entire system could not be easily comprehended. Hence the need for a
more rigorous system analysis phase arose.

User

Developer Generate
Request
Managers
Problem
Statement

User Interviews

Dom Domain Build


Models

Experience

Object Model
Functional Model

Figure 1 : System Analysis Phase

2.1.1 Problem Analysis


The basic aim of problem analysis is to obtain clear understanding of the
needs of the clients and the users, what exactly is desired from the
software, and what the constraints on the solution are. Analysis leads to
the actual specification.

2.1.2 Problem Analysis Approaches


There are three basic approaches to problem analysis.
• Informal Approach
• Conceptual modeling-based Approach
• Prototyping Approach

5
In this project we use Structured Analysis technique to understand the exact
requirement of the organization. In this technique we divide the main problem
in three sub problems and solve them separately. One is Recruitment module,
Employee module, Payroll module and the Master Database module.

2.2 Preliminary Investigation

The Preliminary investigation starts as soon as someone either a user or a


number of a particular department recognizes a problem or initiates a request,
to modify the current computerized system, or to computerize the current
manual system. An important outcome of the preliminary investigation is
determining whether the system is feasible or not.

There are number of employees and keeping details of all of the employees is
very tedious. The existing Manual Information System is not so efficient to
reply all the queries related to the employees.

HR Personnel major responsibility is to induct right type of professionals. For


inducting right type of professionals the organization has a separate
recruitment cell, which performs following activities:

• Calling for applications


• Selecting the applications
• Conducting the examinations
• Compiling the result
• Selecting the candidate for interview
• Preparing details for interview board.

HR department could face a number of problems listed below in order to


maintain the above information in a structures way.

• Preparation of result takes a lot of time.


• It becomes very difficult for the examination section to find out details
of those applicants who scores minimum marks for the interview.
• It becomes very difficult to generate the list of candidates meeting the
eligibility criteria mentioned in the advertisement.

HR department wants a list of technical people deputed in the various


departments and the training attended by these employees to further identify
the training needs. The existing system was unable to answer the above
mentioned query.

Salary details information was available only with account departments. We


mentioned all the salary details in this project so that the information is
centralized and can be accessed by number of users.

6
The main objective to develop the project is to make the Personnel
Information System simple, easy and increase the productivity of the
Managers in taking decisions because all the information is available in the
organized form.

This software provides a user –friendly Interface for the user and it reduces the
data redundancy. Centralized information is available which can be accessed
by number of users.

2.2.1 Drawbacks in current system

There is no existing computerized system. All the work is performed


manually. The manual system is not secure enough.

The current system had lot of problems, which are as follows

• Difficult to locate or find particular information regarding


organization and employees.
• Vacancy available in a particular department.
• All the employees are working in isolation. Every one is
maintaining its own database to keep the information related to his
employees and there is no common database. Hence there is
duplication of data.
• Lot of work has to be done manually.
• No security of data.
• Maintenance of large numbers of record is a hectic job.
• Inefficiency to respond to management queries.
• Time consuming process.
• Very slow processing.
• Loss of integrity of data.
• Inability to recover from data damages.
• System is not transparent.

In order to cope with the above problems this software has been designed and
developed to computerize the working of the Human Resource Department.
The main objective to develop the project is to make the system simple.

2.3 Feasibility Study

Feasibility is the determination of whether or not a project is worth doing. The


process followed in making this determination is called a feasibility study.
This type of study determines if a project can and should be taken. Once it has
been determined that project is feasible, the analyst can go ahead and prepare a
project specification, which finalizes project requirements.

All projects are feasible given unlimited resources and infinite time!
Unfortunately, the development of computer based system is more likely to be
plagued by a scarcity of resources and difficult delivery dates. It is both

7
necessary and prudent to evaluate the feasibility of the project at the earliest
possible time. Months or years of efforts, money loss and untold professional
embarrassment can be averted if we better understand the project at its study
time.

The feasibility of a project is being analyzed within some frame work. The
most important factor is feasible and desirable then it include in the schedule
of the management so that approval can be taken from the same. In the
conduct of the feasibility study, the analysis considers seven distinct, but inter-
related types of feasibility. They are:

• Technical Feasibility
• Economic Feasibility
• Operational Feasibility
• Social Feasibility
• Management Feasibility
• Legal Feasibility
• Time Feasibility

The assessment of the “HRP” has the following facts:

Technical Feasibility

Technical Analysis begins with the assessment of the technical viability of the
proposed system. We have to mention what technologies are required to
accomplish system function and performance .We have to also study how will
these technology issues affect cost .

The existing technology seems sufficient to run the new system. The data
holding facility is also seems sufficient because we are using SQL RDBMS
and it can handle large volume of data , hence in near future if the number of
employs increases it can handle its very easily .

Operational Feasibility

It seems that management of the company is very much interested in the new
system. The management and the users are normally the same members so
there is no problem of conflict between the management & users.

2.3.1 Financial and economical

Among the most important information contained in feasibility study is cost


benefit Analysis – an assessment of the economic justification for a computer
based system project. Cost benefit Analysis calculates approximate costs for
project development and weighs them against tangible and intangible benefits.

8
2.4 Project Planning

Project life cycle has three stages: -

1. Project Initiation – Development team prepares the project plans and


finalizes the outcome of each phase. In this stage team also prepares the
comprehensive list of tasks involved in each phase, and the project assigns
responsibilities to the team members, depending on their skills.

2. Project Execution – In this stage, the team develops the product. In


case of HR Management development team, will develop the online post
resume application.

This Stage consists of following phase: -


1. Requirement Analysis
2. High Level Design
3. Low Level Design
4. Construction
5. Testing
6. Acceptance

3. Project Completion - In this stage, the team has to update the site
regularly. Each new vacancies has to add by the HR Manager as according
to the needs and demands. This stage is very important the freshness of the
site.

When any updation or upgradation is required for the website, the developers
or maintenance team make the website upto date.

There are lots of requirements after the completion of the Project. As this
website is dynamic website in which lots of changes are requires such as
Update current vacancies, interview dates or selected candidates list. So for
this is always a way to do this.

9
2.5 Project Scheduling

2.5.1 PERT Chart

Program Evaluation Review technique(PERT) Chart is mainly used for high


risk projects with various estimation parameters. For each module in a project,
duration is estimated as follows:
1. Time taken to complete a project or module under normal conditions,
tnormal.
2. Time taken to complete a project or module with minimum time, tmin.
3. Time taken to complete a project or module with maximum time,
tmax.
4. Time taken to complete a project from previous related history,
thistory.

An average of tnormal, tmin, tmax and thistory is taken depending upon the
project.
Beta
Testing

Programming

SRS And Design Alpha Testing

3 5 8 10
20 35 20 20

15
1 2
5

20 20 20 10
4 6 7 9 11

User Requirement Buy Installation Writing Training User Test


And Analysis
Writing
Manuals User
Hardware

10
Figure 2 : P.E.R.T CHART FOR ONLINE Recuriment Process &
Employee Management System for HR Group for a Company

2.5.2 Gantt Chart

Gantt Chart is also known as Time Line Charts. A Gantt chart can be
developed for the entire project or a separate chart can be developed for
each function. A tabular form is maintained where rows indicate the
tasks with milestones and columns indicate duration (weeks/months).
The horizontal bars that spans across columns indicate duration of the
task.

June July August September

Requirement Gathering

Design

Test Cases

Coding

Testing

Build

Figure 3 : Gantt Chart

11
2.6 Software requirement specifications (SRS)

SRS is a document that completely describes what the proposed software


should do without describing how the software will do it. The basic limitation
for this is that the user need keeps changing as environment in which the
system was to function changes with time. This leads to a request for
requirement changes even after the requirement phase is done and the SRS is
produced.

The origin of the most software system is the need of a client, who either
wants to automate the existing manual system or desires a new software
system. The developer creates the software system and the end users will use
the completed system. There are three major parties interested in a new
system: the clients, the user and the developer. Somehow the requirements for
the system that will satisfy the needs of the clients and the concerns of the
users have to be communicated to the developer. The Problem is that the
client usually does not understand the software and the software development
process, and the developer often does not understand the client’s problem and
application area.

The basic purpose of the SRS is to bridge this communication gap. The SRS is
the medium through which the client and the user needs are accurately
specified; indeed SRS forms the basis of the software development. A good
SRS should satisfy all the parties -- something very hard to achieve – and
involves trade-offs and persuasions. The important purpose of developing SRS
is helping the client understand their own needs.

Advantages of SRS

• An SRS establishes the basis for agreement between the client and the
supplier on what the software product will do.
• An SRS provides a reference for validation of the final product.
• A high quality SRS is a prerequisite to a high quality software.
• A high quality SRS reduces the development cost.

Characteristics of an SRS

A good SRS is

1. Correct
2. Complete
3. Unambiguous
4. Verifiable
5. Consistent
6. Ranked for importance and/or stability
7. Modifiable
8. Traceable

12
Requirement specification document
The requirement for the proposed system is that, it would be less vulnerable to
the mistakes being made due to entry at two or three levels and calculations.
Moreover, the control of the administrator would be centralized. This will
provide the support for HR department working process.

(1) Introduction
(a) Purpose of the software
The purpose of the proposed system is to provide efficient
information system for management, department and employees.
The main objective to develop the project is to make the
information part simple and to provide user friendly access of this
program to all the staff members of the organization so that they can
locate and reply the inquiries concerned to them.

(b) Scope
The software prepared for e.soft technologies Ltd., it can be
implemented in any Organization with a few minor changes. The
software finds good scope in any organization having HR
department. Talking to the administrator and the employee who were
dealing with the HR department, we came to know that the manual
system was not up to the mark due to the cumbersome data entry and
ample of calculations on the basis of which reports are generated.

(2) General description


(a) Product function and overview

Data Entry Section


User section This section is developed using ASP.NET as front-end
and SQL Server as back-end. Only valid user enters to this section by
providing login name and password to the system.

Administrator section This section can be accessed by providing


administrator password. In this section the administrator can authorize
persons to view dataentry and reports. The administrator can edit the
master table information and payroll information.

HR Manager section This section can be accessed by providing HR


Manager password. In this section the HR Manager can authorize
persons to add and view data base. The HR Manager can edit the
master table information and payroll information.

Data Updation Section


HR Manager section This section is developed using ASP.NET as
front-end and SQL Server as back-end. Only valid user i.e.
administrator and HR manager can update employee profile
information.

13
Administrator section This section can be accessed by providing
administrator password. In this section the administrator can authorize
persons to data updation. The administrator can edit the master table
information and payroll information.

Data Deletion Section


HR Manager section Only valid user i.e. HR Manager can delete
records of employees & candidates by providing login name and
password to the system.

Administrator section This section can be accessed by providing


administrator password. In this section the administrator can authorize
persons to data deletion. The administrator can delete the master table
information and payroll information.

Data Processing Section


This section can be accessed by providing password. In this section the
only administrator as well as HR Manager can process the payroll
related information.

Report Section
This section is developed using Crystal Report as a report generation
tool and SQL Server as back-end.

(b) User Characteristics


The user at the entry section can be HR Manager. The HR Manager
can feed the entries can modify or delete these entries while
administrator can navigate through the entire system.

(a) General constraints


The back-end has to be either SQL Server 7.0 or 2000 and the system
should run on the Windows Operating System.

(3) Specific Requirements

(a) Input and Output

Candidate CV Entry, written exam details, interview details forms serve as


input & the candidate list, written exam list, selected candidate list are the
output of this software.

(b) Functional Requirement


• There should be no manual entry in the database table by directly
accessing the tables i.e. there should be security at database server.
• Only valid user can Input, update or delete record and only
administrator and HR Manager can perform any operation on master
database and payroll module

14
(c) External Interface Requirement
The software must be a user friendly platform to reduce the complexity of
operation. The Human Resource Management System should be capable
enough to support multi-user environment. The software is based on
client-server architecture so that one or more user can do entries in the
software system as well as view reports at a time.

(d) Performance Constraints


The software is supposed to have lacks of records so it should be capable
to generate reports and to perform cumbersome calculations in seconds.

(e) Acceptance Criteria


Before accepting the system, the developer must demonstrate that the
system works on Human Resource Planning. The developer will have to
show through the test cases that all conditions are satisfied.

Software Requirement Specification

Software Tools
Front-end Tool:-Microsoft ASP.NET 2.0 with c#
 User friendly
 Low Cost Solution
 GUI feature
 Better designing aspects
Back-end Tool: -Microsoft SQL Server 20000 – features are
 Security
 Portability
 Quality

Platform
Windows platform like: 2000 professional, XP & Vista

Hardware Specification:
 Intel Pentium and Celeron class processor
 Processor Speed – 1.2 GHz or above
 RAM - 512 MB
 HDD - 40 GB
 FDD-1.44MB
 Monitor-14”SVGA
 Printer – Dot Matrix /Inkjet /Laser Printer
 Mouse- Normal
 Keyboard- Normal

15
2.7 Software Engineering Paradigm applied

Requirement Analysis Phase

During the requirement analysis phase, the development team analysis the
requirement, to be fulfilled by the On Line Recruitment Process & Employee
Management System for HR Group of Company and identifies the portable
approaches for meeting these requirements.

To identify the requirement needed to the application, to study the existing HR


Management portals.

Finally, team identifies that the On Line Recruitment Process & Employee
Management System for HR Group of Company should: -
1. Enable a visitor to register with the resume and application after
validations has been performed on the data provided but the user.
2. Enable visitor to perform activity such as send application, post resume.
3. Enable a registered user to view his/her different reports.
4. Enable administrator to view database and different reports generated by
HR Manager.
5. Enable HR Manager to add and view database to update records of
employees.

Requirement Engineering Processes

1. Elicitation-determine the operational requirements


(User needs and customer expectations)
2. Analysis-translate operational requirements into technical specifications
3. Documentation-record the operational requirements and technical
specifications Verification-check that specifications are complete, correct
and consistent with Needs and expectations
4. Generate acceptance test scenarios
5 Requirements Management-control changes to requirements

What Is a Software Process?

A process is a way of doing something includes work activities


Includes procedures for conducting the work activities.
The work activities transform input work products into output work products

The procedures are supported by methods and tools; for example

Requirements Design Documents


Design Test Plans
Process

Methods
And tools

16
Figure 4 : Software Process

Process Models for Software Development

Iterative Development
Operational
Requirements

Requirements
Specifications

Incremental
Architectural Validation
Design

Incremental
Verification Design
Partitioning

Incremental
Builds*
Figure 5 : Iterative Development Model

Iteration is the process by which the desired result is approached through


repeated cycles

In software engineering, an iterative approach allows revision of and addition


to the work products

Different types of iterative models support revision of:


Requirements -

1. Design
2. Code

Analysis of website

Analysis is a great starting point for developing a website. Analysis enables


your strengths should be permanently promoted on the website can be used to
overcome competitive weakness, such as limited resources to establish a
twenty four hours a day, seven a weak customer service center. The website
also serves as a reliable center point for taking advantage of opportunities. A
website with a list serve potential candidates can quickly notify its.

17
Design Model

To solve the actual problem, a website developer or a team of developers must


incorporate development strategy that encompasses the presses method and
tools.

A process model for website chosen based on the nature of the project and
application. By selecting appropriate process model for website.

Software Engineering Methodology Used

Modular approach is used for developing the proposed system. A system is


considered modular if it consists of discrete components so that each
component can be implemented separately and a change to one component has
minimal impact on other components. Every system is a hierarchy of
components. This system is not an exception. To deign such a hierarchy there
are two approaches: -

(1) Top down

(2) Bottom up

Both approaches have some merits and demerits. For this system top down
approach has been used. It starts by identifying the major components of the
system, decomposing them into their lower level components and iterating
until the derived level of detail is achieved.

Top down design methods often result in some from of stepwise refinement.
Starting from an abstract design, in each step the design is refined to a more
concrete level, until we reach a level where no more refinement is needed and
the design can be implemented directly. A top down approach is suitable only
if the specifications of the system are clearly known and the system
development is from scratch.

A bottom up approach starts with designing the most basic or primitive


components and proceeds to higher level components that use these lower
level components.

The user of the existing system defines his general objectives for the software,
but he does not identify detailed input, processing or output requirements. So,
I have chosen PROTOTYPING approach to develop this software.

Prototyping

The basic idea of prototyping is that instead of freezing the requirements


before any design or coding can proceed, a throwaway prototype is build to

18
help understand the requirements. This prototype is developed based on the
prototype obviously undergoes design, coding & testing, but each of these
phases is not done very formally or thoroughly. By using this prototype the
client can get an actual feel of the system, because the interactions with the
prototype can enable the client to better understand the requirements of the
desired system.

Requirement
Analysis
Design Code Test
Design

Code

Requirement
Analysis

THE PROTOTYPE MODEL

Because the system is complicated and large and there is no existing system
(computerized) prototyping is an attractive idea. In this situation letting the
client test the prototype provides the variable inputs, which help in
determining the requirements of the system. It is also an effective method of
demonstrating the feasibility of a certain approach.

2.8 Data model

A data model is an abstract model that describes how data is represented and
accessed.

The term data model has two generally accepted meanings:

1. A data model theory, i.e. a formal description of how data may be


structured and accessed. See also List of database models.
2. A data model instance, i.e. applying a data model theory to create a
practical data model instance for some particular application. See data
modeling.

Data Model Theory

A data model theory has three main components:

19
• The structural part: a collection of data structures which are used to
create databases representing the entities or objects modeled by the
database.
• The integrity part: a collection of rules governing the constraints
placed on these data structures to ensure structural integrity.
• The manipulation part: a collection of operators which can be applied
to the data structures, to update and query the data contained in the
database.

For example, in the relational model, the structural part is based on a modified
concept of the mathematical relation; the integrity part is expressed in first-
order logic and the manipulation part is expressed using the relational algebra,
tuple calculus and domain calculus.

Data Model Instance

A Data Model Instance is created by applying a Data Model Theory. This is


typically done to solve some business enterprise requirement.

Business requirements are normally captured by a semantic logical data


model. This is transformed into a physical Data Model Instance from which is
generated a physical database.

For example, a Data modeler may use a data modeling tool to create an Entity-
relationship model of the Corporate data repository of some business
enterprise. This model is transformed into a relational model, which in turn
generates a relational database.

Entity-relationship model
The entity-relationship model or entity-relationship diagram (ERD) is a data
model or diagram for high-level descriptions of conceptual data model, and it
provides a graphical notation for representing such data models in the form of
entity-relationship diagrams. Such models are typically used in the first stage
of information-system design; they are used, for example, to describe
information needs and/or the type of information that is to be stored in the
database during the requirement analysis. The data modeling technique,
however, can be used to describe any ontology (i.e. an overview and
classifications of used terms and their relationships) for a certain universe of
discourse (i.e. area of interest). In the case of the design of an information
system that is based on a database, the conceptual data model is, at a later
stage (usually called logical design), mapped to a logical data model, such as
the relational model; this in turn is mapped to a physical model during
physical design. Note that sometimes, both of these phases are referred to as
"physical design".
There are a number of conventions for entity-relationship diagrams (ERDs).
The classical notation is described in the remainder of this article, and mainly
relates to conceptual modeling. There are a range of notations more typically
employed in logical and physical database design.

20
job_code

req_dat specification
dept_co e last_date
de
advertised
req_code
description
Job_Advertise

dept_code
tot_vacan
Recruitment Details
emp_code Employee's Details

desg_code

takes has
has
int_date

job_code
Human Resource emp_name

Management
Interview Details

cv_code

can_code
transfered has
emp_code has
basic pay

emp_code tot_cl

emp_code deduction
tran_to Employee
Transfer
tot_ml Leave Details
Pay Details
tran_date tran_type

Figure 6 : ER Diagram for On Line Recruitment Process & Employee


Management System for HR Group of Company

21
3. SYSTEM DESIGN

3.1 Modularisation details

Designing of system deals with transforming the requirements of system into a


form implement able using a programming language. We can broadly classify
various design activities into two parts:

• Preliminary (or high level) design.


• Detailed Design.

In preliminary design part we design the following items:

1. Different modules required to implement the design.


2. Control relationship among the identified modules.
3. Interface among different modules.

Designing of this software is done with high cohesiveness, i.e. there is a


minimized interaction between two different modules. There is no intra
modular relationship between modules. Most of the modules are self
independent. At the same time, modules are loosely coupled i.e. inter-modular
relation. Hence the software is loosely coupled and highly cohesive.

3.1.1 System’s Modules


Since we use the structured approach to develop the system we divide the
system in modules on the basis of function they perform. These modules are
again divided in to sub modules so that problem can be solved easily and
accurately.

3.1.1.1 Module Division


Application is divided in four modules. They are listed as below:

• Master Database Module


• Recruitment Module
• Employee Module
• Payroll Module

Master Database module contains eight options:--


1) Country
2) State
3) City
4) Department
5) Designation
6) Grade
7) Allowance
8) Deduction

22
Access to master database is provided to only HR Manager.

1) Country

This module provide an interface to the HR Manager through which HR


Manager can Add, Update, Delete the records of the Country Database.

1. Country Code(In label, auto generated)


2. country Name(In Textbox)

From here HR Manager can change the details as well as delete the record of
that country. Proper validations and checks are provided for entered data.

2) State

This module provide an interface to HR Manager the through which user can
Add, Update, Delete the records of the State Database. The existing country
names are displayed in a combo; HR Manager can select a country from here
and can enter the state name for that country. He can also see the existing
records.

1. State Code (In label, auto generated)


2. State Name (In Textbox)

From here HR Manager can change the details as well as delete the record of
that state. Proper validations and checks are provided for entered data.

3) City

This module provide an interface to HR Manager the through which user can
Add, Update, Delete the records of the City Database. The existing country
names are displayed in a combo; then states according to selected country are
displayed. HR Manager can select a country from here and then state and can
enter the city name for that record. He can also see the existing records.

1. City Code (In label, auto generated)


2. City Name (In Textbox)

From here HR Manager can change the details as well as delete the record of
that city. Proper validations and checks are provided for entered data.

4) Department

This module provide an interface to the HR Manager, through which HR


Manager can Add, Update, Delete the records of the Department Database.
1. Department Code (In label, auto generated)
2. Department Name (In Text box)

23
From here HR Manager can change the details as well as delete the record of
that department. Proper validations and checks are provided for entered data.
5) Designation

This module provide an interface to HR Manager the through which user can
Add, Update, Delete the records of the Designation Database. The existing
country names are displayed in a combo; HR Manager can select a country
from here and can enter the state name for that country. He can also see the
existing records.

1. Designation Code (In label, auto generated)


3. Designation Name (In Textbox)
From here HR Manager can change the details as well as delete the record of
that state. Proper validations and checks are provided for entered data.

6) Grade

This module provide an interface to the HR Manager through which HR


Manager can Add, Update, Delete the records of the Grade Database.

1. Grade Code (In label, auto generated)


2. Grade Name (In Textbox)

From here HR Manager can change the details as well as delete the record of
that grade. Proper validations and checks are provided for entered data.

7) Allowance

This module provide an interface to the HR Manager through which HR


Manager can Add, Update, Delete the records of the Allowance Database.

1. Allowance Code (In label, auto generated)


2. Allowance Name (In Textbox)
3. Allowance Type (In Textbox)
4. Short Name (In Textbox)

From here HR Manager can change the details as well as delete the record of
that grade. Proper validations and checks are provided for entered data.

8) Deduction

This module provide an interface to the HR Manager through which HR


Manager can Add, Update, Delete the records of the Deduction Database.

1. Deduction Code (In label, auto generated)


2. Deduction Name (In Textbox)
3. Deduction Type (In Textbox)
4. Short Name (In Textbox)

24
From here HR Manager can change the details as well as delete the record of
that grade. Proper validations and checks are provided for entered data.

Recruitment Module contains following parts:--


1) Job Opening
1.1) Job Identification
1.2) Job Advertising

2) Candidate Details
2.1) Candidate Entry
2.2) Candidate Shortlist

3) Written Exam Details


3.1) Written Marks
3.2) Short listed Candidates

4) Interview
4.1) Interview Details
4.2) Selected Candidates

1) Job Opening

This module contains two parts:


One for job identification and other for advertising the job.

1.1) Job Identification


This part identifies the vacancies for different designations in different
departments. Here user can enter the details and also update, find and
delete the records. The existing department names are displayed in
combo and with respect to selected department, designation names are
displayed in a combo. User can update and retrieve the information as
well as can also delete any selected record.

1.2) Job Advertising


This part is used to fill the details for advertising the identified job. In
this part advertising details are filled according the job code. User can
update and retrieve the information as well as can also delete any
selected record.

2) Candidate Details

This module contains two parts:


One for cv entry of candidates and other for short listing the candidates
for written exams.

2.1) Candidate Entry


This part is used to fill some important details of the candidates and
stores the path where candidate’s cv are stored. CV path can be stored
in cv path text box by clicking the browse button.

25
2.2) Candidate Shortlist
This part shows the details of candidates, according to a job code in a
listview. By checking a particular cv code and clicking show detail cv,
user can see the detail cv of that particular candidate. Mail can also be
send to checked candidate. By checking a row in listview and clicking
the sent mail button, mail sending form is opened and in TO field, that
candidate’s mail id is automatically placed.

3) Written Exam Details


This module also contains two parts:
One for storing the written exam details of every candidate selected for
written exam and other is showing the list of shortlisted candidates for
interview.

3.1) Written Marks


This part stores all the details of written exam. Marks of each
candidate are stored here for further processing. These details can be seen
by user at any time.

3.2) Short listed Candidates


This part shows the selected candidates details according to job code. By
selecting job code in a combo, all candidates, who are selected in written
exam according to criteria, their details are displayed in al list. A mail can
be sent to notify them about their selection for interview round.

4) Interview

This module contains two parts:


One for storing the details of interview and other for the list of selected
candidates.

4.1) Interview Details


This part is used to store all the information of the interview round. All
information of a particular candidate can be stored here and also can be
retrieved after.

4.2) Selected Candidates


This part shows the list of finally selected candidates. Mail can be sent to
all the candidates about their selection and about other information.

Employee Module contains three parts:


1) Employee Profile
2) Employee Training
3) Employee Transfer

1) Employee Profile
This is used to store all the details of company’s employees. In this
employee’s personal, official, experience, qualification and family details
are stored and all details of a particular employee can be retrieved, updated

26
or deleted. Each employee has a unique emp code. If an employee’s
personal information will be deleted, all other information related to him
will be deleted so that no duplication of data will be there.

2) Employee Training
When a person joins the company as a employee, company give him/her
training. All information related to the training of employees can be
maintained in this module.

3) Employee Transfer
Employee’s transfer details are stored here. As user select a emp code
from a combo, all details such as employee name, department, designation,
grade are displayed in text boxes. New department, designation and grade
can be selected through combos. It can also be stored whether it is a
promotion or a simple transfer. This information can be retrieved, updated
and deleted.

Payroll Module contains three sections:


1) Allowance Details
2) Allowance Values
3) Salary Structure

1) Allowance Details
This part is used for all the allowance related details that are given by
company to its employees.
.
2) Allowance Values
This part is used for the information related to allowance values. Here
all details of the values of the allowance are stored according to the
department and designation. Particular information can be seen in listview
by clicking the add button.

3) Salary Structure
This is used for determining the salary structure of a designation according
to department.

User Control
ASP.NET provides facility to make user control that can be used anywhere
in the project. It is a great facility provided by ASP.NET. In My project,
have made a textbox in which only email address can be filled in valid
format. It is used in forms where such requirement arises , such as where
Candidate Email Address is filled.

27
3.2 Data integrity and constraints

Pictorial representations of systems have long been acknowledged as being


more effective that narrative. They are easier to grasp when being explained or
investigate; it is easier to find a particular aspect interest, and pictorial
representations are less ambiguous than narrative.

The DFD is a simple graphical notation that can be used to represent a system
in terms of the input data to the system, various processing carried out on
these data, and the output data generated by the system.

The 0 level DFD of system is as follows:

CANDIDATE DEPARTEMENT

Job detail Job request

Select
candidate
list
HUMAN
CV RESOURCE
MANAGEMENT
SYSTEM

Employee
profile

Candidate Selected
list Reports
list

HR MANAGER

Salary structures

Figure 7 : Data Flow Diagram

28
DEPARTEMENT Selected HR MANAGER
candidates
list
Job request
rerequest Vacancy
Information about details
selected candidates Interview
details

CV Entry Written
exam
details
Candidate
details RECRUITMEN
T PROCESS
Written Exam 1.1
Marks Advertisement
details

Written Marks Job Advertise

Job details Send application

CANDIDATE
Information about selection
Figure 8 : Recruitment Module

29
DEPARTEMENT HR MANAGER

Employee Employee
profile details

emp transfer
emp official Transfer
details

Employee official
information Training details

EMPLOYEE
REGISTRATION emp training
Employee personal PROCESS
information 1.2 Qualification
details

emp qual
emp master Employee’s
experience
Family
details

emp experience emp family

Figure 9 : Employee Module

Salary structure
Allowance details
details
Payroll
Process
Allowance master
1.3 Salary formula

Salary formula Allowance details

HR MANAGER

Figure 10 : Payroll Module

30
3.3 Database design

Detail design is the most creative & challenging phase in the development life
cycle of the project. In the detail design of the system we design the tables of
database, schema of tables and relationship between tables and file
organization of the application.

3.1.1 Design of Database Table


The data to be used in the system are stored in various tables. The number of
tables used & their structure are decided upon keeping in mind the logical
relation in the data available. The database design specifies:

• The various tables to be used


• Data to store in each table
• Format of the fields & their types

We are using database of SQL Server.


To create the database firstly we start the SQL Server. The starting window
appear as :

Figure 11 : SQL Server

After running the SQL Server, firstly I’ve create the database ‘hronline’ with
the user dbo. The initial size of the database is 3 mb.

31
DATABASE TABLES

• Country_master
This table is designed to store the information about country.

Structure for database: country_master

Field Name Type Constraints Description


country_code Varchar(10) Primary Key Auto generated
country_name Varchar(30) Not null

• State_master
This table is designed to store the information about states.

Structure for database: state_master

Field Name Type Constraints Description


state_code Varchar(10) Primary Key Auto generated
state_name Varchar(30) Not null

• city_master
This table is designed to store the information about cities

Structure for database: city _master

Field Name Type Constraints Description


City_code Varchar(10) Primary Key Auto generated
city_name Varchar(30) Not null

• dept_master
This table is designed to store the information about departments of the
company

Structure for database: dept_master

Field Name Type Constraints Description


dept_code Varchar(10) Primary Key Auto generated
dept_name Varchar(30) Not null

32
• desg _master
This table is designed to store the information about the designations in
different department.

Structure for database: desg_master

Field Name Type Constraints Description


desg_code Varchar(10) Primary Key Auto generated
desg_name Varchar(30) Not null

• grade
This table is designed to store the information about the grades in different
department.

Structure for database: grade

Field Name Type Constraints Description


grade_code Varchar(10) Primary Key Auto generated
grade_name Varchar(30) Not null

• loginmaster
This table is designed to store the information about the grades in different
department.

Structure for database: loginmaster

Field Name Type Constraints Description


emp_code Varchar(10) Primary Key Auto generated
emp_name Varchar(30) Not null
emp_type Varchar(30) Not null
Password Varchar(10) Not null

33
• recr_master(Recruitment related Information)

This table is designed to store the information about recruitment of the


candidates for the posts that have to be filled by different departments.

Structure for database: recr_master

Field Name Type Constraints Description


Req_code Varchar(10) Primary Key Auto generated
dept_code Varchar(10) References
dept_master
desg_code Varchar(10) References
desg_master
total_vacancies Numeric(5)
Req_date Datetime(8)
remarks Varchar(50)
Priority Varchar(10)

• job_advertise

This table is designed to store the information regarding the advertisement


for vacancies

Structure for database: job_advertise

Field Name Type Constraints Description


Job_code Varchar(10) References
recr_master
specification Varchar(50)
description Varchar(50)
last_date Datetime(8)

34
• cv_entry
This table is designed to store some primary details about the candidates
who send their cv

Structure for database: recr_master

Field Name Type Constraints Description


cv_code Varchar(10) Primary Key Auto
generated
job_code Varchar(10) References
recr_master
cand_name Varchar(50) References
desg_master
Address Varchar(50)
Fathersname Varchar(50)
date_of_bitrh Datetime(8)
contact Varchar(20)
email_id Varchar(30)
qualification Varchar(50)
Year_quali Varchar(4)
Prof_quali Varchar(50)
Year_prof Varchar(4)
skills Varchar(50)
cv_path Varchar(50) Not Null
selected Varchar(10)
sent_mail Varchar(10)
filename Varchar(30)
filepath Varchar(30)

35
• written_marks
This table is designed to store written exam details conducted for the
selection of candidates

Structure for database: written_marks


Field Name Type Constraints Description
cv_code Varchar(10) References
cv_entry
job_code Varchar(10) References
recr_master
written_date Datetime(8)
technical_paper Numeric(5)
general_marks Numeric(5)
gmax_marks Numeric(5)
paper1_marks Numeric(5)
paper2_marks Numeric(5)
grand_toatal Numeric(5)
totalmax_marks Numeric(5)
g_percent Numeric(5)
total_percent Numeric(5)
selected Varchar(10)
sent mail Varchar(10)

• interview_detail
This table is designed to store interview details held for candidates
selected in written exams

Structure for database: interview_detail


Field Name Type Constraints Description
job_code Varchar(10) References
recr_master
cv_code Varchar(10) References
cv_entry
interview_date Datetime(8)
no_of_interviewers Numeric(5)
Int_1 Numeric(5)
Int_2 Numeric(5)
Int_3 Numeric(5)
Int_4 Numeric(5)
comm._skills Varchar(10)
tech_skills Varchar(10)
• emp_master

36
This table is designed to store personal details about the employees of the
company

Structure for database: emp_master

Field Name Type Constraints Description


emp_code Varchar(10) Primary Key Auto generated
emp_name Varchar(50)
address Varchar(50)
country_code Varchar(10) References
country_master
state_code Varchar(10) References
state_master
city_code Varchar(10) References
city_master
nationality Varchar(20)
contact Varchar(50)
email Varchar(50)
dob Datetime(8)
mar_status Varchar10)
gender Varchar(10)
pp_no Varchar(10)
dl_no Varchar(10)
Pin_code Varchar(10)

• emp_official
This table is designed to store official details about the employees of the
company

Structure for database: emp_official

Field Name Type Constraints Description


emp_code Varchar(10) References
emp_master
dept_code Varchar(10) References
dept_master
desg_code Varchar(10) References
desg_master
Joining_date Datetime(8)
Grade_code Varchar(10) References
grade

• emp_experience

37
This table is designed to store previous experiences details of the
employees of the company

Structure for database: emp_ experience

Field Name Type Constraints Description


s_no Numeric(5) Primary Key
emp_code Varchar(10) References
emp_master,
Primary Key
org_name Varchar(50)
work_period Varchar(20)
salarydrawan Varchar(20)
desg_name Varchar(30)
description Varchar(50)

• emp_qualification
This table is designed to store qualification details of the employees of the
company

Structure for database: emp_qualification

Field Name Type Constraints Description


emp_code Varchar(10) References
emp_master
institute Varchar(50)
Highest_qual Varchar(30)
percentage numeric(9)
year Numeric(5)

• emp_ family
This table is designed to store family details of the employees of the
company

Structure for database: emp_family

Field Name Type Constraints Description


Emp_code Varchar(10) References
emp_master,
Primary Key
relative_name Varchar(50) Primary Key
relationship Varchar(20)
• emp_training

38
This table is designed to store training details that is given to the
employees of the company
Structure for database: emp_ training

Field Name Type Constraints Description


emp_code Varchar(10) References emp_master,
Primary Key
tr_code Varchar(10) Primary Key
tr_description Varchar(50)
training_type Varchar(20)
tr_period Varchar(10)
comments Varchar(50)

• emp_transfer
This table is designed to store training details that is given to the
employees of the company
Structure for database: emp_transfer

Field Name Type Constraints Description


emp_code Varchar(10) References emp_master
transfer_no Varchar(10) Primary Key
deptcode_from Varchar(10)
deptcode_to Varchar(10)
desg_from Varchar(10)
pre_grade Varchar(10)
new_grade Varchar(10)
transfer_date Datetime(8)
transfer_type Varchar(10)

• allowance_master
This table is designed to store primary information of the allowances given
by the company in salary.

Structure for database: allowance_master

Field Name Type Constraints Description


allowance_code Varchar(10) Primary Key
allowance_name Varchar(50)
allowance_type Varchar(15)
Short_name Varchar(10)
• deduction_master

39
This table is designed to store primary information of the deduction
deducted by the company in salary.

Structure for database: allowance_master

Field Name Type Constraints Description


deduction_code Varchar(10) Primary Key
deduction _name Varchar(50)
deduction _type Varchar(15)
Short_name Varchar(10)

• emp_salary
This table is designed to store details of the employee’s salary given

Structure for database: allowance_master

Field Name Type Constraints Description


emp_code Varchar(10) References emp_master
allow1_code Varchar(10) References allowance_master
allow1_amt Numeric(9)
allow2_code Varchar(10) References allowance_master
allow2_amt Numeric(9)
allow3_code Varchar(10) References allowance_master
allow3_amt Numeric(9)
allow4_code Varchar(10) References allowance_master
allow4_amt Numeric(9)
total_all Numeric(9)
dedu1_code Varchar(10) References deduction_master
dedu 1_amt Numeric(9)
dedu 2_code Varchar(10) References deduction_master
dedu 2_amt Numeric(9)
dedu 3_code Varchar(10) References deduction_master
dedu 3_amt Numeric(9)
dedu 4_code Varchar(10) References deduction_master
dedu 4_amt Numeric(9)
Total_ded Numeric(9)

• emp_leave

40
This table is designed to store primary information of the leave taken by
the employee.

Structure for database: allowance_master

Field Name Type Constraints Description


emp_code Varchar(10) References
loginmaster
cl Numeric(5)
Rl Numeric(5)
medical Numeric(5)
erl Numeric(5)
special Numeric(5)

41
3.4 User Interface Design

User interface portion of a software product is responsible for all interactions


with the user. Almost every software product has a user interface. User
interacts with a software product through its user interface, which is the
primary component of any software product that is directly relevant to the
users.
User interface of our project has several characteristics. They are as follows:
• It is simple to learn.
• The time and effort required to initiate and execute different
commands is minimum.
• Once users learn how to use interface, their speed of recall about
how to use the software.
• It is attractive to use.
• The commands supported by interface are consistent.

42
4. CODING

4.1 Complete Project Coding

4.2 Standardization of the coding

The process of optimization starts from the designing stage itself and
continues till the deployment and distribution stage.

Optimizing speed
In order to optimize the speed of the application the following techniques are
used:
 Use of appropriate data type
 Assigning property values to variables
 Using Early binding instead of late binding

Use of appropriate data type


The use of appropriate data type optimizes the execution speed of the
application. Too many implicit data type conversions slow down the
execution. The use of variant data type has been avoided in the application as
it causes application to run slowly.

Assigning property values to variables


Accessing a value from a variable is 10 to 20 times faster than accessing it
from a property because accessing a value from a property makes use of
calling an object. This is not true for a variable as there is no overhead of
making a call to the object. This is used to optimize the execution speed of the
application.

An unoptimized code would have been


k = Convert.ToString(Convert.ToInt32(k) + 1);
An optimized code would have been
c1.cmd.CommandText = "select top 1 city_code from city_master order by
city_code desc";
c1.adp.Fill(c1.ds, "code");
b = dr[“city_code”].ToString();
k = Convert.ToString(Convert.ToInt32(b) + 1);

Using Early Binding instead of Late Binding

A client interacts with a component using its properties and methods. In order
to access the properties and methods of a component, the client needs to be
bound to the component. The process of associating a client with a component
is binding. When you implement early binding between a client’s call and a
component method, the method called is determined at compile time i.e. the

43
call is associated with the appropriate method during the process of
compilation.
• Performance speed
• Syntax checking at compile time
• Display of objects in the Object Browser window
• Provision of help in the Object Browser

Optimizing the Display Speed


In order to display a form frequently Hide and Show methods are used instead
of the Unload and load events. Loading and unloading a form involves
memory overhead and is therefore slower.

Optimizing the Memory


The application has been optimized to occupy the least amount of memory and
still give a good performance. In order to optimize the memory the data is
reclaimed from the string by setting it to “ ”.

Compiling to Native Code


Native code compilation offers several options for performance tuning than
are available with P-code compilation. We can use one of the following
options for compiling the application:
1) Optimize for fast code
2) Optimize for slow code
3) Favor Pentium Pro(tm)

Optimize for fast code


This is the best option when the application has been optimized on speed and
there is large storage space on the disk. The compiler compiles the code for
faster execution. It examines the program structure and changes some of it,
resulting in an optimized code but an enlarged executable file. This is the
default compile option in Visual Studio 2005.

Optimize for slow code


This is the best option when one is concentrating on the hard disk space and
not the speed. The compiler creates the smallest possible compiled code,
occupying less disk space but probably slow in execution.

Advanced Optimization Options


The following Advanced optimization techniques were used:
 Assume No Aliasing
As Aliasing provides a name that refers to a memory location that is
already referred by a different name, selecting this option allows the
compiler to apply optimization that it could not otherwise be applied.

 Remove Array Bounds Checks


Selecting this option can also optimize application speed. The Visual
Studio Compiler by default checks for array indexes and their dimensions.
It reports an error if an array index is out of bounds. As the arrays used

44
here (minimum) are sure not to go out of bounds, therefore choosing this
option will actually optimize speed and thus gives a faster code.

4.3 Validation checks

Field Presence Check


To ensure that all necessary fields are present.
This checks that an entry has been made for the field. For example, the
emp_name field in employee master table cannot be left blank. is a required
field.

Field Length Check


To ensure that an item of data has the correct number of characters.
It determines the minimum and maximum length of the field. It can make sure
the minimum has been. It can also check the maximum length.

Range Check
To ensure that data value is within a pre-determined range.
This checks a value to be within a certain range of values. For example, the
month of a year must be between 1 and 12. Numbers less than 1 or greater
than 12 would be rejected.

Format check
To ensure the individual characters that make up the data are valid - e.g. no
letters in numerical data.
This checks that data is of the right format, that it is made up of the correct
combination of alphabetic and numeric characters. A Phone number must be
in the form of XXXXXXXXXX. The characters must be numbers. The total
length is ten characters. Any other format is rejected.

Batch header check


This is concerned with Batch processing - see later.
The total number of records in the batch should be calculated by the computer
and compared with the figure on the batch header. The control totals and hash
totals are also calculated and compared.

Check Digit
Allows a number to be self-checking.
This is used to check the validity of code numbers, for example paper1_marks
in written_marks table. These numbers are long and prone to data entry
errors. It is crucial that such numbers are entered correctly so that the right
record in the file or database is identified.

45
5. TESTING

5.1 Testing techniques and Testing strategies

Software testing is a critical element of software quality assurance and


represent the ultimate review of specification, design, coding. The purpose of
product testing is to verify and validate the various work products viz. units,
integrated unit, final product to ensure that they meet their requirements.

5.1.1 Testing Objectives

Basically, testing is done for the following purposes:


1. Testing is a process of executing a program with the intent of finding
an error.
2. A good test case is one that has a high probability of finding an as yet
undiscovered error.
3. A successful test case is one that uncovers an as yet undiscovered
error.
Our objective is to design test cases that systematically uncover different
classes of errors and do so with a minimum amount of time and effort. This
process has two parts:
• Planning: This involves writing and reviewing unit, integration,
functional, validation and acceptance test plans.
• Execution: This involves executing these test plans, measuring,
collecting data and verifying if it meets the quality criteria. Data
collected is used to make appropriate changes in the plans related to
development and testing.
The quality of a product or item can be achieved by ensuring that the product
meets the requirements by planning and conducting the following tests at
various stages.

5.1.2 Types of Testing Software

The main types of software testing are:

Component Testing
Starting from the bottom the first test level is “Component Testing”,
sometimes called Unit Testing. It involves checking that each feature specified
in the “Component Design” has been implemented in the component. In
theory an independent tester should do this, but in practice the developer
usually does it, as they are the only people who understand how a component
works. The problem with a component is that it performs only a small part of
the functionality of a system, and it relies on co-operating with other parts of
the system, which may not have been built yet. To overcome this, the

46
developer either builds, or uses special software to trick the component into
believing it is working in a fully functional system.

Interface Testing
As the components are constructed and tested they are then linked together to
check if they work with each other. It is fact that two components that have
passed all their tests, when connected to each other produce one new
component full of faults. These tests can be done by specialists, or by the
developers.
Interface testing is not focused on what the components are doing but on how
they communicate with each other, as specified in the “System Design”. The
“system Design” defines relationship between components, and this involves
stating:

1) What a component can expect from another component in terms of services.


2) How these services will be asked for.
3) How they will be given.
4) How to handle non standard conditions, i.e. errors.
5) Tests are constructed to deal with each of these.

The tests are organized to check all the interfaces, until all the components
have been built and interfaced to each other producing the whole system.

System Testing
Once the entire system has been built then it has to be tested against the
“System Specification” to check if it delivers the features required. It is still
developer focused, although specialist developers known as system testers are
normally employed to do it.
In essence System testing is not about checking the individual parts of the
design, but about checking the system as a whole. In effect it is one giant
component.
System testing can involve a number of specialist types of test to see if all the
functional and non-functional requirements have been met. In addition to
functional requirements these may include the following types of testing for
the non-functional requirements:

1). Performance- Are the performance criteria met?


2). Volume- Can large volumes of information be handled?
3). Stress- Can peak volumes of information be handled?
4). Documentation- Is the documentation usable for the system?
5). Robustness- Does the system remain stable under adverse circumstances?

There are many others, the needs for which are dictated by how the system is
supposed to perform.

Acceptance Testing
Acceptance testing checks the system against the “Requirements”. It is similar
to system testing in that the whole system is checked but the important
difference is the change in focus:

47
System testing checks that the system that was specified has been delivered.
Acceptance testing checks that the system delivers what was requested. The
customer and not the developer should always do acceptance testing. The
customer knows what is required from the system to achieve value in the
business and is the only person qualified to make that judgment. The forms of
tests may follow those in system testing, but at all times they are informed by
the business needs.

Release Testing
Even if a system meets all its requirements, there is still a case to be answered
that it will benefit the business. Release testing is about seeing if the new or
changed system will work in the existing business environment. Mainly this
means the technical environment, and checks concerns such as:

1). Does it affect any other systems running on the hardware?


2). Is it compatible with other system?
3). Does it have acceptable performance under load?

These tests are usually run by the computer operations team in a business. It
would appear obvious that the operation team should be involved right from
the start of a project to give their opinion of a new system may have.

Test Case Design


Test case design focuses on a set of techniques for the creation of test cases
that meet overall testing objectives. In test case design phase, the engineer
creates a series of test cases that are intended to “demolish” the software that
has been built.
Any software product can be tested in one of two ways:

1) Knowing the specific function that a product has been designed to perform,
tests can be conducted that demonstrate each function is fully operational,
at the same time searching for errors in each function. This approach is
known as Black Box Testing.

2) Knowing the internal workings of a product, tests can be conducted to


ensure that internal operation performs according to specifications and all
internal components have been adequately exercised. This approach is
known as White Box Testing.

Black box testing is designed to uncover errors. They are used to demonstrate
that software functions are operations; that input is properly accepted and
output is correctly produced; and that integrity of external information is
maintained. A black box examines some fundamental aspects of a system with
little regard for the internal logical structure of the software.

White box testing of software is predicated on close examination of procedural


details. Providing test cases that exercises specific set of conditions and/or
loops tests logical paths through the software. The “state of program” may be
examined at various points to determine if the expected or asserted status
corresponding to the actual status.

48
5.1.3 Testing the On Line Recruitment Process & Employee Management
System for HR Group of Company

Testing phase is the very important phase in the software development so


it is fully kept in mind while developing this software. In case of this
software, testing has been done in the following areas and manner:

5.1.3.1) Functional Testing


According to the need of the software, the following testing plans
have been planed on some amount on test data. Hypothetical data is
used to test the system before implementation. Some temporary user
ids are created to check the validity and authenticity of the users.
Various constraints are checked for their working. A demo case will
be taken with dummy data for new users.

5.1.3.2) Security Testing


• User id and password is checked and verified for secure login and
access.
• It will be demonstrated that two different login sessions have different
permissions on the menu items . In case a user forgets his password
then administrator or HR Manager has rights to change the password
or allocate new password.

5.1.3.3) Performance Testing

Based on the field conditions these testing for fine tuning can be
carried out at a later date.

• Peak load testing


• Storage testing
• Performance time testing
• Recovery testing

Testing team will take over the project after the initial unit testing,
which would mark the completion of project.

5.2 Debugging and Code improvement

Debugging
The purpose of debugging is to locate and fix the offending code responsible
for a symptom violating a known specification. Debugging typically happens
during three activities in software development, and the level of granularity of
the analysis required for locating the defect differs in these three. The first is

49
during the coding process, when the programmer translates the design into an
executable code. During this process the errors made by the programmer in
writing the code can lead to defects that need to be quickly detected and fixed
before the code goes to the next stages of development. Most often, the
developer also performs unit testing to expose any defects at the module or
component level. The second place for debugging is during the later stages of
testing, involving multiple components or a complete system, when
unexpected behavior such as wrong return codes or abnormal program
termination (“abends”) may be found. A certain amount of debugging of the
test execution is necessary to conclude that the program under test is the cause
of the unexpected behavior and not the result of a bad test case due to
incorrect specification, inappropriate data, or changes in functional
specification between different versions of the system. Once the defect is
confirmed, debugging of the program follows and the misbehaving component
and the required fix are determined. The third place for debugging is in
production or deployment, when the software under test faces real operational
conditions. Some undesirable aspects of software behavior, such as inadequate
performance under a severe workload or unsatisfactory recovery from a
failure, get exposed at this stage and the offending code needs to be found and
fixed before large-scale deployment. This process may also be called
“problem determination,” due to the enlarged scope of the analysis required
before the defect can be localized.

Code Improvement
The process of optimization starts from the designing stage itself and
continues till the deployment and distribution stage.

Optimizing speed
In order to optimize the speed of the application the following techniques are
used:
 Use of appropriate data type
 Assigning property values to variables
 Using Early binding instead of late binding

Use of appropriate data type


The use of appropriate data type optimizes the execution speed of the
application. Too many implicit data type conversions slow down the
execution. The use of variant data type has been avoided in the application as
it causes application to run slowly.

Assigning property values to variables


Accessing a value from a variable is 10 to 20 times faster than accessing it
from a property because accessing a value from a property makes use of
calling an object. This is not true for a variable as there is no overhead of
making a call to the object. This is used to optimize the execution speed of the
application.

An unoptimized code would have been

50
k = Convert.ToString(Convert.ToInt32(k) + 1);

The optimized code is as follows:


c1.cmd.CommandText = "select top 1 city_code from city_master
order by city_code desc";
c1.adp.Fill(c1.ds, "code");
string b = dr["city_code"].ToString();
k = Convert.ToString(Convert.ToInt32(b) + 1);

Using Early Binding instead of Late Binding


A client interacts with a component using its properties and methods. In order
to access the properties and methods of a component, the client needs to be
bound to the component. The process of associating a client with a component
is binding. When you implement early binding between a client’s call and a
component method, the method called is determined at compile time i.e. the
call is associated with the appropriate method during the process of
compilation.
• Performance speed
• Syntax checking at compile time
• Display of objects in the Object Browser window
• Provision of help in the Object Browser

Optimizing the Display Speed


In order to display a form frequently Hide and Show methods are used instead
of the Unload and load events. Loading and unloading a form involves
memory overhead and is therefore slower.

Optimizing the Memory


The application has been optimized to occupy the least amount of memory and
still give a good performance. In order to optimize the memory the data is
reclaimed from the string by setting it to “ ”.

Compiling to Native Code


Native code compilation offers several options for performance tuning than
are available with P-code compilation. We can use one of the following
options for compiling the application:
4) Optimize for fast code
5) Optimize for slow code
6) Favor Pentium Pro(tm)

Optimize for fast code


This is the best option when the application has been optimized on speed and
there is large storage space on the disk. The compiler compiles the code for
faster execution. It examines the program structure and changes some of it,
resulting in an optimized code but an enlarged executable file. This is the
default compile option in Visual Studio 2005.

51
Optimize for slow code
This is the best option when one is concentrating on the hard disk space and
not the speed. The compiler creates the smallest possible compiled code,
occupying less disk space but probably slow in execution.

Advanced Optimization Options


The following Advanced optimization techniques were used:
 Assume No Aliasing
As Aliasing provides a name that refers to a memory location that is
already referred by a different name, selecting this option allows the
compiler to apply optimization that it could not otherwise be applied.

 Remove Array Bounds Checks


Selecting this option can also optimize application speed. The Visual
Studio Compiler by default checks for array indexes and their
dimensions. It reports an error if an array index is out of bounds. As
the arrays used here (minimum) are sure not to go out of bounds,
therefore choosing this option will actually optimize speed and thus
gives a faster code.

52
6. System Security measures

Security Prompting the user for a userid and password in our application is
a potential security threat. So credential information is transferred from the
browser to server are encrypted.

Cookies are an easy and useful way to keep user-specific information


available. However, because cookies are sent to the browser's computer,
they are vulnerable to spoofing or other malicious use. So we follow these
guidelines:

Do not store any critical information in cookies. For example, do not store
a user's password in a cookie, even temporarily. Avoid permanent cookies
if possible. Consider encrypting information in cookies. Set expiration
dates on cookies to the shortest practical time we can.

6.1 Database security

A Database Security Strategy

Much attention has been focused on network attacks by crackers, and how to
stop these. But the vulnerability of data inside the database is somewhat
overlooked. Databases are far too critical to be left unsecured or incorrectly
secured.

Most companies solely implement perimeter-based security solutions, even


though the greatest threats are from internal sources. And information is more
often the target of the attack than network resources.

The best security practices protect sensitive data as it's transferred over the
network (including internal networks) and when it's at rest. One option for
accomplishing this protection is to selectively parse data after the secure
communication is terminated and encrypt sensitive data elements at the
SSL/Web layer. Doing so allows enterprises to choose at a very granular level
(usernames, passwords, and so on.) the sensitive data to secure throughout the
enterprise. Application-layer encryption and mature database-layer encryption
solutions allow enterprises to selectively encrypt granular data into a format
that can easily be passed between applications and databases without changing
the data. I'll focus on database-layer encryption in this article.

Data encryption

The sooner data encryption occurs, the more secure the information is. Due to
distributed business logic in application and database environments,
organizations must be able to encrypt and decrypt data at different points in
the network and at different system layers, including the database layer.
Encryption performed by the DBMS can protect data at rest, but you must

53
decide if you also require protection for data while it’s moving between the
applications and the database and between different applications and data
stores. Sending sensitive information over the Internet or within your
corporate network as clear text defeats the point of encrypting the text in the
database to provide data privacy.

6.2 Creation of User profiles and access rights

Determining user profiles and their privilege domains will contribute to the
creation of a personalized software experience. Effective software must only
present those that are relevant to a given user and within the user's domain of
privilege. These must also reflect the specific grains relevant to the user.
Application personalization requires the establishment of a three-dimensional
framework inclusive of the following:

1. User groups and hierarchies


2. Privilege domain
3. Content domain

Application personalization is an intersection of a three-dimensional


framework

User Groups and Hierarchies

The main purpose for developing user groups and hierarchies is to avoid the
repetitive task of allocating certain sets of privilege and content access to each
individual user. Establishing such groupings allows allocating a set of
privileges to all users within a given group with a single stroke of software
command. Similarly, through inheritance, multiple user groups within the
hierarchy may be allocated a set of privileges with a single click.

54
Privilege Domain

These privileges are essentially functions that may be performed within a


dashboard software program. Therefore, privilege is simply a software
function, and each privilege domain is a unique collection of those functions.
Often, a set of privileges is collectively referred to as a role. For example, the
previous list of privileges could be grouped into three sets and each set
associated to a role. The three roles and the corresponding set of privileges
could be:

Content Domain

The user groups are defined and the roles have been assigned, but still
unanswered is the question: What data and KPIs does a user see on a
dashboard? Answering this question leads us to the issue of content domain-
the parameters of which would define the KPI granularity, the reports, and the
alerts for each dashboard user. Managing content domain involves two
aspects: (1) security and (2) relevance. Security refers to the restriction of
information delivery only to those with the privilege to access certain
information. Information is inherently confidential, and every organization has
its boundaries regarding who may access what information. The security
framework must be created during a dashboard deployment, determining the
permissions and restrictions on the content domain of each user. Relevance
refers to the filtering of the most relevant content to a given dashboard user.
From all of the permitted information for a given user, an effective dashboard
must present the most relevant content with flexibility for the user to access
more information as needed.

55
7. Cost Estimation of the Project
Resource Sharing:

The main goal is to make all programs, equipment and data available to
anyone on the network without regard to the physical resource and the user.
Users need to share resource other files, as well, a common example being
printers. Printers are utilized only a small percentage of the time, therefore
company doesn’t want to invest in a printer for each computer. Network can
be use in this situation to allow the entire user to have access any of the
available printers.

High-Reliability:

Goal of computer network is to provide high reliability by having alternative


source of supply. For example all files could be replicated on two or three
machines, so if one of them is unavailable (due to H/w failure), in other copies
could be used. In addition, the presence of multiple CPU’s means that if one
goes down, the other may be able to take over it works, although at reduced
performance. For many applications the ability to continue operating in the
face of H/w problems is utmost importance.

Saving-Money:

Small computers have much better price/performance ratio than larger ones
mainframes are roughly a factor of ten faster than personal computers but they
cost a thousand times more. This imbalance
Has caused many system designers to built system consisting of personal
computers, one per user, with data kept on one are more shared files server
machines. In this model, the users are called ‘client’ and the whole
arrangement is called the ‘Client-server model’.

Scalability:

The ability to increase the system performance gradually as the work load
grows just by adding more processors with centralized mainframes, when a
system is full a larger one, usually at great expense and even greater
description to the user must replace with ‘client-server model’, new client and
new server can be added as needed.

Communication-Medium:

A computer network can provide a powerful communication medium among


widely separated users. Using a computer network it is easy for two or more
people who are working on some project and who live for apart to write a
report together. When one worker makes a change to an online document, the
other can see the change immediately. Instead of waiting several days for a

56
letter. Such a speed up makes cooperation among far-flung groups of people
easy where it previously had been impossible.

Increase Productivity:

Network increases productivity as several people can enter data at the same
time, but they can also evaluate and process the shared data. So, one person
can handle accounts receivables and someone else processes the profit and
loss statements. Since we have seen what factors have to be kept in mind
before calculating the cost estimating of the full network setup. The networks
would include: -

Server Hardware:

The HP server on windows’2000 which cost around Rs 1.5 lakhs will be best
suited for their organization purpose, as it has maximum security and easy to
manage with a great looking GUI interface making more manageable.

Server Software:

The client machine or the workstation needs to e powerful to lesser the load on
the server like the Windows Server 2000 (in which the workstations are not
more than interface to the server). A genuine Intel Pentium-3 cost around
32,000. The machine configuration will be 512MB RAM 40 GB HD, Floppy-
Drive, Multimedia-Kit and Monitor. This organization requires 10 number of
client machine.

Firewall:

The company needs ant virus S/w for protection again viruses anti viruses S/w
cost about RS10, 000. Besides these the organization needs dot-matrix printers
for printing of issue details and other printing operation. There is a need of at
least four printers and the 0is it’s cost around 4,000.

57
8. Reports

58
9. PERT Chart, Gantt Chart
9.1 PERT Chart

Program Evaluation Review technique(PERT) Chart is mainly used for high


risk projects with various estimation parameters. For each module in a project,
duration is estimated as follows:
5. Time taken to complete a project or module under normal conditions,
tnormal.
6. Time taken to complete a project or module with minimum time, tmin.
7. Time taken to complete a project or module with maximum time,
tmax.
8. Time taken to complete a project from previous related history,
thistory.
An average of tnormal, tmin, tmax and thistory is taken depending upon the
project.
Beta Testing

Programming

SRS And Design Alpha Testing

3 5 8 10
20 35 20 20

15
1 2
5

20 20 20 10
4 6 7 9 11

User Requirement Buy Installation Writing Training User Test


And Analysis
Writing
Manuals User
Hardware

Figure 56 : P.E.R.T CHART FOR ONLINE Recuriment Process &


Employee Management System for HR Group for a Company

59
9.2 Gantt Chart

Gantt Chart is also known as Time Line Charts. A Gantt chart can be
developed for the entire project or a separate chart can be developed for each
function. A tabular form is maintained where rows indicate the tasks with
milestones and columns indicate duration (weeks/months). The horizontal bars
that spans across columns indicate duration of the task.
June July August September

Requirement
Gathering
Design

Test Cases

Coding

Testing

Build

Figure 57 : Gantt Chart

60
10. Future scope and further enhancement of the
Project

10.1 Further Scope

A test system proposal according to its workability, impact on organization


ability to meet users need and effective use resources; it focuses on the
following three major questions:

What are the user’s demonstrable needs and how does it need them?
What resources are available for the given system?
Is the problem worth solving?
What is the likely impact of the system on the organization?

Each of these questions has to be answered carefully. They revolved around


investigation and evaluation of the problems. Identification and description of
candidate systems, specification of performance and the cost of each system
and the final selection of the best system.

• End User Support


1. The proposed system is developed in ASP.NET and SQL
Server.
2. If organization increases users, it just has to add more machines and
install the software on it, which is in the form of exe.

• Security
Security features are implemented. No unauthorized access to the
package, as the security is implemented through login and password.

Last but one of the most important advantages of the HRP system is
that, this system can be used in any Govt. or Public organization, to
process and manage their HR department working, with slight
modifications.

There is no doubt that there always remains some scope of


improvement. The important thing is that the system developed should
be flexible to accommodate any future enhancements. This system can
be used to provide some enhancements without rewriting of existing
code.

61
10.2 Further Enhancement of the Project

Everything that is made has some or the other things to be added to make it
better than revolutions. The project “On Line Recruitment Process & Employee
Management System for HR Group of Company” also falls in the same
domain.

Although it has been tried to develop a robust and fault free system, still
enough flexibility has been provided for further enhancements and
modifications. As I mentioned earlier then the designed forms are typically
reflections of the developer, so I strongly believe that the enhancement to be
done with the project to be done with the design changes, coding changes. But
at the same time I would like to mention that since one can not claim himself
as a master of the technology there is always some scope of technical
modifications in the project that may lead to find code redundancy & storage
space minimization.

• Since the data is retrieved from the tables where everything is based on
the coding system if the coding system is changed then the system needs to
be redesigned.
• The number of queries can always be increased when needed by the
user just by modifying the code little bit, full concentration is maintained
on the design of the system so that it can be easily modified.
• Design of the system can be changed in the sense of the flow of the
control so that the coding can be decreased to a considerable level.
• The developed sub modules have all the related features but still
improvement can be done. The developed package is flexible enough to
incorporate the modifications or enhancements with less alteration.

Human Resource Management System can easily be incorporated in the ERP


system, as it is in itself a separate module of other modules,
In future web-enabled features can also be included in the software so that the
information can be retrieved globally.

62
11. Bibliography

Black Book on ASP.NET

Microsoft SQL Server 2000 in 21 Days


by Richard Waymire, Rick Sawtell

Software Engineering
by Roger S. Pressman

Software Engineering An Integrated Approach


by Panka Jalote

Referenced Sites

www.msdn.microsoft.com
www.w3schools.com
www.microsoft.com

63
12. Appendices

12.1 Introduction to Visual Studio.net

Visual studio.net is a complete set of development tools for building ASP


Web applications, XML Web services, desktop applications, and mobile
applications. Visual Basic .Net, visual c++.NET all use the same integrated
development environment (IDE), which allows them to share tools
facilitates in the creation of mixed – language solutions. In addition, these
languages leverage the functionality of the .NET Framework, which provides
access to key technologies that simplify the development of ASP Web
applications and XML Web services.
Architecture is explained form bottom to the top in the following
discussion:

VB C++ C# Jscript

Common language specification


ASP.NET Web services and Windows forms
Web forms

ADO.NET Data and XML

BASE Classes

Common language Runtime

Figure 58 : Visual Studio 2005

1. At the bottom of the architecture is common language Runtime. The


common language runtime loads and executes code that targets the
runtime. This code is therefore called managed code.
2. .NET Framework provides a rich set of class libraries. These include
base classes, like networking and input/output classes, a data library
for data access, and classes for use by programming tools, such as
debugging services.

64
3. ADO.NET is Microsoft’s ActiveX Data Object (ADO) model for
the .NET Framework. ADO.NET is intended specifically for
developing web applications.
4. The 4th layer of the framework consists of the windows application
model and, in parallel, the web application model. The Web
application model-in the slide presented as ASP .NET –includes Web
Forms and Web Services .ASP.NET comes with built in Web forms
controls, which are responsible for generating the user interface. They
mirror typical HTML widgets like text boxes or buttons.
5. One of the obvious themes of .NET is unification and interoperability
between various programming languages. In order to achieve this;
certain rules must be laid and all the languages must follow these rules.
6. The CLR and the .NET Frameworks in general, however, are designed
in such a way that code written in one language can not only
seamlessly be used by another language. Hence ASP.NET can be
programmed in any of the .net compatible language whether it is
VB.NET, C#, Managed C++ OR JSCRIPT.NET.

The .NET Framework

The .NET Framework is a multi-language environment for building,


deploying, and running XML Web services and applications . It consists of
three main parts:

Common Language Runtime Despite its name , the runtime actually has a
role in both a component is running , the runtime is responsible for
managing memory , allocation , starting up and stopping threads and
processes , and enforcing security policy , as well as satisfying any
dependencies that the component might have on other components .

The Common Language Runtime is the execution engine for .NET Framework
applications. It provides a number of services, including the following:

• code management – loading and execution;


• application memory isolation;
• verification of type safety;
• conversion of IL (platform-independent code generated by
compilers) to native, platform-dependent code;
• access to metadata, which is enhanced type information;
• managing memory for managed objects;
• enforcement of code access security;
• exception handling, including cross-language exceptions;
• interoperation between managed code, COM objects, and pre-
existing DLLs (unmanaged code and data);
• automation of object layout;
• support for developer services – profiling, debugging, and so on.

65
Unified programming classes The Framework provides developers with a
unified, object-oriented, hierarchical, and extensible set of class libraries
(APIs). Developers use the Windows Foundation classes.

The .NET Framework class library

The .NET Framework includes classes, interfaces, and value types that are
used in the development process and provide access to system functionality.
To facilitate interoperability between languages, the .NET Framework types
are Common Language Specification (CLS) compliant and can therefore be
used from any programming language where the compiler conforms to the
CLS.

The .NET Framework types are the foundation on which .NET applications,
components, and controls are built. The .NET Framework includes types that
perform the following functions:

• represent base data types and exceptions;


• encapsulate data structures;
• perform I/O operations;
• access information about loaded types via reflections;
• invoke .NET Framework security checks;
• Provide data access, rich client-side GUI, and server-controlled, client-
side GUI.

The .NET Framework provides a rich set of interfaces, as well as abstract and
concrete (non-abstract) classes. You can use the concrete classes as is or, in
many cases, derive your own classes from them, as well as from abstract
classes. To use the functionality of an interface, you can either create a class
that implements the interface or derive a class from one of the .NET
Framework classes that implements the interface.

Microsoft, with the help of Hewlett-Packard and Intel, supplied the OS-
independent subset of .NET class library to the ECMA standardization board.
For more information visit:

Objectives Of .Net Framework:

The .NET Framework is designed to fulfill the following objectives:


• To provide a consistence object-oriented programming
environment whether object code is stored and executed locally
but internet –distributed, or executed remotely.
• To provide a code – execution environment that minimizes
software deployment and versioning conflicts.

66
• To provide a code-execution environment that guarantees safe
execution of code, including code created by an unknown or
semi – trusted third party.
• To provide a code – execution environment that eliminates the
performance problems of scripted or interpreted environments.
• To make the developer experience consistent across widely
varying types of applications, such as windows – based
applications and Web – based applications.
• To build all communication on industry standards to ensure that
code based on the .NET Framework can integrate with any
other code.

Server Application Development

Server – side applications in the managed world are implemented through


runtime hosts. Unmanaged applications host the common language runtime,
which allows your custom managed code to control the behavior of the server.
This model provides us with all the features of the common language runtime
and class library while gaining the performance and scalability of the host
server.

ADO.NET

ADO.NET is a set of libraries included with the Microsoft .NET Framework


that help you communicate with various data stores from .NET applications.
The ADO.NET libraries include classes for connecting to a data source,
submitting queries, and processing results. You can also use ADO.NET as a
robust, hierarchical, disconnected data cache to work with data off line. The
central disconnected object, the DataSet, allows you to sort, search, filter, store
pending changes, and navigates through hierarchical data. The Dataset also
includes a number of features that bridge the gap between traditional data
access and XML development. Developers can now work with XML data
through traditional data access interfaces and vice-versa.

Microsoft Visual Studio .NET includes a number of data access features you
can use to build data access applications. Many of these features can save you
time during the development process by generating large amounts of tedious
code for you. Other features improve the performance of the applications you
build by storing metadata and updating logic in your code rather than fetching
this information at run time. Believe it or not, many of Visual Studio .NET’s
data access features accomplish both tasks.

The ADO.NET Object Model

Now that you understand the purpose of ADO.NET and where it fits into the
overall Visual Studio .NET architecture, it’s time to take a closer look at the

67
technology. In this chapter, we’ll take a brief look at the ADO.NET object
model and see how it differs from past Microsoft data access technologies.

ADO.NET is designed to help developers build efficient multi-tiered database


applications across intranets and the Internet, and the ADO.NET object model
provides the means. Figure 1-1 shows the classes that comprise the ADO.NET
object model. A dotted line separates the object model into two halves. The
objects to the left of the line are “connected” objects. These objects
communicate directly with your database to manage the connection and
transactions as well as to retrieve data from and submit changes to your
database. The objects to the right of the line are “disconnected” objects that
allow a user to work with data offline.

.NET Data Providers

A .NET data provider is a collection of classes designed to allow you to


communicate with a particular type of data store. The .NET Framework
includes two such providers, the SQL Client .NET Data Provider and the OLE
DB .NET Data Provider. The OLE DB .NET Data Provider lets you
communicate with various data stores through OLE DB providers. The SQL
Client .NET Data Provider is designed solely to communicate with SQL
Server databases, version 7 and later.

Each .NET data provider implements the same base classes—Connection,


Command, DataReader, Parameter, and Transaction—although their actual
names depend on the provider. For example, the SQL Client .NET Data
Provider has a SqlConnection object, and the OLE DB .NET Data Provider
includes an OleDbConnection object. Regardless of which .NET data provider
you use, the provider’s Connection object implements the same basic features
through the same base interfaces. To open a connection to your data store, you
create an instance of the provider’s connection object, set the object’s
ConnectionString property, and then call its Open method.

Each .NET data provider has its own namespace. The two providers included
in the .NET Framework are subsets of the System.Data namespace, where the
disconnected objects reside. The OLE DB .NET Data Provider resides in the
System.Data.OleDb namespace, and the SQL Client .NET Data Provider
resides in System.Data.SqlClient.

Namespaces
A namespace is a logical grouping of objects. The .NET Framework is large,
so to make developing applications with the .NET Framework a little easier,
Microsoft has divided the objects into different namespaces.
The most important reason for using namespaces is to prevent name collisions
in assemblies. With different namespaces, programmers working on different
components combined into a single solution can use the same names for
different items. Since these names are separated, they don’t interfere with each
other at compile time. A more practical reason for namespaces is that grouping
objects can make them easier to locate.

68
Crystal Reports
Crystal Reports for Visual Studio .NET is the standard reporting tool for
Visual Studio .NET: it brings the ability to create interactive, presentation
quality content-which has been the strength of Cryatal Reports for years- to
the .NET platform

Crystal Reports fro Visual Studio .NET is an Integrated component of the


Visual Studio .NET development environment.

12.2 Microsoft SQL SERVER


Microsoft® SQL Server™ 2000 is designed to work effectively as:
• A central database on a server shared by many users who connect to it
over a network. The number of users can range from a handful in one
workgroup, to thousands of employees in a large enterprise, to
hundreds of thousands of Web users.
• A desktop database that services only applications running on the same
desktop.

Server Database Systems

Server-based systems are constructed so that a database on a central computer,


known as a server, is shared among multiple users. Users access the server
through an application:
• In a multi tier system, such as Windows® DNA, the client application
logic is run in two or more locations.
• A thin client is run on the user's local computer and is focused
on displaying results to the user.
• The business logic is located in server applications running on a
server. Thin clients request functions from the server application,
which is itself a multithreaded application capable of working with
many concurrent users. The server application is the one that opens
connections to the database server.
This is a typical scenario for an Internet application. For example, a
multithreaded server application can run on a Microsoft® Internet Information
Services (IIS) server and service thousands of thin clients running on the
Internet or an intranet. The server application uses a pool of connections to
communicate with one or more instances of SQL Server 2000. The instances
of SQL Server 2000 can be on the same computer as IIS, or they can be on
separate servers in the network.
• In a two-tier client/server system, users run an application on their
local computer, known as a client application, that connects over a
network to an instance of SQL Server 2000 running on a server
computer. The client application runs both business logic and the code
to display output to the user, so this is sometimes referred to as a thick
client.

69
Advantages of Server Database System
Having data stored and managed in a central location offers several
advantages:
• Each data item is stored in a central location where all users can work
with it.
• Business and security rules can be defined one time on the server and
enforced equally among all users.
• A relational database server optimizes network traffic by returning
only the data an application needs.
• Hardware costs can be minimized.
• Maintenance tasks such as backing up and restoring data are simplified
because they can focus on the central server.

Advantages of SQL Server 2000 as a Database Server

Microsoft SQL Server 2000 is capable of supplying the database services


needed by extremely large systems. Large servers may have thousands of
users connected to an instance of SQL Server 2000 at the same time. SQL
Server 2000 has full protection for these environments, with safeguards that
prevent problems, such as having multiple users trying to update the same
piece of data at the same time. SQL Server 2000 also allocates the available
resources effectively, such as memory, network bandwidth, and disk I/O,
among the multiple users. Extremely large Internet sites can partition their
data across multiple servers, spreading the processing load across many
computers, and allowing the site to serve thousands of concurrent users.
Multiple instances of SQL Server 2000 can be run on a single computer. For
example, an organization that provides database services to many other
organizations can run a separate instance

70
Internet Clients

Figure 59 : SQL Server 2000 Organisation

of SQL Server 2000 for each customer organization, all on one computer. This
isolates the data for each customer organization, while allowing the service
organization to reduce costs by only having to administer one server computer.
SQL Server 2000 applications can run on the same computer as SQL Server
2000. The application connects to SQL Server 2000 using Windows
Interprocess Communications (IPC) components, such as shared memory,
instead of a network. This allows SQL Server 2000 to be used on small
systems where an application must store its data locally.
The illustration above shows an instance of SQL Server 2000 operating as the
database server for both a large Web site and a legacy client/server system.
The largest Web sites and enterprise-level data processing systems often
generate more database processing than can be supported on a single
computer. In these large systems, the database services are supplied by a group
of database servers that form a database services tier. SQL Server 2000
supports a mechanism that can be used to partition data across a group of
autonomous servers. Although each server is administered individually, the
servers cooperate to spread the database-processing load across the group.

What's New in Microsoft SQL Server 2000


Microsoft® SQL Server™ 2000 extends the performance, reliability, quality,
and ease-of-use of Microsoft SQL Server version 7.0. Microsoft SQL Server

71
2000 includes several new features that make it an excellent database platform
for large-scale online transactional processing (OLTP), data warehousing, and
e-commerce applications.

The OLAP Services feature available in SQL Server version 7.0 is now called
SQL Server 2000 Analysis Services. The term OLAP Services has been
replaced with the term Analysis Services. Analysis Services also includes a
new data mining component.

The Repository component available in SQL Server version 7.0 is now


called Microsoft SQL Server 2000 Meta Data Services. References to the
component now use the term Meta Data Services. The term repository is used
only in reference to the repository engine within Meta Data Services.

Relational Database Enhancements


Microsoft® SQL Server™ 2000 introduces several server improvements and
new features:
1. XML Support
2. Federated Database Servers
3. User-Defined Functions
4. Indexed Views
5. New Data Types
6. INSTEAD OF and AFTER Triggers
7. Cascading Referential Integrity Constraints
8. Full-Text Search Enhancements
9. Multiple Instances of SQL Server
10. Index Enhancements
11. Failover Clustering Enhancements
12. Net-Library Enhancements
13. 64-GB Memory Support
14. Distributed Query Enhancements
15. Updatable Distributed Partitioned Views
16. Kerberos and Security Delegation
17. Backup and Restore Enhancements
18. Scalability Enhancements for Utility Operations

Client Components
Clients do not access Microsoft® SQL Server™ 2000 directly; instead, clients
use applications written to access the data in SQL Server. SQL Server 2000
supports two main classes of applications:
• Relational database applications that send Transact-SQL statements to
the database engine; results are returned as relational result sets.
• Internet applications that send either Transact-SQL statements or
XPath queries to the database engine; results are returned as XML
documents.

Relational Database APIs


SQL Server 2000 provides native support for two main classes of database
APIs:

72
• OLE DB SQL Server 2000 includes a native OLE DB provider. The
provider supports applications written using OLE DB, or other APIs
that use OLE DB, such as ActiveX Data Objects (ADO). Through the
native provider, SQL Server 2000 also supports objects or components
using OLE DB, such as ActiveX, ADO, or Windows DNA
applications.
• ODBC SQL Server 2000 includes a native ODBC driver. The driver
supports applications or components written using ODBC, or other
APIs using ODBC, such as DAO, RDO, and the Microsoft Foundation
Classes (MFC) database classes.
Additional SQL Server API Support
SQL Server 2000 also supports:
• DB-Library
• Embedded SQL

Client Communications
The Microsoft OLE DB Provider for SQL Server 2000, the SQL Server 2000
ODBC driver, and DB-Library are each implemented as a DLL that
communicates to SQL Server 2000 through a component called a client Net-
Library.

MS DTC Service
The Microsoft Distributed Transaction Coordinator (MS DTC) is a transaction
manager that allows client applications to include several different sources of
data in one transaction. MS DTC coordinates committing the distributed
transaction across all the servers enlisted in the transaction.

73
An installation of Microsoft® SQL Server™ can participate in a distributed
transaction by:
• Calling stored procedures on remote servers running SQL Server.
Automatically or explicitly promoting the local transaction to a distributed
transaction and enlist remote servers in the transaction.

Figure 60 : SQL Server Transaction

Using Data Types


Objects that contain data have an associated data type that defines the kind of
data (character, integer, binary, and so on) the object can contain. The
following objects have data types:
• Columns in tables and views.
• Parameters in stored procedures.
• Variables.
• Transact-SQL functions that return one or more data values of a
specific data type.
• Stored procedures that have a return code, which always has an integer
data type.

Assigning a data type to an object defines four attributes of the

74
Transact-SQL has these base data types.

Bigint Binary Bit char Cursor


Datetime Decimal Float image Int
Money Nchar Ntext nvarchar Real
smalldatetime Smallint smallmoney text timestamp
Tinyint Varbinary Varchar uniqueidentifier

Constraints
Constraints allow you to define the way Microsoft® SQL Server™ 2000
automatically enforces the integrity of a database. Constraints define rules
regarding the values allowed in columns and are the standard mechanism for
enforcing integrity.

Classes of Constraints
SQL Server 2000 supports five classes of constraints.
• NOT NULL specifies that the column does not accept NULL values.
• CHECK constraints enforce domain integrity by limiting the values
that can be placed in a column.
• UNIQUE constraints enforce the uniqueness of the values in a set of
columns.
• PRIMARY KEY constraints identify the column or set of columns
whose values uniquely identify a row in a table.
• FOREIGN KEY constraints identify the relationships between tables.
• NO ACTION specifies that the deletion fails with an error.
• CASCADE specifies that all the rows with foreign keys
pointing to the deleted row are also deleted.

SQL Server Enterprise Manager


SQL Server Enterprise Manager is the primary administrative tool for
Microsoft® SQL Server™ 2000 and provides a Microsoft Management
Console (MMC)–compliant user interface that allows users to:
• Define groups of servers running SQL Server.
• Register individual servers in a group.
• Configure all SQL Server options for each registered server.
• Create and administer all SQL Server databases, objects, logins, users,
and permissions in each registered server.
• Define and execute all SQL Server administrative tasks on each
registered server.
• Design and test SQL statements, batches, and scripts interactively by
invoking SQL Query Analyzer.
• Invoke the various wizards defined for SQL Server.

SQL Server Query Analyzer


SQL Server Query Analyzer is a graphical User Interface for designing and
testing Transact-SQL statements, batches, and scripts interactively. SQL
Server Query Analyzer offers the following features:
• Free-form text editor for keying in Transact-SQL statements.

75
• Color coding of Transact-SQL syntax to improve the readability of
complex statements
• Results presented in either a grid or a free form text window.
• Graphical diagram of the show plan information showing the logical
steps built into the execution plan of a Transact-SQL statement.
• Index tuning wizard to analyze a Transact-SQL statement and the
tables it references to see if adding additional indexes will improve
the performance of the query.

76

Das könnte Ihnen auch gefallen