Beruflich Dokumente
Kultur Dokumente
INTRODUCTION
Introduction of Project
Competition in today’s age has a new face-Agility. Companies need to act and
react faster to a rapidly changing business environment. Smart Information
Management is the key to the company’s growth. The human resource
segment of a company has always found itself constrained may be because
each functional area is working in isolation with other and causes confusion
and other adverse effects on growth and development of the organization.
The On Line Recruitment Process & Employee Management System for HR
Group of Company is an enterprise wide single application consists of
solutions for all functional areas of an HR enterprise.
That is a tall order , building a single software program that serves the needs
of people in finance as well as it does the people in human resources and in
the warehouse .Each of those departments typically has its own computer
system , each optimized for the particular ways that the department does its
work . But ERP combines them all together into a single , integrated software
program that runs of a single database so that the various departments can
more easily share information and communicate with each other .
1
• Recruitment /Selection Process.
• Personal Details of the employees.
• Salary Details.
• Training Attended by the employees (inside & outside the
employs).
• Work Experience of employees.
• Report Generation.
Recruitment Management
This module covers all the activities from requisition of post till a candidate is
employed in the company, The major activities performed during coarse of
action are – Requisition for post is given by HOD to the HR department.
Based on this, openings will b advertised for which the candidates will submit
their resumes, short listing of candidates based on their skill is done, and then
they are called for interview which can be taken in many rounds. After all
rounds are over, the HR department gets a clear picture of the interview status
of the selected candidates.
Employee Profile
What follows now is the employee profile wherein existing employee records
and recruited candidates are maintained. Qualification details, family related
records are maintained. Training given to him\her is kept I record. Employee
transfer and promotion are also handled in this profile.
Payroll Management
This module handles activities like creation of salary heads, formula for any
salary head, assigning these to complete department.
2
1.4 Project’s Objectives
There are some other objectives to develop this system. The most
important objective is:
1) Capability
Business activities are influenced by company or organizations
ability to process information quickly and efficiently. The HRP
system adds capability in the following ways:
• Improved Processing speed
The inherent speed, with which computers process information
is one reason why organization seeks the development of the
system projects.
• Faster Retrieval of information
Locating and retrieving information from storage. The ability in
conducting complex searches.
2) Control
• Greater accuracy and consistency
Carrying out computing steps, including arithmetic, correctly
and consistently.
• Better Security
Safeguarding sensitive and important information in form that
is accessible only to authorized person.
3) Communication
• Enhanced Communication
Speeding the flow of information and messages between
remote locations as well as with in offices. This includes the
transmission of documents within offices.
• Integration of business areas
Coordinating business activities taking place in separate areas
of an organization though capture and distribution of
information.
3
4) Cost
• Monitor Cost
Tracking the performance of employees and overhead is
essential to determine whether a firm is performing in line with
expectations with in budget.
• Reduce Cost
Using computing capability to process at a lower cost than
possible with other methods, while maintaining accuracy and
performance level.
Features
• It reduces the paperwork and increased automation.
• Very fast processing
• Efficient management of information
• Improved security
• Data safety through redundancy
• Quick response to adhoc queries
• Integrity of the data is maintained
• Transparency in the system
Introduction of Organisation
e.Soft was incorporated with the prime objective of providing on-site and off-
site professional services specializing in system integration, application
development, CAD and web services.
4
2. SYSTEM ANALYSIS
User
Developer Generate
Request
Managers
Problem
Statement
User Interviews
Experience
Object Model
Functional Model
5
In this project we use Structured Analysis technique to understand the exact
requirement of the organization. In this technique we divide the main problem
in three sub problems and solve them separately. One is Recruitment module,
Employee module, Payroll module and the Master Database module.
There are number of employees and keeping details of all of the employees is
very tedious. The existing Manual Information System is not so efficient to
reply all the queries related to the employees.
6
The main objective to develop the project is to make the Personnel
Information System simple, easy and increase the productivity of the
Managers in taking decisions because all the information is available in the
organized form.
This software provides a user –friendly Interface for the user and it reduces the
data redundancy. Centralized information is available which can be accessed
by number of users.
In order to cope with the above problems this software has been designed and
developed to computerize the working of the Human Resource Department.
The main objective to develop the project is to make the system simple.
All projects are feasible given unlimited resources and infinite time!
Unfortunately, the development of computer based system is more likely to be
plagued by a scarcity of resources and difficult delivery dates. It is both
7
necessary and prudent to evaluate the feasibility of the project at the earliest
possible time. Months or years of efforts, money loss and untold professional
embarrassment can be averted if we better understand the project at its study
time.
The feasibility of a project is being analyzed within some frame work. The
most important factor is feasible and desirable then it include in the schedule
of the management so that approval can be taken from the same. In the
conduct of the feasibility study, the analysis considers seven distinct, but inter-
related types of feasibility. They are:
• Technical Feasibility
• Economic Feasibility
• Operational Feasibility
• Social Feasibility
• Management Feasibility
• Legal Feasibility
• Time Feasibility
Technical Feasibility
Technical Analysis begins with the assessment of the technical viability of the
proposed system. We have to mention what technologies are required to
accomplish system function and performance .We have to also study how will
these technology issues affect cost .
The existing technology seems sufficient to run the new system. The data
holding facility is also seems sufficient because we are using SQL RDBMS
and it can handle large volume of data , hence in near future if the number of
employs increases it can handle its very easily .
Operational Feasibility
It seems that management of the company is very much interested in the new
system. The management and the users are normally the same members so
there is no problem of conflict between the management & users.
8
2.4 Project Planning
3. Project Completion - In this stage, the team has to update the site
regularly. Each new vacancies has to add by the HR Manager as according
to the needs and demands. This stage is very important the freshness of the
site.
When any updation or upgradation is required for the website, the developers
or maintenance team make the website upto date.
There are lots of requirements after the completion of the Project. As this
website is dynamic website in which lots of changes are requires such as
Update current vacancies, interview dates or selected candidates list. So for
this is always a way to do this.
9
2.5 Project Scheduling
An average of tnormal, tmin, tmax and thistory is taken depending upon the
project.
Beta
Testing
Programming
3 5 8 10
20 35 20 20
15
1 2
5
20 20 20 10
4 6 7 9 11
10
Figure 2 : P.E.R.T CHART FOR ONLINE Recuriment Process &
Employee Management System for HR Group for a Company
Gantt Chart is also known as Time Line Charts. A Gantt chart can be
developed for the entire project or a separate chart can be developed for
each function. A tabular form is maintained where rows indicate the
tasks with milestones and columns indicate duration (weeks/months).
The horizontal bars that spans across columns indicate duration of the
task.
Requirement Gathering
Design
Test Cases
Coding
Testing
Build
11
2.6 Software requirement specifications (SRS)
The origin of the most software system is the need of a client, who either
wants to automate the existing manual system or desires a new software
system. The developer creates the software system and the end users will use
the completed system. There are three major parties interested in a new
system: the clients, the user and the developer. Somehow the requirements for
the system that will satisfy the needs of the clients and the concerns of the
users have to be communicated to the developer. The Problem is that the
client usually does not understand the software and the software development
process, and the developer often does not understand the client’s problem and
application area.
The basic purpose of the SRS is to bridge this communication gap. The SRS is
the medium through which the client and the user needs are accurately
specified; indeed SRS forms the basis of the software development. A good
SRS should satisfy all the parties -- something very hard to achieve – and
involves trade-offs and persuasions. The important purpose of developing SRS
is helping the client understand their own needs.
Advantages of SRS
• An SRS establishes the basis for agreement between the client and the
supplier on what the software product will do.
• An SRS provides a reference for validation of the final product.
• A high quality SRS is a prerequisite to a high quality software.
• A high quality SRS reduces the development cost.
Characteristics of an SRS
A good SRS is
1. Correct
2. Complete
3. Unambiguous
4. Verifiable
5. Consistent
6. Ranked for importance and/or stability
7. Modifiable
8. Traceable
12
Requirement specification document
The requirement for the proposed system is that, it would be less vulnerable to
the mistakes being made due to entry at two or three levels and calculations.
Moreover, the control of the administrator would be centralized. This will
provide the support for HR department working process.
(1) Introduction
(a) Purpose of the software
The purpose of the proposed system is to provide efficient
information system for management, department and employees.
The main objective to develop the project is to make the
information part simple and to provide user friendly access of this
program to all the staff members of the organization so that they can
locate and reply the inquiries concerned to them.
(b) Scope
The software prepared for e.soft technologies Ltd., it can be
implemented in any Organization with a few minor changes. The
software finds good scope in any organization having HR
department. Talking to the administrator and the employee who were
dealing with the HR department, we came to know that the manual
system was not up to the mark due to the cumbersome data entry and
ample of calculations on the basis of which reports are generated.
13
Administrator section This section can be accessed by providing
administrator password. In this section the administrator can authorize
persons to data updation. The administrator can edit the master table
information and payroll information.
Report Section
This section is developed using Crystal Report as a report generation
tool and SQL Server as back-end.
14
(c) External Interface Requirement
The software must be a user friendly platform to reduce the complexity of
operation. The Human Resource Management System should be capable
enough to support multi-user environment. The software is based on
client-server architecture so that one or more user can do entries in the
software system as well as view reports at a time.
Software Tools
Front-end Tool:-Microsoft ASP.NET 2.0 with c#
User friendly
Low Cost Solution
GUI feature
Better designing aspects
Back-end Tool: -Microsoft SQL Server 20000 – features are
Security
Portability
Quality
Platform
Windows platform like: 2000 professional, XP & Vista
Hardware Specification:
Intel Pentium and Celeron class processor
Processor Speed – 1.2 GHz or above
RAM - 512 MB
HDD - 40 GB
FDD-1.44MB
Monitor-14”SVGA
Printer – Dot Matrix /Inkjet /Laser Printer
Mouse- Normal
Keyboard- Normal
15
2.7 Software Engineering Paradigm applied
During the requirement analysis phase, the development team analysis the
requirement, to be fulfilled by the On Line Recruitment Process & Employee
Management System for HR Group of Company and identifies the portable
approaches for meeting these requirements.
Finally, team identifies that the On Line Recruitment Process & Employee
Management System for HR Group of Company should: -
1. Enable a visitor to register with the resume and application after
validations has been performed on the data provided but the user.
2. Enable visitor to perform activity such as send application, post resume.
3. Enable a registered user to view his/her different reports.
4. Enable administrator to view database and different reports generated by
HR Manager.
5. Enable HR Manager to add and view database to update records of
employees.
Methods
And tools
16
Figure 4 : Software Process
Iterative Development
Operational
Requirements
Requirements
Specifications
Incremental
Architectural Validation
Design
Incremental
Verification Design
Partitioning
Incremental
Builds*
Figure 5 : Iterative Development Model
1. Design
2. Code
Analysis of website
17
Design Model
A process model for website chosen based on the nature of the project and
application. By selecting appropriate process model for website.
(2) Bottom up
Both approaches have some merits and demerits. For this system top down
approach has been used. It starts by identifying the major components of the
system, decomposing them into their lower level components and iterating
until the derived level of detail is achieved.
Top down design methods often result in some from of stepwise refinement.
Starting from an abstract design, in each step the design is refined to a more
concrete level, until we reach a level where no more refinement is needed and
the design can be implemented directly. A top down approach is suitable only
if the specifications of the system are clearly known and the system
development is from scratch.
The user of the existing system defines his general objectives for the software,
but he does not identify detailed input, processing or output requirements. So,
I have chosen PROTOTYPING approach to develop this software.
Prototyping
18
help understand the requirements. This prototype is developed based on the
prototype obviously undergoes design, coding & testing, but each of these
phases is not done very formally or thoroughly. By using this prototype the
client can get an actual feel of the system, because the interactions with the
prototype can enable the client to better understand the requirements of the
desired system.
Requirement
Analysis
Design Code Test
Design
Code
Requirement
Analysis
Because the system is complicated and large and there is no existing system
(computerized) prototyping is an attractive idea. In this situation letting the
client test the prototype provides the variable inputs, which help in
determining the requirements of the system. It is also an effective method of
demonstrating the feasibility of a certain approach.
A data model is an abstract model that describes how data is represented and
accessed.
19
• The structural part: a collection of data structures which are used to
create databases representing the entities or objects modeled by the
database.
• The integrity part: a collection of rules governing the constraints
placed on these data structures to ensure structural integrity.
• The manipulation part: a collection of operators which can be applied
to the data structures, to update and query the data contained in the
database.
For example, in the relational model, the structural part is based on a modified
concept of the mathematical relation; the integrity part is expressed in first-
order logic and the manipulation part is expressed using the relational algebra,
tuple calculus and domain calculus.
For example, a Data modeler may use a data modeling tool to create an Entity-
relationship model of the Corporate data repository of some business
enterprise. This model is transformed into a relational model, which in turn
generates a relational database.
Entity-relationship model
The entity-relationship model or entity-relationship diagram (ERD) is a data
model or diagram for high-level descriptions of conceptual data model, and it
provides a graphical notation for representing such data models in the form of
entity-relationship diagrams. Such models are typically used in the first stage
of information-system design; they are used, for example, to describe
information needs and/or the type of information that is to be stored in the
database during the requirement analysis. The data modeling technique,
however, can be used to describe any ontology (i.e. an overview and
classifications of used terms and their relationships) for a certain universe of
discourse (i.e. area of interest). In the case of the design of an information
system that is based on a database, the conceptual data model is, at a later
stage (usually called logical design), mapped to a logical data model, such as
the relational model; this in turn is mapped to a physical model during
physical design. Note that sometimes, both of these phases are referred to as
"physical design".
There are a number of conventions for entity-relationship diagrams (ERDs).
The classical notation is described in the remainder of this article, and mainly
relates to conceptual modeling. There are a range of notations more typically
employed in logical and physical database design.
20
job_code
req_dat specification
dept_co e last_date
de
advertised
req_code
description
Job_Advertise
dept_code
tot_vacan
Recruitment Details
emp_code Employee's Details
desg_code
takes has
has
int_date
job_code
Human Resource emp_name
Management
Interview Details
cv_code
can_code
transfered has
emp_code has
basic pay
emp_code tot_cl
emp_code deduction
tran_to Employee
Transfer
tot_ml Leave Details
Pay Details
tran_date tran_type
21
3. SYSTEM DESIGN
22
Access to master database is provided to only HR Manager.
1) Country
From here HR Manager can change the details as well as delete the record of
that country. Proper validations and checks are provided for entered data.
2) State
This module provide an interface to HR Manager the through which user can
Add, Update, Delete the records of the State Database. The existing country
names are displayed in a combo; HR Manager can select a country from here
and can enter the state name for that country. He can also see the existing
records.
From here HR Manager can change the details as well as delete the record of
that state. Proper validations and checks are provided for entered data.
3) City
This module provide an interface to HR Manager the through which user can
Add, Update, Delete the records of the City Database. The existing country
names are displayed in a combo; then states according to selected country are
displayed. HR Manager can select a country from here and then state and can
enter the city name for that record. He can also see the existing records.
From here HR Manager can change the details as well as delete the record of
that city. Proper validations and checks are provided for entered data.
4) Department
23
From here HR Manager can change the details as well as delete the record of
that department. Proper validations and checks are provided for entered data.
5) Designation
This module provide an interface to HR Manager the through which user can
Add, Update, Delete the records of the Designation Database. The existing
country names are displayed in a combo; HR Manager can select a country
from here and can enter the state name for that country. He can also see the
existing records.
6) Grade
From here HR Manager can change the details as well as delete the record of
that grade. Proper validations and checks are provided for entered data.
7) Allowance
From here HR Manager can change the details as well as delete the record of
that grade. Proper validations and checks are provided for entered data.
8) Deduction
24
From here HR Manager can change the details as well as delete the record of
that grade. Proper validations and checks are provided for entered data.
2) Candidate Details
2.1) Candidate Entry
2.2) Candidate Shortlist
4) Interview
4.1) Interview Details
4.2) Selected Candidates
1) Job Opening
2) Candidate Details
25
2.2) Candidate Shortlist
This part shows the details of candidates, according to a job code in a
listview. By checking a particular cv code and clicking show detail cv,
user can see the detail cv of that particular candidate. Mail can also be
send to checked candidate. By checking a row in listview and clicking
the sent mail button, mail sending form is opened and in TO field, that
candidate’s mail id is automatically placed.
4) Interview
1) Employee Profile
This is used to store all the details of company’s employees. In this
employee’s personal, official, experience, qualification and family details
are stored and all details of a particular employee can be retrieved, updated
26
or deleted. Each employee has a unique emp code. If an employee’s
personal information will be deleted, all other information related to him
will be deleted so that no duplication of data will be there.
2) Employee Training
When a person joins the company as a employee, company give him/her
training. All information related to the training of employees can be
maintained in this module.
3) Employee Transfer
Employee’s transfer details are stored here. As user select a emp code
from a combo, all details such as employee name, department, designation,
grade are displayed in text boxes. New department, designation and grade
can be selected through combos. It can also be stored whether it is a
promotion or a simple transfer. This information can be retrieved, updated
and deleted.
1) Allowance Details
This part is used for all the allowance related details that are given by
company to its employees.
.
2) Allowance Values
This part is used for the information related to allowance values. Here
all details of the values of the allowance are stored according to the
department and designation. Particular information can be seen in listview
by clicking the add button.
3) Salary Structure
This is used for determining the salary structure of a designation according
to department.
User Control
ASP.NET provides facility to make user control that can be used anywhere
in the project. It is a great facility provided by ASP.NET. In My project,
have made a textbox in which only email address can be filled in valid
format. It is used in forms where such requirement arises , such as where
Candidate Email Address is filled.
27
3.2 Data integrity and constraints
The DFD is a simple graphical notation that can be used to represent a system
in terms of the input data to the system, various processing carried out on
these data, and the output data generated by the system.
CANDIDATE DEPARTEMENT
Select
candidate
list
HUMAN
CV RESOURCE
MANAGEMENT
SYSTEM
Employee
profile
Candidate Selected
list Reports
list
HR MANAGER
Salary structures
28
DEPARTEMENT Selected HR MANAGER
candidates
list
Job request
rerequest Vacancy
Information about details
selected candidates Interview
details
CV Entry Written
exam
details
Candidate
details RECRUITMEN
T PROCESS
Written Exam 1.1
Marks Advertisement
details
CANDIDATE
Information about selection
Figure 8 : Recruitment Module
29
DEPARTEMENT HR MANAGER
Employee Employee
profile details
emp transfer
emp official Transfer
details
Employee official
information Training details
EMPLOYEE
REGISTRATION emp training
Employee personal PROCESS
information 1.2 Qualification
details
emp qual
emp master Employee’s
experience
Family
details
Salary structure
Allowance details
details
Payroll
Process
Allowance master
1.3 Salary formula
HR MANAGER
30
3.3 Database design
Detail design is the most creative & challenging phase in the development life
cycle of the project. In the detail design of the system we design the tables of
database, schema of tables and relationship between tables and file
organization of the application.
After running the SQL Server, firstly I’ve create the database ‘hronline’ with
the user dbo. The initial size of the database is 3 mb.
31
DATABASE TABLES
• Country_master
This table is designed to store the information about country.
• State_master
This table is designed to store the information about states.
• city_master
This table is designed to store the information about cities
• dept_master
This table is designed to store the information about departments of the
company
32
• desg _master
This table is designed to store the information about the designations in
different department.
• grade
This table is designed to store the information about the grades in different
department.
• loginmaster
This table is designed to store the information about the grades in different
department.
33
• recr_master(Recruitment related Information)
• job_advertise
34
• cv_entry
This table is designed to store some primary details about the candidates
who send their cv
35
• written_marks
This table is designed to store written exam details conducted for the
selection of candidates
• interview_detail
This table is designed to store interview details held for candidates
selected in written exams
36
This table is designed to store personal details about the employees of the
company
• emp_official
This table is designed to store official details about the employees of the
company
• emp_experience
37
This table is designed to store previous experiences details of the
employees of the company
• emp_qualification
This table is designed to store qualification details of the employees of the
company
• emp_ family
This table is designed to store family details of the employees of the
company
38
This table is designed to store training details that is given to the
employees of the company
Structure for database: emp_ training
• emp_transfer
This table is designed to store training details that is given to the
employees of the company
Structure for database: emp_transfer
• allowance_master
This table is designed to store primary information of the allowances given
by the company in salary.
39
This table is designed to store primary information of the deduction
deducted by the company in salary.
• emp_salary
This table is designed to store details of the employee’s salary given
• emp_leave
40
This table is designed to store primary information of the leave taken by
the employee.
41
3.4 User Interface Design
42
4. CODING
The process of optimization starts from the designing stage itself and
continues till the deployment and distribution stage.
Optimizing speed
In order to optimize the speed of the application the following techniques are
used:
Use of appropriate data type
Assigning property values to variables
Using Early binding instead of late binding
A client interacts with a component using its properties and methods. In order
to access the properties and methods of a component, the client needs to be
bound to the component. The process of associating a client with a component
is binding. When you implement early binding between a client’s call and a
component method, the method called is determined at compile time i.e. the
43
call is associated with the appropriate method during the process of
compilation.
• Performance speed
• Syntax checking at compile time
• Display of objects in the Object Browser window
• Provision of help in the Object Browser
44
here (minimum) are sure not to go out of bounds, therefore choosing this
option will actually optimize speed and thus gives a faster code.
Range Check
To ensure that data value is within a pre-determined range.
This checks a value to be within a certain range of values. For example, the
month of a year must be between 1 and 12. Numbers less than 1 or greater
than 12 would be rejected.
Format check
To ensure the individual characters that make up the data are valid - e.g. no
letters in numerical data.
This checks that data is of the right format, that it is made up of the correct
combination of alphabetic and numeric characters. A Phone number must be
in the form of XXXXXXXXXX. The characters must be numbers. The total
length is ten characters. Any other format is rejected.
Check Digit
Allows a number to be self-checking.
This is used to check the validity of code numbers, for example paper1_marks
in written_marks table. These numbers are long and prone to data entry
errors. It is crucial that such numbers are entered correctly so that the right
record in the file or database is identified.
45
5. TESTING
Component Testing
Starting from the bottom the first test level is “Component Testing”,
sometimes called Unit Testing. It involves checking that each feature specified
in the “Component Design” has been implemented in the component. In
theory an independent tester should do this, but in practice the developer
usually does it, as they are the only people who understand how a component
works. The problem with a component is that it performs only a small part of
the functionality of a system, and it relies on co-operating with other parts of
the system, which may not have been built yet. To overcome this, the
46
developer either builds, or uses special software to trick the component into
believing it is working in a fully functional system.
Interface Testing
As the components are constructed and tested they are then linked together to
check if they work with each other. It is fact that two components that have
passed all their tests, when connected to each other produce one new
component full of faults. These tests can be done by specialists, or by the
developers.
Interface testing is not focused on what the components are doing but on how
they communicate with each other, as specified in the “System Design”. The
“system Design” defines relationship between components, and this involves
stating:
The tests are organized to check all the interfaces, until all the components
have been built and interfaced to each other producing the whole system.
System Testing
Once the entire system has been built then it has to be tested against the
“System Specification” to check if it delivers the features required. It is still
developer focused, although specialist developers known as system testers are
normally employed to do it.
In essence System testing is not about checking the individual parts of the
design, but about checking the system as a whole. In effect it is one giant
component.
System testing can involve a number of specialist types of test to see if all the
functional and non-functional requirements have been met. In addition to
functional requirements these may include the following types of testing for
the non-functional requirements:
There are many others, the needs for which are dictated by how the system is
supposed to perform.
Acceptance Testing
Acceptance testing checks the system against the “Requirements”. It is similar
to system testing in that the whole system is checked but the important
difference is the change in focus:
47
System testing checks that the system that was specified has been delivered.
Acceptance testing checks that the system delivers what was requested. The
customer and not the developer should always do acceptance testing. The
customer knows what is required from the system to achieve value in the
business and is the only person qualified to make that judgment. The forms of
tests may follow those in system testing, but at all times they are informed by
the business needs.
Release Testing
Even if a system meets all its requirements, there is still a case to be answered
that it will benefit the business. Release testing is about seeing if the new or
changed system will work in the existing business environment. Mainly this
means the technical environment, and checks concerns such as:
These tests are usually run by the computer operations team in a business. It
would appear obvious that the operation team should be involved right from
the start of a project to give their opinion of a new system may have.
1) Knowing the specific function that a product has been designed to perform,
tests can be conducted that demonstrate each function is fully operational,
at the same time searching for errors in each function. This approach is
known as Black Box Testing.
Black box testing is designed to uncover errors. They are used to demonstrate
that software functions are operations; that input is properly accepted and
output is correctly produced; and that integrity of external information is
maintained. A black box examines some fundamental aspects of a system with
little regard for the internal logical structure of the software.
48
5.1.3 Testing the On Line Recruitment Process & Employee Management
System for HR Group of Company
Based on the field conditions these testing for fine tuning can be
carried out at a later date.
Testing team will take over the project after the initial unit testing,
which would mark the completion of project.
Debugging
The purpose of debugging is to locate and fix the offending code responsible
for a symptom violating a known specification. Debugging typically happens
during three activities in software development, and the level of granularity of
the analysis required for locating the defect differs in these three. The first is
49
during the coding process, when the programmer translates the design into an
executable code. During this process the errors made by the programmer in
writing the code can lead to defects that need to be quickly detected and fixed
before the code goes to the next stages of development. Most often, the
developer also performs unit testing to expose any defects at the module or
component level. The second place for debugging is during the later stages of
testing, involving multiple components or a complete system, when
unexpected behavior such as wrong return codes or abnormal program
termination (“abends”) may be found. A certain amount of debugging of the
test execution is necessary to conclude that the program under test is the cause
of the unexpected behavior and not the result of a bad test case due to
incorrect specification, inappropriate data, or changes in functional
specification between different versions of the system. Once the defect is
confirmed, debugging of the program follows and the misbehaving component
and the required fix are determined. The third place for debugging is in
production or deployment, when the software under test faces real operational
conditions. Some undesirable aspects of software behavior, such as inadequate
performance under a severe workload or unsatisfactory recovery from a
failure, get exposed at this stage and the offending code needs to be found and
fixed before large-scale deployment. This process may also be called
“problem determination,” due to the enlarged scope of the analysis required
before the defect can be localized.
Code Improvement
The process of optimization starts from the designing stage itself and
continues till the deployment and distribution stage.
Optimizing speed
In order to optimize the speed of the application the following techniques are
used:
Use of appropriate data type
Assigning property values to variables
Using Early binding instead of late binding
50
k = Convert.ToString(Convert.ToInt32(k) + 1);
51
Optimize for slow code
This is the best option when one is concentrating on the hard disk space and
not the speed. The compiler creates the smallest possible compiled code,
occupying less disk space but probably slow in execution.
52
6. System Security measures
Security Prompting the user for a userid and password in our application is
a potential security threat. So credential information is transferred from the
browser to server are encrypted.
Do not store any critical information in cookies. For example, do not store
a user's password in a cookie, even temporarily. Avoid permanent cookies
if possible. Consider encrypting information in cookies. Set expiration
dates on cookies to the shortest practical time we can.
Much attention has been focused on network attacks by crackers, and how to
stop these. But the vulnerability of data inside the database is somewhat
overlooked. Databases are far too critical to be left unsecured or incorrectly
secured.
The best security practices protect sensitive data as it's transferred over the
network (including internal networks) and when it's at rest. One option for
accomplishing this protection is to selectively parse data after the secure
communication is terminated and encrypt sensitive data elements at the
SSL/Web layer. Doing so allows enterprises to choose at a very granular level
(usernames, passwords, and so on.) the sensitive data to secure throughout the
enterprise. Application-layer encryption and mature database-layer encryption
solutions allow enterprises to selectively encrypt granular data into a format
that can easily be passed between applications and databases without changing
the data. I'll focus on database-layer encryption in this article.
Data encryption
The sooner data encryption occurs, the more secure the information is. Due to
distributed business logic in application and database environments,
organizations must be able to encrypt and decrypt data at different points in
the network and at different system layers, including the database layer.
Encryption performed by the DBMS can protect data at rest, but you must
53
decide if you also require protection for data while it’s moving between the
applications and the database and between different applications and data
stores. Sending sensitive information over the Internet or within your
corporate network as clear text defeats the point of encrypting the text in the
database to provide data privacy.
Determining user profiles and their privilege domains will contribute to the
creation of a personalized software experience. Effective software must only
present those that are relevant to a given user and within the user's domain of
privilege. These must also reflect the specific grains relevant to the user.
Application personalization requires the establishment of a three-dimensional
framework inclusive of the following:
The main purpose for developing user groups and hierarchies is to avoid the
repetitive task of allocating certain sets of privilege and content access to each
individual user. Establishing such groupings allows allocating a set of
privileges to all users within a given group with a single stroke of software
command. Similarly, through inheritance, multiple user groups within the
hierarchy may be allocated a set of privileges with a single click.
54
Privilege Domain
Content Domain
The user groups are defined and the roles have been assigned, but still
unanswered is the question: What data and KPIs does a user see on a
dashboard? Answering this question leads us to the issue of content domain-
the parameters of which would define the KPI granularity, the reports, and the
alerts for each dashboard user. Managing content domain involves two
aspects: (1) security and (2) relevance. Security refers to the restriction of
information delivery only to those with the privilege to access certain
information. Information is inherently confidential, and every organization has
its boundaries regarding who may access what information. The security
framework must be created during a dashboard deployment, determining the
permissions and restrictions on the content domain of each user. Relevance
refers to the filtering of the most relevant content to a given dashboard user.
From all of the permitted information for a given user, an effective dashboard
must present the most relevant content with flexibility for the user to access
more information as needed.
55
7. Cost Estimation of the Project
Resource Sharing:
The main goal is to make all programs, equipment and data available to
anyone on the network without regard to the physical resource and the user.
Users need to share resource other files, as well, a common example being
printers. Printers are utilized only a small percentage of the time, therefore
company doesn’t want to invest in a printer for each computer. Network can
be use in this situation to allow the entire user to have access any of the
available printers.
High-Reliability:
Saving-Money:
Small computers have much better price/performance ratio than larger ones
mainframes are roughly a factor of ten faster than personal computers but they
cost a thousand times more. This imbalance
Has caused many system designers to built system consisting of personal
computers, one per user, with data kept on one are more shared files server
machines. In this model, the users are called ‘client’ and the whole
arrangement is called the ‘Client-server model’.
Scalability:
The ability to increase the system performance gradually as the work load
grows just by adding more processors with centralized mainframes, when a
system is full a larger one, usually at great expense and even greater
description to the user must replace with ‘client-server model’, new client and
new server can be added as needed.
Communication-Medium:
56
letter. Such a speed up makes cooperation among far-flung groups of people
easy where it previously had been impossible.
Increase Productivity:
Network increases productivity as several people can enter data at the same
time, but they can also evaluate and process the shared data. So, one person
can handle accounts receivables and someone else processes the profit and
loss statements. Since we have seen what factors have to be kept in mind
before calculating the cost estimating of the full network setup. The networks
would include: -
Server Hardware:
The HP server on windows’2000 which cost around Rs 1.5 lakhs will be best
suited for their organization purpose, as it has maximum security and easy to
manage with a great looking GUI interface making more manageable.
Server Software:
The client machine or the workstation needs to e powerful to lesser the load on
the server like the Windows Server 2000 (in which the workstations are not
more than interface to the server). A genuine Intel Pentium-3 cost around
32,000. The machine configuration will be 512MB RAM 40 GB HD, Floppy-
Drive, Multimedia-Kit and Monitor. This organization requires 10 number of
client machine.
Firewall:
The company needs ant virus S/w for protection again viruses anti viruses S/w
cost about RS10, 000. Besides these the organization needs dot-matrix printers
for printing of issue details and other printing operation. There is a need of at
least four printers and the 0is it’s cost around 4,000.
57
8. Reports
58
9. PERT Chart, Gantt Chart
9.1 PERT Chart
Programming
3 5 8 10
20 35 20 20
15
1 2
5
20 20 20 10
4 6 7 9 11
59
9.2 Gantt Chart
Gantt Chart is also known as Time Line Charts. A Gantt chart can be
developed for the entire project or a separate chart can be developed for each
function. A tabular form is maintained where rows indicate the tasks with
milestones and columns indicate duration (weeks/months). The horizontal bars
that spans across columns indicate duration of the task.
June July August September
Requirement
Gathering
Design
Test Cases
Coding
Testing
Build
60
10. Future scope and further enhancement of the
Project
What are the user’s demonstrable needs and how does it need them?
What resources are available for the given system?
Is the problem worth solving?
What is the likely impact of the system on the organization?
• Security
Security features are implemented. No unauthorized access to the
package, as the security is implemented through login and password.
Last but one of the most important advantages of the HRP system is
that, this system can be used in any Govt. or Public organization, to
process and manage their HR department working, with slight
modifications.
61
10.2 Further Enhancement of the Project
Everything that is made has some or the other things to be added to make it
better than revolutions. The project “On Line Recruitment Process & Employee
Management System for HR Group of Company” also falls in the same
domain.
Although it has been tried to develop a robust and fault free system, still
enough flexibility has been provided for further enhancements and
modifications. As I mentioned earlier then the designed forms are typically
reflections of the developer, so I strongly believe that the enhancement to be
done with the project to be done with the design changes, coding changes. But
at the same time I would like to mention that since one can not claim himself
as a master of the technology there is always some scope of technical
modifications in the project that may lead to find code redundancy & storage
space minimization.
• Since the data is retrieved from the tables where everything is based on
the coding system if the coding system is changed then the system needs to
be redesigned.
• The number of queries can always be increased when needed by the
user just by modifying the code little bit, full concentration is maintained
on the design of the system so that it can be easily modified.
• Design of the system can be changed in the sense of the flow of the
control so that the coding can be decreased to a considerable level.
• The developed sub modules have all the related features but still
improvement can be done. The developed package is flexible enough to
incorporate the modifications or enhancements with less alteration.
62
11. Bibliography
Software Engineering
by Roger S. Pressman
Referenced Sites
www.msdn.microsoft.com
www.w3schools.com
www.microsoft.com
63
12. Appendices
VB C++ C# Jscript
BASE Classes
64
3. ADO.NET is Microsoft’s ActiveX Data Object (ADO) model for
the .NET Framework. ADO.NET is intended specifically for
developing web applications.
4. The 4th layer of the framework consists of the windows application
model and, in parallel, the web application model. The Web
application model-in the slide presented as ASP .NET –includes Web
Forms and Web Services .ASP.NET comes with built in Web forms
controls, which are responsible for generating the user interface. They
mirror typical HTML widgets like text boxes or buttons.
5. One of the obvious themes of .NET is unification and interoperability
between various programming languages. In order to achieve this;
certain rules must be laid and all the languages must follow these rules.
6. The CLR and the .NET Frameworks in general, however, are designed
in such a way that code written in one language can not only
seamlessly be used by another language. Hence ASP.NET can be
programmed in any of the .net compatible language whether it is
VB.NET, C#, Managed C++ OR JSCRIPT.NET.
Common Language Runtime Despite its name , the runtime actually has a
role in both a component is running , the runtime is responsible for
managing memory , allocation , starting up and stopping threads and
processes , and enforcing security policy , as well as satisfying any
dependencies that the component might have on other components .
The Common Language Runtime is the execution engine for .NET Framework
applications. It provides a number of services, including the following:
65
Unified programming classes The Framework provides developers with a
unified, object-oriented, hierarchical, and extensible set of class libraries
(APIs). Developers use the Windows Foundation classes.
The .NET Framework includes classes, interfaces, and value types that are
used in the development process and provide access to system functionality.
To facilitate interoperability between languages, the .NET Framework types
are Common Language Specification (CLS) compliant and can therefore be
used from any programming language where the compiler conforms to the
CLS.
The .NET Framework types are the foundation on which .NET applications,
components, and controls are built. The .NET Framework includes types that
perform the following functions:
The .NET Framework provides a rich set of interfaces, as well as abstract and
concrete (non-abstract) classes. You can use the concrete classes as is or, in
many cases, derive your own classes from them, as well as from abstract
classes. To use the functionality of an interface, you can either create a class
that implements the interface or derive a class from one of the .NET
Framework classes that implements the interface.
Microsoft, with the help of Hewlett-Packard and Intel, supplied the OS-
independent subset of .NET class library to the ECMA standardization board.
For more information visit:
66
• To provide a code-execution environment that guarantees safe
execution of code, including code created by an unknown or
semi – trusted third party.
• To provide a code – execution environment that eliminates the
performance problems of scripted or interpreted environments.
• To make the developer experience consistent across widely
varying types of applications, such as windows – based
applications and Web – based applications.
• To build all communication on industry standards to ensure that
code based on the .NET Framework can integrate with any
other code.
ADO.NET
Microsoft Visual Studio .NET includes a number of data access features you
can use to build data access applications. Many of these features can save you
time during the development process by generating large amounts of tedious
code for you. Other features improve the performance of the applications you
build by storing metadata and updating logic in your code rather than fetching
this information at run time. Believe it or not, many of Visual Studio .NET’s
data access features accomplish both tasks.
Now that you understand the purpose of ADO.NET and where it fits into the
overall Visual Studio .NET architecture, it’s time to take a closer look at the
67
technology. In this chapter, we’ll take a brief look at the ADO.NET object
model and see how it differs from past Microsoft data access technologies.
Each .NET data provider has its own namespace. The two providers included
in the .NET Framework are subsets of the System.Data namespace, where the
disconnected objects reside. The OLE DB .NET Data Provider resides in the
System.Data.OleDb namespace, and the SQL Client .NET Data Provider
resides in System.Data.SqlClient.
Namespaces
A namespace is a logical grouping of objects. The .NET Framework is large,
so to make developing applications with the .NET Framework a little easier,
Microsoft has divided the objects into different namespaces.
The most important reason for using namespaces is to prevent name collisions
in assemblies. With different namespaces, programmers working on different
components combined into a single solution can use the same names for
different items. Since these names are separated, they don’t interfere with each
other at compile time. A more practical reason for namespaces is that grouping
objects can make them easier to locate.
68
Crystal Reports
Crystal Reports for Visual Studio .NET is the standard reporting tool for
Visual Studio .NET: it brings the ability to create interactive, presentation
quality content-which has been the strength of Cryatal Reports for years- to
the .NET platform
69
Advantages of Server Database System
Having data stored and managed in a central location offers several
advantages:
• Each data item is stored in a central location where all users can work
with it.
• Business and security rules can be defined one time on the server and
enforced equally among all users.
• A relational database server optimizes network traffic by returning
only the data an application needs.
• Hardware costs can be minimized.
• Maintenance tasks such as backing up and restoring data are simplified
because they can focus on the central server.
70
Internet Clients
of SQL Server 2000 for each customer organization, all on one computer. This
isolates the data for each customer organization, while allowing the service
organization to reduce costs by only having to administer one server computer.
SQL Server 2000 applications can run on the same computer as SQL Server
2000. The application connects to SQL Server 2000 using Windows
Interprocess Communications (IPC) components, such as shared memory,
instead of a network. This allows SQL Server 2000 to be used on small
systems where an application must store its data locally.
The illustration above shows an instance of SQL Server 2000 operating as the
database server for both a large Web site and a legacy client/server system.
The largest Web sites and enterprise-level data processing systems often
generate more database processing than can be supported on a single
computer. In these large systems, the database services are supplied by a group
of database servers that form a database services tier. SQL Server 2000
supports a mechanism that can be used to partition data across a group of
autonomous servers. Although each server is administered individually, the
servers cooperate to spread the database-processing load across the group.
71
2000 includes several new features that make it an excellent database platform
for large-scale online transactional processing (OLTP), data warehousing, and
e-commerce applications.
The OLAP Services feature available in SQL Server version 7.0 is now called
SQL Server 2000 Analysis Services. The term OLAP Services has been
replaced with the term Analysis Services. Analysis Services also includes a
new data mining component.
Client Components
Clients do not access Microsoft® SQL Server™ 2000 directly; instead, clients
use applications written to access the data in SQL Server. SQL Server 2000
supports two main classes of applications:
• Relational database applications that send Transact-SQL statements to
the database engine; results are returned as relational result sets.
• Internet applications that send either Transact-SQL statements or
XPath queries to the database engine; results are returned as XML
documents.
72
• OLE DB SQL Server 2000 includes a native OLE DB provider. The
provider supports applications written using OLE DB, or other APIs
that use OLE DB, such as ActiveX Data Objects (ADO). Through the
native provider, SQL Server 2000 also supports objects or components
using OLE DB, such as ActiveX, ADO, or Windows DNA
applications.
• ODBC SQL Server 2000 includes a native ODBC driver. The driver
supports applications or components written using ODBC, or other
APIs using ODBC, such as DAO, RDO, and the Microsoft Foundation
Classes (MFC) database classes.
Additional SQL Server API Support
SQL Server 2000 also supports:
• DB-Library
• Embedded SQL
Client Communications
The Microsoft OLE DB Provider for SQL Server 2000, the SQL Server 2000
ODBC driver, and DB-Library are each implemented as a DLL that
communicates to SQL Server 2000 through a component called a client Net-
Library.
MS DTC Service
The Microsoft Distributed Transaction Coordinator (MS DTC) is a transaction
manager that allows client applications to include several different sources of
data in one transaction. MS DTC coordinates committing the distributed
transaction across all the servers enlisted in the transaction.
73
An installation of Microsoft® SQL Server™ can participate in a distributed
transaction by:
• Calling stored procedures on remote servers running SQL Server.
Automatically or explicitly promoting the local transaction to a distributed
transaction and enlist remote servers in the transaction.
74
Transact-SQL has these base data types.
Constraints
Constraints allow you to define the way Microsoft® SQL Server™ 2000
automatically enforces the integrity of a database. Constraints define rules
regarding the values allowed in columns and are the standard mechanism for
enforcing integrity.
Classes of Constraints
SQL Server 2000 supports five classes of constraints.
• NOT NULL specifies that the column does not accept NULL values.
• CHECK constraints enforce domain integrity by limiting the values
that can be placed in a column.
• UNIQUE constraints enforce the uniqueness of the values in a set of
columns.
• PRIMARY KEY constraints identify the column or set of columns
whose values uniquely identify a row in a table.
• FOREIGN KEY constraints identify the relationships between tables.
• NO ACTION specifies that the deletion fails with an error.
• CASCADE specifies that all the rows with foreign keys
pointing to the deleted row are also deleted.
75
• Color coding of Transact-SQL syntax to improve the readability of
complex statements
• Results presented in either a grid or a free form text window.
• Graphical diagram of the show plan information showing the logical
steps built into the execution plan of a Transact-SQL statement.
• Index tuning wizard to analyze a Transact-SQL statement and the
tables it references to see if adding additional indexes will improve
the performance of the query.
76