Sie sind auf Seite 1von 69

A project report on

FACE RECONITION BASED CLASS ATTENDANCE

Submitted in partial fulfillment of the requirements for the degree of

B.TECH

by

PRINCE KADIWAR(14BIT0089)

requirements for the degree of B.TECH by PRINCE KADIWAR(14BIT0089) SCHOOL OF INFORMATION AND TECHNOLOGY (SITE) April,

SCHOOL OF INFORMATION AND TECHNOLOGY (SITE)

April, 2018

DECLARATION

I hereby declare that the project report entitled “FACE RECONITION BASED CLASS ATTENDANCE” submitted by me, for the award of the degree of B.Tech to VIT is a record of bonafide work carried out by me under the supervision of DR. SWARNA PRIYA R. M. I further declare that the work reported in this project has not been submitted and will not be submitted, either in part or in full, for the award of any other degree or diploma in this institute or any other institute or university.

Place: Vellore

Date:

Signature of the Candidate

CERTIFICATE

This is to certify that the Project Report entitled “FACE RECONITION BASED CLASS

ATTENDANCEsubmitted by PRINCE KADIWAR(14BIT0089) SITE, VIT, Vellore for the

award of the degree of B.TECH is a record of bonafide work carried out by him under my

supervision.

The contents of this report have not been submitted and will not be submitted either in part or in

full, for the award of any other degree or diploma in this institute or any other institute or

university. The Project report fulfills the requirements and regulations of VIT and in my opinion

meets the necessary standards for submission.

Signature of the Guide

DR. SWARNA PRIYA R. M.

Internal Examiner

Signature of the Head of the Department

DR. DINAKARAN M.

External Examiner

ABSTRACT

This project mainly addresses the building of face recognition system by using Principal Component Analysis (PCA). PCA is a statistical approach used for reducing the number of variables in face recognition. In PCA, every image in the training set is represented as a linear combination of weighted eigenvectors called eigenfaces. These eigenvectors are obtained from covariance matrix of a training image set. The weights are found out after selecting a set of most relevant Eigenfaces. Recognition is performed by projecting a test image onto the subspace spanned by the eigenfaces and then classification is done by measuring minimum Euclidean distance. A number of experiments were done to evaluate the performance of the face recognition system based Attendance management.

i

ACKNOWLEDGEMENT

It is my pleasure to express with deep sense of gratitude to DR. SWARNA PRIYA R M and DR.

PARIMALAM

Technology, for their constant guidance, continual encouragement, understanding, more than all, they taught me patience in my endeavor. My association with them is not confined to academics only, but it is a great opportunity on my part of work with an intellectual and expert in the field of Machine learning. I would like to express my gratitude to the chancellor DR. G. VISWANATHAN, the vice presidents DR. SANKAR VISWANATHAN, DR. SEKAR VISWANATHAN, MR. G. V. SELVAM, the vice chancellor DR. ANAND A. SAMUEL, the pro vice chancellor DR. S. NARAYANAN and DR. ASWANI KUMAR CHERUKURI, School of Information Technology and Engineering, for providing with an environment to work in and for their inspiration during the tenure of the course. In jubilant mood I express ingeniously my whole-hearted thanks to Dr. DINAKARAN M. Head of Department of Information Technology, SITE, all teaching staff and members working as limbs of our university for their not self-centered enthusiasm coupled with timely encouragements showered on me with zeal, which prompted the Acquirement of the requisite knowledge to finalize my course study successfully.

Institute of

M., School of Information Technology and Engineering,

Vellore

I would like to thank my parents for their support. It is indeed a pleasure to thank my friends who persuaded and encouraged me to Take up and complete this task. At last but not least, I express my gratitude and Appreciation to all those who have helped me directly or indirectly toward the

Successful completion of this project. It is indeed a pleasure to thank my friends who persuaded and encouraged me to take up and complete this task. At last but not least, I express my gratitude and appreciation to all those who have helped me directly or indirectly toward the successful completion of this project.

Place: Vellore

Date:

PRINCE KADIWAR

ii

CONTENTS

CONTENTS

iii

LIST OF FIGURES

iv

LIST OF TABLES

v

CHAPTER 1

INTRODUCTION

1.1

CHALLENGES AND OBJECTIVES

2

CHAPTER 2

LITERATURE REVIEW

2.1

LITERATURE REVIEW

3

CHAPTER 3

IMAGE PROCESSING FUNDAMENTALS

3.1 INTRODUCTION

6

3.2 IMAGE PROCESSING TECHNIQUES

6

3.2.1 IMAGE ENHANCEMENT

7

3.2.2 IMAGE

RESTORATION

7

3.3.3 COMMUNICATION

8

3.3.4 RADA IMAGING SYSTEM

8

3.3.5 DOCUMENT PROCESSING

8

3.3.6 DEFENCE/INTELLIGENCE

9

iii

CHAPTER 4

SOFTWARE TOOLS AND METHDOLOGY

4.1 ABOUT MATLAB

10

4.2 MATLAB TOOLBOXS

10

4.3 BASICS OF MATLAB

11

4.3.1 MATLAB WINDOWS

11

4.3.2 ONLINE HELP

12

4.3.3 INPUT-OUTPUT

12

4.3.4 FILE TYPES

14

4.3.5 PLATFORM INDEPENDENCE

15

4.3.6 IMAGES IN MATLAB

15

4.4

MATLAB

16

4.4.1 MATAB OPERATIONS

16

4.4.2 EXPRESSIONS

17

4.4.3 VARIABLES

17

4.4.4 NUMBERS

17

4.4.5 OPERATIONS

18

4.4.6 FUNCTIONS

18

CHAPTER 5

MATLAB IMAGE PROCESSING

5.1 INTRODUCTION

19

5.2 FUNDAMENTALS

19

5.3 IMAGE FORMATS SUPPORTED BY MATLABS

20

iv

5.4

WORKING FORMATS IN MATLAB

20

5.4.1 INTENSITY IMAGE(GREY SCALE IMAGE)

20

5.4.2 BINARY IMAGE

21

5.4.3 INDEXED IMAGE

21

5.4.4 RGB IMAGE

21

5.4.5 MULTIFRAME IMAGE

22

5.5 IMAGE FORMAT CONVERSION

22

5.6 READ MATLAB FILE

23

CHAPTER 6

OVERVIEW AND PLANNING

6.1 EXISTING SYSTEM

26

6.2 PROPOSED ALGORITHM

26

6.3 MODULES

28

6.4 MODULES DESCRIPTION

28

6.5 SYSTEM REQUIREMENTS

29

6.5.1 HARDWARE REQUIREMENTS

29

6.5.2 SOFTWARE CONFIGURATION

29

CHAPTER 7

SYSTEM DESIGN

7.1

UML DIAGRAMS

30

v

CHAPTER 8

IMPLEMENTATION

8.1 SOURCE CODE

34

8.2 SCREENSHOTS

49

CHAPTER 9

TESTING STRATEGY

9.1 SYSTEM TESTING

53

9.2 TYPES OF TESTING

53

9.2.1 UNIT TESTING

53

9.2.2 INTEGRATIONG TESTING

54

9.2.3 FUNCTIONAL TEST

54

9.2.4 SYSTEM TEST

54

9.2.5 WHITE BOX TEXTING

55

9.2.6 BLACK BOX TESTING

55

9.3

UNIT TESTING

55

CHAPTER 10

CONCLUSION

10.1

FUTURE ENHANCEMENT

57

REFERENCES

58

vi

LIST OF FIGURES

3.1

IMAGE PROCESSING TECHNIQUES

6

7.1

USE CASE DIAGRAM

30

7.2

CLASS DIAGRAM

31

7.3

SEQUENCE DIAGRAM

31

7.4

ACTIVITY DIAGRAM

32

7.5

DATAFLOW DIAGRAM

33

7.6

ER DIAGRAM

33

8.1

REGISTRAION

49

8.2

LOGIN

50

8.3

MATCHING

51

8.4

CLASS ATTENDANCE(EXCEL)

52

vii

LIST OF TABLES

4.1

OUTPUT FORMAT

14

5.1

IMAGE FORMAT CONVERSION

22

5.2

READ A MATLAB FILE

24

5.3

LOADING AND SAVING VARIABLES

25

viii

CHAPTER 1

Introduction

Facial recognition or face recognition as it is often referred to as, analyses characteristics of a person's face image input through a camera. It measures overall facial structure, distances between eyes, nose, mouth, and jaw edges. These measurements are retained in a database and used as a comparison when a user stands before the camera. One of the strongest positive aspects of facial recognition is that it is non-intrusive. Verification or identification can be accomplished from two feet away or more, without requiring the user to wait for long periods of time or do anything more than look at the camera. Traditionally student’s attendance is taken manually by using attendance sheet, given by the faculty member in class. The Current attendance marking methods are monotonous & time consuming. Manually recorded attendance can be easily manipulated. Moreover, it is very difficult to verify one by one student in a large classroom environment with distributed branches whether the authenticated students are actually responding or not. Hence the paper is proposed to tackle all these issues [1]. The proposed system consists of a high resolution digital camera to monitor the classroom or office room. It is embedded on a micro-controller based motor system which enables it to rotate in left & right directions. The data or images obtained by the camera are sent to a computer programmed system for further analysis. The obtained images are then compared with a set of reference images of each of the employees or students & mark the corresponding attendance. The system also provides for continuous monitoring of the classroom by an operator if needed. The camera module can be a wireless or wired system.

1

1.1 CHALLENGES AND OBJECTIVES

The most creative and challenging phase of the system development is system design. It provides the under standing and procedural details necessary for implementing the system recommended in the feasibility study. Design goes through the logical and physical stages of development.

In designing a new system, the system analyst must have a clear understanding of the objectivies, which the design is aimed to fulfill. The first step is to determine how the output is to be designed to meet the requirements of the desired output. The operational phases are handled through program construction and testing. Finally details related to justification of the system on the user and the organizations documented and evaluated by the management.

The final prior to the implementation phase includes procedural flowcharts, records and the report layout and workable plan for implementing the candidate system.

2

CHAPTER 2

Literature Review

2.1 LITERATURE REVIEW

Title: Automated Attendance Management System Based On Face Recognition Algorithms

Year: 2013

Author: Shireesha Chintalapati, M.V. Raghunadh,

Authors have proposed a Automated Attendance Management System based On Face Recognition Algorithm. This system, which is based on face detection and recognition algorithm, automatically detects the student when he enters the class room and mark the attendance by recognition him. This technique is to be use in order to handle the treats like spoofing. The problem with this approach is that it capture only one student image at a time when he enter the classroom thus it is time consuming and may distract the attention of the student.

Title: Design of Face Detection and Tracking System

Year: 2010

Author: Ruian Liu Mimi Zhang Shengtao Ma

Automatic Control of Students Attendance in Classroom Using RFID. In this system in which students carry a RFID tag type ID card and they need to place that on the card reader to record his attendance. RS232 is used to connect the system with computer and stored the recorded attendance from the database. This system may give rise to the problem of fraudulent approach. An unauthorized students may make use of RFID card and enter into the organization.

Title: Encara:real-time detection of frontal faces

Year: 2011

Author: M.C. Santana, M.H. Tejera, J.C. Gámez

3

Wireless Fingerprint Attendance Management System. This system uses iris recognition system that does capturing the image of iris recognition, extraction, storing and matching. But the difficulty occurs to lay the transmission lines in the places where the quality of topography is poor. In [4] authors have consider a system based on real time face recognition which is reliable, secure and fast which needs improvement in different lighting conditions.

Title: Attendance System Using Face Recognition and Class Monitoring System

Year: 2014

Author: Prof. Arun Katara, Mr. Sudesh V. Kolhe , Mr. Amar P. Zilpe , Mr. Nikhil D. Bhele4 , Mr. Chetan J. Bele

In this paper, we propose a system that takes the attendance of students in the lecture. This system takes the attendance automatically using face recognition. However, it is difficult to estimate the attendance exactly using each result of face recognition independently because the face detection rate is not sufficiently high. In our paper, we propose a method for estimating the attendance exactly using all the results of face recognition obtained by continuous observation. Continuous observation improves the performance for the estimation of the attendance. We constructed the attendance system based on face recognition, and applied the system to classroom lecture. In our system, we are using raspberry pi. we use OpenCv library which is installed in pi for face detection and recognition. The camera is connected to raspberry pi and student database is stored in the pi. With the help of this system time will reduce and attendance will be marked. In this paper first review similar works in the field of attendance system and recognition of face. Then, it showing our system structure and plan. At the last, experiments are implemented to provide as manifest to support our plan. The result shows that uninterrupted observation improved the performance for the approximation of the attendance.

Title: Automatic Door Access System Using Face Recognition

Year: 2015

Author: Hteik Htar Lwin, Aung Soe Khaing, Hla Myo Tun

4

Most doors are controlled by persons with the use of keys, security cards, password or pattern to open the door. Theaim of this paper is to help users forimprovement of the door security of sensitive locations by using face detection and recognition. Face is a complex multidimensional structure and needs good computing techniques for detection and recognition. This paper is comprised mainly of three subsystems: namely face detection, face recognition and automatic door access control. Face detection is the process of detecting the region of face in an image. The face is detected by using the viola jones method and face recognition is implemented by using the Principal Component Analysis (PCA). Face Recognition based on PCA is generally referred to as the use of Eigenfaces.If a face is recognized, it is known, else it is unknown. The door will open automatically for the known person due to the command of the microcontroller. On the other hand, alarm will ring for the unknown person. Since PCA reduces the dimensions of face images without losing important features, facial images for many persons can be stored in the database. Although many training images are used, computational efficiency cannot be decreased significantly. Therefore, face recognition using PCA can be more useful for door security system than other face recognition schemes.

5

CHAPTER 3

Image Processing Fundamentals

3.1 INTRODUCTION

Digital image processing refers processing of the image in digital form. Modern cameras may directly take the image in digital form but generally images are originated in optical form. They are captured by video cameras and digitalized. The digitalization process includes sampling, quantization. Then these images are processed by the five fundamental processes, at least any one of them, not necessarily all of them.

3.2 IMAGE PROCESSING TECHNIQUES

This section gives various image processing techniques.

IP
IP

Image Enhancement

Image Restoration

Image Analysis

Image Compression

Image Synthesis

Figure 3.1 IMAGE PROCESSING TECHNIQUES

6

3.2.1

IMAGE ENHANCEMENT

Image enhancement operations improve the qualities of an image like improving the image’s contrast and brightness characteristics, reducing its noise content, or sharpen the details. This just enhances the image and reveals the same information in more understandable image. It does not add any information to it.

3.2.2 IMAGE RESTORATION

Image restoration like enhancement improves the qualities of image but all the operations are mainly based on known, measured, or degradations of the original image. Image restorations are used to restore images with problems such as geometric distortion, improper focus, repetitive noise, and camera motion. It is used to correct images for known degradations.

3.2.3 IMAGE ANALYSIS

Image analysis operations produce numerical or graphical information based on characteristics of the original image. They break into objects and then classify them. They depend on the image statistics. Common operations are extraction and description of scene and image features, automated measurements, and object classification. Image analyze are mainly used in machine vision applications.

3.2.4 IMAGE COMPRESSION

Image compression and decompression reduce the data content necessary to describe the image. Most of the images contain lot of redundant information, compression removes all the redundancies. Because of the compression the size is reduced, so efficiently stored or transported. The compressed image is decompressed when displayed. Lossless compression preserves the exact data in the original image, but Lossy compression does not represent the original image but provide excellent compression.

3.2.5 IMAGE SYNTHESIS

Image synthesis operations create images from other images or non-image data. Image synthesis operations generally create images that are either physically impossible or impractical to acquire.

7

3.3

APPLICATIONS OF DIGITAL IMAGE PROCESSING

Digital image processing has a broad spectrum of applications, such as remote sensing via satellites and other spacecrafts, image transmission and storage for business applications, medical processing, radar, sonar and acoustic image processing, robotics and automated inspection of industrial parts.

3.3.1 MEDICAL APPLICATIONS

In medical applications, one is concerned with processing of chest X-rays, cineangiograms, projection images of transsexual tomography and other medical images that occur in radiology, nuclear magnetic resonance (NMR) and ultrasonic scanning. These images may be used for patient screening and monitoring or for detection of tumors or other disease in patients.

3.3.2 SATELLITE IMAGING

Images acquired by satellites are useful in tracking of earth resources; geographical mapping; prediction of agricultural crops, urban growth and weather; flood and fire control; and many other environmental applications. Space image applications include recognition and analysis of objects contained in image obtained from deep space-probe missions.

3.3.3 COMMUNICATION

Image transmission and storage applications occur in broadcast television, teleconferencing, and transmission of facsimile images for office automation, communication of computer networks, closed-circuit television based security monitoring systems and in military communications.

3.3.4 RADAR IMAGING SYSTEMS Radar and sonar images are used for detection and recognition of various types of targets

or in guidance and maneuvering of aircraft or missile systems.

3.3.5 DOCUMENT PROCESSING

It is used in scanning, and transmission for converting paper documents to a digital image form, compressing the image, and storing it on magnetic tape. It is also used in document reading for automatically detecting and recognizing printed characteristics.

8

3.3.6 DEFENSE/INTELLIGENCE

It is used in reconnaissance photo-interpretation for automatic interpretation of earth satellite imagery to look for sensitive targets or military threats and target acquisition and guidance for recognizing and tracking targets in real-time smart-bomb and missile-guidance systems.

9

CHAPTER 4

Software Tools and Methodology

4.1 ABOUT MATLAB

MATLAB is a software package for high performance numerical computation and visualization. It provides an interactive environment with hundreds of built-in functions for technical computation, graphics and animation. Best of all, it provides easy extensibility with its own high-level programming language. The name MATLAB stands for MATrix LABoratory. The basic building block of MATLAB is the matrix. The fundamental data type is the array.

MATLABs built-in functions provide excellent tools for linear algebra computations, data analysis, signal processing, optimization, numerical solutions of ODES, quadrature and many other types of scientific computations. Most of these functions use the state-of-the art algorithms. There are numerous functions for 2-D and 3-D course, MATLAB even provides an external interface to run those programs from within MATLAB. The user, however, is not limited to the built-in functions, he can write his own functions in the MATLAB language. Once written, these functions behave just like the built-in functions. MATLAB’s language is very easy to learn and to use.

4.2 MATLAB TOOLBOXES

There are several optional ‘Toolboxes’ available from the developers of the MATLAB. These tool boxes are collection of functions written for special applications such as Symbolic Computations Toolbox, Image Processing Toolbox, Statistics Toolbox, and Neural Networks Toolbox, Communications Tool box, Signal Processing Toolbox, Filter Design Toolbox, Fuzzy Logic Toolbox, Wavelet Toolbox, Data base Toolbox, Control System Toolbox, Bioinformatics Toolbox, Mapping Toolbox.

10

4.3 BASICS OF MATLAB

4.3.1 MATLAB WINDOWS

On all UNIX systems, Macs, and PC, MATLAB works through three basic windows. They

are:

a. Command window

This is the main window. The MATLAB command prompt characterizes it ‘>>’.when you launch the application program, MATLAB puts you in this window .All commands, including those for running user-written programs, are typed in this window at the MATLAB prompt.

b. Graphics window

The output of all graphics commands are typed in the command window are flushed to the graphics or figure window, a separate gray window with(default) white background colour. The user can create as many figure windows, as the system memory will allow.

c. Edit window

This is where you write, edit, create, and save your own programs in files called M-files. We can use any text editor to carry out these tasks. On the most systems, such as PC’s and Macs, MATLAB provides its build in editor. On other systems, you can invoke the edit window by typing the standard file editing command that you normally use on your systems. The command is typed at the MATLAB prompt following the special character ‘!’ character. After editing is completed, the control is returned to the MATLAB.

11

4.3.2 On-Line Help

a. On-line documentation

MATLAB provides on-line help for all its built-in functions and programming language constructs. The commands look for, help, help win, and helpdesk provides on-line help.

b. Demo

MATLAB has a demonstration program that shows many of its features. The program includes a tutorial introduction that is worth trying. Type demo at the MATLAB prompt to invoke the demonstration program, and follow the instruction on the screen.

4.3.3 Input-Output

MATLAB supports interactive computation taking the input from the screen, and flushing the output to the screen. In addition, it can read input files and write output files. The following features hold for all forms of input-output.

a. Data type

The fundamental data type in the MATLAB is the array. It encompasses several distinct data objects-integers, doubles, matrices, character strings, and cells. In most cases, however, we never have to worry about the data type or the data object declarations. For example there is no need to declare variables, as real or complex .When a real number is entered as the variable, MATLAB automatically sets the variable to be real.

b. Dimensioning

Dimensioning is automatic in MATLAB. No dimensioning statements are required for vectors or arrays. We can find the dimension of an existing matrix or a vector with the size and length commands.

12

C. Case sensitivity

MATLAB is case sensitive i.e. it differentiates between the lower case and the Uppercase letters. Thus a and an are different variables. Most MATLAB commands are built-in function calls are typed in lower case letters. We can turn case sensitivity on and off with casesen command.

d. Output display

The output of every command is displayed on the screen unless MATLAB is directed otherwise.

A semicolon at the end of a command suppresses the screen output, except for graphics and on-

line help command. The following facilities are provided for controlling the screen output.

i. Paged output To direct the MATLAB to show one screen of output at a time, type more on the MATLAB prompt. Without it, MATLAB flushes the entire output at once, without regard to the speed at which we read.

ii. Output format Though computations inside the MATLAB are performed using the double

precision, the appearance of floating point numbers on the screen is controlled by the output format

in use. There are several different screen output formats. The following table shows the printed

value of 10pi in different formats.

 

Table 4.1

Format short

31.4159

Format short e

3.1416e+01

Format long

31.41592653589793

Format long e

3.141592653589793e+01

Format short g

31.416

Format long

g

31.4159265358979

Format hex

403f6a7a2955385e

Format rat

3550/113

Format bank

31.42

13

e. Command History

MATLAB saves previously typed commands in a buffer. These commands can be called with the up-arrow key. This helps in editing previous commands. You can also recall a previous command by typing the first characters and then pressing the up-arrow key. On most Unix systems, MATLABS command line editor also understands the standard emacs key bindings.

4.3.4 File Types

MATLAB has three types of files for storing information

M-files: M-files are standard ASCII text files, with a .m extension to the file name. There are two types of these files: script files and function files. Most programs we write in MATLAB are saved as M-files. All built-in functions in MATLAB are M-files, most of which reside on our computer in precompiled format. Some built in functions are provided with source code in readable M-files so that can be copied and modified.

Mat-files: Mat-files are binary data-files with a .mat extension to the file name. Mat-files are created by MATLAB when we save data with the save command. The data is written in a special format that only MATLAB can read. Mat-files can be loaded into MATLAB with the load command.

Mex-files: Mex-files are MATLAB callable Fortran and C programs, with a.mex extension to the file name. Use of these files requires some experience with MATLAB and a lot of patience.

4.3.5 Platform independence

One of the best features of MATLAB is its platform-independence. Programs written in the MATLAB language work exactly the same way on all computers. The user interface however, varies from platform to platform. For example, on PC’s and Macs there are menu driven commands

14

for opening, writing, editing, saving and printing files whereas on Unix machines such as sun workstations, these tasks are usually performed with Unix commands.

4.3.6 Images in MATLAB

The project has involved understanding data in MATLAB, so below is a brief review of how images are handled. Indexed images are represented by two matrices, a color map matrix and image matrix.

(i)The color map is a matrix of values representing all the colours in the image.

(ii)The image matrix contains indexes corresponding to the colour map color map.

A color map matrix is of size N*3, where N is the number of different colors I the image. Each row represents the red, green, blue components for a colour.

E.g. the matrix

Represents two colours, the first have components r1, g1,b1 and the second having the components r2,g2,b2

The wavelet toolbox only supports indexed images that have linear, monotonic color maps. Often color images need to be pre-processed into a grey scale image before using wavelet decomposition. The Wavelet Toolbox User’s Guide provides some sample code to convert color images into grey scale. This will be useful if it is needed to put any images into MATLAB.

This chapter dealt with introduction to MATLAB software which we are using for our project. The 2-D wavelet Analysis, the decomposition of an image into approximations and details and the properties of different types of wavelets will be discussed in the next chapter.

15

4.4 MATLAB

Matlab is a high-performance language for technical computing.

It integrates computation, programming and visualization in a user-friendly environment where problems and solutions are expressed in an easy-to-understand mathematical notation.

Matlab is an interactive system whose basic data element is an array that does not

Require dimensioning.

This allows the user to solve many technical computing problems, especially those with matrix and vector operations, in less time than it would take to write a program in a scalar non-interactive language such as C or FORTRAN.

Matlab features a family of application-specific solutions which are called toolboxes.

It is very important to most users of Matlab, that toolboxes allow to learn and apply

Specialized technology.

These toolboxes are comprehensive collections of Matlab functions, so-called M-files that extend the Matlab environment to solve particular classes of problems.

Matlab is a matrix-based programming tool. Although matrices often need not to be

Dimensioned explicitly, the user has always to look carefully for matrix dimensions.

If it is not defined otherwise, the standard matrix exhibits two dimensions n × m.

Column vectors and row vectors are represented consistently by n × 1 and 1 × n matrices, respectively.

4.4.1 MATLAB OPERATIONS

Matlab operations can be classified into the following types of operations:

Arithmetic and logical operations,

Mathematical functions,

Graphical functions, and

Input/output operations.

In the following sections, individual elements of Matlab operations are explained in Detail.

16

4.4.2

EXPRESSIONS

Like most other programming languages, Matlab provides mathematical expressions,

But unlike most programming languages, these expressions involve entire matrices. The building blocks of expressions are

Variables

Numbers

Operators

Functions

4.4.3

VARIABLES

Matlab does not require any type declarations or dimension statements.

When a new variable name is introduced, it automatically creates the variable and allocates the appropriate amount of memory.

If the variable already exists, Matlab changes its contents and, if necessary, allocates new storage.

For example

>> books = 10

It creates a 1-by-1 matrix named books and stores the value 10 in its single element.

In the expression above, >> constitutes the Matlab prompt, where the commands can be entered.

Variable names consist of a string, which start with a letter, followed by any number of letters, digits, or underscores. Matlab is case sensitive; it distinguishes between uppercase and lowercase letters. A and a are not the same variable.

To view the matrix assigned to any variable, simply enter the variable name.

4.4.4

NUMBERS

Matlab uses the conventional decimal notation.

A decimal point and a leading plus or minus sign is optional. Scientific notation uses the letter e to specify a power-of-ten scale factor.

Imaginary numbers use either i or j as a suffix.

17

Some examples of legal numbers are:

7 -55 0.0041 9.657838 6.10220e-10 7.03352e21 2i -2.71828j 2e3i 2.5+1.7j.

4.4.5 OPERATORS

Expressions use familiar arithmetic operators and precedence rules. Some examples are:

+ Addition

- Subtraction

* Multiplication

/ Division

’ Complex conjugate transpose

( ) Brackets to specify the evaluation order.

4.4.6 FUNCTIONS

Matlab provides a large number of standard elementary mathematical functions, including sin, sqrt, expand abs.

Taking the square root or logarithm of a negative number does not lead to an error; the appropriate complex result is produced automatically.

Matlab also provides a lot of advanced mathematical functions, including Bessel and Gamma functions. Most of these functions accept complex arguments.

For a list of the elementary mathematical functions, type

>> help elfun

Some of the functions, like sqrt and sin are built-in. They are a fixed part of the Matlab core so they are very efficient.

The drawback is that the computational details are not readily accessible. Other functions, like gamma and sinh, are implemented in so called M-files.

You can see the code and even modify it if you want.

18

CHAPTER 5

Matlab Image Processing

5.1 INTRODUCTION

When working with images in Matlab, there are many things to keep in mind such as loading an image, using the right format, saving the data as different data types, how to display an image, conversion between different image formats, etc. This worksheet presents some of the commands designed for these operations. Most of these commands require you to have the Image processing tool box installed with Matlab. To find out if it is installed, type ver at the Matlab prompt. This gives you a list of what tool boxes that are installed on your system.

For further reference on image handling in Matlab you are recommended to use Mat lab’s help browser. There is an extensive (and quite good) on-line manual for the Image processing tool box that you can access via Mat lab’s help browser.

The first sections of this worksheet are quite heavy. The only way to understand how the presented commands work is to carefully work through the examples given at the end of the worksheet. Once you can get these examples to work, experiment on your own using your favorite image!

5.2 FUNDAMENTALS

A digital image is composed of pixels which can be thought of as small dots on the screen.

A

digital image is an instruction of how to color each pixel. We will see in detail later on how this

is

done in practice. A typical size of an image is 512-by-512 pixels. Later on in the course you will

see that it is convenient to let the dimensions of the image to be a power of 2. For example, 2 9 =512. In the general case we say that an image is of size m-by-n if it is composed of m pixels in the vertical direction and n pixels in the horizontal direction.

Let us say that we have an image on the format 512-by-1024 pixels. This means that the data for the image must contain information about 524288 pixels, which requires a lot of memory! Hence, compressing images is essential for efficient image processing. You will later on see how

19

Fourier analysis and Wavelet analysis can help us to compress an image significantly. There are also a few "computer scientific" tricks (for example entropy coding) to reduce the amount of data required to store an image.

5.3 IMAGE FORMATS SUPPORTED BY MATLAB

The following image formats are supported by Matlab:

BMP

HDF

JPEG

PCX

TIFF

XWB

Most images you find on the Internet are JPEG-images which is the name for one of the most widely used compression standards for images. If you have stored an image you can usually see

from the suffix what format it is stored in. For example, an image named myimage.jpg is stored in the JPEG format and we will see later on that we can load an image of this format into Matlab

.

5.4 WORKING FORMATS IN MATLAB

If an image is stored as a JPEG-image on your disc we first read it into Matlab. However, in order to start working with an image, for example perform a wavelet transform on the image, we must convert it into a different format. This section explains four common formats.

5.4.1 INTENSITY IMAGE (GRAY SCALE IMAGE)

This is the equivalent to a "gray scale image" and this is the image we will mostly work with in this course. It represents an image as a matrix where every element has a value corresponding to how bright/dark the pixel at the corresponding position should be colored. There are two ways to represent the number that represents the brightness of the pixel: The double class

20

(or data type). This assigns a floating number ("a number with decimals") between 0 and 1 to each pixel.

The value 0 corresponds to black and the value 1 corresponds to white. The other class is called uint8 which assigns an integer between 0 and 255 to represent the brightness of a pixel. The value 0 corresponds to black and 255 to white. The class uint8 only requires roughly 1/8 of the storage compared to the class double. On the other hand, many mathematical functions can only be applied to the double class. We will see later how to convert between double and uint8.

5.4.2 BINARY IMAGE This image format also stores an image as a matrix but can only color a pixel black or white

(and nothing in between). It assigns a 0 for black and a 1 for white.

5.4.3 INDEXED IMAGE

This is a practical way of representing color images. (In this course we will mostly work with gray scale images but once you have learned how to work with a gray scale image you will also know the principle how to work with color images.) An indexed image stores an image as two matrices. The first matrix has the same size as the image and one number for each pixel. The second matrix is called the color map and its size may be different from the image. The numbers in the first matrix is an instruction of what number to use in the color map matrix.

5.4.4 RGB IMAGE

This is another format for color images. It represents an image with three matrices of sizes matching the image format. Each matrix corresponds to one of the colors red, green or blue and gives an instruction of how much of each of these colors a certain pixel should use.

21

5.4.5 MULTIFRAME IMAGE In some applications we want to study a sequence of images. This is very common in biological and medical imaging where you might study a sequence of slices of a cell. For these cases, the multiframe format is a convenient way of working with a sequence of images. In case you choose to work with biological imaging later on in this course, you may use this format.

How to convert between different formats The following table shows how to convert between the different formats given above. All these commands require the Image processing tool box!

5.5 IMAGE FORMAT CONVERSION

(Within the parenthesis you type the name of the image you wish to convert.)

Table 5.1(IMAGE FORMAT CONVERSION)

Operation:

Matlab command:

Convert between intensity/indexed/RGB formats to binary format.

dither()

Convert between intensity format to indexed format.

gray2ind()

Convert between indexed format to intensity format.

ind2gray()

Convert between indexed format to RGB format.

ind2rgb()

Convert a regular matrix to intensity format by scaling.

mat2gray()

Convert between RGB format to intensity format.

rgb2gray()

Convert between RGB format to indexed format.

rgb2ind()

22

The command mat2gray is useful if you have a matrix representing an image but the values representing the gray scale range between, let's say, 0 and 1000. The command mat2gray automatically re scales all entries so that they fall within 0 and 255 (if you use the uint8 class) or 0 and 1 (if you use the double class).

How to convert between double and uint8

When you store an image, you should store it as a uint8 image since this requires far less memory than double. When you are processing an image (that is performing mathematical operations on an image) you should convert it into a double. Converting back and forth between these classes is easy.

I=im2double (I);

Converts an image named I from uint8 to double.

I=im2uint8 (I);

An image named I from double to uint8.

5.6 READ MATLAB FILE

When you encounter an image you want to work with, it is usually in form of a file (for example, if you down load an image from the web, it is usually stored as a JPEG-file). Once we are done processing an image, we may want to write it back to a JPEG-file so that we can, for example, post the processed image on the web. This is done using the imread and imwrite commands. These commands require the Image processing tool box!

23

Reading and writing image files

Table 5.2(READ A MATLAB FILE)

 

Matlab

Operation:

command:

Read an image. (Within the parenthesis you type the name of the image file you wish to read. Put the file name within single quotes ' '.)

imread()

Write an image to a file.(As the first argument within the parenthesis you type the

name

of

the

image

you

have

worked

with.

As a second argument within the parenthesis you type the name of the file and

imwrite( , )

format

that

you

want

to

write

the

image

to.

Put the file name within single quotes ' '.)

 

Make sure to use semi-colon ; after these commands, otherwise you will get LOTS OF

number scrolling on you screen

the section "Image formats supported by Matlab" above.

The commands imread and imwrite support the formats given in

Loading and saving variables in Matlab

This section explains how to load and save variables in Matlab. Once you have read a file, you probably convert it into an intensity image (a matrix) and work with this matrix. Once you are done you may want to save the matrix representing the image in order to continue to work with this matrix at another time. This is easily done using the commands save and load. Note that save and load are commonly used Matlab commands, and works independently of what tool boxes that are installed.

24

Loading and saving variables

Table 5.3

Operation:

Matlab command:

Save the variable X.

save X

Load the variable X.

load X

25

CHAPTER 6

Overview and Planning

6.1 EXISTING SYSTEM

Face recognition systems have been grabbing high attention from commercial market point of view as well as pattern recognition field. Face recognition has received substantial attention from researches in biometrics, pattern recognition field and computer vision communities. The face recognition systems can extract the features of face and compare this with the existing database. The faces considered here for comparison are still faces. Machine recognition of faces from still and video images is emerging as an active research area. The present paper is formulated based on still or video images captured either by a digital camera or by a web cam. The face recognition system detects only the faces from the image scene, extracts the descriptive features. It later compares with the database of faces, which is collection of faces in different poses. The present system is trained with the database where the images are taken in different poses, with glasses, with and without beard.

6.2 PROPOSED ALGORITHM

One of the simplest and most effective PCA approaches used in face recognition systems is the so-called eigenface approach. This approach transforms faces into a small set of essential characteristics, eigenfaces, which are the main components of the initial set of learning images (training set). Recognition is done by projecting a new image in the eigenface subspace, after which the person is classified by comparing its position in eigenface space with the position of known individuals.The advantage of this approach over other face recognition systems is in its simplicity, speed and insensitivity to small or gradual changes on the face. The problem is limited to files that can be used to recognize the face. Namely, the images must be vertical frontal views of human faces.

26

The whole recognition process involves two steps:

A. Initialization process

B. Recognition process

The Initialization process involves the following operations:

i. Acquire the initial set of face images called as training set.

ii. Calculate the Eigenfaces from the training set, keeping only the highest

eigenvalues. These M images define the face space. As new faces are experienced, the eigenfaces can be updated or recalculated.

iii. Calculate distribution in this M-dimensional space for each known person by

projecting his or her face images onto this face-space. These operations can be performed from

time to time whenever there is a free excess operational capacity. This data can be cached which can be used in the further steps eliminating the overhead of re-initializing, decreasing execution time thereby increasing the performance of the entire system. Having initialized the system, the next process involves the steps: i. Calculate a set of weights based on the input image and the M eigenfaces by projecting the input image onto each of the Eigenfaces.

iv. Determine if the image is a face at all (known or unknown) by checking to see

if the image is sufficiently close to a ―free space‖. v. If it is a face, then classify the weight pattern as either a known person or as

unknown.

vi. Update the eigenfaces or weights as either a known or unknown, if the same unknown person face is seen several times then calculate the characteristic weight pattern and incorporate into known faces. The last step is not usually a requirement of every system and hence the steps are left optional and can be implemented as when the there is a requirement.

27

6.3

MODULES

USER

1. Login

2. Set Dataset path

3. Set Database path

4. Comparison process (PCA based Algorithm)

5. Show results.

6.4 MODULES DESCRIPTION

USER

1. User can Login the process and further processing more.

2. Set the datapath path for further clarification.

3. Set the database path for further clarification process.

4. Using PCA algorithm, When the select the input image by user and to checking the similarities of database images.

5. If the images are matched, then show the results. Otherwise does not matched.

28

6.5 SYSTEM REQUIREMENTS

6.5.1 HARDWARE REQUIREMENTS

Processor

:

INTEL i7 CORE

Ram

:

4 GB SD RAM

Monitor

:

15” COLOR

Hard Disk

:

500 GB

Keyboard

:

STANDARD 102 KEYS

Mouse

:

3 BUTTONS

6.5.2 SOFTWARE CONFIGURATION

Operating System

:

Windows 8

Environment

:

MATLAB

Matlab

:

Version 15a

29

7.1 UML DIAGRAMS

7.1.1 USE CASE DIAGRAM

CHAPTER 7

System Design

7.1 UML DIAGRAMS 7.1.1 USE CASE DIAGRAM CHAPTER 7 System Design Figure 7.1(USE CASE DIAGRAM) 30

Figure 7.1(USE CASE DIAGRAM)

30

7.1.2

CLASS DIAGRAM

7.1.2 CLASS DIAGRAM Figure 7.2(CLASS DIAGRAM) 7.1.3 SEQUENCE DIAGRAM FIGURE 7.3(SEQUENCE DIAGRAM) 31

Figure 7.2(CLASS DIAGRAM)

7.1.3 SEQUENCE DIAGRAM

7.1.2 CLASS DIAGRAM Figure 7.2(CLASS DIAGRAM) 7.1.3 SEQUENCE DIAGRAM FIGURE 7.3(SEQUENCE DIAGRAM) 31

FIGURE 7.3(SEQUENCE DIAGRAM)

31

7.1.4 ACTIVITY DIAGRAM

7.1.4 ACTIVITY DIAGRAM FIGURE 7.4(ACTIVITY DIAGRAM) 32

FIGURE 7.4(ACTIVITY DIAGRAM)

32

7.1.5

DATA FLOW DIAGRAM

7.1.5 DATA FLOW DIAGRAM 7.1.6 ER DIAGRAM FIGURE 7.5(DATA FLOW DIAGRAM) FIGURE 7.6(ER DIAGRAM) 33

7.1.6 ER DIAGRAM

FIGURE 7.5(DATA FLOW DIAGRAM)

7.1.5 DATA FLOW DIAGRAM 7.1.6 ER DIAGRAM FIGURE 7.5(DATA FLOW DIAGRAM) FIGURE 7.6(ER DIAGRAM) 33

FIGURE 7.6(ER DIAGRAM)

33

8.1 SOURCE CODE

CHAPTER 8

Implementation

Main

function varargout = Main_Face(varargin)

%

MAIN_FACE MATLAB code for Main_Face.fig

 

%

MAIN_FACE, by itself, creates a new MAIN_FACE or raises the existing

%

singleton*.

%

%

H = MAIN_FACE returns the handle to a new MAIN_FACE or the handle to

%

the existing singleton*.

%

%

MAIN_FACE('CALLBACK',hObject,eventData,handles,

)

calls the local

%

function named CALLBACK in MAIN_FACE.M with the given input arguments.

%

%

MAIN_FACE('Property','Value',

)

creates a new MAIN_FACE or raises the

%

existing singleton*. Starting from the left, property value pairs are

%

applied to the GUI before Main_Face_OpeningFcn gets called. An

%

unrecognized property name or invalid value makes property application

%

stop. All inputs are passed to Main_Face_OpeningFcn via varargin.

%

%

*See GUI Options on GUIDE's Tools menu. Choose "GUI allows only one

%

instance to run (singleton)".

%

%

See also: GUIDE, GUIDATA, GUIHANDLES

 

%

Edit the above text to modify the response to help Main_Face

%

Last Modified by GUIDE v2.5 24-Jan-2018 12:15:03

%

Begin initialization code - DO NOT EDIT

gui_Singleton = 1; gui_State = struct('gui_Name',

'gui_Singleton', gui_Singleton, 'gui_OpeningFcn', @Main_Face_OpeningFcn, 'gui_OutputFcn', @Main_Face_OutputFcn, 'gui_LayoutFcn', [] , 'gui_Callback', []); if nargin && ischar(varargin{1}) gui_State.gui_Callback = str2func(varargin{1}); end

mfilename,

34

if nargout

[varargout{1:nargout}] = gui_mainfcn(gui_State, varargin{:}); else gui_mainfcn(gui_State, varargin{:}); end

% End initialization code - DO NOT EDIT

% --- Executes just before Main_Face is made visible.

function Main_Face_OpeningFcn(hObject, eventdata, handles, varargin)

% This function has no output args, see OutputFcn.

% hObject

% eventdata reserved - to be defined in a future version of MATLAB

handle to figure

% structure with handles and user data (see GUIDATA)

handles

% command line arguments to Main_Face (see VARARGIN)

varargin

% Choose default command line output for Main_Face

handles.output = hObject;

% Update handles structure

guidata(hObject, handles);

% UIWAIT makes Main_Face wait for user response (see UIRESUME)

% uiwait(handles.figure1);

% --- Outputs from this function are returned to the command line.

function varargout = Main_Face_OutputFcn(hObject, eventdata, handles)

% varargout cell array for returning output args (see VARARGOUT);

% hObject

% eventdata reserved - to be defined in a future version of MATLAB

% handles

handle to figure

structure with handles and user data (see GUIDATA)

% Get default command line output from handles structure

varargout{1} = handles.output;

% --- Executes on button press in pushbutton1.a

function pushbutton1_Callback(hObject, eventdata, handles)

% hObject

% eventdata reserved - to be defined in a future version of MATLAB

% handles

handle to pushbutton1 (see GCBO)

structure with handles and user data (see GUIDATA)

datapath = uigetdir('C:\Users\Matlab\Desktop\Face recognition real time\Database\.jpg'); testpath = uigetdir('C:\Users\Matlab\Desktop\Face recognition real time\Dataset\.jpg');

35

prompt = {'Enter test image name (a number between 1 to 5):'}; dlg_title = 'Input of PCA-Based Face Recognition System'; num_lines= 1; def = {' '}; TestImage = inputdlg(prompt,dlg_title,num_lines,def); TestImage = strcat(testpath,'\',char(TestImage),'.jpg');

recog_img = facerecog(datapath,TestImage); selected_img = strcat(datapath,'\',recog_img); test_img = imread(TestImage);

axes(handles.axes1);

imshow(test_img); title('Test Image');

select_img = imread(selected_img);

axes(handles.axes2)

imshow(select_img); title('Recognized Image');

INP1=im2bw(test_img);

INPUT1=im2bw(select_img)

S=corr2(INPUT1,INP1);

S

if S>0.5 uiwait(msgbox('Matched'));

i=index(1);

if i ==1 || i==16 ||i==31 ||i==32 name='Rajesh Mishra'; Add='No:12 Anna Nagar,Chennai'; elseif i ==2 || i==17||i==32 name='Alok Nath'; Add='No:1/24 Anna Salai,Vellore'; elseif i ==3 || i==18 ||i==33 name='Brijesh Kumar'; Add='No: 32/10, Abdul Aziz Street,Chennai'; elseif i ==4 || i==19 ||i==34 name='Anurag Kesawar'; Add='No.41, Rangaiyya Street,Ayanavaram, Chennai'; elseif i ==5 || i==20 ||i==35 name='Shambhu Nath'; Add='Plot No. 160,1" Cross Street, Vellore'; elseif i ==6 || i==21 ||i==36 name='Ritesh Kumar'; Add='No:1/4 KK Nagar, Chennai'; elseif i ==7 || i==22 ||i==37 name='D K Mishra';

36

Add='F2 2/2, E.V.R Periyar Nagar,Chennai'; elseif i ==8 || i==23 ||i==38 name='Shivakanth Pathak'; Add='No. 6/50G, Shanti Path, Chanakyapuri, New Delhi'; elseif i ==9 || i==24 ||i==39 name='O P Tiwari'; Add='No:14/1 Shantipath, Chanakyapuri,New Delhi'; elseif i ==10 || i==25 ||i==40 name='Bijoy Jha'; Add='24, Kasturba Gandhi Marg,New Delhi'; elseif i ==11 || i==26 ||i==41 name='P K Mishra'; Add='38/A, Jawahar Lal Nehru Road,Kolkata'; elseif i ==12 || i==27 ||i==42 name='Dojer Kera'; Add='C-49, G-Block, Bandra East, Mumbai '; elseif i ==13 || i==28 ||i==43 name='Amit Kumar'; Add='M-26/6,Ashok Nagar, Chennai'; elseif i ==14 || i==29 ||i==44 name='Andrew Rendou'; Add='1st Avenue,Ambattur, Chennai'; elseif i ==15 || i==30 ||i==45 name='Tigmanshu Mishra'; Add='No:24/53 Manimegalai Street, Madipakkam, Chennai'; end

na=name; ad=Add; s1=strcat('Name :

',name);

s2=strcat('Address :

',Add);

s={s1;s2};

set(handles.edit2,'String',s);

else

uiwait(msgbox('Notmatched'));

end

% --- Executes on button press in pushbutton2.

function pushbutton2_Callback(hObject, eventdata, handles)

% hObject

% eventdata reserved - to be defined in a future version of MATLAB

% handles

handle to pushbutton2 (see GCBO)

structure with handles and user data (see GUIDATA)

37

global img

n=2;

interval=3;

photosave = inputdlg(' Do you want to save the files(y/n): ','Registration Form'); photosave = str2num(photosave{:});

outputFolder = fullfile('C:\Users\Matlab\Desktop\Face recognition real time\Dataset');

if ~exist(outputFolder, 'dir') end obj = videoinput('winvideo', 1); set(obj, 'ReturnedColorSpace', 'RGB'); disp('First shot will taken after 1 second');

pause(1);

for i=1:n preview(obj) img=getsnapshot(obj); closepreview(obj); if(photosave == 'y') outputBaseFileName = sprintf('%d.jpg',i); outputFullFileName = fullfile(outputFolder, outputBaseFileName); imwrite(img,outputFullFileName,'jpg'); end pause(interval); end

n= 5; outputFolder = fullfile('C:\Users\Matlab\Desktop\Face recognition real time\Dataset'); if ~exist(outputFolder, 'dir') end for i=1:n outputBaseFileName= sprintf('%d.jpg',i); outputFullFileName = fullfile(outputFolder, outputBaseFileName); imwrite(img,outputFullFileName,'jpg');

axes(handles.axes1);

imshow(img),title('Login Image') end

% --- Executes on button press in pushbutton3.

function pushbutton3_Callback(hObject, eventdata, handles)

% hObject

% eventdata reserved - to be defined in a future version of MATLAB

handle to pushbutton3 (see GCBO)

38

% handles

structure with handles and user data (see GUIDATA)

global img1

n=2;

interval=3;

photosave = inputdlg(' Do you want to save the files(y/n): ','Registration Form'); photosave = str2num(photosave{:});

outputFolder = fullfile('C:\Users\Matlab\Desktop\Face recognition real time\Database');

if ~exist(outputFolder, 'dir') end obj = videoinput('winvideo', 1); set(obj, 'ReturnedColorSpace', 'RGB'); disp('First shot will taken after 1 second');

pause(1);

for i=1:n preview(obj)

img1=getsnapshot(obj);

closepreview(obj); if(photosave == 'y') outputBaseFileName = sprintf('%d.jpg',i); outputFullFileName = fullfile(outputFolder, outputBaseFileName);

imwrite(img1,outputFullFileName,'jpg');

end

pause(interval);

end

n= 5; outputFolder = fullfile('C:\Users\Matlab\Desktop\Face recognition real time\Database'); if ~exist(outputFolder, 'dir') end for i=1:n outputBaseFileName = sprintf('%d.jpg',i); outputFullFileName = fullfile(outputFolder, outputBaseFileName);

imwrite(img1,outputFullFileName,'jpg');

axes(handles.axes1);

imshow(img1),title('Registered Image') end

facerecog:

function [recognized_img]=facerecog(datapath,testimg)

39

%

In this part of function, we align a set of face images (the training set x1, x2,

, xM )

%

%

This means we reshape all 2D images of the training database

%

into 1D column vectors. Then, it puts these 1D column vectors in a row to

%

construct 2D matrix 'X'.

%

%

%

datapath

-

path of the data images used for training

% X

-

A 2D matrix, containing all 1D image vectors.

%

Suppose all P images in the training database

%

have the same size of MxN. So the length of 1D

%

column vectors is MxN and 'X' will be a (MxN)xP 2D matrix.

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

%%%

%%%%%%%%% finding number of training images in the data path specified as argument

%%%%%%%%%%

D = dir(datapath); % D is a Lx1 structure with 4 fields as: name,date,byte,isdir of all L files

present in the directory 'datapath'

imgcount = 0;

for i=1 : size(D,1)

if not(strcmp(D(i).name,'.')|strcmp(D(i).name,' ')|strcmp(D(i).name,'Thumbs.db'))

imgcount = imgcount + 1; % Number of all images in the training database end

end

%%%%%%%%%%%%%%%%%%%%%%%%%% creating the image matrix X

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

X = [];

for i = 1 : imgcount str = strcat(datapath,'\',int2str(i),'.jpg'); img = imread(str); img = rgb2gray(img); [r c] = size(img); temp = reshape(img',r*c,1); %% Reshaping 2D images into 1D image vectors %% here img' is used because reshape(A,M,N) function reads the matrix A

columnwise

%% where as an image matrix is constructed with first N pixels as first row,next N in second row so on

X = [X temp];

end

%% X,the image matrix with columnsgetting added for each image

40

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

%%%%%%%%%%%%%%%%%%%%%%%%%%%%

%

Now we calculate m, A and eigenfaces.The descriptions are below :

%

%

m

-

(MxN)x1 Mean of the training images

%

A

-

(MxN)xP Matrix of image vectors after each vector getting subtracted from

the mean vector m

%

eigenfaces

-

(MxN)xP' P' Eigenvectors of Covariance matrix (C) of training database

X

%

where P' is the number of eigenvalues of C that best represent the

feature set

 

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

%%%%%%%%%%%%%%%%%%%%%%%%%%%%

%%%%% calculating mean image vector %%%%%

mean(X,2); % Computing the average face image m = (1/P)*sum(Xj's) imgcount = size(X,2);

m =

(j = 1 : P)

%%%%%%%% calculating A matrix, i.e. after subtraction of all image vectors from the mean image vector %%%%%%

A = [];

for i=1 : imgcount

temp = double(X(:,i)) - m;

A = [A temp];

end

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% CALCULATION OF

EIGENFACES

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

%%%%%%%%%%

%%% we know that for a MxN matrix, the maximum number of non-zero eigenvalues that its covariance matrix can have %%% is min[M-1,N-1]. As the number of dimensions (pixels) of each image vector is very high

compared to number of %%% test images here, so number of non-zero eigenvalues of C will be maximum P-1 (P being the number of test images) %%% if we calculate eigenvalues & eigenvectors of C = A*A' , then it will be very time consuming as well as memory. %%% so we calculate eigenvalues & eigenvectors of L = A'*A , whose eigenvectors will be linearly related to eigenvectors of C. %%% these eigenvectors being calculated from non-zero eigenvalues of C, will represent the best feature sets.

41

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

%%%%%%%%%%%%%%%%%%%%%%%%

L= A' * A; [V,D]=eig(L); %% V : eigenvector matrix D : eigenvalue matrix

%%%% again we use Kaiser's rule here to find how many Principal Components (eigenvectors) to be taken %%%% if corresponding eigenvalue is greater than 1, then the eigenvector will be chosen for creating eigenface

L_eig_vec = []; for i = 1 : size(V,2) if( D(i,i) > 1 ) L_eig_vec = [L_eig_vec V(:,i)]; end end

%%% finally the eigenfaces %%% eigenfaces = A * L_eig_vec;

%In this part of recognition, we compare two faces by projecting the images into facespace and % measuring the Euclidean distance between them.

%

%

recogimg

-

the recognized image name

%

testimg

-

the path of test image

%

m

-

mean image vector

%

A

-

mean subtracted image vector matrix

%

eigenfaces

-

eigenfaces that are calculated from eigenface function

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

%%%%%%% finding the projection of each image vector on the facespace (where the eigenfaces are the co-ordinates or dimensions) %%%%%

projectimg = [ ]; % projected image vector matrix for i = 1 : size(eigenfaces,2) temp = eigenfaces' * A(:,i); projectimg = [projectimg temp]; end

%%%%% extractiing PCA features of the test image %%%%%

test_image = imread(testimg); test_image = test_image(:,:,1);

42

[r c] = size(test_image); temp = reshape(test_image',r*c,1); % creating (MxN)x1 image vector from the 2D image temp = double(temp)-m; % mean subtracted vector projtestimg = eigenfaces'*temp; % projection of test image onto the facespace

%%%%% calculating & comparing the euclidian distance of all projected trained images from the projected test image %%%%%

euclide_dist = [ ]; for i=1 : size(eigenfaces,2) temp = (norm(projtestimg-projectimg(:,i)))^2; euclide_dist = [euclide_dist temp]; end [euclide_dist_min recognized_index] = min(euclide_dist); recognized_img = strcat(int2str(recognized_index),'.jpg');

attendance:

function varargout = facerecog(varargin)

%

FACERECOG M-file for facerecog.fig

 

%

FACERECOG, by itself, creates a new FACERECOG or raises the existing

%

singleton*.

%

%

H = FACERECOG returns the handle to a new FACERECOG or the handle to

%

the existing singleton*.

%

%

FACERECOG('CALLBACK',hObject,eventData,handles,

)

calls the local

%

function named CALLBACK in FACERECOG.M with the given input arguments.

%

%

FACERECOG('Property','Value',

)

creates a new FACERECOG or raises the

%

existing singleton*. Starting from the left, property value pairs are

%

applied to the GUI before facerecog_OpeningFcn gets called. An

%

unrecognized property name or invalid value makes property application

%

stop. All inputs are passed to facerecog_OpeningFcn via varargin.

%

%

*See GUI Options on GUIDE's Tools menu. Choose "GUI allows only one

%

instance to run (singleton)".

%

%

See also: GUIDE, GUIDATA, GUIHANDLES

 

%

Edit the above text to modify the response to help facerecog

%

Last Modified by GUIDE v2.5 21-Mar-2016 18:04:51

43

% Begin initialization code - DO NOT EDIT

gui_Singleton = 1; gui_State = struct('gui_Name',

'gui_Singleton', gui_Singleton, 'gui_OpeningFcn', @facerecog_OpeningFcn, 'gui_OutputFcn', @facerecog_OutputFcn, 'gui_LayoutFcn', [] , 'gui_Callback', []); if nargin && ischar(varargin{1}) gui_State.gui_Callback = str2func(varargin{1}); end

mfilename,

if nargout

[varargout{1:nargout}] = gui_mainfcn(gui_State, varargin{:}); else gui_mainfcn(gui_State, varargin{:}); end

% End initialization code - DO NOT EDIT

% --- Executes just before facerecog is made visible.

function facerecog_OpeningFcn(hObject, eventdata, handles, varargin)

% This function has no output args, see OutputFcn.

% hObject

% eventdata reserved - to be defined in a future version of MATLAB

handle to figure

% structure with handles and user data (see GUIDATA)

handles

% command line arguments to facerecog (see VARARGIN)

varargin

% Choose default command line output for facerecog

handles.output = hObject;

% Update handles structure

guidata(hObject, handles);

% UIWAIT makes facerecog wait for user response (see UIRESUME)

% uiwait(handles.figure1);

% --- Outputs from this function are returned to the command line.

function varargout = facerecog_OutputFcn(hObject, eventdata, handles)

% varargout cell array for returning output args (see VARARGOUT);

% hObject

% eventdata reserved - to be defined in a future version of MATLAB

% handles

handle to figure

structure with handles and user data (see GUIDATA)

% Get default command line output from handles structure

44

varargout{1} = handles.output;

% --- Executes on button press in pushbutton1.

function pushbutton1_Callback(hObject, eventdata, handles)

% hObject

% eventdata reserved - to be defined in a future version of MATLAB

% handles

[file path]=uigetfile('*.jpg','Select input image'); filename=strcat(path,file);

handles.data=imresize(rgb2gray(imread(filename)),[200,200]);

guidata(hObject,handles);

axes(handles.axes5);

imshow(uint8(handles.data));

handle to pushbutton1 (see GCBO)

structure with handles and user data (see GUIDATA)

% --- Executes on button press in pushbutton2.

function pushbutton2_Callback(hObject, eventdata, handles)

% hObject

% eventdata reserved - to be defined in a future version of MATLAB

% handles

a=zeros(200,200,75);

h=waitbar(0,'Searching Database for i=1:75

handle to pushbutton2 (see GCBO)

structure with handles and user data (see GUIDATA)

');

a(:,:,i)=imresize(rgb2gray(imread(strcat('C:\Users\Matlab\Desktop\Source

code\TestData\',num2str(i),'.jpg'))),[200,200]);

end

% imshow(uint8(a(:,:,8)));

%data read %eigen face vectors

b=zeros(40000,1,75);

for j=1:75

c=a(:,:,j)';

d=c(:);

b(:,1,j)=d;

waitbar(j/300);

end

% avg face vector

m=mean(b(:,1,:),3);

% normalized vectors

for i=1:75

n(:,1,i)=b(:,1,i)-m;

end

c=zeros(40000,75);

45

for i=1:75

c(:,i)=n(:,1,i);

waitbar((75/300)+(i/300));

end comat=c'*c; %size(comat) %eigen faces for i=1:75 u(:,i)=c(:,:)*comat(:,i);

waitbar(0.5+(i/300));

end for i=1:75

w(:,i)=inv(u'*u)*(u'*n(:,1,i));

waitbar(0.75+(i/300));

end newim=handles.data'; newimag=newim(:);

w(:,76)=inv(u'*u)*(u'*(double(newimag)-m));

%distance computations

d=zeros(75,1);

% v=zeros(13,1); for i=1:75

v=w(:,76)-w(:,i);

d(i)=sqrt(v'*v);

end

[a,index]=sort(d);

axes(handles.axes1);

imshow(imresize(rgb2gray(imread((strcat('C:\Users\Matlab\Desktop\Source

code\Database\',strcat(num2str(index(1))),'.jpg')))),[200,200]));

i=index(1);

if i ==1 || i==2 ||i==31 ||i==32 || i==33 name='Rajesh Mishra'; Add='No:12 Anna Nagar,Chennai'; elseif i ==3 || i==4 ||i==34 ||i==35 || i==36 name='Alok Nath'; Add='No:1/24 Anna Salai,Vellore'; elseif i ==5 || i==6 ||i==37 ||i==38 || i==39 name='Brijesh Kumar'; Add='No: 32/10, Abdul Aziz Street,Chennai'; elseif i ==7 || i==8 ||i==40 ||i==41 || i==42 name='Anurag Kesawar'; Add='No.41, Rangaiyya Street,Ayanavaram, Chennai'; elseif i ==9 || i==10 ||i==43 ||i==44 || i==45

46

name='Shambhu Nath'; Add='Plot No. 160,1" Cross Street, Vellore'; elseif i ==11 || i==12 ||i==46 ||i==47 || i==48 name='Ritesh Kumar'; Add='No:1/4 KK Nagar, Chennai'; elseif i ==13 || i==14 ||i==49 ||i==50 || i==51 name='D K Mishra'; Add='F2 2/2, E.V.R Periyar Nagar,Chennai'; elseif i ==15 || i==16 ||i==52 ||i==53 || i==54 name='Shivakanth Pathak'; Add='No. 6/50G, Shanti Path, Chanakyapuri, New Delhi'; elseif i ==17 || i==18 ||i==55 ||i==56 || i==57 name='O P Tiwari';

Add='No:14/1 Shantipath, Chanakyapuri,New Delhi'; elseif i ==19 || i==20 ||i==58 ||i==59 || i==60 name='Bijoy Jha'; Add='24, Kasturba Gandhi Marg,New Delhi'; elseif i ==22 || i==21 ||i==61 ||i==62 || i==63 name='P K Mishra'; Add='38/A, Jawahar Lal Nehru Road,Kolkata'; elseif i ==24 || i==23 ||i==64 ||i==65 || i==66 name='Dojer Kera'; Add='C-49, G-Block, Bandra East, Mumbai '; elseif i ==26 || i==25 ||i==67 ||i==68 || i==69 name='Amit Kumar'; Add='M-26/6,Ashok Nagar, Chennai'; elseif i ==27 || i==28 ||i==70 ||i==71 || i==72 name='Andrew Rendou'; Add='1st Avenue,Ambattur, Chennai'; elseif i ==29 || i==30 ||i==73 ||i==74 || i==75 name='Tigmanshu Mishra'; Add='No:24/53 Manimegalai Street, Madipakkam, Chennai'; end % tym=datestr(now,'HH:MM:SS'); na=name; ad=Add; dt=date;

s1=strcat('Name :

',name);

s2=strcat('Address :

',Add);

s={s1;s2};

set(handles.text2,'String',s);

filename='Studentdata.xlsx'; N=na; A=ad; Att='Present';dd=dt; fileExist = exist(filename,'file'); if fileExist==0 header = {'Name', 'Address','Attendence','Date'};

47

xlswrite(filename,header); else [~,~,input] = xlsread(filename); % Read in your xls file to a cell array (input) new_data = {N, A,Att,dd}; % This is a cell array of the new line you want to add output = cat(1,input,new_data); % Concatinate your new data to the bottom of input xlswrite(filename,output); % Write to the new excel file. end

48

8.2 SCREEN SHOT

8.2.1 Registration

8.2 SCREEN SHOT 8.2.1 Registration Figure 8.1 Registration 49

Figure 8.1 Registration

49

8.2.2 Login

8.2.2 Login Figure 8.2 Login 50

Figure 8.2 Login

50

8.2.3 Matching

8.2.3 Matching Figure 8.3 Matching 51

Figure 8.3 Matching

51

8.2.4 Store Student Details

8.2.4 Store Student Details Figure 8.4 Store Student Details 52

Figure 8.4 Store Student Details

52

CHAPTER 9

Testing Strategy

9.1 SYSTEM TESTING

The purpose of testing is to discover errors. Testing is the process of trying to discover every conceivable fault or weakness in a work product. It provides a way to check the functionality of components, sub assemblies, assemblies and/or a finished product It is the process of exercising software with the intent of ensuring that the

Software system meets its requirements and user expectations and does not fail in an unacceptable manner. There are various types of test. Each test type addresses a specific testing requirement.

9.2 TYPES OF TESTS

9.2.1 Unit testing Unit testing involves the design of test cases that validate that the internal program logic is functioning properly, and that program inputs produce valid outputs. All decision branches and internal code flow should be validated. It is the testing of individual software units of the application .it is done after the completion of an individual unit before integration. This is a structural testing, that relies on knowledge of its construction and is invasive. Unit tests perform basic tests at component level and test a specific business process, application, and/or system configuration. Unit tests ensure that each unique path of a business process performs accurately to the documented specifications and contains clearly defined inputs and expected results.

53

9.2.2

Integration testing Integration tests are designed to test integrated software components to determine if they

actually run as one program. Testing is event driven and is more concerned with the basic outcome of screens or fields. Integration tests demonstrate that although the components were individually satisfaction, as shown by successfully unit testing, the combination of components is correct and

exposing the problems that arise from the

consistent. Integration testing is specifically aimed at combination of components.

9.2.3 Functional test

Functional tests provide a systematic demonstration that functions tested are available as specified by the business and technical requirements, system documentation, and user manuals.

Functional testing is centered on the following items:

Valid Input

: identified classes of valid input must be accepted.

Invalid Input

: identified classes of invalid input must be rejected.

Functions

: identified functions must be exercised.

Output

: identified classes of application outputs must be exercised.

Systems/Procedures: interfacing systems or procedures must be invoked.

Organization and preparation of functional tests is focused on requirements, key functions, or special test cases. In addition, systematic coverage pertaining to identify

Business process flows; data fields, predefined processes, and successive processes must be considered for testing. Before functional testing is complete, additional tests are identified and the effective value of current tests is determined.

9.2.4 System Test

System testing ensures that the entire integrated software system meets requirements. It tests a configuration to ensure known and predictable results. An example of system testing is the

54

configuration oriented system integration test. System testing is based on process descriptions and flows, emphasizing pre-driven process links and integration points.

9.2.5 White Box Testing

White Box Testing is a testing in which in which the software tester has knowledge of the inner workings, structure and language of the software, or at least its purpose. It is purpose. It is used to test areas that cannot be reached from a black box level.

9.2.6 Black Box Testing

Black Box Testing is testing the software without any knowledge of the inner workings, structure or language of the module being tested. Black box tests, as most other kinds of tests, must

be written from a definitive source document, such as specification or requirements document, such as specification or requirements document. It is a testing in which the software under test is treated, as a black box .you cannot “see” into it. The test provides inputs and responds to outputs without considering how the software works.

9.3 Unit Testing

Unit testing is usually conducted as part of a combined code and unit test phase of the software lifecycle, although it is not uncommon for coding and unit testing to be conducted as two distinct phases.

Test Strategy and Approach Field testing will be performed manually and functional tests will be written in detail.

Test objectives

All field entries must work properly.

Pages must be activated from the identified link.

The entry screen, messages and responses must not be delayed.

55

Features to be tested

Verify that the entries are of the correct format

No duplicate entries should be allowed

All links should take the user to the correct page.

Acceptance Testing

User Acceptance Testing is a critical phase of any project and requires significant participation by the end user. It also ensures that the system meets the functional requirements.

Test Results: All the test cases mentioned above passed successfully. No defects encountered.

56

CHAPTER 10

Conclusion

Experimental results have shown that, the proposed face recognition method was very sensitive to face background and head orientations. Changes in the illumination did not cause a major problem to the system. Besides, presence of small detail such as dark glasses or masks was too far from being a real challenge to the system. There exists a trade off between the correct recognition rate and the threshold value. As the threshold value increases, numbers of misses begin to decrease, possibly resulting in misclassifications. On the contrary, when the number of eigenfaces involved in the recognition process increases misclassification rate begins to decrease, possibly resulting in misses. The eigenface method is very sensitive to head orientations, and most of the mismatches occur for the images with large head orientations.

10.1 FUTHURE ENHANCEMENT

The current recognition system has been designed for frontal views of face images. A neural network architecture (may be together with a feature based approach) can be implemented in which the orientation of the face is first determined, and then the most suitable recognition method is selected, Also the current recognition system acquires face images only from face files located on magnetic mediums. Camera and scanner support should be implemented for greater flexibility.

57

REFRENCES

[1] M. A. Turk and A. P. Pentland, “Face Recognition Using Eigenfaces,”in Proc. IEEE Conference on Computer Vision and PatternRecognition, pp. 586591. 1991.

[2] Nirmalya Kar, Mrinal Kanti Debbarma, Ashim Saha, and Dwijen Rudra Pal” Study of Implementing Automated Attendance SystemUsing Face Recognition Technique “International Journal of Computer and Communication Engineering, Vol. 1, No. 2, July 2012.

[3] Goldstein, A. J., Harmon, L. D., and Lesk, A. B., Identification of human faces", Proc. IEEE 59, pp. 748- 760, (1971).

[4]M.N.Shah Zainudin., Radi H.R., S.Muniroh Abdullah., Rosman Abd. Rahim., M.Muzafar Ismail,” Face Recognition using PCA and LDA”, International Journal of Electrical & Computer Sciences.

[5] Yohei KAWAGUCHI ,Tetsuo SHOJI ,Weijane LIN ,and Koh KAKUSHO , “Face Recognition-based Lecture Attendance System”, Department of Intelligence Science and Technology, Graduate School of Informatics, Kyoto University.

58