Sie sind auf Seite 1von 33

SWE1015- BIOMETRIC SYSTEMS

REVIEW 3
FINAL REPORT

FACE RECOGNITION SYSTEM FOR STUDENT


ATTENDANCE
GROUP-7
16MIS0166 – P.MEDHAVINI
16MIS0227 – KOWSALYA V
16MIS0308 – M TANVI TABSHEEM
16MIS0423 – R.PRATHYUSHA
16MIS0468 – SRI LIKITHA M L

FACULTY ADVISOR – RAMYA G


Chapter BIOMETRIC SYSTEMS Page.No.

1 Abstract 1

2 Objectives 2

3 Literature Review

3.1. National Status

3.2. International Status

4 Requirements -

4.1. Hardware - System Requirements

4.2. Software - Languages, Tools used

5 Invention Details

5.1. Objects of the Invention

5.2. Summary of Invention

5.3 Architecture of project

5.4. Detailed Description of the Invention


ABSTRACT:

● Nowadays Educational institutions are concerned about regularity of


student attendance. This is mainly due to students’ overall academic
performance is affected by his or her attendance in the institute.
● Mainly there are two conventional methods of marking attendance which
are calling out the roll call or by taking student sign on paper. They both
were more time consuming and difficult.
● Hence, there is a requirement of computer-based student attendance
management system which will assist the faculty for maintaining
attendance record automatically.
● In this project we have implemented the automated attendance system
using MATLAB.
● We have projected our ideas to implement “Automated Attendance
System Based on Facial Recognition”, in which it imbibes large
applications.
● The application includes face identification, which saves time and
eliminates chances of proxy attendance because of the face authorization.
● Hence, this system can be implemented in a field where attendance plays
an important role.

MODULES:

1. Create database
2. Eigen face core
3. Eigen face test
4. Face recognition
5. Main

Objectives:

Our primary goal is to help the lecturers, improve and organize the process of
track and manage student attendance and absenteeism. Additionally, we seek to:

● Provides a valuable attendance service for both teachers and students.


● Reduce manual process errors by provide automated and a reliable
attendance system uses face recognition technology.
● Increase privacy and security which student cannot presenting himself or
his friend while they are not.
● Produce monthly reports for lecturers.
● Flexibility, Lectures capability of editing attendance records.
● Calculate absenteeism percentage and send reminder messages to
students.

In this project we aim to build an Attendance marking system with


the help of facial recognition owing to the difficulty in the manual as well as
other traditional means of attendance system.

Literature Review

For our project we got motivation by the research carried out by the following
people and their published papers:

“Eigenfaces for recognition’’ (Mathew Turk and Alex Pentland) [1], here they
have developed a near-real time computer system that can locate and track a
subject’s head, and then recognize the person by comparing characteristics of
the face to those of known individuals. The computational approach taken in
this system is motivated by both physiology and information theory, as well as
by the practical requirements of near-real time performance and accuracy. This
approach treats the face recognition problem as an intrinsically two-dimensional
recognition problem rather than requiring recovery of threedimensional
geometry, taking advantage of the fact that these faces are normally upright and
thus may be described by a small set of two-dimensional characteristic views.
Their experiments show that the eigenface technique can be made to perform at
very high accuracy, although with a substantial “unknown “rejection rate and
thus potentially well suited to these applications. The future scope of this
project was-in addition to recognizing face, to use eigenface analysis to
determine the gender of the subject and to interpret facial expressions.

“Face recognition using eigenfaces and artificial neural networks” (Mayank


Agarwal, Nikunj Jain, Mr. Manish Kumar and Himanshu Agrawal) [4], this
paper presents a methodology for face recognition based on information theory
approach of coding and decoding the face image. Proposed methodology is
connection of two stages – Feature extraction using principle component
analysis and recognition using the feed forward back propagation Neural
Network. The algorithm has been tested on 400 images (40 classes). A
recognition score for test lot is calculated by considering almost all the variants
of feature extraction. The proposed methods were tested on Olivetti and Oracle
Research Laboratory (ORL) face database. Test results gave a recognition rate
of 97.018%

National Status:
1) Face Recognition Techniques - An evaluation Study
(Department of Management Information System, Applied
Science University,)

● Face Recognition Based on Principal Component Analysis. Principal


Component Analysis (PCA) is known as algorithm
● That used in face recognition. The basic idea in PCA is to determine a
vector of much lower dimension that best approximates in some sense a
given data vector,
● thus, in face recognition it takes an s-dimensional vector representation of
each face in a training set of images as input, and determines a t-
dimensional subspace whose basis vector is maximum corresponding to the
original image,
● The dimension of this new subspace is lower than the original one (t <<s). if
the original image elements are considered as random variables, then the
principal corresponding to the large Eigen values of the correlation matrix
and error minimization is done in a least-squares sense(Qing chen, Xiaoli
Yang, jiying Zhao)
2) REAL TIME FACE RECOGNITION USING ADBOOST
IMPROVED PCA ALGORITHM

AUTHORS: K.Susheel Kumar, Shitala Prasad, Vijay Bhaskar Semwal, R


C Tripathi

➢ Represent the faces in the database in terms of the vector X.Compute


the average face Avg Face and subtract the Avg Face from the vector
X.
➢ Classify the images based on the number of unique subjects involved.
So the number of classes, C, will be the number of subjects who have
been imaged.
Compute the scatter matrix.
Use PCA to reduce the dimension of the feature space to N – C. Let the
eigenvectors obtained be W(fca)Project the scatter matrices onto this basis
to obtain non-singular scatter matrices and S aN
➢ Compute the generalized eigenvectors of the non-singular scatter
matrices and soas to satisfy the equation

S(B)*W(LDA) = S(W)*W(LDA) *D

➢ where D is the eigenvalue. Retain only theC-1 eigenvectors


corresponding to the C-1 largest eigenvalues. This gives the basis
vector W(LDA)
➢ Then the image vector X is projected onto this basis vector and the
weights of the image are computed.

where c=number of distinct classes


N=number of images in each class of I Xi=face image that are classes i

International Status:
1) SPARSE REPRESENTATION THEORY AND ITS APPLICATIONS FOR
FACE RECOGNITION

AUTHORS:-Yongjiao Wang , Chuan Wang , and Lei Liang

we use some face databases to verify the performance of


different face recognition methods We compare face recognition based sparse
representation (SR) with the common methods such as nearest neighbor (NN),
linear support vector machine (SVM), nearest subspace (NS). In our
experiments, PCA is used to reduce the dimensionality of original image vector,
and then these low dimension features are as facial feature. We randomly
separate each database into two halves. One half was used as the dictionary, and
the other half as testing samples After conversion problem, the optimal solution
can be solved by a standard linear programming method to obtain. Obviously, if
we directly use the original high-dimensional image to construct the training
dictionary, with the corresponding equations must be over- determined, but there is
the corresponding high computational complexity. To reduce the computational
complexity and maintain the sparsity of the solution vector, the original facial
image vector is projected by PCA to obtain the low dimension face vector.
Hardware Requirements:

PC

WEB CAM

High resolution camera and screen

Software requirements:

MATLAB

windows7 or higher, SQL and visual studio.

INVENTION DETAILS

Objects of the Invention:

It is very useful in protected zone while many pupils are entering and exiting.
For e.g., Take a bank while entering it takes the data from the user and the
cameras installed will takes the photos particulary the faces of the user and
combines with their user names.

Summary of Invention:

We use open CV with python .It detect the faces and combines with the ID
given by names and taken into database .Afterwards while recognizing the face
it display the name
Architecture of project:

BLOCK DIAGRAM:
IMAGE PROCESSING

START

CAPTURE
IMAGE

FACE DETECTION
AND CROPPING

FACE RECOGNITION
USING EIGEN
VALUES

STORED
RECOGNITION
ENTRIES
Detailed Description of the Invention:

➢ We use MAT LAB and this detects the faces and store in the database .Our
duty is to give the ID’s for the corresponding image and the names.

➢ While it about to sign in process it checks the image in the database whether
it is present or not and gives

➢ The result our name with our image if not it displays “Not an authorized
user please sign up”.

CODE:

CREATE DATABASE
function T = CreateDatabase(TrainDatabasePath)
% Align a set of face images (the training set T1, T2, ... , TM )
%
% Description: This function reshapes all 2D images of the training
database
% into 1D column vectors. Then, it puts these 1D column vectors in a row to
% construct 2D matrix 'T'.
%
%
% Argument: TrainDatabasePath - Path of the training database
%
% Returns: T - A 2D matrix, containing all 1D
image vectors.
% Suppose all P images in the
training database
% have the same size of MxN. So the
length of 1D
% column vectors is MN and 'T' will
be a MNxP 2D matrix.
%
% See also: STRCMP, STRCAT, RESHAPE

% Original version by Amir Hossein Omidvarnia, October 2007


% Email: aomidvar@ece.ut.ac.ir

%%%%%%%%%%%%%%%%%%%%%%%% File management


TrainFiles = dir(TrainDatabasePath);
Train_Number = 0;

for i = 1:size(TrainFiles,1)
if
not(strcmp(TrainFiles(i).name,'.')|strcmp(TrainFiles(i).name,'..')|strcmp(T
rainFiles(i).name,'Thumbs.db'))
Train_Number = Train_Number + 1; % Number of all images in the
training database
end
end
%%%%%%%%%%%%%%%%%%%%%%%% Construction of 2D matrix from 1D image vectors
T = [];
for i = 1 : Train_Number

% I have chosen the name of each image in databases as a corresponding


% number. However, it is not mandatory!
str = int2str(i);
str = strcat(str,'.jpg');
str = strcat(TrainDatabasePath,str);

img = imread(str);
img = rgb2gray(img);

[irow, icol] = size(img);

temp = reshape(img',irow*icol,1); % Reshaping 2D images into 1D image


vectors
T = [T temp]; % 'T' grows after each turn
end

EIGEN FACECORE
function [m, A, Eigenfaces] = EigenfaceCore(T)
% Use Principle Component Analysis (PCA) to determine the most
discriminating features between images of faces.
%
% Description: This function gets a 2D matrix, containing all training
image vectors
% and returns 3 outputs which are extracted from training database.
%
% Argument: T - A 2D matrix, containing all 1D
image vectors.
% Suppose all P images in the
training database
% have the same size of MxN. So the
length of 1D
% column vectors is M*N and 'T'
will be a MNxP 2D matrix.
%
% Returns: m - (M*Nx1) Mean of the training
database
% Eigenfaces - (M*Nx(P-1)) Eigen vectors of the
covariance matrix of the training database
% A - (M*NxP) Matrix of centered image
vectors
%
% See also: EIG

%%%%%%%%%%%%%%%%%%%%%%%% Calculating the mean image


m = mean(T,2); % Computing the average face image m = (1/P)*sum(Tj's) (j
= 1 : P)
Train_Number = size(T,2);

%%%%%%%%%%%%%%%%%%%%%%%% Calculating the deviation of each image from mean


image
A = [];
for i = 1 : Train_Number
temp = double(T(:,i)) - m; % Computing the difference image for each
image in the training set Ai = Ti - m
A = [A temp]; % Merging all centered images
end

%%%%%%%%%%%%%%%%%%%%%%%% Snapshot method of Eigenface methos


% We know from linear algebra theory that for a PxQ matrix, the maximum
% number of non-zero eigenvalues that the matrix can have is min(P-1,Q-1).
% Since the number of training images (P) is usually less than the number
% of pixels (M*N), the most non-zero eigenvalues that can be found are
equal
% to P-1. So we can calculate eigenvalues of A'*A (a PxP matrix) instead of
% A*A' (a M*NxM*N matrix). It is clear that the dimensions of A*A' is much
% larger that A'*A. So the dimensionality will decrease.

L = A'*A; % L is the surrogate of covariance matrix C=A*A'.

[V ,D] = eig(L); % Diagonal elements of D are the eigenvalues for both


L=A'*A and C=A*A'.

%%%%%%%%%%%%%%%%%%%%%%%% Sorting and eliminating eigenvalues


% All eigenvalues of matrix L are sorted and those who are less than a
% specified threshold, are eliminated. So the number of non-zero
% eigenvectors may be less than (P-1).

L_eig_vec = [];
for i = 1 : size(V,2)
if( D(i,i)>1 )
L_eig_vec = [L_eig_vec V(:,i)];
end
end

%%%%%%%%%%%%%%%%%%%%%%%% Calculating the eigenvectors of covariance matrix


'C'
% Eigenvectors of covariance matrix C (or so-called "Eigenfaces")
% can be recovered from L's eiegnvectors.
Eigenfaces = A * L_eig_vec; % A: centered image vectors

EIGEN FACE TEST


function [m, A, Eigenfaces] = Eigenfacetest(T)
% Use Principle Component Analysis (PCA) to determine the most
discriminating features between images of faces.
%
% Description: This function gets a 2D matrix, containing all training
image vectors
% and returns 3 outputs which are extracted from training database.
%
% Argument: T - A 2D matrix, containing all 1D
image vectors.
% Suppose all P images in the
training database
% have the same size of MxN. So the
length of 1D
% column vectors is M*N and 'T'
will be a MNxP 2D matrix.
%
% Returns: m - (M*Nx1) Mean of the training
database
% Eigenfaces - (M*Nx(P-1)) Eigen vectors of the
covariance matrix of the training database
% A - (M*NxP) Matrix of centered image
vectors
%
% See also: EIG

% Original version by Amir Hossein Omidvarnia, October 2007


% Email: aomidvar@ece.ut.ac.ir

%%%%%%%%%%%%%%%%%%%%%%%% Calculating the mean image


m = mean(T,2); % Computing the average face image m = (1/P)*sum(Tj's) (j
= 1 : P)
Train_Number = size(T,2);

%%%%%%%%%%%%%%%%%%%%%%%% Calculating the deviation of each image from mean


image
A = [];
for i = 1 : Train_Number
temp = double(T(:,i)) - m; % Computing the difference image for each
image in the training set Ai = Ti - m
A = [A temp]; % Merging all centered images
end

%%%%%%%%%%%%%%%%%%%%%%%% Snapshot method of Eigenface methos


% We know from linear algebra theory that for a PxQ matrix, the maximum
% number of non-zero eigenvalues that the matrix can have is min(P-1,Q-1).
% Since the number of training images (P) is usually less than the number
% of pixels (M*N), the most non-zero eigenvalues that can be found are
equal
% to P-1. So we can calculate eigenvalues of A'*A (a PxP matrix) instead of
% A*A' (a M*NxM*N matrix). It is clear that the dimensions of A*A' is much
% larger that A'*A. So the dimensionality will decrease.

L = A'*A; % L is the surrogate of covariance matrix C=A*A'.

[V ,D] = eig(L); % Diagonal elements of D are the eigenvalues for both


L=A'*A and C=A*A'.

%%%%%%%%%%%%%%%%%%%%%%%% Sorting and eliminating eigenvalues


% All eigenvalues of matrix L are sorted and those who are less than a
% specified threshold, are eliminated. So the number of non-zero
% eigenvectors may be less than (P-1).

L_eig_vec = [];
for i = 1 : size(V,2)
if( D(i,i)>1 )
L_eig_vec = [L_eig_vec V(:,i)];
end
end

%%%%%%%%%%%%%%%%%%%%%%%% Calculating the eigenvectors of covariance matrix


'C'
% Eigenvectors of covariance matrix C (or so-called "Eigenfaces")
% can be recovered from L's eiegnvectors.
Eigenfaces = A * L_eig_vec; % A: centered image vectors
MAIN
function varargout = MAIN(varargin)
% MAIN MATLAB code for MAIN.fig
% MAIN, by itself, creates a new MAIN or raises the existing
% singleton*.
%
% H = MAIN returns the handle to a new MAIN or the handle to
% the existing singleton*.
%
% MAIN('CALLBACK',hObject,eventData,handles,...) calls the local
% function named CALLBACK in MAIN.M with the given input arguments.
%
% MAIN('Property','Value',...) creates a new MAIN or raises the
% existing singleton*. Starting from the left, property value pairs
are
% applied to the GUI before MAIN_OpeningFcn gets called. An
% unrecognized property name or invalid value makes property
application
% stop. All inputs are passed to MAIN_OpeningFcn via varargin.
%
% *See GUI Options on GUIDE's Tools menu. Choose "GUI allows only one
% instance to run (singleton)".
%
% See also: GUIDE, GUIDATA, GUIHANDLES

% Edit the above text to modify the response to help MAIN

% Last Modified by GUIDE v2.5 29-Oct-2018 10:09:23

% Begin initialization code - DO NOT EDIT


gui_Singleton = 1;
gui_State = struct('gui_Name', mfilename, ...
'gui_Singleton', gui_Singleton, ...
'gui_OpeningFcn', @MAIN_OpeningFcn, ...
'gui_OutputFcn', @MAIN_OutputFcn, ...
'gui_LayoutFcn', [] , ...
'gui_Callback', []);
if nargin && ischar(varargin{1})
gui_State.gui_Callback = str2func(varargin{1});
end

if nargout
[varargout{1:nargout}] = gui_mainfcn(gui_State, varargin{:});
else
gui_mainfcn(gui_State, varargin{:});
end
% End initialization code - DO NOT EDIT

% --- Executes just before MAIN is made visible.


function MAIN_OpeningFcn(hObject, eventdata, handles, varargin)
% This function has no output args, see OutputFcn.
% hObject handle to figure
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
% varargin command line arguments to MAIN (see VARARGIN)

% Choose default command line output for MAIN


handles.output = hObject;

% Update handles structure


guidata(hObject, handles);

% UIWAIT makes MAIN wait for user response (see UIRESUME)


% uiwait(handles.figure1);

% --- Outputs from this function are returned to the command line.
function varargout = MAIN_OutputFcn(hObject, eventdata, handles)
% varargout cell array for returning output args (see VARARGOUT);
% hObject handle to figure
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Get default command line output from handles structure


varargout{1} = handles.output;

% --------------------------------------------------------------------
function FILE_Callback(hObject, eventdata, handles)
% hObject handle to FILE (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% --------------------------------------------------------------------
function OPEN_Callback(hObject, eventdata, handles)
% hObject handle to OPEN (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
[fn, pn]=uigetfile('*.jpg;*.png');
a=[pn fn];
b=imread(a);
handles.b=b;
axes(handles.axes5);
imshow(b);
guidata(hObject,handles);
% --------------------------------------------------------------------
function Untitled_3_Callback(hObject, eventdata, handles)
% hObject handle to Untitled_3 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% --------------------------------------------------------------------
function ON_Callback(hObject, eventdata, handles)
% hObject handle to ON (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
mkdir('E:\New
folder\facerecattendencesystem\facerecattendencesystem\TestDatabase');
axes(handles.axes5);
vid = videoinput('winvideo',1);
handles.vid=vid;
vidRes = get(vid, 'VideoResolution');
nBands = get(vid, 'NumberOfBands');
hImage = image( zeros(vidRes(2), vidRes(1), nBands) );
preview(vid,hImage);
img = getsnapshot(vid);
axes(handles.axes6);
imshow(img);
guidata(hObject,handles);
FDetect=vision.CascadeObjectDetector('FrontalFaceCART');
BB=step(FDetect,img);
axes(handles.axes7);
imshow(img);
hold on
for i=1:size(BB,1)
rectangle('position',BB(i,:),'Linewidth',5,'Linestyle','-
','Edgecolor','r');
end
hold off
N=size(BB,1);
handles.N=N;
counter=1;
for i=1:N
face=imcrop(img,BB(i,:));
savenam = strcat('E:\New
folder\facerecattendencesystem\facerecattendencesystem\TestDatabase\'
,num2str(counter), '.jpg'); %this is where and what your image will be
saved
baseDir = 'E:\New
folder\facerecattendencesystem\facerecattendencesystem\TestDatabase\';
% baseName = 'image_';
newName = [baseDir num2str(counter) '.jpg'];
handles.face=face;
while exist(newName,'file')
counter = counter + 1;
newName = [baseDir num2str(counter) '.jpg'];
end
fac=imresize(face,[240,320]);
imwrite(fac,newName);
axes(eval(['handles.axes', num2str(i)]));
imshow(face);
guidata(hObject,handles);
pause(5);
end

% --------------------------------------------------------------------
function OFF_Callback(hObject, eventdata, handles)
% hObject handle to OFF (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
off=handles.vid;
delete(off);

function edit1_Callback(hObject, eventdata, handles)


% hObject handle to edit1 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Hints: get(hObject,'String') returns contents of edit1 as text


% str2double(get(hObject,'String')) returns contents of edit1 as a
double

% --- Executes during object creation, after setting all properties.


function edit1_CreateFcn(hObject, eventdata, handles)
% hObject handle to edit1 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called

% Hint: edit controls usually have a white background on Windows.


% See ISPC and COMPUTER.
if ispc && isequal(get(hObject,'BackgroundColor'),
get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end

% --- Executes on button press in pushbutton2.


function pushbutton2_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton2 (see GCBO)
% eventdata reserved - to be defined in a future verclcsion of MATLAB
% handles structure with handles and user data (see GUIDATA)

% --- Executes on button press in checkbox1.


function checkbox1_Callback(hObject, eventdata, handles)
% hObject handle to checkbox1 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Hint: get(hObject,'Value') returns toggle state of checkbox1

% --- Executes on button press in checkbox2.


function checkbox2_Callback(hObject, eventdata, handles)
% hObject handle to checkbox2 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Hint: get(hObject,'Value') returns toggle state of checkbox2

% --- Executes on button press in checkbox3.


function checkbox3_Callback(hObject, eventdata, handles)
% hObject handle to checkbox3 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Hint: get(hObject,'Value') returns toggle state of checkbox3

% --- Executes on button press in checkbox4.


function checkbox4_Callback(hObject, eventdata, handles)
% hObject handle to checkbox4 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
% Hint: get(hObject,'Value') returns toggle state of checkbox4

% --------------------------------------------------------------------
function EXIT_Callback(hObject, eventdata, handles)
% hObject handle to EXIT (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
close;

% --- Executes on button press in RECOGNITION.


function RECOGNITION_Callback(hObject, eventdata, handles)
% hObject handle to RECOGNITION (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
rec=handles.N;
TrainDatabasePath = 'E:\New
folder\facerecattendencesystem\facerecattendencesystem\TrainDatabase\';
TestDatabasePath = 'E:\New
folder\facerecattendencesystem\facerecattendencesystem\TestDatabase\';
v=rec;
for j = 1:v
TestImage = num2str(j);
s=strcat('a',TestImage);
TestImage = strcat(TestDatabasePath,'\',char( TestImage),'.jpg');
T = CreateDatabase(TrainDatabasePath);
[m, A, Eigenfaces] = EigenfaceCore(T);
[OutputName,Recognized_index] = Recognition(TestImage, m, A,
Eigenfaces);
SelectedImage = strcat(TrainDatabasePath,'\', OutputName);
SelectedImage = imread(SelectedImage);
axes(eval(['handles.axes', num2str(s)]));
imshow(SelectedImage);
switch Recognized_index
case 1
strmsg1 = 'The Person is Recognised. ';
msg = [strmsg1 ' '];
msgbox(msg);
sd=strcat('D',num2str(j));
% se=strcat('E',num2str(j));
% dt = datestr(now,'mmmm dd, yyyy HH:MM:SS.FFF AM');
% dt=char(dt);
% xlswrite('java.xlsx',dt,'Sheet1',se);
xlswrite('java.xlsx','1','Sheet1',sd);

case 2
strmsg1 = 'The Person is Recognised. ';
msg = [strmsg1 ' '];
msgbox(msg);
sd=strcat('D',num2str(j));
% se=strcat('E',num2str(j));
% dt = datestr(now,'mmmm dd, yyyy HH:MM:SS.FFF AM');
% dt=char(dt);
% xlswrite('java.xlsx',dt,'Sheet1',se);
xlswrite('java.xlsx','2','Sheet1',sd);

case 3
strmsg1 = 'The Person is Recognised.';
msg = [strmsg1 ' '];
msgbox(msg);
sd=strcat('D',num2str(j));
% se=strcat('E',num2str(j));
% dt = datestr(now,'mmmm dd, yyyy HH:MM:SS.FFF AM');
% dt=char(dt);
% xlswrite('java.xlsx',dt,'Sheet1',se);
xlswrite('java.xlsx','3','Sheet1',sd);

case 4
strmsg1 = 'The Person is Recognised. ';
msg = [strmsg1 ' '];
msgbox(msg);
sd=strcat('D',num2str(j));
% se=strcat('E',num2str(j));
% dt = datestr(now,'mmmm dd, yyyy HH:MM:SS.FFF AM');
% dt=char(dt);
% xlswrite('java.xlsx',dt,'Sheet1',se);
xlswrite('java.xlsx','4','Sheet1',sd);

case 5
strmsg1 = 'The Person is Recognised. ';
msg = [strmsg1 ' '];
msgbox(msg);
sd=strcat('D',num2str(j));
% se=strcat('E',num2str(j));
% dt = datestr(now,'mmmm dd, yyyy HH:MM:SS.FFF AM');
% dt=char(dt);
% xlswrite('java.xlsx',dt,'Sheet1',se);
xlswrite('java.xlsx','5','Sheet1',sd);

case 6
strmsg1 = 'The Person is Recognised.';
msg = [strmsg1 ' '];
msgbox(msg);
sd=strcat('D',num2str(j));
% se=strcat('E',num2str(j));
% dt = datestr(now,'mmmm dd, yyyy HH:MM:SS.FFF AM');
% dt=char(dt);
% xlswrite('java.xlsx',dt,'Sheet1',se);
xlswrite('java.xlsx','6','Sheet1',sd);

% case 7
% strmsg1 = 'The recognised person is ';
% msg = [strmsg1 'SOORAJ'];
% msgbox(msg);
% sd=strcat('D',num2str(j));
% % se=strcat('E',num2str(j));
% % dt = datestr(now,'mmmm dd, yyyy HH:MM:SS.FFF AM');
% % dt=char(dt);
% % xlswrite('java.xlsx',dt,'Sheet1',se);
% xlswrite('java.xlsx','1','Sheet1',sd);
% case 8
% strmsg1 = 'The recognised person is ';
% msg = [strmsg1 'SOORAJ'];
% msgbox(msg);
% sd=strcat('D',num2str(j));
% % se=strcat('E',num2str(j));
% % dt = datestr(now,'mmmm dd, yyyy HH:MM:SS.FFF AM');
% % dt=char(dt);
% % xlswrite('java.xlsx',dt,'Sheet1',se);
% xlswrite('java.xlsx','1','Sheet1',sd);
%
% case 9
% strmsg1 = 'The recognised person is ';
% msg = [strmsg1 'SOORAJ'];
% msgbox(msg);
% sd=strcat('D',num2str(j));
% % se=strcat('E',num2str(j));
% % dt = datestr(now,'mmmm dd, yyyy HH:MM:SS.FFF AM');
% % dt=char(dt);
% % xlswrite('java.xlsx',dt,'Sheet1',se);
% xlswrite('java.xlsx','1','Sheet1',sd);
% case 10
% strmsg1 = 'The recognised person is ';
% msg = [strmsg1 'ramya'];
% msgbox(msg);
% sd=strcat('D',num2str(j));
% % se=strcat('E',num2str(j));
% % dt = datestr(now,'mmmm dd, yyyy HH:MM:SS.FFF AM');
% % dt=char(dt);
% % xlswrite('java.xlsx',dt,'Sheet1',se);
% xlswrite('java.xlsx','1','Sheet1',sd);
%
% case 11
% strmsg1 = 'The recognised person is ';
% msg = [strmsg1 'ramya'];
% msgbox(msg);
% sd=strcat('D',num2str(j));
% % se=strcat('E',num2str(j));
% % dt = datestr(now,'mmmm dd, yyyy HH:MM:SS.FFF AM');
% % dt=char(dt);
% % xlswrite('java.xlsx',dt,'Sheet1',se);
% xlswrite('java.xlsx','1','Sheet1',sd);
%
% case 12
% strmsg1 = 'The recognised person is ';
% msg = [strmsg1 'ramya'];
% msgbox(msg);
% sd=strcat('D',num2str(j));
% % se=strcat('E',num2str(j));
% % dt = datestr(now,'mmmm dd, yyyy HH:MM:SS.FFF AM');
% % dt=char(dt);
% % xlswrite('java.xlsx',dt,'Sheet1',se);
% xlswrite('java.xlsx','1','Sheet1',sd);
% case 13
% strmsg1 = 'The recognised person is ';
% msg = [strmsg1 'shyam'];
% msgbox(msg);
% sd=strcat('D',num2str(j));
% % se=strcat('E',num2str(j));
% % dt = datestr(now,'mmmm dd, yyyy HH:MM:SS.FFF AM');
% % dt=char(dt);
% % xlswrite('java.xlsx',dt,'Sheet1',se);
% xlswrite('java.xlsx','1','Sheet1',sd);
%
% case 14
% strmsg1 = 'The recognised person is ';
% msg = [strmsg1 'shyam'];
% msgbox(msg);
% sd=strcat('D',num2str(j));
% % se=strcat('E',num2str(j));
% % dt = datestr(now,'mmmm dd, yyyy HH:MM:SS.FFF AM');
% % dt=char(dt);
% % xlswrite('java.xlsx',dt,'Sheet1',se);
% xlswrite('java.xlsx','1','Sheet1',sd);
%
% case 15
% strmsg1 = 'The recognised person is ';
% msg = [strmsg1 'shyam'];
% msgbox(msg);
% sd=strcat('D',num2str(j));
% % se=strcat('E',num2str(j));
% % dt = datestr(now,'mmmm dd, yyyy HH:MM:SS.FFF AM');
% % dt=char(dt);
% % xlswrite('java.xlsx',dt,'Sheet1',se);
% xlswrite('java.xlsx','1','Sheet1',sd);
end
end

% --- Executes on button press in checkbox5.


function checkbox5_Callback(hObject, eventdata, handles)
% hObject handle to checkbox5 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Hint: get(hObject,'Value') returns toggle state of checkbox5

% --- Executes on button press in checkbox6.


function checkbox6_Callback(hObject, eventdata, handles)
% hObject handle to checkbox6 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Hint: get(hObject,'Value') returns toggle state of checkbox6

% --- Executes on button press in checkbox7.


function checkbox7_Callback(hObject, eventdata, handles)
% hObject handle to checkbox7 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Hint: get(hObject,'Value') returns toggle state of checkbox7

% --- Executes on button press in checkbox8.


function checkbox8_Callback(hObject, eventdata, handles)
% hObject handle to checkbox8 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Hint: get(hObject,'Value') returns toggle state of checkbox8

% --- Executes on button press in VIEWPROFILE.


function VIEWPROFILE_Callback(hObject, eventdata, handles)
% hObject handle to VIEWPROFILE (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% --------------------------------------------------------------------
function TESTON_Callback(hObject, eventdata, handles)
% hObject handle to TESTON (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
axes(handles.axes5);
vid = videoinput('winvideo',1);
handles.vid=vid;
vidRes = get(vid, 'VideoResolution');
nBands = get(vid, 'NumberOfBands');
hImage = image( zeros(vidRes(2), vidRes(1), nBands) );
preview(vid,hImage);
guidata(hObject,handles);

% --------------------------------------------------------------------
function TESTOFF_Callback(hObject, eventdata, handles)
% hObject handle to TESTOFF (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
off=handles.vid;
delete(off);

% --------------------------------------------------------------------
function PREVIEW_Callback(hObject, eventdata, handles)
% hObject handle to PREVIEW (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% --------------------------------------------------------------------
function ADDPHOTO_Callback(hObject, eventdata, handles)
% hObject handle to ADDPHOTO (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% --------------------------------------------------------------------
function ADD_Callback(hObject, eventdata, handles)
% hObject handle to ADD (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
mkdir('E:\New folder\TrainDatabase');
for ind=1:3
axes(handles.axes5);
vid = videoinput('winvideo',1);
handles.vid=vid;
vidRes = get(vid, 'VideoResolution');
nBands = get(vid, 'NumberOfBands');
hImage = image( zeros(vidRes(2), vidRes(1), nBands) );
preview(vid,hImage);
img = getsnapshot(vid);
axes(handles.axes6);
imshow(img);
guidata(hObject,handles);
FDetect=vision.CascadeObjectDetector('FrontalFaceCART');
%htextinsface = vision.TextInserter('Text', 'face : %2d', 'Location',
[5 2],'Font', 'Courier New','FontSize', 14);
BB=step(FDetect,img);
axes(handles.axes7);
imshow(img);
hold on
for i=1:size(BB,1)
rectangle('position',BB(i,:),'Linewidth',5,'Linestyle','-
','Edgecolor','r');
end
hold off
N=size(BB,1);
handles.N=N;
counter=1;
for i=1:N
face=imcrop(img,BB(i,:));
savenam = strcat('E:\New
folder\facerecattendencesystem\facerecattendencesystem\TrainDatabase\'
,num2str(counter), '.jpg'); %this is where and what your image will be
saved
baseDir = 'E:\New
folder\facerecattendencesystem\facerecattendencesystem\TrainDatabase\';
% baseName = 'image_';
newName = [baseDir num2str(counter) '.jpg'];
handles.face=face;
while exist(newName,'file')
counter = counter + 1;
newName = [baseDir num2str(counter) '.jpg'];
end
fac=imresize(face,[240,320]);
imwrite(fac,newName);
%axes(handles.axes14);
axes(eval(['handles.axes', num2str(i)]));
imshow(face);
guidata(hObject,handles);
pause(2);
end
delete(vid);
end

% --------------------------------------------------------------------
function REMOVE_Callback(hObject, eventdata, handles)
% hObject handle to REMOVE (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

rmdir('E:\New
folder\facerecattendencesystem\facerecattendencesystem\TestDatabase','s')

% --- Executes on button press in pushbutton4.


function pushbutton4_Callback(hObject, eventdata, handles)
% hObject handle to pushbutton4 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
open('C:\Windows\System32\DriverStore\FileRepository\bth.inf_amd64_neutral_
a1e8f56d586ec10b\fsquirt.exe');

function edit2_Callback(hObject, eventdata, handles)


% hObject handle to edit2 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)

% Hints: get(hObject,'String') returns contents of edit2 as text


% str2double(get(hObject,'String')) returns contents of edit2 as a
double

% --- Executes during object creation, after setting all properties.


function edit2_CreateFcn(hObject, eventdata, handles)
% hObject handle to edit2 (see GCBO)
% eventdata reserved - to be defined in a future version of MATLAB
% handles empty - handles not created until after all CreateFcns called

% Hint: edit controls usually have a white background on Windows.


% See ISPC and COMPUTER.
if ispc && isequal(get(hObject,'BackgroundColor'),
get(0,'defaultUicontrolBackgroundColor'))
set(hObject,'BackgroundColor','white');
end

RECOGNITION
function [OutputName,Recognized_index] = Recognition(TestImage, m, A,
Eigenfaces)
% Recognizing step....
%
% Description: This function compares two faces by projecting the images
into facespace and
% measuring the Euclidean distance between them.
%
% Argument: TestImage - Path of the input test image
%
% m - (M*Nx1) Mean of the training
% database, which is output of
'EigenfaceCore' function.
%
% Eigenfaces - (M*Nx(P-1)) Eigen vectors of the
% covariance matrix of the training
% database, which is output of
'EigenfaceCore' function.
%
% A - (M*NxP) Matrix of centered image
% vectors, which is output of
'EigenfaceCore' function.
%
% Returns: OutputName - Name of the recognized image in
the training database.
%
% See also: RESHAPE, STRCAT

%%%%%%%%%%%%%%%%%%%%%%%% Projecting centered image vectors into facespace


% All centered images are projected into facespace by multiplying in
% Eigenface basis's. Projected vector of each face will be its
corresponding
% feature vector.

ProjectedImages = [];
Train_Number = size(Eigenfaces,2);
for i = 1 : Train_Number
temp = Eigenfaces'*A(:,i); % Projection of centered images into
facespace
ProjectedImages = [ProjectedImages temp];
end

%%%%%%%%%%%%%%%%%%%%%%%% Extracting the PCA features from test image


InputImage = imread(TestImage);
temp = InputImage(:,:,1);

[irow, icol] = size(temp);


InImage = reshape(temp',irow*icol,1);
Difference = double(InImage)-m; % Centered test image
ProjectedTestImage = Eigenfaces'*Difference; % Test image feature vector

%%%%%%%%%%%%%%%%%%%%%%%% Calculating Euclidean distances


% Euclidean distances between the projected test image and the projection
% of all centered training images are calculated. Test image is
% supposed to have minimum distance with its corresponding image in the
% training database.

Euc_dist = [];
for i = 1 : Train_Number
q = ProjectedImages(:,i);
temp = ( norm( ProjectedTestImage - q ) )^2;
Euc_dist = [Euc_dist temp];
end

[Euc_dist_min , Recognized_index] = min(Euc_dist);


OutputName = strcat(int2str(Recognized_index),'.jpg');
Train database images:
Test database images:
OUTPUT

Das könnte Ihnen auch gefallen