Sie sind auf Seite 1von 34

US 20190280869A1

( 19) United States


(12 ) Streit
Patent Application Publication ( 10) Pub . No.: US 2019 /0280869 A1
(43) Pub. Date: Sep . 12 , 2019
(54 ) SYSTEMS AND METHODS FOR
PRIVACY-ENABLED BIOMETRIC
(52) CPC
U .S. CI.............. H04L 9 /3231 ( 2013 .01); G06N 3 / 08
PROCESSING (2013 .01); H04L 9 /008 ( 2013.01); G06F 21/32
( 2013 .01)
(71 ) Applicant: Open Inference Holdings LLC , (57) ABSTRACT
Rockville , MD (US ) In one embodiment, a set of feature vectors can be derived
(72) Inventor: Scott Edward Streit , Woodbine, MD from any biometric data, and then using a deep neural
network (“ DNN ” ) on those one -way homomorphic encryp
(US) tions (i.e ., each biometrics ' feature vector ) can determine
matches or execute searches on encrypted data . Each bio
( 73 ) Assignee : Open Inference Holdings LLC , metrics ' feature vector can then be stored and /or used in
Rockville, MD (US) conjunction with respective classifications, for use in sub
sequent comparisons without fear of compromising the
(21) Appl. No.: 15 /914 , 942 originalbiometric data . In various embodiments , the original
biometric data is discarded responsive to generating the
(22 ) Filed : Mar. 7 , 2018 encrypted values. In another embodiment, the homomorphic
encryption enables computations and comparisons on
Publication Classification cypher text without decryption . This improves security over
conventional approaches . Searching biometrics in the clear
(51) H04L
Int. Ci9./32 on any system , represents a significant security vulnerability.
( 2006 .01) In various examples described herein , only the one -way
G06F 21/32 ( 2006 .01 ) encrypted biometric data is available on a given device .
H04L 9 /00 (2006 .01) Various embodiments restrict execution to occur on
GO6N 3/08 (2006 .01) encrypted biometrics for any matching or searching.

Biometric Processing

m
Component

Training Generation
Component Known

Feature Vector
Component

Cassifier
Component

Component
Patent Application Publication Sep . 12 , 2019 Sheet 1 of 13 US 2019/0280869 A1

Acquire Unencrypted - 102


Biometric

0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . 0 . . . . . . . . . ...

w Generate Training
Biometrics

- . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . - . ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? E EEEEEEEEEEEEEEEEEEEEEEEEE

Ap ly Neural Network N
108
hn
???????????????????????????????????????????????????????????????????????????????????
Capture Feature 110

Classify Feature Vectors


From Acquired & Training Biometrics
book

n Discard Unencrypted
Data Fm
Store Feature Vectors
& Classification

FIG . 1
Patent Application Publication Sep . 12 , 2019 Sheet 2 of 13 US 2019/0280869 A1

Acquire Biometric 202

Pre process Biometric 204

D Generate Feature
Vectors

D Classify Biometric Information

FIG . 2A
Patent Application Publication Sep . 12 , 2019 Sheet 3 of 13 US 2019/0280869 A1

.uuuuuuuuuuuuuuuuuuuuuu u W

Acquire Feature Vectors


wwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwwww

Determine Match
Information

Vote On Match
n
Retrain Model

FIG . 2B
Patent Application Publication Sep . 12 , 2019 Sheet 4 of 13 US 2019/0280869 A1

302
Biometric Processing
Component 306

Training Generation
Component

Yo o
i

Feature Vector Unknown

00
.
0
poulaymtdisc Component
TRRRRRRRRRRRRRRRRRRRRR

Classifier
Component
o

oo

YO
Component
????????????????????
Oo
FIG . 3
Patent Application Publication Sep . 12 , 2019 Sheet 5 of 13 US 2019/0280869 A1

4A
.
FIG

:lOuatypeurt
d#cimlenasions
Ful ycon ect ed
Hid en
:
2
layer
dc2x#imlenasions

Fully con ect ed


Hid en
:
1
layer
d500
),
g
.
e
(
imensions

Ful ycon ect ed

CElxasmipfler eFmbatduirneg
:lIanypeurt d)leg
128imensions
,
.
Patent Application Publication Sep . 12 , 2019 Sheet 6 of 13 US 2019/0280869 A1

LTarbueles H(uman labeling


:
)
hot
-
One encoding ]
.
1
,
0
(

Hinge loss
;lOuatypeurt atleast#class
:VOaultupest 21
6
3
2
,
5
.
0
-
(

dimenso
4B
.
FIG
Ful y con ect
ed 2:lHiadyern
class
#
2x
least
at
@id mne is otns

Ful y con ect ed


1:lHiadyern
d500ormioensrioens
Ful y con ect ed

PTrhainseg ;lIanypeurt eFmbatduirneg


Patent Application Publication Sep . 12 , 2019 Sheet 7 of 13 US 2019/0280869 A1

4C
.
FIG
1PERSON

:lOuatypeurt
:VOaultupest ]
6
3
2
,
5
.
0
-
(

Ful y con ect ed


Hid en
:
2
layer

Ful y con ect ed


Hid en
:
1
layer

Ful y con ect ed

Prhedaicsteon :lIanypeurt
Patent Application Publication Sep . 12 , 2019 Sheet 8 of 13 US 2019/0280869 A1

4D
.
FIG
UNKOWNPERSON
:lOuatypeurt
:VOaultupest 21
6
2
,
5
.
0
-
(

Ful y con ect ed


2:lHiadyern

Ful y con ect ed


1:lHiadyern

Ful y con ect ed

Prehdiactsoen :lIanypeurt
Patent Application Publication Sep . 12 , 2019 Sheet 9 of 13 US 2019/0280869 A1

?
YA

Initialized Running
*
22
:
01
at • Done
23
22
:
01
at •
PM
34
22
:
01
at
PM
1
(
second
) )
seconds
10
(
PM )
seconds
12
=
Total
( DoneBMIondfelr
DoneJobStatus
Notes None
SUU

5A
.
FIG

384
softmax
73602103e
.
7
II
[
|
14
-
219556982
2
]
08 65661884e
.
-
22442035e
1
|
]
12
25
79010940e14174363e
23
08735890e
2[
10
531504329045704219e8
045
1
8161
.
3
|
000
+
231
-

TestGIMemnoeadrgeicl 2011
18133693e
[
]
09
15146406e
4
II
)
15
-
066254188
.
1
|

food
iSomuargce Ivnisfuealrzntcieo 6232584 e
.
1
1611
-
ve
Patent Application Publication Sep . 12 , 2019 Sheet 10 of 13 US 2019/0280869 A1

volezijnsa

Sunshei
uod se
erea
adeus
:
)

871
ISZI
ueaw
:
OL
'
L
&
9

,
pis:u6op6e0ln7a9p

ejep vojeno
?
?
Old
99
Patent Application Publication Sep . 12 , 2019 Sheet 11 of 13 US 2019/0280869 A1

2442

**

*
*

424414

* * ** *

5C
.
FIG

.
.

Value
-

.
-

IL .
-

.
4

]
126
64
[
:
shape
Data
deviation
Std
91
.
35
:
90067
.
1
:
Mean

convta
"
Activa on
Patent Application Publication Sep . 12 , 2019 Sheet 12 of 13 US 2019/0280869 A1

Data shape: ( 256 1 11ku n ed


, at
it
Mean : 2 .09114
Std deviation : 2.89049
Activation
ww

w
6 . 28 12.6

Data shape: (15 111


" softmax" Mean : 0.0666667
Std deviation : 0 .249444
Activation
nom

i
V NX
bnadonsah
i
'
L
'

i
'
!
I

!!
!
!

dacor ba MILA
!
,
11

:
1

11

798-38 * 0Value
.500 1.00
FIG . 5D
Patent Application Publication Sep . 12 , 2019 Sheet 13 of 13 US 2019/0280869 A1

600

Processor Memory
610 620

Non - Volatile
Storage
630

FIG . 6
US 2019 /0280869 A1 Sep . 12 , 2019

SYSTEMS AND METHODS FOR [0006 ] According to one embodiment, the homomorphic
PRIVACY-ENABLED BIOMETRIC encryption enables computations and comparisons on
PROCESSING cypher text without decryption . This improves security over
conventional approaches . Searching biometrics in the clear
COPYRIGHT NOTICE on any system , represents a significant security vulnerability .
In various examples described herein , only the one -way
[0001 ] A portion of the disclosure of this patent document encrypted biometric data is available on a given device .
contains material which is subject to copyright protection . Various embodiments restrict execution to occur on
The copyright owner has no objection to the facsimile encrypted biometrics for any matching or searching
reproduction by anyone of the patent document or the patent [0007 ] According to another aspect, encrypted search can
disclosure , as it appears in the Patent and Trademark Office be executed on the system in polynomial time, even in a one
patent file or records, but otherwise reserves all copyright to many use case . This feature enables scalability that
rights whatsoever. conventional systems cannot perform and enables security /
BACKGROUND privacy unavailable in many conventional approaches .
[0008 ] According to one aspect a privacy - enabled biomet
[0002] Biometrics offer the opportunity for identity assur ric system is provided . The system comprises at least one
ance and identity validation . Many conventional uses for processor operatively connected to a memory ; a classifica
biometrics currently exist for identity and validation . These tion component executed by the at least one processor,
conventional approaches suffer from many flaws. For comprising a classification network having a deep neural
example , the IPHONE facial recognition service limits network (“ DNN " ) configured to classify feature vector
implementation to a one to one match . This limitation is due inputs during training and return a label for person identi
to the inability to perform one to many searching on the fication or an unknown result during prediction ; and the
biometric , let alone on a secure encrypted biometric . In fact, classification component is further configured to accept as
most conventional approaches search or match biometrics an input feature vectors that are Euclidean measurable and
using unencrypted information , and attempt to perform the return the unknown result or the label as output.
search in secure computing spaces to avoid compromise of 10009 ]. According to one embodiment, a set of biometric
the biometrics. feature vectors is used for training in the DNN neural
SUMMARY network for subsequent prediction. According to one
embodiment, biometrics are morphed a finite number of
[0003 ] It is realized that there is a need for a solution that times to create additional biometrics for training of the
provides one to many searching, and that provides for second ( classification ) neural network . The second neural
operations on encrypted biometric information . There is a network is loaded with the label and a finite number of
further need to establish such searches that accomplish one feature vectors based on an input biometric . According to
to many matching in polynomial time. Various embodiments one embodiment, the classification component is configured
of the privacy - enabled biometric system provide for scan to accept or extract from another neural network Euclidean
ning of multiple biometrics to determine matches or close measurable feature vectors . According to one embodiment,
ness . Further embodiments can provide for search and the another neural network comprises a pre -trained neural
matching across multiple types of encrypted biometric infor network . According to one embodiment, this network takes
mation improving accuracy of validation over many con in a plaintext biometric and returns a Euclidean measureable
ventional approaches , while improving the security over the feature vector that represents a one -way encrypted biomet
same approaches . ric . According to one embodiment, the classification neural
10004 ] According to another aspect, conventional network comprises a classification based deep neural net
approaches are significantly burdened not only in biometric work configured for dynamic training with label and feature
data that is to be searched in the clear but also by key vector input pairs to training. According to one embodiment,
management overhead that is needed for securing those a feature vector is input for prediction .
biometrics in storage . Using APPLE as an example , a secure [0010 ]. According to one embodiment, the system further
enclave is provided on the IPHONE with encryption keys comprises a preprocessing component configured to validate
only available to the secure enclave such that facial biomet plaintext biometric input. According to one embodiment,
rics never leave a respective device or the secure enclave . only valid images are used for subsequent training after the
Various embodiments described herein completely change preprocessing . According to one embodiment, the classifi
this paradigm by fully encrypting the reference biometric , cation component is configured with a plurality ofmodes of
and executing comparisons on the encrypted biometrics execution , including an enrollment mode configured to
(e .g., encrypted feature vectors of the biometric ). accept, as input, a label and feature vectors on which to train
10005 ] According to one embodiment, a set of feature the classification network for subsequent prediction .
vectors can be derived from any biometric data , and then According to one embodiment, the classification component
using a deep neural network (“ DNN ” ) on those one -way is configured to predict a match , based on a feature vector as
homomorphic encryptions (i.e ., each biometrics' feature input, to an existing label or to return an unknown result.
vector ) a system can determine matches or execute searches According to one embodiment, the classification component
on the encrypted data . Each biometrics ' feature vector can is configured to incrementally update an existing model,
then be stored and /or used in conjunction with respective maintaining the network architecture and accommodating
classifications , for use in subsequent comparisons without the unknown result for subsequent predictions . According to
fear of compromising the original biometric data. In various one embodiment, wherein the system is configured to ana
embodiments , the original biometric data is discarded lyze the output values and based on their position and the
responsive to generating the encrypted values. values, determine the label or unknown .
US 2019 /0280869 A1 Sep . 12 , 2019

[0011] According to one embodiment, the classification input layer having a number of nodes at least equal to the
network further comprises an input layer for accepting number of dimensions of the feature vector input, a first and
feature vectors of a number of dimensions , the input layer second hidden layer and an output layer that generates an
having a number of classes at least equal to the number of array of values .
dimensions of the feature vector input, first and a second [0015 ] According to one aspect a non - transitory computer
hidden layers , and an output layer that generates an array of readable medium containing instructions when executed by
values . According to one embodiment, the fully connected at least one processor cause a computer system to execute a
neural network further comprises an input layer for accept method for executing privacy -enabled biometric analysis ,
ing feature vectors of a number of dimensions , the input the method is provided . A method comprises an instantiat
layer having a number of nodes at least equal to the number ing , a classification component comprising a classification
of dimensions of the feature vector input , a first hidden layer network having a deep neural network (“ DNN ” ) configured
of at least 500 dimensions, a second hidden layer of at least to classify feature vector and label inputs during training and
twice the number of input dimensions, and an output layer return a label for person identification or an unknown result
that generates an array of values that based on their position during prediction ; accepting , by the classification compo
and the values , determine the label or unknown . According nent, as an input feature vectors that are Euclidean measur
to one embodiment, a set ofbiometric feature vectors is used able as an input and a label for training the classification
for training the DNN neural network for subsequent predic network , and Euclidean measurable feature vectors for pre
tion . diction functions with the classification network ; and clas
[ 0012 ] According to one aspect a computer implemented sifying , by a classification component executed on at least
method for executing privacy - enabled biometric training is one processor, the feature vector inputs and the label during
provided . The method comprises instantiating , by at least training.
one processor , a classification component comprising clas [0016 ] According to one embodiment, the method further
sification network having a deep neural network (“DNN ” ) comprises an act of accepting or extracting, by the classifi
configured to classify feature vector inputs during training cation component, from another neural network Euclidean
and return a label for person identification or an unknown measurable feature vectors . According to one embodiment,
result during prediction ; accepting, by the classification the another neural network comprises a pre -trained neural
component, as an input feature vectors that are Euclidean network . According to various embodiments , the computer
measurable and a label for training the classification net readablemedium contains instructions to perform any of the
work ; and Euclidean measurable feature vectors for predic method steps above, individually , in combination , or in any
tion functions with the classification network ; and classify combination .
ing , by a classification component executed on at least one [0017 ] According to one aspect a privacy -enabled biomet
processor, the feature vector inputs and the label during ric system is provided . The system comprises a classification
training . means comprising a classifying deep neural network
[0013] According to one embodiment, the method further (“ DNN ” ) executed by at least one processor the FCNN
comprises accepting or extracting , by the classification com configured to : classify feature vector inputs and return a
ponent, from another neural network the Euclidean measur label for person identification or an unknown result as a
able feature vectors . According to one embodiment, the prediction ; and accept as an input, feature vectors that are
another neural network comprises a pre -trained neural net Euclidean measurable and a label as an instance of training .
work . According to one embodiment, the classification neu [0018 ] According to one aspect, a privacy -enabled bio
ral network comprises a classification based deep neural metric system is provided . The system comprises at least one
network configured for dynamic training with label and processor operatively connected to a memory ; a classifica
feature vector input pairs . According to one embodiment, the tion component executed by the at least one processor,
method further comprises an act of validating input biomet including a classification network having a deep neural
rics used to generate a feature vector. According to one network (“ DNN " ) configured to classify feature vector
embodiment, the method further comprises an act of trig inputs during training and return a label for person identi
gering a respective one of a plurality ofmodes of operation , fication or an unknown result during prediction , wherein the
including an enrollment mode configured to accept a label classification component is further configured to accept as
and feature vectors for an individual. According to one an input feature vectors that are Euclidean measurable ; a
embodiment, the method further comprises an act of pre feature vector generation component comprising a pre
dicting a match to an existing label or returning an unknown trained neural network configured to generate Euclidean
result responsive to accepting a biometric feature vector as measurable feature vectors as an output of a least one layer
input. in the neural network responsive to input of an unencrypted
[ 0014 ] According to one embodiment, method further biometric input.
comprises an act of updating the classification network with 10019 ) According to one embodiment, the classification
respective vectors for use in subsequent predictions. To component is further configured to accept one way homo
handle the case of a person ' s looks changing over time, the morphic , Euclidean measurable vectors, and labels for per
input for prediction , may be used to re - train the individual. son identification as input for training . According to one
According to one embodiment, the method further com embodiment, the classification component is configured to
prises an act of updating , incrementally , an existing node in accept or extract from the pre - trained neural network the
the classification network and maintaining the network feature vectors. According to one embodiment, the pre
architecture to accommodate the feature vector for subse trained neural network includes an output generation layer
quent predictions . According to one embodiment, the clas which provides Euclidean Measureable feature vectors .
sification network further comprises an input layer for According to one embodiment , the classification network
accepting feature vectors of a number of dimensions, the comprises a deep neural network suitable for training and,
US 2019 /0280869 A1 Sep . 12 , 2019

for prediction , output of a list of values allowing the and Euclidean measurable feature vectors for prediction
selection of labels or unknown as output. According to one functions with the classification network ; and classifying , by
embodiment, the pre -trained network generates feature vec a classification component executed on at least one proces
tors on a first biometric type (e. g ., image , voice , health data , sor, the feature vector inputs and the label during training .
iris , etc .) ; and the classification component is further con [0023] According to one embodiment, the method further
figured to accept feature vectors from a another neural comprises accepting or extracting, by the classification net
network that generates Euclidean measurable feature vectors work the Euclidean measurable feature vectors from the
on a another biometric type . pre- trained neural network . According to one embodiment,
10020 ] According to one embodiment, the system is con the second neural network comprises a pre - trained neural
figured to instantiate multiple classification networks each network . According to one embodiment, the method further
associated with at least one differentbiometric type relative comprises an act of validating input feature vectors as
to another classification network , and classify input feature Euclidean measurable . According to one embodiment, the
vectors based on executing at least a first or second classi method further comprises generating , by the classification
fication network . According to one embodiment, the system component feature vectors on a first biometric type ( e .g .,
is configured to execute a voting procedure to increase image , voice , health data , iris , etc . ); and accepting, by the
accuracy of identification based on multiple biometric inputs classification component, feature vectors from another neu
or multiple types of biometric input. According to one ral network that generates Euclidean measurable feature
embodiment, the system is configured to maintain at least an vectors on a second biometric type .
executing copy of the classifying network and an updatable [0024 ] According to one embodiment, method further
copy of classification network that can be locked or put in an comprises : instantiating multiple classification networks
offline state to enable retraining operations while the execut each associated with at least one different biometric type
ing copy of the classifying network handles any classifica relative to another classification network , and classifying
tion requests . According to one embodiment, the classifica input feature vectors based on applying at least a first or
tion component is configured with a plurality of modes of second classification network . According to one embodi
execution , including an enrollment mode configured to ment, the method further comprises executing a voting
accept a label for identification and the input feature vectors procedure to increase accuracy of identification based on
for an individual from the feature vector generation com multiple biometric inputs or multiple types of biometric
ponent. input and respective classifications. According to one
[0021] According to one embodiment, the classification embodiment, for a biometric to be considered a match , it
component is configured to predict a match to an existing must receive a plurality of votes based on a plurality of
label or to return an unknown result based on feature vectors biometrics . According to one embodiment, the method fur
enrolled in the classification network . According to one ther comprises instantiating multiple copies of the classifi
embodiment, the classification component is configured to cation network to enable at least an executing copy of the
incrementally update an existing node in the neural network classification network , and an updatable classification net
maintaining the network architecture and accommodating work that can be locked or put in an offline state to enable
the unknown result for subsequent predictions. According to retraining operations while the executing copy of the clas
one embodiment, the classification network further com sification network handles any classification requests .
prises an input layer for accepting feature vectors of a According to one embodiment, the method further com
number of dimensions, the input layer having a number of prises predicting a match to an existing label or to return an
nodes at least equal to the number of dimensions of the unknown result based , at least in part, on feature vectors
feature vector input, a first hidden layer, a second hidden enrolled in the classification network . According to one
layer, and an output layer that generates hat generates an embodiment, the method further comprises updating, incre
array of values that based on their position and the values , mentally , an existing model in the classification network
determine the label or unknown . According to one embodi maintaining the network architecture and accommodating
ment, the classification network further comprises a plurality the unknown result for subsequent predictions.
of layers including two hidden layers and an output layer [0025 ]. According to one aspect a non -transitory computer
having a number of nodes at least equal to the number of readable medium containing instructions when executed by
dimensions of the feature vector input. at least one processor cause a computer system to execute a
[ 0022] According to one aspect a computer implemented method for executing privacy -enabled biometric analysis ,
method for executing privacy - enabled biometric analysis , the method is provided . The method comprises instantiating
the method is provided . The method further comprises a classification component comprising a deep neural net
instantiating , by at least one processor, a classification work (“ DNN ” ) configured to classify feature vector and
component comprising a deep neural network (“ DNN ” ) label inputs during training and return a label for person
configured to classify feature vector inputs during training identification or an unknown result during prediction , and a
and return a label for person identification or an unknown feature vector generation component comprising a pre
result during prediction , and a feature vector generation trained neural network ; generating , by the feature vector
component comprising a pre -trained neural network ; gener generation component Euclidean measurable feature vectors
ating, by the feature vector generation component Euclidean as an output of a least one layer in the pre- trained neural
measurable feature vectors as an output of a least one layer network responsive to input of an unencrypted biometric
in the pre - trained neural network responsive to input of an input; accepting, by the classification component , as an input
unencrypted biometric input; accepting, by the classification feature vectors that are Euclidean measurable generated by
component, as an input feature vectors that are Euclidean the feature vector generation component and a label for
measurable generated by the feature vector generation com training the classification network , and Euclidean measur
ponent and a label for training the classification network , able feature vectors for prediction functions with the clas
US 2019 /0280869 A1 Sep . 12 , 2019

sification network ; and classifying, by a classification com of inputs for theprediction and a result of a class or unknown
ponent executed on at least one processor, the feature vector (all returned values dictating UNKNOWN ).
inputs and the label during training. According to various [0030 ] According to one embodiment, the classification
embodiments, the computer readable medium contains component is further configured to accept the feature vector
instructions to perform any of the method steps above , inputs from a neural network model that generates Euclidean
individually , in combination , or in any combination . measurable feature vectors . According to one embodiment,
[ 0026 ] According to one aspect a privacy - enabled biomet the classification component is further configured to extract
ric system is provided . The system comprises a feature the feature vectors from the neural network model from
vector generation means comprising a pre -trained neural layers in the model. According to one embodiment, the
network configured to generate Euclidean measurable fea system further comprising a feature vector component
ture vectors responsive to an unencrypted biometric input; a executed by the at least one processor comprising a neural
classification means comprising a deep neural network network . According to one embodiment, the feature vector
(“ DNN ” ) configured to : classify feature vector and label component is configured to extract the feature vectors during
inputs and return a label for person identification or an execution of the neural network from layers . According to
unknown result for training; and accept feature vectors that one embodiment, the neural network comprises of a set of
are Euclidean measurable as inputs and return a label for layers wherein one layer outputs Euclidean Measurable
person identification or an unknown result for prediction . Feature Vectors . According to one embodiment, the system
[ 0027 ] According to one aspect a privacy - enabled biomet further comprising a retraining component configured to
ric system is provided . The system comprises at least one monitor a number of new input feature vectors or matches of
processor operatively connected to a memory ; a classifica new biometric information to a label and trigger retraining
tion component executed by the at least one processor, by the classification component on the new biometric infor
including a classification network having a deep neural mation for the label. This can be additional training on a
network (“DNN ” ) configured to classify feature vector and person , using predict biometrics, that continues training as a
label inputs during training and return a label for person biometric changes over time. The system may be configured
identification or an unknown result during prediction , to do this based on a certain number of consecutive predic
wherein the classification component is further configured to tions or may do it chronologically , say once every six
accept as an input feature vectors that are Euclidean mea months.
surable; the classification network having an architecture [0031] According to one embodiment, the classification
comprising a plurality of layers : at least one layer compris component is configured to retrain the neural network on
ing nodes associated with feature vectors , the at least one addition of new feature vectors . According to one embodi
layer having an initial number of identification nodes and a ment, the neural network is initially trained with unallocated
subset of the identification nodes that are unassigned ; the people classifications, and the classification component is
system responsive to input of biometric information for a further configured to incrementally retrain the neural net
new user is configured to trigger an incremental training work to accommodate new people using the unallocated
operation for the classification network integrating the new classifications . According to one embodiment , the system
biometric information into a respective one of the unallo further comprises a retraining component configured to :
cated identification nodes usable for subsequent matching. monitor a number of incremental retraining ; trigger the
[0028] According to one embodiment, the system is con classifier component to fully retrain the neural network
figured to monitor allocation of the unallocated identifica responsive to allocation of the unallocated classifications .
tion nodes and trigger a full retraining of the classification According to one embodiment , the classification component
network responsive to assignment of the subset of unallo is configured to fully retrain the neural network to incorpo
cated nodes. According to one embodiment, the system is rate unallocated people classifications, and incrementally
configured to execute a full retraining of the classification retrain for new people using the unallocated classifications.
network to include additional unallocated identification According to one embodiment, the classification component
nodes for subsequent incremental retraining of the DNN . further comprises multiple neural networks for processing
According to one embodiment, the system iteratively fully respective types of biometric information . According to one
retrains the classification network upon depletion of unal embodiment, the classification component is further config
located identification nodes with additional unallocated ured to generate an identity of a person responsive to at least
nodes for subsequent incremental training. According to one two probable biometric indicators that may be used simul
embodiment, the system is further configured to monitor taneously or as part of a “ voting” algorithm .
matching of new biometric information to existing identifi [0032 ] According to one aspect a computer implemented
cation nodes in the classification network . method for privacy - enabled biometric analysis is provided .
[0029 ] According to one embodiment, the system is fur The method comprises instantiating , by at least one proces
ther configured trigger integration of new biometric infor sor, a classification component comprising a classification
mation into existing identification nodes responsive to network having a deep neural network (“ DNN ” ) configured
exceeding a threshold associated with matching new bio to classify feature vector and label inputs during training and
metric information . According to one embodiment, the pre return a label for person identification or an unknown result
trained network is further configured to generate one way during prediction , and wherein the classification component
homomorphic , Euclidean measurable , feature vectors for the is further configured to accept as an input feature vectors that
individual. According to one embodiment, the classification are Euclidean measurable and return the unknown result or
component is further configured to return a set of probabili the label as output; instantiating the classification compo
ties for matching a set of existing labels. According to one nent includes an act of allocating within at least one layer of
embodiment, the classification component is further config the classification network , an initial number of classes and
ured to predict an outcome based on a trained model, a set having a subset of the class slots that are unassigned ;
US 2019 /0280869 A1 Sep . 12 , 2019

triggering responsive to inputof biometric information for a network , an initial number of classes and having a subset of
new user incremental training operation for the classification additional classes that are unassigned ; triggering responsive
network integrating the new biometric information into a to input of biometric information for a new user incremental
respective one of the unallocated class slots usable for training operation for the classification network integrating
subsequent matching . the new biometric information into a respective one of the
[ 0033] According to one embodiment, the method further unallocated identification nodes usable for subsequent
comprises acts of accepting , by the classification compo matching. According to various embodiments, the computer
nent, as an input feature vectors that are Euclidean measur readable medium contains instructions to perform any of the
able generated by a feature vector generation component; method steps above, individually , in combination , or in any
classifying, by the classification component executed on at combination .
least one processor, the feature vector inputs ; and returning , [0036 ] According to one aspect a privacy - enabled biomet
by the classification component, a label for person identifi ric system is provided . The system comprises at least one
cation or an unknown result . According to one embodiment, processor operatively connected to a memory ; a classifica
the method further comprises acts of instantiating a feature tion component executed by the at least one processor,
vector generation component comprising a pre -trained neu comprising classification network having a deep neural
ral network , and generating , by the feature vector generation network configured to classify Euclidean measurable feature
component Euclidean measurable feature vectors as an vectors and label inputs for person identification during
output of a least one layer in the pre -trained neural network training , and accept as an input feature vectors that are
responsive to input of an unencrypted biometric input. Euclidean measurable and return an unknown result or the
According to one embodiment, the method further com label as output; and an enrollment interface configured to
prises an act of monitoring, by the at least one processor , accept biometric information and trigger the classification
allocation of the unallocated identification classes and trig component to integrate the biometric information into the
gering an incremental retraining of the classification net classification network .
work responsive to assignment of the subset of unallocated
nodes to provide additional unallocated classes. [0037 ] According to one embodiment, the enrollment
[0034 ] According to one embodiment, the method further interface is accessible via uri, and is configured to accept
comprises an act of monitoring, by the at least one processor, unencrypted biometric information and personally identifi
allocation of the unallocated identification nodes and trig able information (“ PII” ) . According to one embodiment , the
gering a full retraining or incremental of the classification enrollment interface is configured to link the PII to a one
network responsive to assignment of the subset of unallo way homomorphic encryption of an unencrypted biometric
cated nodes. According to one embodiment, the method input. According to one embodiment, the enrollment inter
further comprises an act of executing a full retraining of the face is configured to trigger deletion of the unencrypted
classification network to include additional unallocated biometric information . According to one embodiment, the
classes for subsequent incremental retraining of the DNN . system is further configured to enroll an individual for
According to one embodiment, the method further com biometric authentication , and the classification component is
prises an act of fully retraining the classification network further configured to accept input of Euclidean measurable
iteratively upon depletion of unallocated identification feature vectors for person identification during prediction.
nodes, the full retraining including an act of allocating According to one embodiment, the classification component
additional unallocated nodes for subsequent incremental is further configured to return a set of probabilities for
training. According to one embodiment, the method further matching a feature vector. According to one embodiment,
comprises an act of monitoring matching of new biometric the classification component is further configured to predict
information to existing identification nodes . According to an outcome based on a trained model, a set of inputs for the
one embodiment, the method further comprises an act of prediction and a result of a class ( persons ) or UNKNOWN
triggering integration of new biometric information into (all returned values dictating UNKNOWN ) .
existing identification nodes responsive to exceeding a [0038 ] According to one embodiment, the system further
threshold associated with matching new biometric informa comprises an interface configured to accept a biometric
tion . According to one embodiment, the method further input and return and indication of known or unknown to a
comprises an act of generating one way homomorphic , requesting entity . According to one embodiment, requesting
Euclidean measurable , labels for person identification entity includes any one or more of : an application , a mobile
responsive to input of Euclidean measurable feature vectors application , a local process , a remote process, a method, and
for the individual by the classification component. a business object. According to one embodiment, the clas
[0035 ] According to one aspect a non -transitory computer sification component further comprising multiple classifica
readable medium containing instructions when executed by tion networks for processing different types of biometric
at least one processor cause a computer system to execute a information . According to one embodiment, the classifica
method instantiating a classification component comprising tion component is further configured to match an identity of
a classification network having a deep neural network a person responsive to at least two probable biometric
(“DNN " ) configured to classify feature vector and label indicators that may be used simultaneously or as part of a
inputs during training and return a label for person identi voting algorithm . According to one embodiment, the clas
fication or an unknown result during prediction , and wherein sification network further comprises an input layer for
the classification component is further configured to accept accepting feature vectors of a number of dimensions, the
as an input feature vectors that are Euclidean measurable input layer having a number of classes at least equal to the
and return the unknown result or the label as output ; number of dimensions of the feature vector input, a first and
instantiating the classification component includes an act of second hidden layer, and an output layer that generates an
allocating within at least one layer of the classification array of values .
US 2019 /0280869 A1 Sep . 12, 2019

[0039] According to one aspect a computer implemented erates an array of values that based on their position and the
method for privacy -enabled biometric analysis , the method values, determine the label or unknown .
is provided . The method comprises instantiating , by at least [0043] Still other aspects , examples , and advantages of
one processor, a classification component comprising a full these exemplary aspects and examples, are discussed in
deep neural network configured to classify feature vectors detail below . Moreover , it is to be understood that both the
that are Euclidean measurable and a label inputs for person foregoing information and the following detailed description
identification during training , and accept as an input feature are merely illustrative examples of various aspects and
vectors that are Euclidean measurable and return an examples , and are intended to provide an overview or
unknown result or the label as output during prediction , and framework for understanding the nature and character of the
an enrollment interface ; accepting, by the enrollment inter claimed aspects and examples. Any example disclosed
face , biometric information associated with a new indi herein may be combined with any other example in any
vidual ; triggering the classification component to train the manner consistent with at least one of the objects, aims, and
classification network on feature vectors derived from the needs disclosed herein , and references to “ an example,”
biometric information and a label for subsequent identifica “ some examples ," " an alternate example , ” “ various
tion ; and return the label through for subsequent identifica examples,” “ one example ,” “ at least one example,” “ this and
tion . other examples” or the like are not necessarily mutually
[0040 ] According to one embodiment, an instantiating the exclusive and are intended to indicate that a particular
enrollment interface included hosting a portal accessible via feature , structure , or characteristic described in connection
uri, and the method includes accepting biometric informa with the example may be included in at least one example .
The appearances of such termsherein are not necessarily all
tion and personally identifiable information (“ PII” ) through referring to the same example .
the portal. According to one embodiment, the method further
comprises linking the PII to a one way homomorphic BRIEF DESCRIPTION OF DRAWINGS
encryption of unencrypted biometric input. According to one
en
embodiment, the method further comprises triggering dele [0044 ] Various aspects of at least one embodiment are
tion of unencrypted biometric information on a submitting discussed below with reference to the accompanying figures,
device . According to one embodiment, method further com which are not intended to be drawn to scale. The figures are
prises enrolling individuals for biometric authentication ; and included to provide an illustration and a further understand
mapping labels and respective feature vectors for person ing of the various aspects and embodiments , and are incor
identification, responsive to input of Euclidean measurable porated in and constitute a part of this specification , but are
feature vectors and a label for the individual . According to not intended as a definition of the limits of any particular
one embodiment, the method further comprises returning a embodiment. The drawings , together with the remainder of
set of probabilities for matching a set of existing labels. the specification , serve to explain principles and operations
[0041] According to one embodiment, the method further of the described and claimed aspects and embodiments . In
the figures , each identical or nearly identical component that
comprises predicting an outcome based on a trained model, is illustrated in various figures is represented by a like
a set of inputs for the prediction and a result of a class (e .g., numeral. For purposes of clarity , not every component may
persons) or unknown (e . g ., all returned values dictating be labeled in every figure . In the figures :
UNKNOWN ) . According to one embodiment, the method [0045 ] FIG . 1 is an example process flow for classifying
further comprises accepting via an authentication interface a biometric information , according to one embodiment ;
biometric input and returning and indication of known or [ 0046 ] FIG . 2A is an example process flow for authenti
unknown to a requesting entity . According to one embodi
ment, the requesting entity includes any one or more of: an cation with secured biometric data , according to one
application, a mobile application , a local process , a remote embodiment;
process, a method , and a business object. According to one [0047 ] FIG . 2B is an example process flow for one to
embodiment, the method further comprises processing dif many matching execution , according to one embodiment;
ferent types of biometric information using multiple classi [0048 ] FIG . 3 is a block diagram of an embodiment of a
fication networks . According to one embodiment, the privacy - enabled biometric system , according to one embodi
method further comprises generating an identity of a person ment ;
responsive to at least two probable biometric indicators that [0049] FIGS. 4A - D are a diagram of embodiments of a
may be used simultaneously or as part of a voting algorithm . fully connected neural network for classification ;
[0042 ] According to one embodiment, the classification [0050 ] FIGS. 5A - D illustrate example processing steps
and example outputs during identification , according to one
network further comprises an input layer for accepting embodiment ; and
feature vectors of a number of dimensions, the input layer [0051 ] FIG . 6 is a block diagram of an embodiment of a
having a number of classes at least equal to the number of special purpose computer system program to execute the
dimensions of the feature vector input, a second hidden layer processes and /or functions described herein .
of at least twice the number of input dimensions, and an
output layer that generates an array of values. According to DETAILED DESCRIPTION
one embodiment, the fully connected neuralnetwork further
comprises an input layer for accepting feature vectors of a 10052 ]. According to some embodiments, the system is
number of dimensions, the input layer having a number of configured to provide one to many search and/or matching
nodes at least equal to the number of dimensions of the on encrypted biometrics in polynomial time. According to
feature vector input, a first hidden layer of at least 500 one embodiment, the system takes input biometrics and
dimensions, a second hidden layer of at least twice the transforms the input biometrics into feature vectors (e.g., a
number of input dimensions , and an output layer that gen list of floating point numbers ( e .g ., 128 , 256 , or within a
US 2019 /0280869 A1 Sep . 12 , 2019

range of at least 64 and 10240 , although some embodiments themselves on their mobile device for enrollment. Pre
can use more feature vectors )). According to various processing steps can be executed on the biometric informa
embodiments , the number of floating point numbers in each tion at 104. For example , given a photo of a user, pre
list depends on themachine learning model being employed . processing can include cropping the image to significant
For example , the known FACENET model by GOOGLE portions (e .g ., around the face or facial features). Various
generates a feature vector list of 128 floating point numbers , examples exist of photo processing options that can take a
but other embodiments use models with different feature reference image and identify facial areas automatically .
vectors and , for example , lists of floating point numbers . [0057] In another example , the end user can be provided
[0053] According to various embodiments , the biometrics a user interface that displays a reference area , and the user
processing model (e.g., deep learning convolution network is instructed to position their face from an existing image
( e.g ., for images and/or faces)) is configured such that each into the designated area . Alternatively , when the user takes
feature vector is Euclidean measurable when output. The a photo the identified area can direct the user to focus on
input ( e.g ., the biometric ) to the model can be encrypted their face so that it appears within the highlight area . In other
using a neural network to output a homomorphic encrypted options, the system can analyze other types of images to
value . According to one aspect, by executing on feature identify areas of interest ( e. g., iris scans, hand images ,
vectors that are Euclidean measureable — the system pro fingerprint, etc . ) and crop images accordingly . In yet other
duces and operates on one way homomorphic encryptions of options, samples of voice recordings can be used to select
input biometrics. These one way homomorphic encryptions data of the highest quality ( e. g., lowest background noise ),
can be used in encrypted operations (e. g., addition , multi or can be processed to eliminate interference from the
plication , comparison , etc .) without knowing the underlying acquired biometric (e .g ., filter out background noise ).
plaintext value . Thus, the original or input biometric can [0058 ] Having a given biometric, the process 100 contin
simply be discarded , and does not represent a point of failure ues with generation of additional training biometrics at 106 .
for security thereafter. In further aspects, implementing one For example , a number of additional images can be gener
way encryptions eliminates the need for encryption keys that ated from an acquired facial image . In one example , an
can likewise be compromised . This is a failing of many additional twenty five images are created to form a training
convention systems. set of images . In some examples, as few as three images can
[ 0054] Examples of the methods, devices, and systems be used but with the tradeoff of reduce accuracy . In other
discussed herein are not limited in application to the details examples, as many as forty training images may be created .
of construction and the arrangement of components set forth The training set is used to provide for variation of the initial
in the following description or illustrated in the accompa biometric information , and the specific number of additional
nying drawings. The methods and systems are capable of training points can be tailored to a desired accuracy (see e.g.,
implementation in other embodiments and of being prac Tables I- VIII below provide example implementation and
ticed or of being carried out in various ways. Examples of test results). Various ranges of training set production can be
specific implementations are provided herein for illustrative used in different embodiments ( e . g ., any set of images from
purposes only and are not intended to be limiting . In two to one thousand ). For an image set, the training group
particular, acts, components, elements and features dis can include images of different lighting, capture angle,
cussed in connection with any one or more examples are not positioning , etc . For audio based biometrics different back
intended to be excluded from a similar role in any other ground noises can be introduced , different words can be
examples. used , different samples from the same vocal biometric can
[0055 ] Also , the phraseology and terminology used herein be used in the training set , among other options . Various
is for the purpose of description and should not be regarded embodiments of the system are configured to handle mul
as limiting . Any references to examples, embodiments, tiple different biometric inputs including even health profiles
components , elements or acts of the systems and methods that are based at least in part on health readings from health
herein referred to in the singular may also embrace embodi sensors (e.g ., heart rate , blood pressure , EEG signals, body
ments including a plurality, and any references in plural to mass scans, genome, etc . ). According to various embodi
any embodiment , component, element or act herein may also ments, biometric information includes Initial Biometric Val
embrace embodiments including only a singularity . Refer ues (IBV ) a set of plaintext values (pictures, voice , SSNO ,
ences in the singular or plural form are not intended to limit driver ' s license number, etc. ) or any other Personally Iden
the presently disclosed systems or methods, their compo tifiable Information (“ PII” ) that together define a person . In
nents , acts , or elements . The use herein of “ including," some examples , the biometric value itself may be stored as
" comprising," " having," " containing," " involving ," and PII and this plaintext may become searchable and privacy
variations thereof is meant to encompass the items listed enhanced by using homomorphic encryption generating
thereafter and equivalents thereof as well as additional Euclidean Measurable ciphertext.
items. References to “ or” may be construed as inclusive so [0059 ] At 108 , feature vectors are generated from the
that any terms described using “ or” may indicate any of a initial biometric information ( e . g., one or more plain text
single , more than one, and all of the described terms. values that identify an individual). Feature vectors are
0056 ] FIG . 1 is an example process flow 100 for enrolling generated based on all available biometric information
in a privacy - enabled biometric system (e . g ., FIG . 3 , 304 which can include a set of and training biometrics generated
described in greater detail below ). Process 100 begins with from the initial unencrypted biometric information received
acquisition of unencrypted biometric data at 102 . The unen on an individual or individuals. According to one embodi
crypted biometric data ( e . g ., plaintext, reference biometric , ment, the IBV is used in enrollment and for example in
etc.) can be directly captured on a user device , received from process 100 . The set of IBVs are processed into a set of
an acquisition device , or communicated from stored bio - initial biometric vectors ( e . g ., feature vectors ) which are
metric information . In one example , a user takes a photo of used downstream in a subsequent neural network .
US 2019 /0280869 A1 Sep . 12 , 2019

[0060] In one implementation , users are directed to a executed at 112 on the feature vectors based on a fully
website to input multiple data points for biometric informa- connected neural network ( e .g ., a second neural network ).
tion (e.g., multiple pictures including facial images) in The execution is run against all the biometric data (i.e .,
conjunction with personally identifiable information (“ PII” ). feature vectors from the initial biometric and training bio
The system and /or execution of process 100 can include metric data ) to create the classification information . Accord
tying the PII to encryptions of the biometric as discussed ing to one example, a fully connected neural network having
below . two hidden layers is employed for classification of the
[0061] In one embodiment, a convolutional deep neural biometric data . In another example, a fully connected net
network is executed to process the unencrypted biometric work with no hidden layers can be used for the classification .
information and transform it into feature vector which has a However , the use of the fully connected network with two
property of being one-way encrypted cipher text. The neural hidden generated better accuracy in classification (see e. g.,
network is applied (108 ) to compute a one-way homomor Tables I-VIII described in greater detail below ) . According
phic encryption of the biometric — resulting in feature vec to one embodiment, process 100 can be executed to receive
tors ( e. g ., at 110 ). These outputs can be computed from an an original biometric (e . g ., at 102 ) generate feature vectors
original biometric using the neural network but the values (e .g ., 110 ), and apply a FCNN classifier to generate a label
are one way in that the neural network cannot then be used to identify a person at 112 (e.g ., output # people ).
to regenerate the original biometrics from the outputs . [0066 ] Process 100 continues with discarding any unen
[0062 ] Various embodiments take as input a neural net crypted biometric data at 114 . In one example , an applica
work capable of taking plaintext input and returning Euclid tion on the user' s phone is configured to enable enrollment
ean measurable output. One such implementation is FaceNet of captured biometric information and configured to delete
which takes in any image of a face and returns 128 floating the original biometric information once processed (e.g., at
point numbers , as the feature vector. The neural network is 114 ). In other embodiments, a server system can process
fairly open ended , where various implementations are con received biometric information and delete the original bio
figured to return a Euclidean measureable feature vector that metric information once processed . According to some
maps to the input. This feature vector is nearly impossible to aspects, only requiring that original biometric information
use recreate the original input biometric and is therefore exists for a short period during processing or enrollment
considered a one -way encryption . significantly improves the security of the system over con
[ 0063] Various embodiments are configured to accept the ventional approaches. For example , systems that persistently
feature vector( s ) produced by a first neural network and use store or employ original biometric data become a source of
it as input to a new neuralnetwork ( e . g ., a second classifying vulnerability . Unlike a password that can be reset, a com
neural network ). According to one example , the new neural promised biometric remains compromised , virtually forever.
network has additional properties. This neural network is [0067 ] Returning to process 100 , at 116 the resulting
specially configured to enable incremental training ( e . g ., on
new users and /or new feature vectors ) and configured to cipher text (e . g ., feature vectors ) biometric is stored . In one
distinguish between a known person and an unknown per example , the encrypted biometric can be stored locally on a
son . In one example , a fully connected neural network with user device . In other examples , the generated encrypted
2 hidden layers and a “ hinge ” loss function is used to process biometric can be stored on a server, in the cloud, a dedicated
input feature vectors and return a known person identifier data store , or any combination thereof. In one example , the
( e .g ., person label or class ) or indicate that the processed biometrics and classification is stored for use in subsequent
matching or searching . For instance , new biometric infor
biometric feature vectors are notmapped to a known person . mation can be processed to determine if the new biometric
For example, the hinge loss function outputs one or more information matches any classifications. The match (de
negative values if the feature vector is unknown. In other pending on a probability threshold ) can then be used for
examples, the output of the second neural network is an authentication or validation .
array of values, wherein the values and their positions in the
array determined a match to a person . [0068 ] In cases where a single match is executed , the
10064 ) Various embodiment use different machine learn neural network model employed at 112 can be optimized for
ingmodels for capturing feature vectors in the first network . one to one matching. For example , the neural network can
According to various embodiments , the feature vector cap be trained on the individual expected to use a mobile phone
ture is accomplished via a pre - trained neural network ( in (assuming no other authorized individuals for the device ). In
cluding , for example , a convolutional neural network ) where some example , the neural network model can include train
the output is Euclidean measurable . In some examples , this ing allocation to accommodate incremental training of the
can include models having a softmax layer as part of the model on acquired feature vectors over time. Various
model, and capture of feature vectors can occur preceding embodiment, discussed in great detail below incorporate
such layers . Feature vectors can be extracted from the incremental training operations for the neural network to
pre -trained neural network by capturing results from the permit additional people and to incorporate newly acquired
layers that are Euclidean measurable . In some examples, the feature vectors.
softmax layer or categorical distribution layer is the final [0069 ] In other embodiments , an optimized neural net
layer of the model, and feature vectors can be extracted from work model (e.g ., FCNN ) can be used for a primary user of
the n - 1 layer (e . g ., the immediately preceding layer ). In a device , for example , stored locally, and remote authenti
other examples, the feature vectors can be extracted from the cation can use a data store and one to many models (e .g ., if
model in layers preceding the last layer. Some implemen the first model returns unknown ). Other embodiments may
tations may offer the feature vector as the last layer. provide the one to many models locally as well. In some
10065 ] The resulting feature vectors are bound to a specific instances, the authentication scenario (e . g ., primary user or
user classification at 112 . For example , deep learning is not) can be used by the system to dynamically select a neural
US 2019 /0280869 A1 Sep . 12 , 2019

network model formatching , and thereby provide additional execution ( also referred to as prediction ). According to one
options for processing efficiency . embodiment, process 250 begins with acquisition of feature
[0070 ] FIG . 2A illustrates an example process 200 for vectors (e .g ., step 206 of FIG . 2A or 110 of FIG . 1 ). At 254,
authentication with secured biometric data . Process 200 the acquired feature vectors are matched against existing
begins with acquisition of multiple unencrypted biometrics classifications via a deep learning neural network . In one
for analysis at 202. In one example , the privacy - enabled example , the deep learning neural network has been trained
biometric system is configured to require at least three during enrollment on s set of individuals. The acquired
biometric identifiers ( e .g ., as plaintext data , reference bio feature vectors will be processed by the trained deep learn
metric , or similar identifiers). If for example, an authenti ing network to predict if the input is a match to known
cation session is initiated , the process can be executed so individual or does not match and returns unknown . In one
that it only continues to the subsequent steps if a sufficient example , the deep learning network is a fully connected
number of biometric samples are taken , given , and /or neural network (“FCNN ” ). In other embodiments, different
acquired . The number of required biometric samples can network models are used for the second neural network .
vary, and take place with as few as one . [0075 ] According to one embodiment, the FCNN outputs
[0071] Similar to process 100 , the acquired biometrics can an array of values . These values, based on their position and
be pre -processed at 204 ( e .g ., images cropped to facial the value itself, determine the label or unknown. According
features, voice sampled , iris scans cropped to relevant to one embodiment, returned from a one to many case are a
portions, etc.). Once pre -processing is executed the biomet series of probabilities associated with the match assuming
ric information is transformed into a one -way homomorphic five people in the trained data : the output layer showing
encryption of the biometric information to acquire the probability ofmatch by person : [0 . 1 , 0 .9 , 0 .3 , 0 .2 , 0 . 1 ] yields
feature vectors for the biometrics under analysis ( e . g ., at a match on Person 2 based on a threshold set for the classifier
206 ). Similar to process 100 , the feature vectors can be (e . g ., > 0 .5 ). In another run , the output layer: [0 . 1 , 0 .6 , 0 .3 ,
acquired using any pre -trained neural network that outputs 0 .8 , 0 . 1 ] yields a match on Person 2 & Person 4 (e . g ., using
Euclidean measurable feature vectors. In one example , this the same threshold ).
includes a pre - trained neural network that incorporates a [0076 ] However, where two results exceed the match
softmax layer. However, other examples do not require the threshold , the process and or system is configured to select
pre -trained neural network to include a softmax layer, only themaximum value and yield a (probabilistic ) match Person
that they output Euclidean measurable feature vectors. In 4 . In another example , the output layer : [ 0 . 1, 0 .2 , 0 .3 , 0 .2 ,
one, example , the feature vectors can be obtain in the layer 0 .1 ] shows no match to a known person — hence an
preceding the softmax layer as part of step 206 . UNKNOWN personas no values exceed the threshold .
[0072] At 208 , a prediction (e.g., a via deep learning Interestingly , this may result in adding the person into the
neural network ) is executed to determine if there is a match list of authorized people ( e . g ., via enrollment discussed
for the person associated with the analyzed biometrics . As above ) , or this may result in the person being denied access
discussed above with respect to process 100 , the prediction or privileges on an application . According to various
can be executed as a fully connected neural network having embodiments , process 250 is executed to determine if the
two hidden layers (during enrollment the neural network is person is known or not. The functions that result can be
configured to identify input feature vectors as individuals or dictated by the application that requests identification of an
unknown , and unknowns individuals can be added via analyzed biometrics.
incremental training or full retraining of the model). In other [0077 ] For an UNKNOWN person, i.e . a person never
examples , a fully connected neural network having no trained to the deep learning enrollment and prediction neural
hidden layers can be used . Examples ofneural networks are network , an output layer of an UNKNOWN person looks
described in greater detail below (e .g., FIG . 4 illustrates an like 1 - 0 .7 , - 1. 7 , - 6 . 0 , - 4 . 3 ). In this case, the hinge loss
example neural network 400 ). Other embodiments of the function has guaranteed that the vector output is all negative .
neural network can be used in process 200. According to This is the case of an UNKNOWN person . In various
some embodiments, the neural network features include embodiments, the deep learning neural network must have
operates as a classifier during enrollment to map feature the capability to determine if a person is UNKNOWN . Other
vectors to identifications, operates as a predictor to identify solutions that appear viable, for example, support vector
a known person or an unknown. In some embodiments , machine (“ SVM ” ) solutions break when considering the
differentneural networks can be tailored to different types of UNKNOWN case . According to various embodiments , the
biometrics, and facial images processed by one, while voice deep learning neural network ( e . g ., an enrollment & predic
biometrics are processed by another . tion neural network ) is configured to train and predict in
[0073 ] According to some embodiments , process 208 is polynomial time.
described agnostic to submitter security . In other words, [0078 ] Step 256 can be executed to vote on matching.
process 200 relies on front end application configuration to According to one embodiment, multiple images or biomet
ensure submitted biometrics are captured from the person rics are processed to identify a match . In an example where
trying to authenticate . As process 200 is agnostic to submit three images are processed the FCNN is configured to
ter security , the process can be executed in local and remote generate an identification on each and use each match as a
settings in the same manner. However, according to some vote for an individual's identification . Once a majority is
implementations the execution relies on the native applica reached ( e . g ., at least two votes for person A ) the system
tion or additional functionality in an application to ensure an returns as output identification of person A . In other
acquired biometric represents the user to be authenticated or instance , for example , where there is a possibility that an
matched . unknown person may result - voting can be used to facilitate
[0074 ] FIG . 2B illustrates an example process flow 250 determination of the match or no match . In one example ,
showing additional details for a one to many matching each result that exceeds the threshold probability can count
US 2019 /0280869 A1 Sep . 12 , 2019
10
as one vote, and the final tally of votes ( e. g., often 4 out of ment, component 312 executes a convolution neural net
5 ) is used to establish the match . In some implementations, work (" CNN " ), where the CNN includes a layer which
an unknown class may be trained in the model — in the generates Euclidean measurable output. The feature vector
examples above a sixth number would appear with a prob component 312 is configured to extract the feature vectors
ability of matching the unknown model. In other embodi from the layers preceding the softmax layer ( including for
ments, the unknown class is notused , and matching is made example , the n - 1 layer ) . As discussed above , various neural
or not against known persons. Where a sufficient match does networks can be used to define features vectors tailored to an
not result , the submitted biometric information is unknown . analyzed biometric ( e . g ., voice , image , health data , etc . ),
[ 00791 Responsive to matching on newly acquired bio where an output of or with the model is Euclidean measur
metric information , process 250 can include an optional step able . Some examples ofthese neural network include model
258 for retraining of the classification model. In one having a softmax layer. Other embodiment, use a model that
example, a threshold is set such that step 258 tests if a does not include a softmax layer to generate Euclidean
threshold match has been exceed , and if yes, the deep measurable vectors . Various embodiment of the system
learning neural network (e .g ., classifier & prediction net and /or feature vector component are configured to generate
work ) is retrained to include the new feature vectors being and capture feature vectors for the processed biometrics in
analyzed . According to some embodiments , retraining to the layer or layer preceding the softmax layer.
include newer feature vectors permits biometrics that change [0084 ] According to another embodiment, the feature vec
over time ( e . g ., weight loss , weight gain , aging or other tors from the feature vector component 312 or system 304
events that alter biometric information , haircuts , among are used by the classifier component 314 to bind a user to a
other options). classification ( i.e ., mapping biometrics to an match able/
[0080 ] FIG . 3 is a block diagram of an example privacy searchable identity ). According to one embodiment , the deep
enabled biometric system 304 . According to some embodi learning neural network (e . g ., enrollment and prediction
ments, the system can be installed on a mobile device or network ) is executed as a FCNN trained on enrollment data .
called from a mobile device (e . g., on a remote server or In one example, the FCNN generates an output identifying
cloud based resource ) to return an authenticated or not a person or indicating an UNKNOWN individual (e. g ., at
signal. In various embodiments system 304 can executed 306 ) . Other examples , do not use fully connected neural
any of the preceding processes. For example , system 304 networks.
can enroll users ( e. g ., via process 100 ), identify enrolled [0085 ] According to various embodiments, the deep learn
users (e . g ., process 200 ), and search for matches to users ing neural network (e.g ., which can be an FCNN ) must
( e.g ., process 250 ). differentiate between known persons and the UNKNOWN .
[0081] According to various embodiments , system 304 In some examples, this can be implement as a sigmoid
can accept, create or receive original biometric information function in the last layer that outputs probability of class
( e .g ., input 302 ). The input 302 can include images of matching based on newly input biometrics or showing
people , images of faces, thumbprint scans, voice recordings , failure to match . Other examples achieve matching based on
sensor data , etc . A biometric processing component (e.g ., a hinge loss functions.
308 ) can be configured to crop received images, sample 10086 ] In further embodiments , the system 304 and/ or
voice biometrics, etc ., to focus the biometric information on classifier component 314 are configured to generate a prob
distinguishable features (e.g ., automatically crop image ability to establish when a sufficiently close match is found.
around face ). Various forms of pre -processing can be In some implementations, an unknown person is determined
executed on the received biometrics , designed to limit the based on negative return values . In other embodiments,
biometric information to important features. In some multiple matches can be developed and voting can also be
embodiments , the pre -processing ( e . g ., via 308 ) is not used to increase accuracy in matching .
executed or available . In other embodiments , only biomet [0087 ] Various implementations of the system have the
rics that meet quality standards are passed on for further capacity to use this approach formore than one set of input.
processing. The approach itself is biometric agnostic . Various embodi
0082] Processed biometrics can be used to generate addi ments employ feature vectors that are Euclidean measurable ,
tional training data , for example , to enroll a new user. A which is handled using the first neural network . In some
training generation component 310 can be configured to instances, different neural networks are configured to pro
generate new biometrics for a user. For example , the training cess different types of biometrics. Using that approach the
generation component can be configured to create new vector generating neuralnetwork may be swapped for or use
images of the users face having different lighting , different a different neural network in conjunction with others where
capture angles , etc ., in order to build a train set of biomet each is capable of creating a Euclidean measurable feature
rics . In one example , the system includes a training thresh vector based on the respective biometric . Similarly , the
old specifying how many training samples to generate from system may enroll in both biometric types (e. g ., use two or
a given or received biometric . In another example , the more vector generating networks ) and predict on the features
system and / or training generation component 310 is config vectors generated for both types of biometrics using both
ured to build twenty five additional images from a picture of neural networks for processing respective biometric type
a user 's face. Other numbers of training images, or voice simultaneously. In one embodiment, feature vectors from
samples , etc., can be used . each type of biometric can likewise be processed in respec
[ 0083] The system is configured to generate feature vec tive deep neural networks configured to predict matches
tors from the biometrics (e .g ., process images from input and based on feature vector inputs or return unknown . The
generated training images ). In some examples , the system simultaneous results ( e .g ., one from each biometric type )
304 can include a feature vector component 312 configured may be used to identify using a voting scheme or may better
to generate the feature vectors. According to one embodi perform by firing both predictions simultaneously
US 2019 /0280869 A1 Sep . 12 , 2019

[0088 ] According to further embodiments, the system can tems in these cases system 300 may not include, for
be configured to incorporate new identification classes example, 308 , 310 , 312 , and instead receive feature vectors
responsive to receiving new biometric information. In one from other systems, components or processes .
embodiment, the system 304 includes a retraining compo 10093 ] FIGS. 4A - D illustrate example embodiments of a
nent configured to monitor a number of new biometrics ( e . g ., classifier network . The embodiments show a fully connected
per user /identification class or by total number of new neural network for classifying feature vectors for training
biometrics ) and automatically trigger a re - enrollment with and for prediction . Other embodiments implement different
the new feature vectors derived from the new biometric neural networks , including for example , neural networks
information ( e. g ., produced by 312 ). In other embodiments, that are not fully connected . Each of the networks accepts
the system can be configured to trigger re - enrollment on new Euclidean measurable feature vectors and returns a label or
feature vectors based on time or time period elapsing. unknown result for prediction or binds the feature vectors to
[0089 ] The system 304 and /or retraining component 316 a label during training .
can be configured to store feature vectors as they are [0094 FIGS. 5A - D illustrate examples of processing that
processed , and retain those feature vectors for retraining can be performed on input biometrics (e . g ., facial image )
(including for example feature vectors that are unknown to using a neural network . Feature vectors can be extracted
retrain an unknown class in some examples ). Various from such neural networks and used by a classifier ( e.g.,
embodiments of the system are configured to incrementally FIGS. 4A - D ) during training or prediction operations .
retrain the model on system assigned numbers of newly According to various embodiments , the system implements
received biometrics . Further, once a system set number of a first pre -trained neural network for generating Euclidean
incremental retraining have occurred the system is further measurable feature vectors that are used as inputs for a
configured to complete a full retrain of the model. The second classification neural network . In other embodiments ,
variables for incremental retraining and full retraining can other neural networks are used to process biometrics in the
be set on the system via an administrative function . Some first instance . In still other examples, multiple neural net
defaults include incremental retrain every 3 , 4 , 5 , 6 identi works can be used to generated Euclidean measurable
fications, and full retrain every 3, 4 , 5 , 6 , 7 , 8 , 9, 10 feature vectors from unencrypted biometric inputs each may
incremental retrains. Additionally, this requirement may be feed the feature vectors to a respective classifier . In some
met by using calendar time, such as retraining once a year. examples, each generator neural network can be tailored to
These operations can performed on offline (e .g ., locked ) a respective classifier neural network , where each pair ( or
copies of the model, and once complete the offline copy can multiples of each ) is configured to process a biometric data
be made live. type (e.g., facial image, iris images, voice , health data, etc .).
[0090 ] Additionally, the system 304 and / or retraining [0095 ] Implementation Examples
component 316 is configured to update the existing classi [0096 ] The following example instantiations are provided
fication model with new users / identification classes . Accord to illustrates various aspects of privacy -enabled biometric
ing to various embodiments , the system builds a classifica systems and processes . The examples are provide to illus
tion model for an initial number of users, which can be based trate various implementation details and provide illustration
on an expected initial enrollment. The model is generated of execution options as well as efficiency metrics. Any of the
with empty or unallocated spaces to accommodate new details discussed in the examples can be used in conjunction
users. For example , a fifty user base is generated as a one with various embodiments.
hundred user model. This over allocation in the model 00971. It is realized that conventional biometric solutions
enables incremental training to be executed on the classifi have security vulnerability and efficiency /scalability issues.
cation model. When a new user is added , the system is Apple , Samsung , Google and MasterCard have each
and / or retraining component 316 is configured to incremen launched biometric security solutions that share at least three
tally retrain the classification model — ultimately saving sig technical limitations . These solutions are ( 1 ) unable to
nificant computation time over convention retraining execu search biometrics in polynomial time; (2 ) do not one-way
tions. Once the over allocation is exhausted ( e . g ., 100 total encrypt the reference biometric ; and ( 3 ) require significant
identification classes ) a full retrain with an additional over computing resources for confidentiality and matching .
allocation can be made ( e . g ., fully retrain the 100 classes to [0098 ] Modern biometric security solutions are unable to
a model with 150 classes ). In other embodiments, an incre scale ( e .g . Apple Face IDTM authenticates only one user ) as
mental retrain process can be executed to add additional they are unable to search biometrics in polynomial time. In
unallocated slots . fact, the current “ exhaustive search ” technique requires
[0091 ] Even with the reduced time retraining, the system significant computing resources to perform a linear scan of
can be configured to operate with multiple copies of the an entire biometric datastore to successfully one -to -one
classification model. One copy may be live that is used for record match each reference biometric and each new input
authentication or identification . A second copy may be an record — this is as a result of inherent variations in the
update version , that is taken offline (e . g ., locked from biometric instances of a single individual.
access ) to accomplish retraining while permitting identifi [00991. Similarly , conventional solutions are unable to
cation operations to continue with a live model. Once one -way encrypt the reference biometric because exhaustive
retraining is accomplished , the updated model can be made search (as described above) requires a decryption key and a
live and the other model locked and updated as well. decryption to plaintext in the application layer for every
Multiple instances of both live and locked models can be attempted match . This limitation results in an unacceptable
used to increase concurrency . risk in privacy (anyone can view a biometric ) and authen
[0092] According to some embodiments , the system 300 tication ( anyone can use the stolen biometric ). And , once
can receive feature vectors instead of originalbiometrics and compromised , a biometric — unlike a password cannot be
processing original biometrics can occur on different sys reset .
US 2019 /0280869 A1 Sep . 12 , 2019

[0100 ] Finally , moderns solutions require the biometric to network prediction call. To execute steps 1 through 5 for
return to plaintext in order to match since the encrypted form every biometric is time consuming , error prone and fre
is not Euclidean measurable . It is possible to choose to make quently nearly impossible to do before the biometric
a biometric two -way encrypted and return to plaintext — but becomes deprecated . One goal with various embodiments , is
this requires extensive key management and , since a two to develop a scheme, techniques and technologies that allow
way encrypted biometric is not Euclidean measurable, it also the system to work with biometrics in a privacy protected
returns the solution to linear scan limitations. and polynomial-timebased way that is also biometric agnos
[0101] Various embodiments of the privacy -enabled bio tic . Various embodiments employ machine learning to solve
metric system and /or methods provide enhancement over problems issues with (2 )-(5 ).
conventional implementation (e.g ., in security, scalability,
and / or management functions). Various embodiments enable [0107 ] According to various embodiments , assumed is or
scalability (e. g ., via " encrypted search ” ) and fully encrypt no control over devices such as cameras or sensors that
the reference biometric (e.g ., " encrypted match ” ). The sys acquire the to be analyzed biometrics ( thus arriving as plain
tem is configure to provide an “ identity” that is no longer text). According to various embodiments , if that data is
tied independently to each application and a further enables encrypted immediately and only process the biometric infor
a single, global “ Identity Trust Store” that can service any mation as cipher text, the system provides the maximum
identity request for any application . practical level of privacy. According to another aspect, a
[0102] Various operations are enabled by various embodi one -way encryption of the biometric , meaning that given
ment, and the functions include . For example : cipher text, there is no mechanism to get to the original
[0103 ] Encrypted Match : using the techniques plaintext, reduces/eliminates the complexity of key manage
described herein , a deep neural network (“ DNN ” ) is ment of various conventional approaches . Many one -way
used to process a reference biometric to compute a encryption algorithmsexist , such as MD5 and SHA -512
one -way, homomorphic encryption of the biometric 's however , these algorithms are not homomorphic because
feature vector before transmitting or storing any data.
This allows for computations and comparisons on they are not Euclidean measurable . Various embodiments
cipher texts without decryption , and ensures that only discussed herein enable a general purpose solution that
the Euclidean measureable , homomorphic encrypted produces biometric cipher text that is Euclidean measurable
biometric is available to execute subsequent matches in using a neural network . Apply a classifying algorithm to the
the encrypted space . The plaintext data can then be resulting feature vectors enables one-to -many identification .
discarded and the resultant homomorphic encryption is
then transmitted and stored in a datastore. This example In various examples , this maximizes privacy and runs
allows for computations and comparisons on cipher between O (n )= 1 and O (n )= log (n ) time.
texts without decryption and ensures that only the [0108 ] As discussed above, some capture devices can
Euclidean measureable , homomorphic encrypted bio encrypt the biometric via a one way encryption and provide
metric is available to execute subsequentmatches in the feature vectors directly to the system . This enables some
encrypted space . embodiments, to forgo biometric processing components ,
[0104 ] Encrypted Search : using the techniques training generation components, and feature vector genera
described herein , encrypted search is done in polyno tion components , or alternatively to not use these elements
mial time according to various embodiments . This for already encrypted feature vectors .
allows for comparisons of biometrics and achieve val
ues for comparison that indicate “ closeness ” of two [0109 ] Example Execution and Accuracy
biometrics to one another in the encrypted space ( e. g . [0110 ] In some executions, the system is evaluated on
a biometric to a reference biometric ) while at the same differentnumbers of images per person to establish ranges of
time providing for the highest level of privacy. operating parameters and thresholds . For example , in the
[0105 ] Various examples detail implementation of one - to experimental execution the num -epochs establishes the
many identification using, for example , the N - 1 layer of a number of interactions which can be varied on the system
deep neural network . The various techniques are biometric (e .g ., between embodiments, between examples, and
agnostic , allowing the same approach irrespective of the between executions, among other options). The LFW dataset
biometric or the biometric type. Each biometric ( face , voice , is taken from the known labeled faces in the wild data set.
IRIS , etc .) can be processed with a different, fully trained , Eleven people is a custom set of images and faces94 from
neural network to create the biometric feature vector. the known source faces94 . For our examples, the epochs
[ 0106 ] According to some aspects, an issue with current are the number of new images that are morphed from the
biometric schemes is they require a mechanism for: (1 ) original images. So if the epochs are 25 , and we have 10
acquiring the biometric , (2) plaintext biometric match , (3 ) enrollment images, then we train with 250 images . The
encrypting the biometric , (4 ) performing a Euclidean mea morphing of the images changed the lighting, angels and the
surable match , and (5 ) searching using the second neural like to increase the accuracy in training.
US 2019 /0280869 A1 Sep . 12 , 2019

TABLE I

( fully connected neural network model with 2 hidden layers + output sigmoid layer):
Input = > [100 , 50 ] = > num _ people ( train for 100 people given 50 individuals to
identify). Other embodiments improve over these accuracies for the UNKNOWN.
UNKNOWN # images # images Accuracy Accuracy
Training Test PERSON In Test In UNKNOWN In Test In UNKNOWN
Dataset Set Set Set Set PERSON Set Parameters Set PERSON Set
LFW 70 % 30 % 11 people 1304 257 min _ images _ per_ person = 10 98 .90 % 86 .40 %
dataset num - epochs = 25
LFW 70 % 30 % 11 people 2226 257 min _ images _ per _person = 3 93 .90 % 87 .20 %
dataset num - epochs = 25
11 people 70 % 30 % Copy 2 people 77 min _ images _ per _ person = 2 100 . 00 % 50 .00 %
from LFW num - epochs = 25
faces94 70 % 30 % 11 people 918 257 min _ images _per_ person = 2 99 . 10 % 79 .40 %
dataset num - epochs = 25

TABLE II
(0 hidden layers & output linear with decision f(x ); Decision at .5 value )
Improves accuracy for the UNKNOWN case, but other implementations achieve higher
accuracy .

UNKNOWN # images # images Accuracy Accuracy


Training Test PERSON In Test In UNKNOWN In Test In UNKNOWN
Dataset Set Set Set Set PERSON Set Parameters Set PERSON Set
LFW 70 % 30 % 11 people 1304 257 min _ images _ per_ person = 10 98 .80 % 91 .10 % %
dataset num - epochs = 25
LFW 70 % 30 % 11 people 2226 257 min _ images _ per _person = 3 96 .60 % 97.70 % %
dataset num - epochs = 25
11 people 70 % 30 % Copy 2 people 77 min _ images _ per _person = 2 98.70 % 50 .00 % %
from LFW num - epochs = 25
faces94 70 % 30 % 11 people 918 257 min _ images _ per _person = 2 99. 10 % 82. 10 % %
dataset num -epochs = 25
Cut - off = 0 . 5
faces94 70 % 30 % 11 people 918 257 min _ images _per _ person =2 98 .30 % 95 .70 %
dataset num - epochs = 25
Cut-off = 1 .0

TABLE III
FCNN with 1 hidden layer (500 nodes) + output linear with decision
UNKNOWN # images # images Accuracy Accuracy
Training Test PERSON In Test In UNKNOWN In Test In UNKNOWN
Dataset Set Set Set Set PERSON Set Parameters Set PERSON Set
LFW 70 % 30 % 11 people 1304 257 min _ images _ per _person = 10 99. 30 % 92 . 20 %
dataset num -epochs = 25
LFW 70 % 30 % 11 people 2226 257 min _ images _per_ person = 3 97.50 % 97 .70 %
dataset num -epochs = 25
11 people 70 % 30 % Copy 2 people 77 min _ images _per_ person = 2
from LFW num -epochs = 25
faces94 70 % 30 % 11 people
dataset
918 257
257 min _ images _per_ person = 2
num -epochs = 25
99. 20 % 92.60 %
Cut-off = 0 .5
faces94 70 % 30 % 11 people 918 257 min _ images _ per _ person = 2
dataset num -epochs = 25
Cut-off = 1. 0
US 2019 /0280869 A1 Sep . 12 , 2019
14

TABLE IV
FCNN 2 Hidden Layers (500, 2 *num _ people ) + output linear, decisions f( x )
UNKNOWN # images # images Accuracy Accuracy
Training Test PERSON In Test In UNKNOWN In Test In UNKNOWN
Dataset Set Set Set Set PERSON Set Parameters Set PERSON Set
LFW 70 % 30 % 11 people 1304 257 min _ images _ per _ person = 10 98 . 30 % 97 .70 %
dataset num -epochs = 25
LFW 70 % 30 % 11 people 2226 257 min _ images _per_ person = 3 98.50 % 98 . 10 %
dataset num -epochs = 25
Cut-off = 0
11 people 70 % 30 % Copy 2 people 77 min _ images _ per_ person = 2
from LFW num -epochs = 25
faces94 70 % 30 % 11 people 918 257 min _ images _per_ person =2 98.60 % 93.80 %
dataset num -epochs = 25
Cut-off = 1. 0

0111 ] In various embodiments , the neural network model implementation . For example, where additions to the iden
is generated initially to accommodate incremental additions tifiable users is anticipated to be small additional incremen
of new individuals to identify (e.g ., 2 * num _ people is an tal training options can include any number with ranges of
example of a model initially trained for 100 people given an
initial 50 individuals of biometric information ). The mul 1 % to 200 % . In other embodiments , larger percentages can
tiple or training room provides can be tailored to the specific be implemented as well.
TABLE V
FCNN 2 Hidden Layers (500 , 2 * num _people ) + output linear, decisions f(x ), and
voting-where the model is trained on 2 * the number of class identifiers for incremental training.
Accuracy Accuracy
UNKNOWN # images # images Accuracy In UNKNOWN In UNKNOWN
Training Test PERSON In Test In UNKNOWN In Test PERSON Set = PERSON Set =
Dataset Set Set Set Set PERSON Set Parameters Set 11 people faces94
LFW 70 % 30 % 11 people 1304 257 min _ images _per _person = 10 98.20 % 98.80 % 86 .40 %
dataset num -epochs = 25 (vote ) (vote ) (vote )

150 mm som probe 126


LFW

11 people
70 %

70 %
30 % 11 people
dataset

30 % Copy 2 people
2226

77
257
num -epochs = 25
Cut-off = 0
min _ images _ per _person = 2
100 .00 %
min _ images _ per _person = 3 98. 10 %
( vote )
98 .60 %
100 .00 %
98 .40 %
(vote )
100 .00 %
90.80 %
93. 60 %
(vote )
95. 40 %

70 %
from LFW
11 people
4 num - epochs = 25
faces94 30 % 918 257 min _ images _per _person = 2
dataset num - epochs = 25
Cut-off = 0

[0112] According to one embodiment the system can be


implemented as a REST compliantAPI that can be integrates
and /or called by various programs, applications, systems,
system components , etc ., and can be requested locally or
remotely .
[0113] In one example, the privacy -enabled biometric API
includes the following specifications:
• Preparing data : this function takes the images & labels and save them into the
local directory.
def add _ training_ data ( list_ of_ images , list_ of_ label) :
@ params list _ of_ images: the list of images
@ params list_ of_ label: the list of corresponding labels
Training model: each label (person /individual) should have at least 2 images. In
some examples, if the person does not have the minimum that person will be
ignored .
{
def train ( ) :
US 2019 /0280869 A1 Sep . 12 , 2019

- continued
Prediction :
def predict(list _ of_ images ) :
@ params list _of_ images : the list of images of the same person
@ return label: a person name or “ UNKNOWN _ PERSON ”

[0114 ] Further embodiments can be configured to handle [0122 ] grant all on trueid .* to trueid @ 'localhost' identi
new people (e . g ., labels or classes I the model) in multiple fied by “trueid ';
way . In one example , the currentmodel can be retrain every [0123] drop table if exists feature ;
time (e .g ., with a threshold number) a certain number of new [0124 ] drop table if exists image;
people are introduced . In this example , the benefit is [0125 ] drop table if exists PII ;
improved accuracy — the system can guarantee a level of [0126 ] drop table if exists subject;
accuracy even with new people . There exists a trade-off in
that full retraining is a slow time consuming and heavy
computation process . This can be mitigated with live and CREATE TABLE subject
offline copies of the model so the retraining occurs offline
and the newly retrain model is swapped for the live version . id INT PRIMARY KEY AUTO INCREMENT,
[0115 ] In one example, training time executed in over 20 when _ created TIMESTAMP DEFAULT CURRENT_ TIMESTAMP
minutes . With more data the training time increases. CREATE TABLE PII
[0116 ] According to another example , the model is initial id INT PRIMARY KEY AUTO _ INCREMENT,
ized with slots for new people . The expanded model is subject _ id INT,
configured to support incremental training ( e. g., the network tag VARCHAR ( 254),
structure is not changed when adding new people ). In this value VARCHAR ( 254 )
example, the time add new people is significantly reduces CREATE TABLE image
( even over other embodiments of the privacy - enabled bio
metric system ). It is realized that there may be some id INT PRIMARY KEY AUTO _ INCREMENT,
reduction in accuracy with incremental training , and as more subject _ id INT,
and more people are added the model can trends towards image _ name VARCHAR (254),
overfit on the new people i.e ., become less accurate with old is _ train boolean ,
when created TIMESTAMP DEFAULT CURRENT TIMESTAMP
people . However, various implementation have been tested
to operate at the same accuracy even under incremental CREATE TABLE feature
retraining.
id INT PRIMARY KEY AUTO _ INCREMENT,
[ 0117 ] Yet another embodiments implements both incre image _ id INT NOT NULL ,
mentalretraining and full retraining at a threshold level ( e.g ., feature _order INT NOT NULL ,
build the initial model with a multiple of the people as feature _ value DECIMAL (32 ,24 ) NOT NULL
needed — (e. g., 2 times — 100 labels for an initial 50 people ,
50 labels for an initial 25 people , etc .) ). Once the number of
people reaches the upper bound (or approaches the upper [0127 ] ALTER TABLE image ADD CONSTRAINT
bound ) the system can be configured to execute a full retrain fk subject id FOREIGN KEY ( subject id ) REFERENCES
on the model, while building in the additional slots for new subject( id );
users . In one example, given 100 labels in the model with 50 0128 ] ALTER TABLE PII ADD CONSTRAINT fk _ sub
initial people (50 unallocated ) reaches 50 new people , the ject id pii FOREIGN KEY (subject_ id ) REFERENCES
system will execute a full retrain for 150 labels and now 100 subject( id );
actual people. This provides for 50 additional users and [0129 ] ALTER TABLE feature ADD CONSTRAINT
incremental retraining before a full retrain is executed . fk _ image _ id FOREIGN KEY ( image _ id ) REFERENCES
[0118 ] Stated generally , the system in various embodi image ( id );
ments is configured to retrain the whole network from [0130 ] CREATE INDEX piisubjectid ON PII (subject _ id );
beginning for every N people step . Training data : have 100 [0131 ] CREATE INDEX imagesubjectid ON image( sub
people; step 1 : train the network with N = 1000 people ; assign ject_ id );
100 people and reserving 900 to train incremental ; train [0132] CREATE INDEX imagesubjectidimage ON image
incrementally with new people until we reach 1000 people ;
and reach 1000 people, full retrain . Full retrain : train the (subject_ id , image _ name);
network with 2N = 2000 people ; now have 1000 people for [0133] CREATE INDEX featureimage _id ON feature( im
reserving to train incremental; train incrementally with new age _ id );
people untilwe reach 2000 people ; and repeat the full retrain [0134 ] API Execution Example :
with open allocations when reach the limit. [0135 ] Push the known LFW feature embeddings to
[0119 ] An example implementation of the API includes biometric feature database .
the following code: [0136 ] Simulate the incremental training process :
[0120 ] drop database if exists trueid ; [0137 ] num _ seed = 50 # build the model network, and
[0121 ] create database trueid ; first num _ seed people was trained fully
US 2019 /0280869 A1 Sep . 12 , 2019
16

10138 ] num _ window = 50 # For every num _ window [0150 ] Build themodel network with # class = 200. Train
people : build the model network , and people trained from beginning (e .g., # epochs= 100 ) with the first 150
fully people . The remaining 50 classes are reserved for
[0139 ] num _ step = 1 # train incremental every new incremental training .
num _ step people (0151] i) Incremental training for the 151st person .
[0140 ] num _ eval= 10 # evaluate the model every num _ Train the previous model with all 151 people (#
epochs = 20 )
eval people [0152 ] ii ) Continue . . .
[0141] Build the modelnetwork with # class = 100 . Train [0153] Refactor Problem :
from beginning (# epochs = 100 ) with the first 50 [0154] According to various embodiments , it is realized
people . The remaining 50 classes are reserved for that incremental training can trigger concurrency problems:
incremental training . e . g ., a multi -thread problem with the same model , thus the
[0142 ] i) Incremental training for the 51st person . system can be configured to avoid retrain incrementally at
Train the previous model with all 51 people (# the same time for two different people ( data can be lost if
epochs= 20 ) retraining occurs concurrently ). In one example , the system
[0143 ] ii ) Incremental training for the 52st person . implements a lock or a semaphore to resolve . In another
Train the previous model with all 52 people (# example , multiple models can be running simultaneously
epochs= 20 ) and reconciliation can be executed between the models in
stages . In further examples, the system can monitoring
[0144 ] iii) continue . . . models to ensure only one retrain is executed one multiple
[0145 ] (Self or automatic monitoring can be executed live models , and in yet others use locks on the models to
by various embodiments to ensure accuracy over ensure singular updates via incremental retrain . Reconcili
time alert flags can be produced if deviation or exces ation can be executed after an update between models . In
sive inaccuracy is detected ; alternatively or in conjunc further examples , the system can cache feature vectors for
tion full retraining can be executed responsive to excess subsequent access in the reconciliation .
inaccuracy and the fully retrained model evaluated to 10155 ) According to some embodiments, the system
determine is accuracy issues are resolved — if so the full design resolves a data pipeline problem : in some examples ,
retrain threshold can be automatically adjusted ). Evalu the data pipeline supports running one time due to queue and
ate the accuracy of the previous model ( e. g ., at every 10 thread characteristics . Other embodiments , avoid this issue
steps ), optionally record the training time for every by extracting the embeddings . In examples, that do not
step . include that functionality the system can still run multiple
10146 ] Achieve incremental training for maximum allo times without based on saving the embedding to file, and
cation (e . g ., the 100th person ). Full train of the previous loading the embedding from file . This approach can be used
model with all 100 people (e.g., # epochs = 20 ) where the extracted embedding is unavailable via other
approaches . Various embodiments can employ different
[0147] Build the modelnetwork with # class = 150 . Train options for operating with embeddings : when we give a
from beginning ( e .g ., # epochs = 100 ) with the first 100 value to a tensorflow , we have several ways : Feed _ dict
people . The remaining 50 classes are reserved for ( speed trade -off for easier access ) ; and Queue : faster via
incremental training . multi-threads, but can only run one time (the queue will be
10148 ] i) Incremental training for the 101st person . end after it ' s looped ).
Train the previous model with all 101 people (# 10156 ] Table VIII & TABLE IX (below ) shows execution
epochs= 20 ) timing during operation and accuracy percentages for the
[0149] ii ) continue . . . example .
TABLE VI
A B C D E
1 step action info time accuracy
2 50 Retrieving feature embedding 100 .939024
3 50 Training Deep Learning classifier 54 . 34578061
4 51 Retrieving feature embedding 104 .8042319
5 51 Training incrementally Deep Learning classifier 9 .755134106
52 Retrieving feature embedding 105 .692045
7 52 Training incrementally Deep Learning classifier 9 . 367767096
8 53 Retrieving feature embedding 95 .68940234
9 53 Training incrementally Deep Learning classifier 9 .33846755
10
?
54 Retrieving feature embedding 108 . 8445647
11 54 Training incrementally Deep Learning classifier
?

9 .668224573
12 55 Retrieving feature embedding 108 .7391896
13 55 Training incrementally Deep Learning classifier 10 .2577827
14 56 Retrieving feature embedding 107 . 1305535
15 56 Training incrementally Deep Learning classifier 9 .660038471
16
?
57 Retrieving feature embedding 111. 1128619
17 57 Training incrementally Deep Learning classifier 9 .824867487
18 58 Retrieving feature embedding 109 .780278
19 58 Training incrementally Deep Learning classifier 10 .25701618
20 59 Retrieving feature embedding 114 .9919829
21 59 Training incrementally Deep Learning classifier 9 .752382278
22 60 Retrieving feature embedding 114 . 3731036
US 2019 /0280869 A1 Sep . 12 , 2019
17

TABLE VI-continued
AB ? D ?
1 step action info time accuracy

23 60 Training incrementally Deep Learning classifier 10 . 15184236


24 60 Accuracy # test_ images = 533 0 .988743
ääAw
25 60 Vote Accuracy
26 61 Retrieving feature embedding
# test _ images = 533
118. 237993

wN
27 61 Training incrementally Deep Learning classifier 10 .0895071
62 Retrieving feature embedding 120 . 2519257
62 Training incrementally Deep Learning classifier 10 .69825125
30 63 Retrieving feature embedding 119 . 3803787
31 63 Training incrementally Deep Learning classifier 10 .66580486
32 64 Retrieving feature embedding 138.031605
33 64 Training incrementally Deep Learning classifier 12. 32183456
34 65 Retrieving feature embedding 133. 2701755
35 65 Training incrementally Deep Learning classifier 12. 35964537
36 66
Retrieving feature embedding 136 . 8798289
37 66
Training incrementally Deep Learning classifier 12 .07544327
38 67 Retrieving feature embedding 140 . 3868775
3967 Training incrementally Deep Learning classifier 12 . 54206896
40 68 Retrieving feature embedding 140 . 855052
41 68 Training incrementally Deep Learning classifier 12 .59552693
42 69 Retrieving feature embedding 140 . 2500689
43 69 Training incrementally Deep Learning classifier 12. 55604577
44 70 Retrieving feature embedding 144 . 5612676
45 70 Training incrementally Deep Learning classifier 12. 95398426
46 70 Accuracy # test_ images = 673 0 .9925706
47 70 Vote Accuracy # test _ images = 673
48 71 Retrieving feature embedding 145 . 2458987
49 71 Training incrementally Deep Learning classifier 13.09439131
º indicates text missing or illegible when filed

TABLE VII
A B C DE
1 step action Info time accuracy
80 Training incrementally Deep Learning classifier 14 . 24880123
80 Accuracy # test_ images = 724 0 .9903315
80 Vote Accuracy # test_ images = 724
70 81 Retrieving feature embedding 153.8295755
81 Training incrementally Deep Learning classifier 14 .72389603
82 Retrieving feature embedding 157 .9210677
82 Training incrementally Deep Learning classifier 14 .57672453
83 Retrieving feature embedding 164 . 8383744
83 Training incrementally Deep Learning classifier 21. 83570766
84 Retrieving feature embedding 161.2950387
84 Training incrementally Deep Learning classifier 14 . 25801277
78 85 Retrieving feature embedding 155.9785285
85 Training incrementally Deep Learning classifier 14 . 45170879
86 Retrieving feature embedding 160 . 9079704
86 Training incrementally Deep Learning classifier 14 .81818509
87 Retrieving feature embedding 164.5734673

UAWN 87
88
88
89
89
Training incrementally Deep Learning classifier
Retrieving feature embedding
Training incrementally Deep Learning classifier
Retrieving feature embedding
Training incrementally Deep Learning classifier
18 . 26664591
169.8400548
15 .75074983
169 . 2413263
15 .93148685
88 90 Retrieving feature embedding 172 .5191889
90 Training incrementally Deep Learning classifier 15 . 88449383
90 Accuracy # test_ images = 822 0 .986618
90 Vote Accuracy # test_ images = 822 0 .9963504
91 Retrieving feature embedding 170 . 162873
91 Training incrementally Deep Learning classifier 15 .72525668
92 Retrieving feature embedding 174 .9947026
92 Training incrementally Deep Learning classifier 15 .791049
93 Retrieving feature embedding 175 . 3449857
93 Training incrementally Deep Learning classifier 15 .8756597
94 Retrieving feature embedding 177.0825081
99 94 Training incrementally Deep Learning classifier 15 . 72812366
100 95 Retrieving feature embedding 178 .8840812
101 95 Training incrementally Deep Learning classifier 16 . 04615927
102 96 Retrieving feature embedding 171 .2114341
103 96 Training incrementally Deep Learning classifier 16 . 32442522
US 2019 /0280869 A1 Sep . 12 , 2019
18

TABLE VII -continued


A B D E
1 step action Info time accuracy
104 97 Retrieving feature embedding 177.8708515
105 97 Training incrementally Deep Learning classifier 15 . 90093112
106 98 Retrieving feature embedding 177 .5916936
107 98 Training incrementally Deep Learning classifier 16 . 57834721
108 99 Retrieving feature embedding 185 . 1854212
109 99 framing incrementally Deep Learning classifier 16 .64935994
110 100 Retrieving feature embedding 179 .5375969
111 100 Training incrementally Deep Learning classifier 17 . 24395561
112 100 Accuracy # test_ images = 875 0 .9897143
113 100 Vote Accuracy # test_ images = 875
114 100 Retrieving feature embedding 184. 8017459
indicates text missing or illegible when filed
TABLE VIII
Table VIII shows summary information for additional executions.
UNKNOWN # people in # images # images Accuracy
Training Test PERSON Traing In Test In UNKNOWN In Test
Dataset Set Set Set Set Set PERSON Set Parameters Set

LFW 70 % 30 % 11 people 158 1304 257 min _ images _ per _ person = 10 98 .20 %
dataset num - epochs = 25 ( vote )
100 .00 %
Cut-off = 0
LFW 70 % 30 % 11 people 901 2226 257 min _ images _ per_person = 3 93.80 %
dataset num -epochs = 25 (vote )
96 .42 %
Cut-off = 0

10157] According to one embodiment, the system can be the appended claims. An illustrative implementation of
described broadly to include the any one or more or any a computer system 800 thatmay be used in connection
combination of the following elements and associated func with any of the embodiments of the disclosure provided
tions : herein is shown in FIG . 8 . The computer system 800
10158 ] Preprocessing: where the system takes in an may include one or more processors 810 and one or
unprocessed biometric , which can include cropping and more articles of manufacture that comprise non - transi
aligning and either continues processing or returns that tory computer- readable storage media (e . g ., memory
the biometric cannot be processed. 820 and one or more non - volatile storage media 830 ) .
[0159 ] Neural network 1: Pre-trained . Takes in unen The processor 810 may control writing data to and
crypted biometrics . Returns biometric feature vectors reading data from thememory 820 and the non - volatile
that are one way encrypted and Euclidean measureable . storage device 830 in any suitable manner. To perform
That is it ' s only purpose . any of the functionality described herein , the processor
[0160 ] Neural network 2 : Not pre -trained . It is a deep 810 may execute one or more processor-executable
learning neural network that does classification . instructions stored in one or more non - transitory com
Includes incremental training, takes a set of label, puter -readable storage media (e.g ., the memory 820 ),
feature vector pairs as input and returns nothing during which may serve as non - transitory computer - readable
training — the trained network is used for matching or storage media storing processor- executable instructions
prediction on newly input biometric information . Does for execution by the processor 810 .
prediction , which takes a feature vector as input and [0164 ] The terms “ program ” or “ software ” are used herein
returns an array of values. These values , based on their in a generic sense to refer to any type of computer code or
position and the value itself, determine the label or set of processor- executable instructions that can be
unknown. employed to program a computer or other processor to
[0161] Voting functions can be executed with neural implement various aspects of embodiments as discussed
network 2 e .g ., during prediction . above . Additionally, it should be appreciated that according
10162 ] System may have more than one neuralnetwork to one aspect, one or more computer programs that when
1 for different biometrics . Each would generate feature executed perform methods of the disclosure provided herein
vectors based on unencrypted input. need not reside on a single computer or processor , but may
[0163] System may have multiple neural network 2 (s ) be distributed in a modular fashion among different com
puters or processors to implement various aspects of the
one for each biometric type . Modifications and varia disclosure provided herein .
tions of the discussed embodiments will be apparent to
those of ordinary skill in the art and all such modifi [0165 ] Processor -executable instructions may be in many
cations and variations are included within the scope of forms, such as program modules , executed by one or more
US 2019 /0280869 A1 Sep . 12 , 2019

computers or other devices. Generally, program modules " comprising ” can refer, in one embodiment, to A only
include routines, programs, objects, components, data struc ( optionally including elements other than B ); in another
tures, etc . that perform particular tasks or implement par embodiment, to B only (optionally including elements other
ticular abstract data types. Typically, the functionality of the than A ); in yet another embodiment, to both A and B
program modules may be combined or distributed as desired (optionally including other elements ); etc.
in various embodiments . [0170 ] Use of ordinal terms such as “ first,” “ second ,”
[ 0166 ] Also , data structures may be stored in one ormore “ third ,” etc ., in the claims to modify a claim element does
non - transitory computer- readable storage media in any suit not by itself connote any priority, precedence , or order of
able form . For simplicity of illustration , data structures may one claim element over another or the temporal order in
be shown to have fields that are related through location in which acts of a method are performed . Such terms are used
the data structure . Such relationships may likewise be merely as labels to distinguish one claim element having a
achieved by assigning storage for the fields with locations in certain name from another element having a same name (but
a non - transitory computer-readable medium that convey for use of the ordinal term ).
relationship between the fields. However, any suitable [0171 ] The phraseology and terminology used herein is for
mechanism may be used to establish relationships among the purpose of description and should not be regarded as
information in fields of a data structure, including through limiting. The use of “ including," " comprising,” “ having,"
the use of pointers, tags or other mechanisms that establish " containing” , “ involving" , and variations thereof, is meant
relationships among data elements . to encompass the items listed thereafter and additional
[0167] Also , various inventive concepts may be embodied items.
as one or more processes , of which examples ( e. g., the 10172 ] Having described several embodiments of the tech
processes described with reference to FIGS . 1 and 2A -2B ) niques described herein in detail, various modifications , and
have been provided . The acts performed as part of each improvements will readily occur to those skilled in the art.
process may be ordered in any suitable way . Accordingly , Such modifications and improvements are intended to be
embodiments may be constructed in which acts are per within the spirit and scope of the disclosure . Accordingly ,
formed in an order different than illustrated , which may the foregoing description is by way of example only , and is
include performing some acts simultaneously , even though not intended as limiting. The techniques are limited only as
shown as sequential acts in illustrative embodiments. defined by the following claims and the equivalents thereto .
[0168 ] All definitions, as defined and used herein , should 1 . A privacy -enabled biometric system comprising :
be understood to control over dictionary definitions, and / or
ordinary meanings of the defined terms. As used herein in at least one processor operatively connected to a memory ;
the specification and in the claims, the phrase " at least one," a classification component executed by the at least one
in reference to a list of one or more elements, should be processor, including a classification network having a
understood to mean at least one element selected from any deep neural network (“ DNN ” ) configured to classify
one or more of the elements in the list of elements , but not feature vector and label inputs during training and
necessarily including at least one of each and every element return a label for person identification or an unknown
specifically listed within the list of elements and not exclud result during prediction , wherein the classification
ing any combinations of elements in the list of elements . component is further configured to accept as an input
This definition also allows that elements may optionally be feature vectors that are Euclidean measurable;
present other than the elements specifically identified within the classification network having an architecture compris
the list of elements to which the phrase " at least one” refers, ing a plurality of layers: at least one layer comprising
whether related or unrelated to those elements specifically nodes associated with feature vectors , the at least one
identified . Thus, as a non -limiting example, " at least one of layer having an initial number of identification nodes
A and B ” (or, equivalently , " at least one of A or B ," or , and a subset of the identification nodes that are unas
equivalently “ at least one of A and / or B ” ) can refer, in one signed ;
embodiment, to at least one , optionally including more than wherein the system responsive to input of biometric
one, A , with no B present (and optionally including elements information for a new user is configured to trigger an
other than B ); in another embodiment, to at least one , incremental training operation for the classification
optionally including more than one , B , with no A present
( and optionally including elements other than A ) ; in yet network integrating the new biometric information into
another embodiment, to at least one , optionally including a respective one of the unallocated identification nodes
more than one, A , and at least one , optionally including more usable for subsequent matching .
than one, B (and optionally including other elements ); etc . 2 . The system of claim 1, wherein the system is config
[0169 ] The phrase " and /or," as used herein in the speci ured to monitor allocation of the unallocated identification
fication and in the claims, should be understood to mean nodes and trigger a full retraining of the classification
“ either or both ” of the elements so conjoined , i.e., elements network responsive to assignment of the subset of unallo
that are conjunctively present in some cases and disjunc cated nodes .
tively present in other cases . Multiple elements listed with 3 . The system of claim 2 , wherein the system is config
" and/or” should be construed in the same fashion , i.e ., "one ured to execute a full retraining of the classification network
or more ” of the elements so conjoined . Other elements may to include additional unallocated identification nodes for
optionally be present other than the elements specifically subsequent incremental retraining of the DNN .
identified by the “ and /or" clause , whether related or unre 4 . The system of claim 3 , where the system iteratively
lated to those elements specifically identified . Thus, as a fully retrains the classification network upon depletion of
non - limiting example , a reference to " A and/ or B ” , when unallocated identification nodes with additional unallocated
used in conjunction with open -ended language such as nodes for subsequent incremental training .
US 2019 /0280869 A1 Sep . 12 , 2019

5 . The system of claim 1 , wherein the system is further least one layer in the pre -trained neural network
configured to monitor matching of new biometric informa responsive to input of an unencrypted biometric input.
tion to existing identification nodes in the classification 15 . The method of claim 12 , further comprising an act of
network . monitoring, by the at least one processor, allocation of the
6 . The system of claim 5 , wherein the system is further unallocated identification classes and triggering an incre
configured trigger integration of new biometric information mental retraining of the classification network responsive to
into existing identification nodes responsive to exceeding a assignment of the subset of unallocated nodes to provide
threshold associated with matching new biometric informa additional unallocated classes.
tion. 16 . The method of claim 12 , further comprising an act of
7 . The system of claim 1 , wherein the pre -trained network monitoring, by the at least one processor, allocation of the
is further configured to generate one way homomorphic , unallocated identification nodes and triggering a full retrain
Euclidean measurable , feature vectors for the individual. ing or incremental of the classification network responsive
8 . The system of claim 1 , wherein the classification to assignment of the subset of unallocated nodes.
component is further configured to return a set of probabili 17. The method of claim 16 , further comprising an act of
ties for matching a set of existing labels . executing a full retraining of the classification network to
9 . The system of claim 2 , wherein the classification include additional unallocated classes for subsequent incre
component is further configured to predict an outcome based mental retraining of the DNN .
on a trained model, a set of inputs for the prediction and a 18 . The method of claim 17, further comprising an act of
result of a class or unknown. fully retraining the classification network iteratively upon
10 . The system of claim 1 , wherein the classification depletion of unallocated identification nodes , the full retrain
component further comprises multiple neural networks for
processing respective types of biometric information . ing including an act of allocating additional unallocated
11 . The system of claim 1 , wherein the classification nodes for subsequent incremental training .
component is further configured to generate an identity of a 19 . The method of claim 12 , further comprising an act of
person responsive to at least two probable biometric indi monitoring matching of new biometric information to exist
cators that may be used simultaneously or as part of a voting ing identification nodes.
algorithm . 20 . The method of claim 19 , further comprising an act of
12 . A computer implemented method for privacy - enabled triggering integration of new biometric information into
biometric analysis , the method comprising: existing identification nodes responsive to exceeding a
instantiating, by at least one processor, a classification threshold associated with matching new biometric informa
component comprising a classification network having tion .
a deep neural network (“ DNN ” ) configured to classify 21 . The method of claim 12 , further comprising an act of
feature vector and label inputs during training and generating one way homomorphic , Euclidean measurable ,
return a label for person identification or an unknown labels for person identification responsive to input of Euclid
result during prediction , and wherein the classification ean measurable feature vectors for the individual by the
component is further configured to accept as an input classification component.
feature vectors that are Euclidean measurable and 22. A non - transitory computer readable medium contain
return the unknown result or the label as output; ing instructions when executed by at least one processor
wherein instantiating the classification component cause a computer system to execute a method for executing
includes an act of allocating within at least one layer of privacy -enabled biometric analysis , themethod comprising :
the classification network , an initial number of classes instantiating a classification component comprising a
and having a subset of the class slots that are unas classification network having a deep neural network
signed ; (“ DNN ” ) configured to classify feature vector and label
triggering responsive to input ofbiometric information for inputs during training and return a label for person
a new user incremental training operation for the clas identification or an unknown result during prediction ,
sification network integrating the new biometric infor and wherein the classification component is further
mation into a respective one of the unallocated class configured to accept as an input feature vectors that are
slots usable for subsequent matching. Euclidean measurable and return the unknown result or
13 . The method of claim 12 , further comprising acts of: the label as output;
accepting , by the classification component, as an input wherein instantiating the classification component
feature vectors that are Euclidean measurable gener includes an act of allocating within at least one layer of
ated by a feature vector generation component; the classification network , an initial number of classes
classifying, by the classification component executed on and having a subset of additional classes that are
at least one processor , the feature vector inputs ; and unassigned ;
returning, by the classification component, a label for
person identification or an unknown result. triggering responsive to input of biometric information for
14 . The method of claim 12 , further comprising acts of: a new user incremental training operation for the clas
instantiating a feature vector generation component com sification network integrating the new biometric infor
prising a pre - trained neural network ; and mation into a respective one of the unallocated identi
generating , by the feature vector generation component fication nodes usable for subsequentmatching .
Euclidean measurable feature vectors as an output of a

Das könnte Ihnen auch gefallen