Sie sind auf Seite 1von 19

sensors

Article
Analyte Quantity Detection from Lateral Flow Assay
Using a Smartphone
Kamrul H. Foysal 1,† , Sung Eun Seo 2,† , Min Ju Kim 2 , Oh Seok Kwon 2,3, * and Jo Woon Chong 1, *
1 Department of Electrical & Computer Engineering, Texas Tech University, Lubbock, TX 79409, USA;
kamrul.foysal@ttu.edu
2 Infectious Disease Research Center, Korea Research Institute of Bioscience and Biotechnology (KRIBB),
125 Gwahak-ro, Yuseong-gu, Daejeon 34141, Korea; eun93618@gmail.com (S.E.S.);
mjkim4427@gmail.com (M.J.K.)
3 Nanobiotechnology and Bioinformatics (Major), University of Science & Technology (UST), 125 Gwahak-ro,
Yuseong-gu, Daejeon 34141, Korea
* Correspondence: oskwon7799@gmail.com (O.S.K.); j.chong@ttu.edu (J.W.C.)
† These authors contributed equally to this work.

Received: 24 October 2019; Accepted: 30 October 2019; Published: 5 November 2019 

Abstract: Lateral flow assay (LFA) technology has recently received interest in the biochemical field
since it is simple, low-cost, and rapid, while conventional laboratory test procedures are complicated,
expensive, and time-consuming. In this paper, we propose a robust smartphone-based analyte
detection method that estimates the amount of analyte on an LFA strip using a smartphone camera.
The proposed method can maintain high estimation accuracy under various illumination conditions
without additional devices, unlike conventional methods. The robustness and simplicity of the
proposed method are enabled by novel image processing and machine learning techniques. For the
performance analysis, we applied the proposed method to LFA strips where the target analyte is
albumin protein of human serum. We use two sets of training LFA strips and one set of testing LFA
strips. Here, each set consists of five strips having different quantities of albumin—10 femtograms,
100 femtograms, 1 picogram, 10 picograms, and 100 picograms. A linear regression analysis
approximates the analyte quantity, and then machine learning classifier, support vector machine
(SVM), which is trained by the regression results, classifies the analyte quantity on the LFA strip in an
optimal way. Experimental results show that the proposed smartphone application can detect the
quantity of albumin protein on a test LFA set with 98% accuracy, on average, in real time.

Keywords: LFA pad; analyte detection; smartphone; LFA reader

1. Introduction
Lateral flow assay (LFA) [1] strips are widely used for analyte detection in a given liquid sample.
This technology has rapidly gained interest nowadays due to its low-cost, simple, and swift detection of
biochemical compounds in samples [2–4]. Specifically, LFA pads are cellulose-based strips containing
specific reagents which are activated when the target analyte is present in the sample. LFA has
been adopted in various fields of study, e.g., point of care [5,6], agriculture [7], food [8], medicine,
and environmental science [9]. Moreover, LFA is important for diagnosis, such as pregnancy, heart
failure [10], contamination [11], and detection of drugs of abuse [12]. Some of these LFAs are particularly
designed for point of care testing [13,14].
Existing analyte detection methods using LFA strips can be categorized into qualitative and
quantitative approaches [15]. Qualitative analysis is performed by assessing colors at the region of
interest (ROI) visually [13]. This rapid result of qualitative tests helps make immediate decisions.

Sensors 2019, 19, 4812; doi:10.3390/s19214812 www.mdpi.com/journal/sensors


Sensors 2019, 19, 4812 2 of 19

However, qualitative analysis provides subjective judgement and may give a wrong interpretation [16].
Hence, quantitative methods are needed where the results are not clear with qualitative analysis.
Quantitative analysis of LFA is shown to facilitate point-of-care testing compared to qualitative
analysis [16–19]. Quantitative approaches require a method that can readily deliver quantitative
measurements in a fast and accurate way. Due to their superior processing ability and portability,
smartphones are being used for acquiring and processing biomedical signals and images [20–22].
Existing quantitative analysis requires external devices [23,24] in addition to a smartphone, such as an
external lens, LFA housing device, or LED strip/optical fiber. Standalone devices [25] adopt specific
processors, light sources, and housing for the LFA strip and are configured to exclusively perform analyte
quantity detection [26,27]. For automated quantification and rapid decision, a mobile camera–based
LFA detection method has been highlighted, which adopts specific lighting conditions [28] or external
housing attached to a smartphone [23] for LFA detection.
For quantification of the result, optical strip readers are commonly attached to a smartphone.
Images of the strips are captured with the smartphone camera, and then the optical strip reader
processes the image using image processing software that does, for example, threshold-based color
segmentation [29] to quantitatively measure the amount of analyte present in the sample. This procedure
requires placing the LFA and light source precisely when the smartphone captures the image and also
needs proper illumination for high contrast between ROI and background, which is not always easy to
achieve [30]. In order to provide more accurate quantitative analysis, an LFA analyte detection system
that is resilient at diverse illumination conditions needs to be developed [4]. A smartphone-based
automated method quantifying the quantity of the analyte present in a sample can be promising due
to its portability and ease-of-use.
Several techniques have been adopted for analyte detection in LFA, and recent smartphone-based
detection techniques are highlighted. Carrio et al. [12] and Cooper et al. [28] demonstrated that
smartphone-based LFA detection can be applied to the quantification of drugs of abuse. However, these
methods require devices other than a smartphone. In this paper, we propose a novel smartphone-based
analyte quantity detection method for an LFA strip, which detects the amount of analyte in an LFA
strip using a smartphone without an external device. We also design the proposed method to be
resilient at diverse lighting conditions. Using optical methods, the application can easily detect test line
and control line pixels’ intensity values without using an external strip reader device, which increases
portability and convenience. Unlike a standalone LFA strip reader device, our developed system is
portable, accurate, and convenient. Using an image processing technique, this application can replace
a cumbersome reader device and effectively measure the presence of the analyte in a sample liquid in a
quantitative way. Compared to the conventional technologies, our proposed method maintains its
estimation performance under diverse lighting conditions without requiring an external LFA housing
device. The main contributions of this paper are summarized as follows:

• A convenient and portable alternative to read LFA strips using solely a smartphone.
The conventional techniques require external devices. Our proposed method uses a novel
smartphone application that instantly detects an analyte quantity in an LFA strip without any
prior calibration or any requirement of an external device.
• Resilience at various light conditions in detecting the analyte amount in an LFA strip.
The performance of existing techniques is affected by the environment such as lighting conditions.
Our proposed algorithm can accurately estimate the quantity of the analyte in an LFA strip in
diverse lighting conditions.

2. Materials

2.1. Input Data: LFA Pads


An LFA strip contains four primary parts—sample pad, conjugate pad, membrane, and absorbent
pad [31]. The sample pad is the region where the sample solution is distributed. The sample solution
Sensors 2019, 19, 4812 3 of 19
Sensors 2019, 19, x FOR PEER REVIEW 3 of 18
Sensors 2019, 19, x FOR PEER REVIEW 3 of 18
is
ofdesigned to flow
an LFA strip. Thefrom the sample
conjugate pad to
tags bind to the
the conjugate pad, which
analyte molecules contains
of the sample thesolution
conjugate andlabels
flow
(i.e.,
along gold
of an LFA thenanoparticles)
strip. The and
membrane antibodies.
is made Figure
of 1 shows
nitrocellulosethe architecture
material. Asand
conjugate tags bind to the analyte molecules of the sample solution and flow theworking
sample principle
moves of an
along,
LFA
along strip.
reagents The conjugate
thesituated
strip. on the
The tagslines
test
membrane bindisofto theporous
the
made analyte moleculesmaterial.
membrane
of nitrocellulose of the sample
capture the solution
Asconjugated
the and
samplelabels flow
moves along
bound
along,to
the
the strip.
target The
analyte.membrane
The is
control made
line of
on nitrocellulose
the membrane material.
consists As
of the sample
specific
reagents situated on the test lines of the porous membrane capture the conjugated labels bound to moves
antibodies along,
for the reagents
particular
situated
the targetonanalyte.
conjugate the captures
and testThe
lines of the
the
control porous
conjugate
line on themembrane
tags. These capture
membrane phenomena
consiststheofconjugated
are observed
specific labels bound
as the
antibodies forred to the target
color
the in the
particular
analyte.
test/control
conjugate Theand control
lines. Theline
captures on intensity
color
the the membrane
conjugate the consists
oftags. test
These of specific antibodies
linephenomena
is expected to
arevary for theas particular
proportionally
observed the red to the conjugate
colorquantity
in the
and
of the captures
analyte
test/control the
lines. conjugate
in the
Thetest tags.However,
sample.
color These
intensity of thephenomena
thereline
test isare
is no observed
color
expectedchange as
inthe
to vary the red
test color
line ifin
proportionally thethe test/control
totarget
the analyte
quantity
lines.
of the The
is absent color
in
analyte inintensity
the sample. of colored
A the test
the test sample. line is expected
control
However, line isto
thereassuresnovary proportionally
that
color the testinhas
change to
thebeen theline
test quantity
performed ofwell
the analyte
if the target analyte
[32], as
in the test
shown
is absent sample.
inin
Figure 1.However,
the sample. there iscontrol
A colored no color linechange
assures in the
thattest
theline
testifhas
the been
targetperformed
analyte is absent in the
well [32], as
sample. A colored
shown in Figure 1. control line assures that the test has been performed well [32], as shown in Figure 1.

Figure 1. Lateral flow assay (LFA) strip architecture where the analyte is detected in the test line, and
the red1.control
Figure line
Lateral
Lateralflowindicates
flowassay that
assay(LFA)
(LFA)the test
strip
strip was performed.
architecture where
architecture thethe
where analyte is detected
analyte in the
is detected in test
the line, and
test line,
the red
and the control line line
red control indicates that that
indicates the test waswas
the test performed.
performed.
The analyte quantity in the sample is expected to be proportional to the intensity of the red
colored
The test linequantity
analyte
analyte region.
quantityinHence,
thethe
in asample
sampleweighted summation
is expected
is expected of proportional
red color
to betoproportional
be in the
to the testintensity
intensity
to the line region
of the red canred
be
of colored
the
considered
test line test
colored as
region. the
line parameter
Hence,
region. for the
a Hence,
weighted detectionsummation
summation
a weighted ofofanalyte
red colorquantity,
of in
redthe where
test
color linethe
in weight
region canisbe
test line considered
considered
region to
can be
be
as the color intensity
the parameter
considered as thefor of each pixel.
the detection
parameter for the Figure 2a
ofdetection shows
analyte quantity,
of analytea sample
where LFA test
the weight
quantity, strip
whereisthe with its
considered consisting
weight isto parts,
be the color
considered to
while
be the Figure
intensity of each
color 2bpixel.
shows
intensity of the
eachROI
Figure 2a (see
shows
pixel. rectangular
a sample
Figure LFA
2a shows box) and
atest the
strip
sample testits
with
LFA and the control
consisting
test strip with its lines
parts, (see
while orange
Figure
consisting 2b
parts,
and blue
shows
while thelines).
ROI2b(see
Figure rectangular
shows the ROIbox)(see and the test and
rectangular box)the andcontrol lines
the test and(see
theorange
controland blue
lines lines).
(see orange
and blue lines).

(a) (b)
Figure 2.
Figure 2. (a)
(a) Sample
Sample LFA strip(a)
LFA strip and (b)
and (b) region
region of
of interest
interest of
of an
an LFA
LFA strip (b) an analyte. The intensity
strip with
with an analyte. The intensity
and density
and density
Figure of red
2. (a) of red color
Sample color in
LFAin the and
the
strip testline
test lineregion
(b) regionof
region determine
determine the
interest ofthe amount
anamount ofwith
of
LFA strip analyte
analyte inthe
an in thesample.
sample.
analyte. The intensity
and density of red color in the test line region determine the amount of analyte in the sample.
Sensors 2019, 19,
Sensors 2019, 19, 4812
x FOR PEER REVIEW 44 of
of 19
18

2.2. Data Acquisition Using a Smartphone Camera


2.2. Data Acquisition Using a Smartphone Camera
LFA strips of known quantities of target analyte on sample pads were obtained from the Korea
LFA strips
Research Instituteof known quantities
of Bioscience of target analyte
& Biotechnology on sample
(KRIBB). pads were
The target analyteobtained from theprotein
was albumin Korea
Research Institute of Bioscience & Biotechnology (KRIBB). The target
(albumin from human serum acquired from Sigma Aldrich, Saint Louis, MO, USA [33]with product analyte was albumin protein
(albumin from human
number A9511), and the serum acquired
conjugate from
tags Sigma
were gold Aldrich, Saint Louis,
nanoparticles withMO, USA [33]antibody.
an albumin with product The
number A9511), and the conjugate tags were gold nanoparticles with
solvent of pH 7.4 PBS (phosphate buffered saline, purchased from Gibco, Paisley, UK) was an albumin antibody. The used
solvent
to
of pH 7.4 PBS (phosphate buffered saline, purchased from Gibco, Paisley,
prepare 10 microliters (μL) of solution of albumin sample. Here, we considered the following five UK) was used to prepare
10 microliters
different (µL) of solution
concentration of albumin
levels: 10 ng/mL, 1 sample.
ng/mL, 100 Here, we considered
pg/mL, 10 pg/mL,the andfollowing
1 pg/mL. five
Three different
sets of
concentration levels: 10 ng/mL, 1 ng/mL, 100 pg/mL, 10 pg/mL, and 1 pg/mL.
samples were available for the study, with each set containing five LFA strips having five different Three sets of samples were
available for the study, with each set containing five LFA strips having five different
concentrations. As a result, the corresponding analyte quantities of five different concentration levels concentrations.
As
area10result,
ng/mL the× corresponding
10 μL = 100 pg, analyte
1 ng/mLquantities
× 10 μL = of 10 five different
pg, 100 pg/mL concentration
× 10 μL = 1 pg, levels are 10×ng/mL
10 pg/mL 10 μL
× 10 = 100 pg, 1 ng/mL × 10 = 10 pg, 100 pg/mL × 10 =
= 100 fg, and 1 pg/mL × 10 μL = 10 fg, respectively. An Institutional Review Board (IRB)µL
µL µL µL 1 pg, 10 pg/mL × 10 = 100 fg,
assessment
and
was 1notpg/mL × 10 µL
performed = 10 fg,the
because respectively.
sample wasAn Institutional
purchased. FigureReview
3 showsBoard (IRB)images
sample assessment
of LFA was
setnot
#1.
performed because the sample was purchased. Figure 3 shows sample
Each reading was taken five times to reduce error. Hence, a total of 15 LFA strips were used, and images of LFA set #1. Each
75
reading was taken five times to reduce error. Hence, a total of 15 LFA
readings were obtained in this study. Two sets, each consisting of five LFA strips of the strips were used, and 75 readings
were obtained in this
aforementioned study. Two
quantities sets,
(set #1 each
and consisting
#2), were used of five LFA strips
to train of the aforementioned
the machine learning model, quantities
and the
(set #1 and #2), were used to train the machine learning model, and the other
other set was used for validating and testing the model of the proposed method. The data acquisition set was used for validating
and testing the
procedure wasmodel of the proposed
performed with an method.
AndroidThe data acquisition
smartphone Samsungprocedure
Galaxy wasS7performed with an
Edge, Samsung
Android smartphone
Electronics, Seoul, South Samsung
KoreaGalaxy
[34] withS7 Edge,
Android Samsung
version Electronics,
8.0.0. We Seoul,
used the South
rearKorea
camera [34]ofwith
the
Android version 8.0.0. We used the rear camera of the smartphone, which
smartphone, which had a maximum resolution of 12 megapixels (4032 × 3024 pixels). Autofocus had a maximum resolution of
12 megapixels
function was (4032 × 3024
enabled, andpixels). Autofocus
the flashlight wasfunction
turnedwas off enabled,
during the and data
the flashlight
acquisitionwasprocedure.
turned off
during the data acquisition procedure. Images of the LFA strips were captured
Images of the LFA strips were captured under different ambient lighting conditions in the laboratory. under different ambient
lighting conditions in the laboratory. The room had a floor area of 41.475 m 2 and was illuminated with
The room had a floor area of 41.475 m2 and was illuminated with a 36-Watt ceiling light and a 27-
aWatt
36-Watt
tableceiling
lamp, light and a 27-Watt
respectively. table lamp,
The method respectively.
was evaluated Theon
based method
acquired wasdata
evaluated based on
using MATLAB
acquired data using MATLAB 2018b, The MathWorks,
2018b, The MathWorks, Inc., Natick, Massachusetts, USA [35]. Inc., Natick, Massachusetts, USA [35].

Figure 3.
Figure LFA strips
3. LFA strips for
for different
different quantities
quantities of
of analyte
analyte under
under laboratory
laboratory ambient
ambient lighting
lighting condition.
condition.
3. Methods
3. Methods
3.1. Feature Extraction
3.1. Feature Extraction
We considered the number of red pixels accumulated on the test line as a feature. We assumed that
We considered
the cumulative sum ofthe
rednumber
pixels’ of red pixels
intensities in accumulated on the
the test line varied test line as a feature.
proportionally We assumed
with analyte quantity
under a specific lighting environment. To extract this feature, we performed the following:with
that the cumulative sum of red pixels’ intensities in the test line varied proportionally analyte
(1) selecting
quantity
the under
test line a specific
region and thelighting
controlenvironment.
line region, (2)Tosegmenting
extract thisimage
feature,viawe performed
optimal the following:
thresholding using
(1) selecting the test line region and the control line region, (2) segmenting image
Otsu’s method [36], and (3) calculating weighted sum of red pixels’ intensities, which are explained via optimalin
thresholding
Section using Otsu’s
3.1.1, Section method
3.1.2, and Section[36], andrespectively.
3.1.3, (3) calculating weighted
During sum
all these of red
steps, pixels’
image intensities,
preprocessing
which
and are explained
machine learning in Sections were
techniques 3.1.1, applied
3.1.2, and 3.1.3, respectively.
as described During all these steps, image
in these sections.
preprocessing and machine learning techniques were applied as described in these sections.

3.1.1. Test and Control Line Region Selection


To extract the test and control line regions, the acquired image is cropped with fixed dimensions
and a fixed aspect ratio [37]. This technology has been readily available in current generation cellular
Sensors 2019, 19, 4812 5 of 19

3.1.1. Test and Control Line Region Selection

SensorsTo extract
2019, thePEER
19, x FOR test and control
REVIEW line regions, the acquired image is cropped with fixed dimensions 5 of 18
and a fixed aspect ratio [37]. This technology has been readily available in current generation cellular
devices,
devices,and andthe theprocedure
procedurehas has been
been used
used inin printing,
printing, extracting
extracting thumbnails,
thumbnails,and andframing
framingdigital
digital
pictures.
pictures. Accurate
Accurate placement
placement of the LFA strip under
LFA strip underthe thecamera
cameraisisrequired
requiredfor forexact
exactapproximation
approximation of
of analyte
analyte quantity.
quantity. Figure
Figure 4a shows
4a shows an example
an example of an accurate
of an accurate placement placement
of an LFAofunder an LFA under a
a smartphone
smartphone
camera, where camera, where
the ROI sitsthe ROI sits
exactly in exactly in the
the center box center
of the box
3× of 3the 3 ×of
grid 3 grid of the camera
the camera view. view.
Then,
Then, the center
the center grid region’s
grid region’s dimensions
dimensions are readily
are readily extracted
extracted from from
the the acquired
acquired LFA LFA image.
image. Figure
Figure 4b
4b shows
shows thethe indexes
indexes ofof
thethecenter
centerbox’s
box’sboundary
boundarypoints:
points:((1,
((1,1),
1), (2,
(2, 1),
1), (1,
(1, 2),
2), (2, 2)). The
The developed
developed
application
application automatically cropsand
automatically crops andextracts
extractsthe thetest
test
and and control
control lineline regions
regions fromfrom the center
the center grid
grid point
point locations
locations of theofimage.
the image. The regions
The regions are obtained
are obtained from from
innerinner grid indexes
grid indexes for thefortest
theline
test((0,
line1),
((0,
(3,1),
1),
(3, 1), (0, 2), (3, 2)) and the control line ((0, 2), (3, 2), (0, 3), (3, 3)). As shown in Figure
(0, 2), (3, 2)) and the control line ((0, 2), (3, 2), (0, 3), (3, 3)). As shown in Figure 4c, the control line and 4c, the control
line andline
the test the are
testidentified
line are identified
as region Aasand region
regionA and region B, respectively.
B, respectively. These regionsThese regions
are then are then
considered for
considered for further calculation.
further calculation.

(a) (b) (c)


Figure
Figure 4.4. (a) AA33× ×3 grid
3 grid of smartphone
of the the smartphone camera camera viewproper
view with with proper positioning
positioning of LFA, (b) of dimension
LFA, (b)
dimension
extraction ofextraction of regions
regions from frombox
the center the((1,
center
1), (2,box
1), ((1, 1),(2,
(1, 2), (2,2)),
1), with
(1, 2),the(2,test
2)),line
with thethe
and test line and
control line
the control
position line position
indexes indexes
((0, 1), (3, 1), (0, ((0, 1),2))
2), (3, (3,and
1), (0,
((0,2),
2),(3,
(3,2))
2),and ((0,(3,2),3)),
(0, 3), (3,respectively,
2), (0, 3), (3,and
3)), (c)
respectively,
the control
and
line(c)
andthethe
control lineregions
test line and the test lineasregions
(marked A and (marked as A and B, respectively).
B, respectively).

3.1.2.Image
3.1.2. ImageSegmentation
Segmentationvia
viaOptimal
OptimalThresholding
ThresholdingUsing
UsingOtsu’s
Otsu’sMethod
Method
As mentioned
As mentionedin in Section
Section2,2, we
wecaptured
capturedthetheLFA
LFA strip
strip images
images using
using aa Samsung
Samsung Galaxy
Galaxy S7S7
smartphone camera with the resolution of 12 megapixels (4032 × 3024 pixels). The
smartphone camera with the resolution of 12 megapixels (4032 × 3024 pixels). The Samsung Galaxy Samsung Galaxy
S7camera
S7 cameragenerated
generatedLFALFAimages
imagesin inAlpha-Red-Green-Blue
Alpha-Red-Green-Blue(ARGB) (ARGB)32-bit
32-bitcolor
colorspace
spaceconsisting
consistingof
of
alpha (A),
alpha (A), red
red (R),
(R),green
green(G),
(G),and blue
and (B)(B)
blue channels. Here,
channels. the A,
Here, theR,A,G,R,
andG,Band
channels represent
B channels opacity
represent
and red, green, and blue intensities, respectively. Figure 5a,b show the ARGB representation
opacity and red, green, and blue intensities, respectively. Figure 5a,b show the ARGB representation of an
image and the 32-bit representation of a pixel in the image, respectively. Each pixel
of an image and the 32-bit representation of a pixel in the image, respectively. Each pixel can have can have 232 ,
2or
32, 4,294,967,296, different
or 4,294,967,296, representations
different since
representations a pixel
since is represented
a pixel by 32
is represented bybits.
32 bits.
Denoting the red color intensity by IR (x, y) and the green color intensity by IG (x, y), we observed
that the ratio of red to green channel intensity (IR (x, y)/ IG (x, y)) was noticeably different near the
test/control line regions compared to other regions as shown in Figure 6. Specifically, IR (x, y) / IG (x, y)
values for pixels of an LFA strip image were nearly unity except for the test and control line regions.

(a)
Sensors 2019, 19, x FOR PEER REVIEW 6 of 19
Sensors 2019, 19, 4812 6 of 19

Sensors 2019, 19, x FOR PEER REVIEW 6 of 18

(b)
Figure 5. Representation of an image obtained using (a) a Samsung Galaxy S7 smartphone. (a) Alpha-
Red-Green-Blue (ARGB) representation of an image. An ARGB image consists of four planes, which
are the alpha, red, green, and blue channels. Each channel’s intensity value can vary from 0 to 255. A
pixel in the original image shows the combination of the intensity values of the color channels, and
(b) the 32-bit representation of a pixel of the image (a). In an ARGB image, each pixel corresponds to
a specific 32-bit binary value, of which the lowest eight bits represent blue, the next 8 bits represent
(b)
green, the next eight bits for red, and the highest eight bits represent the alpha channel intensity
values.
FigureThe
Figure pixel’s color isof
5. Representation
Representation the
of ancombination
an image of these
imageobtained
obtained color
using
using intensities.
a Samsung
a Samsung Galaxy
Galaxy S7 smartphone.
S7 smartphone. (a) Alpha-
(a) Alpha-Red-
Red-Green-Blue (ARGB)
Green-Blue (ARGB) representation
representation of an of an image.
image. An ARGB
An ARGB imageimage consists
consists of fourofplanes,
four planes, which
which are the
Denoting
are the red,
alpha, the
alpha, redand
red,
green, color
green, and
blue intensity by 𝐼channel’s
blue channels.
channels. Each (𝑥,
Each𝑦)channel’s
and theintensity
intensity greencan
value color
value intensity
varycan vary
from from
0 to 255. 𝐼 pixel
by0 Ato (𝑥,
255.𝑦)A
in, we
observed
pixel that
the originaltheimage
in the ratioshows
original of redtheto
image green thechannel
combination
shows of theintensity
combinationintensity (𝐼intensity
of thevalues (𝑥,of
𝑦)/the𝐼 color
(𝑥, 𝑦))
values was
channels,
of the noticeably
and
color different
(b) the 32-bit
channels, and
near (b)
the test/control
representation
the of aline
pixelregions
32-bit representationof the compared
of image (a).
a pixel In an
of the toARGB
imageother regions
(a).image,
In an each
ARGB as
pixelshown in pixel
corresponds
image, each Figure 6. Specifically,
to corresponds
a specific 32-bit
to
𝐼 (𝑥, 𝑦) / 𝐼 (𝑥,
abinary
specific 𝑦)
value, values
32-bitofbinary
whichfor
thepixels
value,lowest ofeight
an LFA
of which strip eight
bitslowest
the image
represent bitswere
blue, nearly
the next
represent unity
8blue,
bits the except
represent forrepresent
next 8green,
bits the
the test
nextand
controleight
linebits
green, for
regions.
the red,
next andbits
eight the highest
for red, eight bits highest
and the represent the bits
eight alpha channel the
represent intensity
alphavalues.
channelThe pixel’s
intensity
color isThe
values. the combination ofthe
pixel’s color is these color intensities.
combination of these color intensities.

Denoting the red color intensity by 𝐼 (𝑥, 𝑦) and the green color intensity by 𝐼 (𝑥, 𝑦) , we
observed that the ratio of red to green channel intensity (𝐼 (𝑥, 𝑦)/ 𝐼 (𝑥, 𝑦)) was noticeably different
near the test/control line regions compared to other regions as shown in Figure 6. Specifically,
𝐼 (𝑥, 𝑦) / 𝐼 (𝑥, 𝑦) values for pixels of an LFA strip image were nearly unity except for the test and
control line regions.

(a) (b)
Figure
Figure6.6.Ratio
Ratioofofred
redchannel
channeltotogreen
greenchannel
channelofofa asample
sampleLFA
LFAstrip
stripisisvisualized
visualizedby
by(a)
(a)a a2D
2Dsurface,
surface,
and
and (b) a 3D surface. It is clear that the test and control lines had the maximum value that can beused
(b) a 3D surface. It is clear that the test and control lines had the maximum value that can be used
totodifferentiate
differentiatethem
themfromfromother
otherregions.
regions.

Otsu’smultilevel
Otsu’s multilevelthresholding
thresholdingmethod method[36,38]
[36,38]has
hasbeen
beenincluded
includedininthis
thispaper,
paper,which
whichthresholds
thresholds
an image
an image automatically
automaticallyby byminimizing
minimizing variance
variance of intra-class
of intra-classintensity level.level.
intensity This method was applied
This method was
(a) to obtain an optimal threshold value (TH (b)
to the extracted control line
applied to the extracted control line to obtain an optimal threshold value mask(𝑇𝐻
) for ) for masking. This
masking. This masking
is used
masking toisonly
Figure used consider
6. Ratiotoofonly the
red channel ROItofor
consider calculation.
the
greenROI ofThis threshold
for calculation.
channel a sample value
Thisstrip
LFA was applied
threshold value
is visualized to athe
bywas
(a) 2Dwhole
applied image
to
surface, the
for creating
whole and (b) athe
image 3D mask,
for surface.which
creating the
It is consists
mask,
clear oftest
which
that the points whose
consists
and red
oflines
control points to
hadgreen
whose channel intensity
red to value
the maximum greenthat ratio
channel
can bevalues
used are
intensity
higher
ratio thanare
values
to THhigher
differentiate . Denoted
them
mask than 𝑇𝐻byregions.
from other Imask
. Denoted the𝐼mask
(x, y), by (𝑥,intensity
𝑦), the mask
valueintensity
at the pixel
value at the (pixel
location x, y),
Imask (x, (𝑥,
location y) 𝑦), (𝑥, 𝑦) was
was𝐼calculated by the following
calculated byequation:
the following equation:
Otsu’s multilevel thresholding method [36,38] has been included in this paper, which thresholds
𝐼R (𝑥, 𝑦)
an image automatically by minimizing variance 1, ifof intra-class> 𝑇𝐻intensity
mask ,
level. This method was
𝐼mask (𝑥, 𝑦) = 𝐼 (𝑥, 𝑦) (1)
applied to the extracted control line to obtain an optimal threshold value (𝑇𝐻
G ) for masking. This
0, otherwise.
Figure 7a has been obtained from MATLAB to visualize the color difference between the ROI
and the non-ROI region. In the test and the control line regions of the image, the red intensity values
(in the range near 160) are larger than the green intensity values (in the range near 130), while the
other regions had comparable red and green intensity values. Evidently, the intensity ratio of red to
Sensors 2019, 19, 4812 7 of 19


IR (x, y)
 1, if IG (x, y) > THmask ,


Imask (x, y) =  (1)
 0, otherwise.

Figure 7a has been obtained from MATLAB to visualize the color difference between the ROI
and the non-ROI region. In the test and the control line regions of the image, the red intensity values
(in the range near 160) are larger than the green intensity values (in the range near 130), while the
other regions had comparable red and green intensity values. Evidently, the intensity ratio of red
to green color channels showed a higher difference in the test and control line regions, i.e., a higher
IR (x, y)/IG (x, y) value. Using the optimal threshold (THmask ), the mask image was obtained, which
is shown in Figure 7b. Figure 7c shows the cropped final image of the selected control line and test
line regions.
Sensors 2019, 19, x FOR PEER REVIEW 7 of 18
Sensors 2019, 19, x FOR PEER REVIEW 7 of 18

(a) (b) (c)


(a) (b) (c)
Figure
Figure 7. 7.Visualization
(a) (a) Visualizationof of
twotwo pixelsin
pixels inaasample
sample LFA
LFA strip
stripimage
image with
withred, green,
red, andand
green, blueblue
channel
channel
Figure 7.values.
intensity (a) Visualization
Pixel A is inofthe
two pixels
test in a sample
line region LFA Bstrip
and pixel is inimage with red,
the non-ROI green,(b)
region, and
theblue channel
calculated
intensity values. Pixel A is in the test line region and pixel B is in the non-ROI region, (b) the
intensity values. Pixel A is in the test line region and pixel B is in the non-ROI region, (b) the calculatedcalculated
mask where the red to green intensity ratio was greater than the threshold THmask, and (c) the
mask where the redthe
mask where to green
red tointensity ratio was
green intensity greater
ratio than the
was greater threshold
than THmaskTH
the threshold , and
mask(c) the(c)
, and extracted
the
extracted control line and test line regions are shown.
controlextracted
line andcontrol
test line
lineregions
and testare
lineshown.
regions are shown.

3.1.3.3.1.3. Calculation of the Weighted Sum of Red Pixels’ Intensities


Calculation of the
3.1.3. Calculation of Weighted Sum
the Weighted of of
Sum Red Pixels’
Red Pixels’Intensities
Intensities
We observed that the sum of color intensities of the red pixels on the test line region tends to
We observed that that
Weasobserved the sum of color
the sum intensities
of color intensities ofofthe red
red pixels
thelighting
pixels onthe
on thetest
testline
line region
region tends to
increase the analyte quantity increases under a specific environment, as shown in tends
Figureto
increase as the
increase analyte
as quantity
the analyte increases
quantity under
increases under a specific
a specificlighting
lightingenvironment,
environment, as as
8. From this observation, we assumed that the counted red pixels’ intensity values on the test line shown
shown ininFigure
Figure 8.
8.
Fromfrom From
this the this observation,
observation, we
we assumed assumed that
that image the
the counted counted red
red pixels’ pixels’ intensity
intensity values
values on the test line
captured smartphone camera can estimate the analyte quantity fromon the
the LFAtest line from
strip.
from thesmartphone
the captured captured smartphone cameracan
camera image image can estimate
estimate the analyte
the analyte quantity
quantity fromfromthe the
LFALFA strip.
strip.

(a) (b)
(a) (b)
Figure 8. (a) Bar chart for the median of the readings for the weighted sum of the test line red pixels’
Figure Figure 8.values
8. (a) Bar
intensity (a)chart
Bar chart for
thethe
for the
against median
median
quantity of of
of thethe readings
readings
analyte for
presentfor the sample
in the
the weighted
in sum
weighted asum ofof
the
specific thetesttest
lightinglineline
redred
pixels’
condition;pixels’
intensity
intensity values values against
against the quantity
the quantity of analyte present in the sample in a specific lighting condition;
(b) the boxplot for multiple readingsoffor
analyte present
each analyte in the sample in a specific lighting condition;
quantity.
(b) the(b) the boxplot for multiple readings for each analyte quantity.
boxplot for multiple readings for each analyte quantity.
Equation (2) shows the method for calculating the weighted summation of red pixels’ intensity
Equation (2) shows the method for calculating the weighted summation of red pixels’ intensity
values in a region (Sregion).
values in a region (Sregion).
𝑆region = ∑ 𝐼R (𝑥, 𝑦). (2)
𝑆region = ∑ 𝐼R (𝑥, 𝑦). (2)
This equation is applied throughout the extracted control and test line. By calculating the
This equation is applied throughout the extracted control and test line. By calculating the
weighted summation of the red pixels’ intensity values of the control line and test line regions Scontrol
Sensors 2019, 19, 4812 8 of 19

Equation (2) shows the method for calculating the weighted summation of red pixels’ intensity
values in a region (Sregion ).
X
Sregion = IR (x, y). (2)

This equation is applied throughout the extracted control and test line. By calculating the weighted
summation of the red pixels’ intensity values of the control line and test line regions Scontrol and Stest
were
Sensorsobtained
2019, 19, xfrom the masked
FOR PEER REVIEW image. 8 of 18

3.2.
3.2. Multiclass
Multiclass Classification
Classification Using
UsingMachineLearning
Machine LearningTechniques
Techniques

3.2.1.
3.2.1. Input
Input Parameter
Parameter for
for Classification
Classification(Test
(Testto
toControl
ControlLine
LineSignal
SignalIntensity
Intensity(T/C)
(T/C)Ratio)
Ratio)
The
Theproposed
proposedmethod
methodutilized
utilizedaaregression
regressionanalysis
analysis[39,40]
[39,40]totoapproximate
approximatethe theanalyte
analyte quantity
quantity
and predict the value using machine learning techniques. As an input parameter
and predict the value using machine learning techniques. As an input parameter for regression for regression analysis,
we considered
analysis, the ratio ofthe
we considered theratio
test to
ofcontrol
the testline signal intensity
to control line signal(T/Cintensity
ratio), since
(T/Cthe T/C ratio
ratio), sinceincrease
the T/C
proportionally with analyte quantity
ratio increase proportionally despitequantity
with analyte of variation
despitein illumination. The
of variation in T/ C ratio is The
illumination. expressed in
T/C ratio
Equation (3),
is expressed in Equation (3),
Stest
TCratio = 𝑆test , (3)
T/C ratio =Scontrol , (3)
𝑆 control
where Stest and Scontrol are the weighted sums of red pixels’ intensity values of the test and control
where Stest and Scontrol are the weighted sums of red pixels’ intensity values of the test and control lines,
lines, respectively. In this paper, we observed that the T/C ratio remained stable in different luminous
respectively. In this paper, we observed that the T/C ratio remained stable in different luminous
environments for the same analyte quantity. Figure 9 shows two readings of the same LFA strip under
environments for the same analyte quantity. Figure 9 shows two readings of the same LFA strip
two different ambient light environments. To simulate these two different lighting environments,
under two different ambient light environments. To simulate these two different lighting
a 36-Watt ceiling light and a 27-Watt table lamp light was used in a laboratory of 41.475 m2 floor area
environments, a 36-Watt ceiling light and a 27-Watt table lamp light was used in a laboratory of 41.475
(see Section 2 for setup). The T/C ratios obtained had 0.15 percentage difference, which validates its
m2 floor area (see Section 2 for setup). The T/C ratios obtained had 0.15 percentage difference, which
ability as a classifier parameter since it is resilience at different light conditions.
validates its ability as a classifier parameter since it is resilience at different light conditions.

(a) (b)
Figure 9.9. LFA
Figure LFA strip
strip and
and calculated
calculated T/C
T/Cratio
ratiofor
fortwo
twodifferent
differentlighting
lightingconditions:
conditions: (a)
(a) 36-Watt
36-Watt LED
LED
ceilinglamp
ceiling lampand
and(b)(b)27-Watt
27-WattLED
LEDtable
tablelamp
lampilluminated
illuminatedenvironments,
environments,showing
showingaasimilar
similarT/C
T/Cratio.
ratio.

TheT/C
The T/Cratio
ratiofrom
fromdifferent
different readings
readings of of available
available LFALFA
setssets (mentioned
(mentioned in Section
in Section 2): #1,
2): set set set
#1, #2,
set
#2, and
and setare
set #3 #3 shown
are shown in Tables
in Tables 1–3. 1–3.

Table Calculated T/C


Table1.1. Calculated T/Cratio
ratiofor
forLFA
LFAset
set#1.
#1.
Test Class
Test ClassReading 1
Reading 1 Reading
Reading2 2 Reading 33
Reading Reading
Reading 4 4 Reading
Reading
5 5
100 pg100 pg 0.7640.764 0.782
0.782 0.772
0.772 0.758
0.758 0.762 0.762
10 pg 10 pg 0.6720.672 0.618
0.618 0.502
0.502 0.536
0.536 0.596 0.596
1 pg 0.434 0.404 0.394 0.452 0.380
1 pg 0.434 0.404 0.394 0.452 0.380
100 fg 0.21 0.234 0.20 0.18 0.194
10 fg 100 fg 0 0.21 0.234
0 0.20
0 0.18 0 0.194 0
10 fg 0 0 0 0 0

Table 2. Calculated T/C ratio for LFA set #2.

Test Class Reading 1 Reading 2 Reading 3 Reading 4 Reading 5


100 pg 0.806 0.807 0.82 0.85 0.836
Sensors 2019, 19, 4812 9 of 19

Table 2. Calculated T/C ratio for LFA set #2.

Test Class Reading 1 Reading 2 Reading 3 Reading 4 Reading 5


100 pg 0.806 0.807 0.82 0.85 0.836
10 pg 0.634 0.622 0.674 0.67 0.668
1 pg 0.456 0.456 0.438 0.472 0.426
100 fg 0.246 0.242 0.212 0.226 0.176
10 fg
Sensors 2019, 19, 0
x FOR PEER REVIEW 0 0 0 0 9 of 18

Table
Table 3.
3. Calculated
CalculatedT/C
T/C ratio
ratio for
for LFA
LFA set
set #3.
#3.

Test ClassReading
Test Class Reading
1 1 Reading2 2
Reading Reading
Reading33 Reading 4 4Reading
Reading 5
Reading 5
100 pg
100 pg 0.796
0.796
0.828
0.828
0.882
0.882
0.64 0.64
0.798 0.798
10 pg10 pg 0.634
0.634 0.61
0.61 0.608
0.608 0.7220.722 0.64 0.64
1 pg 1 pg 0.426
0.426 0.5
0.5 0.496
0.496 0.4160.416 0.476 0.476
100 fg100 fg 0.238
0.238 0.204
0.204 0.208
0.208 0.1940.194 0.22 0.22
10 fg 10 fg 0 0 00 0 0 0 0 0 0

3.2.2. Regression
3.2.2. Regressionand
andClassification
Classification
Visual detection
Visual detection methods,
methods, which
which has
has been
been used
used to
to detect
detect analyte
analyte quantity,
quantity, are
are qualitative
qualitative and
and
erroneous due to absence of quantification parameters. From the conjugation of albumin
erroneous due to absence of quantification parameters. From the conjugation of albumin with gold with gold
nanoparticle tags,
nanoparticle tags, aa comparative
comparative valuation
valuation of
of quantifiable
quantifiable color
color density
density is
is possible.
possible. Figure
Figure 10
10 shows
shows
the box
the box whisker
whisker plot
plot ofof the
the acquired T/C ratios for different
acquired T/C different readings.
readings.

Figure10.
Figure Analytequantity
10.Analyte quantityagainst T/Cratio
againstT/C ratioplot
plotfor
forthree
threedifferent
different LFA
LFA sets.

A regression
A regression analysis
analysis was
was performed
performed based
based onon the
the feature
feature parameter
parameter (T/C
(T/C ratio)
ratio)from
fromTables
Tables1–3.
1–
The approximations obtained from the regression analysis were utilized for classification
3. The approximations obtained from the regression analysis were utilized for classification into into the
the
categories mentioned
categories mentionedininSection 2.22.2
Section using machine
using learning
machine technique.
learning A linear
technique. A support vector machine
linear support vector
machine (SVM) classifier [41] was adopted for the classification of analyte quantity categories.input
(SVM) classifier [41] was adopted for the classification of analyte quantity categories. The The
parameter
input for thefor
parameter SVM
the is
SVMthe is
approximated quantities
the approximated obtained
quantities from the
obtained fromregression analysis.
the regression After
analysis.
training
After the linear
training the SVM
linearclassifier using two
SVM classifier training
using LFA sets LFA
two training with sets
knownwithanalyte
known quantities, the trained
analyte quantities,
classifier estimates the analyte quantity of a test LFA set.
the trained classifier estimates the analyte quantity of a test LFA set.

3.3. Developed Smartphone Application


A smartphone application was developed on the Android platform following the proposed
method described in Sections 3.1 and 3.2. Figure 11a shows the flowchart of our developed
application’s operation, and Figure 11b shows an example of data acquisition using our developed
application.
Sensors 2019, 19, 4812 10 of 19

3.3. Developed Smartphone Application


A smartphone application was developed on the Android platform following the proposed method
described in Sections 3.1 and 3.2. Figure 11a shows the flowchart of our developed application’s
Sensors 2019, 19, x FOR PEER REVIEW 10 of 18
operation, and Figure 11b shows an example of data acquisition using our developed application.

(a)

(b)
Figure 11. (a)
Figure11. (a)The
Theflowchart
flowchartdescribing
describingthe subsequent
the operations
subsequent of our
operations developed
of our application;
developed (b) the
application; (b)
data acquisition procedure with accurate placement using our developed application.
the data acquisition procedure with accurate placement using our developed application.

The
Thesmartphone
smartphoneapplication
applicationwaswastested
testedonona sample
a sample LFALFA to to
evaluate
evaluatethethe
performance
performance in ainreal-life
a real-
scenario. TheThe
life scenario. testing procedure
testing procedure waswas as follows: a test
as follows: strip
a test was
strip wasplaced
placedunder
underthethe
smartphone
smartphone onona
white background. Using the grid view of the camera, the ROI was positioned
a white background. Using the grid view of the camera, the ROI was positioned inside the center box. inside the center box.
The
The image
image waswas captured,
captured, andandthe
theROI
ROIwas wasobtained.
obtained. TheThe application
application thenthen created
created aa mask
mask using
using the
the
preprocessing
preprocessing technique
technique mentioned
mentioned in in the
the proposed
proposedalgorithm.
algorithm. In In the
the masked
masked region,
region, the
the weighted
weighted
sums
sums of red pixels’
of red pixels’ intensities
intensitiesofofthe
thetest
testand
andthethe control
control lineline regions
regions werewere calculated.
calculated. Then,Then, T/C
T/C ratio
ratio
values values
that that corresponded
corresponded to theto the quantity
quantity of of
thetheanalyte
analytewere werecalculated.
calculated. Figure
Figure 12
12 shows
shows the the
application screens and steps of operation of an LFA strip with a 100 pg analyte
application screens and steps of operation of an LFA strip with a 100 pg analyte quantity. Figure 12c quantity. Figure 12c
shows
showsthe thepositioning
positioningofofthetheLFA
LFA strip in in
strip thethe
center
centerboxboxof the gridgrid
of the view. Figure
view. 12e shows
Figure the test
12e shows theand
test
control line with
and control the number
line with of pixels
the number in theinmask,
of pixels the mask,and Figure
and Figure12f shows the weighted
12f shows the weightedsumssumsof red
of
pixels’ intensities of the control line (6191) and test line
red pixels’ intensities of the control line (6191) and test line (7047).(7047).
For accurate estimation of the quantity of the analyte, it is essential to detect the ROI precisely.
Since mobile devices do not have a fixed position with respect to the test strip, smartphone-based
measurement is usually weak and prone to motion blur artifacts [12]. Moreover, due to the inherent
unstable nature of the human hand (i.e., movement and shaking) and lack of a stable supporting device
(i.e., tripod), the placement of the camera on the exact ROI is difficult, causing the measurement to be
erroneous [42]. Hence, for an accurate comparison analysis, the camera must be placed at a certain
distance from the test strip covering the ROI for each of the test cases. Figure 13 compares correct
(see Figure 13a) and incorrect placements (see Figure 13b) of an LFA strip under a smartphone camera.
As shown in Figure 13b, the larger field of view can lead to errors in calculating the number of pixels in
the ROI.
sums of red pixels’ intensities of the test and the control line regions were calculated. Then, T/C ratio
values that corresponded to the quantity of the analyte were calculated. Figure 12 shows the
application screens and steps of operation of an LFA strip with a 100 pg analyte quantity. Figure 12c
shows the positioning of the LFA strip in the center box of the grid view. Figure 12e shows the test
and control
Sensors 2019, 19,line
4812with the number of pixels in the mask, and Figure 12f shows the weighted sums
11 of of
19
red pixels’ intensities of the control line (6191) and test line (7047).

Sensors 2019, 19, x FOR PEER REVIEW 11 of 18

Sensors 2019, 19, x FOR PEER REVIEW 11 of 18


(a) (b) (c)

(d) (e) (f)


Figure 12. Proposed application for the detection of an analyte in an LFA strip: (a) home screen, (b)
second screen with an image capture option, (c) camera view with a 3 × 3 grid spacing, (d) captured
raw image, (e) processed image using the created mask (total pixel number shown at the bottom of
the image), and (f) calculated control (left) and test (right) line weighted sum of pixels’ intensity
values.

For accurate estimation of the quantity of the analyte, it is essential to detect the ROI precisely.
Since mobile devices do not have a fixed position with respect to the test strip, smartphone-based
measurement is usually weak and prone to motion blur artifacts [12]. Moreover, due to the inherent
unstable nature of the human hand (i.e., movement and shaking) and lack of a stable supporting
device (i.e., tripod), the placement of the camera on the exact ROI is difficult, causing the
measurement to be erroneous (d) [42]. Hence, for an accurate(e) comparison analysis,(f) the camera must be
placed at a certain distance from the test strip covering the ROI for each of the test cases. Figure 13
Figure 12.
12. Proposed
Proposed application
application for
forthe
thedetection
detection ofofanananalyte
analytein in
anan
LFA LFAstrip: (a) home
strip: screen, (b)
Figure
compares correct (see Figure 13a) and incorrect placements (see Figure 13b) of an(a)LFA
home screen,
strip under a
second
(b) screen
second with
screen anan
with image capture
image captureoption,
option,(c)
(c)camera
cameraviewviewwith
withaa 33 ××33grid
grid spacing,
spacing, (d) captured
(d) captured
smartphone camera. As shown in Figure 13b, the larger field of view can lead to errors in calculating
raw image,
raw image,(e)
(e)processed
processedimage
imageusing
usingthethe created
created maskmask (total
(total pixel
pixel number
number shown
shown at the
at the bottom
bottom of
of the
the number of pixels in the ROI.
the image),
image), and
and (f) (f) calculated
calculated control
control (left)
(left) and and
test test line
(right) (right) line weighted
weighted sum ofintensity
sum of pixels’ pixels’ intensity
values.
values.

For accurate estimation of the quantity of the analyte, it is essential to detect the ROI precisely.
Since mobile devices do not have a fixed position with respect to the test strip, smartphone-based
measurement is usually weak and prone to motion blur artifacts [12]. Moreover, due to the inherent
unstable nature of the human hand (i.e., movement and shaking) and lack of a stable supporting
device (i.e., tripod), the placement of the camera on the exact ROI is difficult, causing the
measurement to be erroneous [42]. Hence, for an accurate comparison analysis, the camera must be
placed at a certain distance from the test strip covering the ROI for each of the test cases. Figure 13
compares correct (see Figure 13a) and incorrect placements (see Figure 13b) of an LFA strip under a
smartphone camera. As shown in Figure 13b, the larger field of view can lead to errors in calculating
the number of pixels in the ROI.
(a) (b)
(a) Exact
Figure 13. (a) Exact placement
placement of
of the
the LFA
LFA strip
strip under
under a smartphone
smartphone camera
camera and
and (b)
(b) an improper
placement of the LFA strip—a larger field of view due to the proximity of the camera
camera to
to the
the LFA
LFA strip.

In order to counter the challenge of accurate placement, we implemented an additional function


in our developed application which helps users to place the strip in a correct position using grids, as
shown in Figure 14. Specifically, the application shows a 3 × 3 grid view on the screen and prompts
the user to set the control line and test line in a certain position inside the center grid box, as shown
in Figure 14b. Thus, the reference distance is fixed for all experiments. To reduce the error due to
motion blur artifacts, five readings were taken for each case.
Sensors 2019, 19, 4812 12 of 19

In order to counter the challenge of accurate placement, we implemented an additional function


in our developed application which helps users to place the strip in a correct position using grids,
as shown in Figure 14. Specifically, the application shows a 3 × 3 grid view on the screen and prompts
the user to set the control line and test line in a certain position inside the center grid box, as shown in
Figure 14b. Thus, the reference distance is fixed for all experiments. To reduce the error due to motion
blur artifacts,
Sensors 2019, 19, xfive
FORreadings were taken for each case.
PEER REVIEW 12 of 18

(a) (b)
Figure 14. (a) Android
Figure 14. Android smartphone
smartphone (Samsung
(Samsung Galaxy)
Galaxy) camera
camera view withaa33×× 33 grid
viewwith grid positioning
positioning and
and
(b)
(b) LFA
LFAstrip
stripROI
ROIpositioned
positionedin
inthe
thecenter
centerbox.
box.

4.
4. Results
Results
The
The proposed
proposed method
method first
first approximates
approximates thethe analyte
analyte quantities
quantities using
using the
the regression
regression analysis,
analysis,
and
and then
then predicts
predicts the
the quantities
quantities using
using the
the machine
machine learning classifier, as
learning classifier, as described
described in in Section
Section 3.
3.
Section 4.1 explains the performance metrics to evaluate the regression and classification
Section 4.1 explains the performance metrics to evaluate the regression and classification performance
of the proposedofmethod.
performance Then, themethod.
the proposed regression and classification
Then, the regression performance are evaluated
and classification in Section 4.2.
performance are
Section 4.3 shows the smartphone application capability in terms of functionality and
evaluated in Section 4.2. Section 4.3 shows the smartphone application capability in terms of efficiency.
functionality and efficiency.
4.1. Performance Metrics
The regression
4.1. Performance analysis of the proposed method shows the relationship between the analyte
Metrics
quantity and the T/C ratio for the acquired readings (mentioned in Section 3.2). The calibration curve,
The regression analysis of the proposed method shows the relationship between the analyte
which can be used to approximate the unknown quantity of analyte in the sample, was obtained from
quantity and the T/C ratio for the acquired readings (mentioned in Section 3.2). The calibration curve,
the regression analysis. From the curve, the R2 value and the standard error of detection (σ) were
which can be used to approximate the unknown quantity of analyte in the sample, was obtained from
calculated according to Equations (4) and (5), respectively.
the regression analysis. From the curve, the R2 value and the standard error of detection (𝜎) were
calculated according to Equations (4) and (5),2 respectively.
SSresidual
R = , (4)
𝑆𝑆
SStotal
𝑅 = , (4)
𝑆𝑆
where SSresidual is the measure of the discrepancy between the data and the estimation model, and SStotal
where
is the sumSSresidual
of theissquares
the measure
of the of the discrepancy
difference between
of the obtained data theand
data
itsand
mean,theand
estimation model, and
SStotal is the sum of the squares of the difference of the obtained data and its mean, and
q
2
∑( )0
P
𝜎 = (Y − Y ,) (5)
σ= , (5)
N
where Y is an actual data, Y’ is a predicted data, and N is the number of acquired data points. The R2
2
where Y is an the
value shows actual data, Y’measure
statistical is a predicted data, andofNvariance,
of proportion is the number whereas 𝜎 measures
of acquired datathe
points. The Rof
precision
value showsmean
the sample the statistical measure
of detection. of proportion
Moreover, of variance,ofwhereas
the performance σ measures
the regression the precision
is evaluated of the
in terms of
sample
the following metrics: limit of detection (LOD), limit of quantification (LOQ), and coefficientthe
mean of detection. Moreover, the performance of the regression is evaluated in terms of of
following metrics:
variation (CV) limitLOD
[43,44]. of detection (LOD),
is the lowest limit of
possible quantification
concentration (LOQ), which
of analyte and coefficient of variation
can be detected with
reliability and separated from the limit of blank (LOB). LOB is represented as the maximum quantity of
analyte measured when no analyte is present in the sample. LOQ is the lowest quantity of analyte
detected with a high confidence level. LOQ is the performance metric for the method or measurement
system. CV is the relative variability calculated from the ratio of the standard deviation to the mean.
LOD, LOQ, and CV are calculated according to Equations (6)–(8), respectively.
Sensors 2019, 19, 4812 13 of 19

(CV) [43,44]. LOD is the lowest possible concentration of analyte which can be detected with reliability
and separated from the limit of blank (LOB). LOB is represented as the maximum quantity of analyte
measured when no analyte is present in the sample. LOQ is the lowest quantity of analyte detected
with a high confidence level. LOQ is the performance metric for the method or measurement system.
CV is the relative variability calculated from the ratio of the standard deviation to the mean. LOD,
LOQ, and CV are calculated according to Equations (6)–(8), respectively.
σ
Sensors 2019, 19, x FOR PEER REVIEW LOD = LOB + 1.645 × , 13 of (6)
18
m
𝜎
σ
𝐿𝑂𝑄 =
LOQ = 10
10 × ,, (7)
(7)
𝑚
m
σ is the standard error of detection, and m is the slope of the
where σ the calibration
calibration curve;
curve;
𝑠
CV = s ,,
𝐶𝑉 = (8)
(8)
𝜇
µ
where s is the standard deviation of the readings, and μ is the mean of the readings for each analyte
where s is the standard deviation of the readings, and µ is the mean of the readings for each
quantity.
analyte quantity.
On the other hand, the performance of the classifier is evaluated in terms of accuracy, which is
On the other hand, the performance of the classifier is evaluated in terms of accuracy, which is
defined as follows:
defined as follows:
TP + TN
𝐴𝑐𝑐𝑢𝑟𝑎𝑐𝑦
Accuracy = = 100%,
× 100%, (9)
(9)
TP + TN + FP + FN
where TP, FP, TN, and FN are the numbers of true positives, false
false positives,
positives, true
true negatives,
negatives, and
and false
false
negatives,
negatives, respectively.

4.2. Regression
4.2. Regression and
and Classification
Classification Results
Results
Figure 15 shows
Figure shows the
the regression
regression analysis of the analyte quantity
quantity against T/C ratio. The
against T/C The calibration
calibration
curve has an equation in the form: y = 0.203x + 0.0118, where x is the analyte quantity,
curve has an equation in the form: y = 0.203x + 0.0118, where x is the analyte quantity, and y is the and y isT/C
the
T/C ratio. The obtained 2
R value of 0.9838
ratio. The obtained R2 value of 0.9838 fromfrom Equation
Equation (4) shows
(4) shows a strong
a strong correlation
correlation between
between the
the readings and the fitted calibration curve. The standard error of detection (σ )
readings and the fitted calibration curve. The standard error of detection (𝜎) value of 0.007 was value of 0.007 was
calculated from
calculated from Equation
Equation (5),
(5), and
and the
the slope
slope of
of the calibration
calibration curve
curve (m)
(m) of
of 0.203
0.203 was obtained
obtained from
from
the calibration
the calibration curve.
curve. TheThelimit
limitofof detection
detection (LOD)
(LOD) ofof 0.000026
0.000026 nMnM (1.71
(1.71 pg/mL)
pg/mL) and and the
the limit
limit of
of
(LOQ) of 0.00039 nM (26.48 pg/mL) were calculated using the values of 𝜎 and m
quantification (LOQ) of 0.00039 nM (26.48 pg/mL) were calculated using the values of σ and m according
to Equations
according (6) and (7),(6)
to Equations respectively. The coefficients
and (7), respectively. of variation of
The coefficients (CV) obtained
variation (CV)from calculations
obtained from
using Equation
calculations (8)Equation
using are shown (8)inare
Table 4. in Table 4.
shown

(a) (b)
Figure
Figure 15. Calibration
Calibration curves
curves obtained
obtained from
from the
the regression
regression analysis
analysis of
of the
the LFA sets: (a) logarithmic
logarithmic
and
and (b)
(b) linear.
linear.

Table 4. Mean, std. deviation and coefficient of variation of test to control line signal intensity (T/C)
ratios for each of the analyte quantities of the LFA sets.

Analyte QuantityMean T/C RatioStd. Deviation of T/C ratio CV (%)


Sensors 2019, 19, 4812 14 of 19

Table 4. Mean, std. deviation and coefficient of variation of test to control line signal intensity (T/C)
ratios for each of the analyte quantities of the LFA sets.

Analyte Quantity Mean T/C Ratio Std. Deviation of T/C ratio CV (%)
100 pg 0.791502 0.055011 6.950253
10 pg 0.624684 0.055213 8.838581
1 pg 0.440393 0.035544 8.070897
100 fg 0.211211 0.021829 10.33506
10 fg 0 0 0

A linear SVM classifier with a five-fold cross validation was trained using the approximated
quantities obtained from the two training LFA sets. Each set contained the following analyte quantities:
10 fg, 100 fg, 1 pg, 10 pg, and 100 pg. The confusion matrix for the training sets is shown in Table 5.
The SVM decision boundary obtained from the training sets was applied to the test LFA set. Each test
strip result was evaluated 10 times. The confusion matrix for the testing set is shown in Table 6.

Table 5. Confusion matrix for the training sets obtained by the linear support vector machine (SVM)
model. The asterisk sign (*) denotes the misclassified analyte quantity for the training LFA sets.

Predicted Class
Actual Class 100 pg 10 pg 1 pg 100 fg 10 fg
100 pg 10 0 0 0 0
10 pg 1* 9 0 0 0
1 pg 0 0 10 0 0
100 fg 0 0 0 10 0
10 fg 0 0 0 0 10

Table 6. Confusion matrix for testing the method with the SVM. The asterisk sign (*) denotes the
misclassified analyte quantity for the test LFA set.

Predicted Class
Actual Class 100 pg 10 pg 1 pg 100 fg 10 fg
100 pg 10 0 0 0 0
10 pg 1* 9 0 0 0
1 pg 0 0 10 0 0
100 fg 0 0 0 10 0
10 fg 0 0 0 0 10

The analyte quantity detection accuracy of the proposed method was calculated to be 98%,
according to Equation (9). The performance metrics for the classifier show that our smartphone
application could detect four classes out of five with 100% accuracy.

4.3. Smartphone Application Capability


Figure 16 shows the screen captures of five readings of a sample set of LFA strips from our
application. The proposed quantitative method was able to detect the analyte quantity instantly after it
was trained using the SVM. Training of the proposed method required 0.817 s using a cloud computer
equipped with Windows 64-bit, Intel Core i7-6700 CPU @ 3.40 GHz (eight CPUs), 16,384 MB Ram,
and MATLAB (2018b) programming.
4.3. Smartphone Application Capability
Figure 16 shows the screen captures of five readings of a sample set of LFA strips from our
application. The proposed quantitative method was able to detect the analyte quantity instantly after
it was trained using the SVM. Training of the proposed method required 0.817 s using a cloud
Sensors 2019, 19, 4812 15 of 19
computer equipped with Windows 64-bit, Intel Core i7-6700 CPU @ 3.40 GHz (eight CPUs), 16,384
MB Ram, and MATLAB (2018b) programming.

Sensors 2019, 19, x FOR PEER REVIEW 15 of 18


(a) (b) (c)

(d) (e)
Figure of
Figure 16. Estimation 16.analyte
Estimation of analyte
quantity: (a) 10quantity:
fg, (b) 100(a)fg,10(c)
fg,1 (b)
pg,100
(d) fg, (c) 1and
10 pg, pg, (e)
(d)100
10 pg,
pg, and
using(e) 100 pg, using
the developed
the developed application. application. application
The smartphone The smartphone
was used application was used
for the detection for the
of analyte detection of analyte
quantities
quantities
under various lighting under various
conditions. lighting
The number onconditions. The
the left is the number
control lineon the left is
weighted sumtheofcontrol line weighted sum
red pixels’
intensities, theof red pixels’
number intensities,
on the right is the number
test lineon the rightsum
weighted is theoftest
redline weighted
pixels’ sum ofand
intensities, redthe
pixels’ intensities,
analyte quantityand the analyteatquantity
is displayed is displayed
the bottom at the bottom of the screen.
of the screen.

5. Discussion5. Discussion
Ruppert et al.Ruppert
[45] compared their
et al. [45] methodtheir
compared to themethod
method toused by the ImageJ
the method used bysoftware [46,47],
the ImageJ software [46,47],
which is a well-known
which is aopen platformopen
well-known tool platform
for scientific
tool image analysis.
for scientific Similarly,
image weSimilarly,
analysis. comparedwe ourcompared our
proposed method
proposed ImageJ software’s
to themethod method
to the ImageJ [46,47].
software’s For a [46,47].
method fair comparison,
For a fair Otsu’s thresholding,
comparison, Otsu’s thresholding,
which is included in our method, was also included in the ImageJ software.
which is included in our method, was also included in the ImageJ software.
Table 7 compares
Tablethe performance
7 compares of the proposed
the performance method
of the and the
proposed method
method and used by ImageJ
the method used by ImageJ
software under ambient lighting condition (mentioned in Section 2). The proposed method
software under ambient lighting condition (mentioned in Section 2). The proposed method shows shows
2.96 times lower
2.96LOD
timesand 798.92
lower LOD times
and lower
798.92LOQ
timesthan
lower ImageJ
theLOQ method,
than which
the ImageJ indicate
method, that the
which indicate that the
performance of the proposed method was better at quantifying the analyte quantity.
performance of the proposed method was better at quantifying the analyte quantity.

Table 7. Performance
Table 7.comparison between
Performance ImageJbetween
comparison softwareImageJ
and our proposed
software andmethod.
our proposed method.
ProposedProposed
Method Method ImageJImageJ
MethodMethod
LOD LOD 0.000026
0.000026 nM (1.71 pg/mL)
nM (1.71 pg/mL) 0.000077
0.000077 nM
nM (5.12 (5.12 pg/mL)
pg/mL)
LOQ 0.00039 nM (26.48 pg/mL) 0.31158 nM (20,720.4 pg/mL)
LOQ 0.00039 nM (26.48 pg/mL) 0.31158 nM (20,720.4 pg/mL)

Table 8 compares our proposed method to the conventional methods in terms of functionality.
Our method provides a more convenient, portable, robust, and inexpensive solution compared to the
existing methods. Moreover, this method has a strong advantage in that only a smartphone is needed
to accurately detect the amount of analyte without any additional equipment.
Sensors 2019, 19, 4812 16 of 19

Table 8 compares our proposed method to the conventional methods in terms of functionality.
Our method provides a more convenient, portable, robust, and inexpensive solution compared to the
existing methods. Moreover, this method has a strong advantage in that only a smartphone is needed
to accurately detect the amount of analyte without any additional equipment.

Table 8. Method comparison between existing methods and the proposed method.

Mokkapati et al. [49],


Ruppert et al. [45],
RDS 2500 LFA Reader, Preechaburana et al. Proposed
Roda et al. [48],
Detekt Biomedical [50] Method
Balaji et al. [23]
LLC. [25]
Uses smartphone Yes No Yes Yes
Requires external device Yes N/A No No
Works in varying lighting conditions N/A N/A No Yes
Approximates analyte quantity Yes Yes Yes Yes
Predicts based on machine learning No No No Yes

6. Conclusions
In this paper, we have explored the possibility of using a smartphone application to detect the
analyte quantity in a sample from an LFA. Using a novel image processing technique and SVM
algorithm, we have proposed an automated smartphone application–based method for the detection
of analyte quantity in LFA with acceptable accuracy. Our proposed smartphone application can
successfully measure the analyte quantity in a sample from a set of LFA strips of analyte quantity
varying from 10 fg to 100 pg (1 pg/mL to 10 ng/mL of concentration for 10 uL solution). An overall
accuracy of 98.00% in the detection of unknown analyte quantities validated the promising performance
of our application, which provides a convenient and rapid LFA reading. Our proposed method also
supports the ubiquity (i.e., independent operability under a range of external lighting conditions),
portability (i.e., no external device is required), efficiency (i.e., the method can determine the amount
of analyte readily), and reliability (i.e., the proposed method achieved a reasonable accuracy in the test
case) of our application. It can replace existing methods for the low-cost detection of LFA analytes in
different application fields.

Author Contributions: K.H.F. designed the analysis, developed the software, wrote the original and revised
manuscript, and conducted data analysis and details of the work. S.E.S. collected the data set, re-designed the
research experiment, verified data, and conducted statistical analysis. M.J.K. collected the data and conducted the
analysis. O.S.K. conceptualized and designed the research experiment. J.W.C. wrote the original/revised drafts,
designed, re-designed, and verified image data analysis, and guided direction of the work.
Funding: This research was supported by a National Research Foundation of Korea (NRF) Grant funded by
the Ministry of Science and Information and Communication Technology (ICT) for First-Mover Program for
Accelerating Disruptive Technology Development (NRF-2018M3C1B9069834); the National Research Foundation
of Korea (NRF) Grant funded by the Ministry of Science and Information and Communication Technology (ICT)
(NRF-2019M3C1B8077544); Korea Institute of Planning and Evaluation for Technology in Food, Agriculture and
Forestry (IPET) through the Advanced Production Technology Development Program, funded by Ministry of
Agriculture, Food and Rural Affairs (MAFRA) (318104-3); the Brain Research Program through the National
Research Foundation of Korea (NRF) funded by the Ministry of Science, Information and Communication
Technology (ICT) & Future Planning (NRF-2016M3C7A1905384); and the Korea Research Institute of Bioscience
and Biotechnology (KRIBB) Initiative Research Program.
Conflicts of Interest: The authors declare no conflict of interest.

References
1. Wong, R.C.; Harley, Y.T. Quantitative, false positive, and false negative issues for lateral flow immunoassays
as exemplified by onsite drug screens. In Lateral Flow Immunoassay; Humana Press: New York, NY, USA,
2009; pp. 1–19.
2. Koczula, K.M.; Gallotta, A. Lateral Flow Assays. Essays Biochem. 2016, 60, 111–120. [PubMed]
3. Pronovost, A.D.; Lee, T.T. Assays and Devices for Distinguishing between Normal and Abnormal Pregnancy.
U.S. Patent 5,786,220, 28 July 1998.
Sensors 2019, 19, 4812 17 of 19

4. Mak, W.C.; Beni, V.; Turner, A.P. Lateral-flow technology: From visual to instrumental. TrAC Trends Anal.
Chem. 2016, 79, 297–305. [CrossRef]
5. Manabe, Y.C.; Nonyane, B.A.S.; Nakiyingi, L.; Mbabazi, O.; Lubega, G.; Shah, M.; Moulton, L.H.; Joloba, M.;
Ellner, J.; Dorman, S.E. Point-of-Care Lateral Flow Assays for Tuberculosis and Cryptococcal Antigenuria
Predict Death in HIV Infected Adults in Uganda. PLoS ONE 2014, 9, e101459. [CrossRef] [PubMed]
6. Sun, J.; Xianyu, Y.; Jiang, X. Point-of-care biochemical assays using gold nanoparticle-implemented
microfluidics. Chem. Soc. Rev. 2014, 43, 6239–6253. [CrossRef] [PubMed]
7. Wang, X.; Li, K.; Shi, D.; Xiong, N.; Jin, X.; Yi, J.; Bi, D. Development of an Immunochromatographic
Lateral-Flow Test Strip for Rapid Detection of Sulfonamides in Eggs and Chicken Muscles. J. Agric. Food
Chem. 2007, 55, 2072–2078. [CrossRef]
8. Ching, K.H.; He, X.; Stanker, L.H.; Lin, A.V.; McGarvey, J.A.; Hnasko, R. Detection of Shiga Toxins by Lateral
Flow Assay. Toxins 2015, 7, 1163–1173. [CrossRef]
9. Wu, Y.; Xu, M.; Wang, Q.; Zhou, C.; Wang, M.; Zhu, X.; Zhou, D. Recombinase polymerase amplification (RPA)
combined with lateral flow (LF) strip for detection of Toxoplasma gondii in the environment. Veter- Parasitol.
2017, 243, 199–203. [CrossRef]
10. You, M.; Lin, M.; Gong, Y.; Wang, S.; Li, A.; Ji, L.; Zhao, H.; Ling, K.; Wen, T.; Huang, Y.; et al. Household
Fluorescent Lateral Flow Strip Platform for Sensitive and Quantitative Prognosis of Heart Failure Using
Dual-Color Upconversion Nanoparticles. ACS Nano 2017, 11, 6261–6270. [CrossRef]
11. Ngom, B.; Guo, Y.; Wang, X.; Bi, D. Development and application of lateral flow test strip technology
for detection of infectious agents and chemical contaminants: A review. Anal. Bioanal. Chem. 2010, 397,
1113–1135. [CrossRef]
12. Carrio, A.; Sampedro, C.; Sánchez-López, J.L.; Pimienta, M.; Campoy, P. Automated Low-Cost Smartphone-Based
Lateral Flow Saliva Test Reader for Drugs-of-Abuse Detection. Sensors 2015, 15, 29569–29593. [CrossRef]
13. Kaylor, R.; Yang, D.; Knotts, M. Reading Device, Method, and System for Conducting Lateral Flow Assays.
U.S. Patent 8,367,013, 5 February 2013.
14. Seo, S.E.; Tabei, F.; Park, S.J.; Askarian, B.; Kim, K.H.; Moallem, G.; Chong, J.W.; Kwon, O.S. Smartphone
with optical, physical, and electrochemical nanobiosensors. J. Ind. Eng. Chem. 2019, 77, 1–11. [CrossRef]
15. Imrich, M.R.; Zeis, J.K.; Miller, S.P.; Pronovost, A.D. Lateral Flow Medical Diagnostic Assay Device with
Sample Extraction Means. U.S. Patent 5,415,994, 16 May 1995.
16. Li, Z.; Chen, H.; Wang, P. Lateral flow assay ruler for quantitative and rapid point-of-care testing. Anal. 2019,
144, 3314–3322. [CrossRef] [PubMed]
17. Vashist, S.K.; Luppa, P.B.; Yeo, L.Y.; Ozcan, A.; Luong, J.H. Emerging Technologies for Next-Generation
Point-of-Care Testing. Trends Biotechnol. 2015, 33, 692–705. [CrossRef] [PubMed]
18. Zhang, J.-J.; Shen, Z.; Xiang, Y.; Lu, Y. Integration of Solution-Based Assays onto Lateral Flow Device
for One-Step Quantitative Point-of-Care Diagnostics Using Personal Glucose Meter. ACS Sensors 2016, 1,
1091–1096. [CrossRef]
19. You, D.J.; Park, T.S.; Yoon, J.-Y. Cell-phone-based measurement of TSH using Mie scatter optimized lateral
flow assays. Biosens. Bioelectron. 2013, 40, 180–185. [CrossRef]
20. Shen, L.; Hagen, J.A.; Papautsky, I. Point-of-care colorimetric detection with a smartphone. Lab Chip 2012, 12,
4240. [CrossRef]
21. Cate, D.M.; Adkins, J.A.; Mettakoonpitak, J.; Henry, C.S. Recent Developments in Paper-Based Microfluidic
Devices. Anal. Chem. 2014, 87, 19–41. [CrossRef]
22. Tabei, F.; Zaman, R.; Foysal, K.H.; Kumar, R.; Kim, Y.; Chong, J.W. A novel diversity method for smartphone
camera-based heart rhythm signals in the presence of motion and noise artifacts. PLoS ONE 2019, 14,
e0218248. [CrossRef]
23. Srinivasan, B.; O’Dell, D.; Finkelstein, J.L.; Lee, S.; Erickson, D.; Mehta, S. ironPhone: Mobile device-coupled
point-of-care diagnostics for assessment of iron status by quantification of serum ferritin. Biosens. Bioelectron.
2018, 99, 115–121. [CrossRef]
24. Lee, S.; O’Dell, D.; Hohenstein, J.; Colt, S.; Mehta, S.; Erickson, D. NutriPhone: A mobile platform for low-cost
point-of-care quantification of vitamin B12 concentrations. Sci. Rep. 2016, 6, 28237. [CrossRef]
25. Biomedical, D. RDS 2500, Austin, United States. Available online: http://idetekt.com/reader-systems (accessed
on 1 August 2019).
Sensors 2019, 19, 4812 18 of 19

26. Li, Z.; Wang, Y.; Wang, J.; Tang, Z.; Pounds, J.G.; Lin, Y. Rapid and Sensitive Detection of Protein Biomarker
Using a Portable Fluorescence Biosensor Based on Quantum Dots and a Lateral Flow Test Strip. Anal. Chem.
2010, 82, 7008–7014. [CrossRef] [PubMed]
27. Larsson, K.; Kriz, K.; Kříž, D. Magnetic transducers in biosensors and bioassays. Analusis 1999, 27, 617–621.
[CrossRef]
28. Cooper, D.C.; Callahan, B.; Callahan, P.; Burnett, L. Mobile image ratiometry: A new method for instantaneous
analysis of rapid test strips. Nat. Preced. 2012, 10. [CrossRef]
29. Thieme, T.R.; Cimler, B.M.; Klimkow, N.M. Saliva assay method and device. U.S. Patent 5,714,341,
3 February 1998.
30. Eltzov, E.; Guttel, S.; Kei, A.L.Y.; Sinawang, P.D.; Ionescu, R.E.; Marks, R.S.; Adarina, L.Y.K.; Ionescu, E.R.
Lateral Flow Immunoassays—from Paper Strip to Smartphone Technology. Electroanalysis 2015, 27, 2116–2130.
[CrossRef]
31. O’Farrell, B. Evolution in lateral flow–based immunoassay systems. In Lateral Flow Immunoassay; Humana
Press: New York, NY, USA, 2009; pp. 1–33.
32. Danks, C.; Barker, I. On-site detection of plant pathogens using lateral-flow devices. EPPO Bull. 2000, 30,
421–426. [CrossRef]
33. Sigma Aldrich, Merck KGaA, Saint Louis, Missouri, USA. Available online: https://www.sigmaaldrich.com/
(accessed on 2 August 2019).
34. Samsung Galaxy S7, Samsung Electronics, Ridgefield Park, NJ, USA. Available online: https://www.samsung.
com/global/galaxy/galaxy-s7/hardware/ (accessed on 2 August 2019).
35. MATLAB 2018b, The Mathworks Inc. Natick, Massachusetts, USA. Available online: https://www.mathworks.
com/products/matlab.html (accessed on 2 August 2019).
36. Otsu, N. A Threshold Selection Method from Gray-Level Histograms. IEEE Trans. Syst. Man, Cybern. 1979, 9,
62–66. [CrossRef]
37. Gignac, J.-P. Method of cropping a digital image. U.S. Patent 10/487,995, 2 December 2004.
38. Priya, M.; Nawaz, D.G.K. Multilevel Image Thresholding using OTSU’s Algorithm in Image Segmentation.
Int. J. Sci. Eng. Res. 2017, 8, 101–105.
39. Zangheri, M.; Di Nardo, F.; Mirasoli, M.; Anfossi, L.; Nascetti, A.; Caputo, D.; De Cesare, G.; Guardigli, M.;
Baggiani, C.; Roda, A. Chemiluminescence lateral flow immunoassay cartridge with integrated amorphous
silicon photosensors array for human serum albumin detection in urine samples. Anal. Bioanal. Chem. 2016,
408, 8869–8879. [CrossRef]
40. Lee, S.; Kim, G.; Moon, J. Performance Improvement of the One-Dot Lateral Flow Immunoassay for Aflatoxin
B1 by Using a Smartphone-Based Reading System. Sensors 2013, 13, 5109–5116. [CrossRef]
41. Weston, J.; Watkins, C. Multi-Class Support Vector Machines; Technical Report CSD-TR-98-04; Department of
Computer Science, Royal Holloway, University of London: London, UK, May 1998.
42. Faulstich, K.; Gruler, R.; Eberhard, M.; Lentzsch, D.; Haberstroh, K. Handheld and portable reader devices
for lateral flow immunoassays. In Lateral Flow Immunoassay; Springer: New York, NY, USA, 2009; pp. 1–27.
43. Armbruster, D.A.; Pry, T. Limit of Blank, Limit of Detection and Limit of Quantitation. Clin. Biochem. Rev.
2008, 29, S49–S52.
44. Armbruster, D.A.; Tillman, M.D.; Hubbs, L.M. Limit of detection (LQD)/limit of quantitation (LOQ):
Comparison of the empirical and the statistical methods exemplified with GC-MS assays of abused drugs.
Clin. Chem. 1994, 40, 1233–1238. [PubMed]
45. Ruppert, C.; Phogat, N.; Laufer, S.; Kohl, M.; Deigner, H.-P. A smartphone readout system for gold
nanoparticle-based lateral flow assays: Application to monitoring of digoxigenin. Microchim. Acta 2019, 186,
119. [CrossRef] [PubMed]
46. ImageJ National Institutes of Health; Bethesda, M.D. USA. Available online: https://imagej.nih.gov/ij/
(accessed on 2 August 2019).
47. Abràmoff, M.D.; Magalhães, P.J.; Ram, S.J. Image processing with ImageJ. Biophotonics Int. 2004, 11, 36–42.
48. Roda, A.; Guardigli, M.; Calabria, D.; Calabretta, M.M.; Cevenini, L.; Michelini, E. A 3D-printed device
for a smartphone-based chemiluminescence biosensor for lactate in oral fluid and sweat. Anal. 2014, 139,
6494–6501. [CrossRef] [PubMed]
Sensors 2019, 19, 4812 19 of 19

49. Mokkapati, V.K.; Sam Niedbala, R.; Kardos, K.; Perez, R.J.; Guo, M.; Tanke, H.J.; Corstjens, P.L. Evaluation of
UPlink–RSV: Prototype Rapid Antigen Test for Detection of Respiratory Syncytial Virus Infection. Ann. N. Y.
Acad. Sci. 2007, 1098, 476–485. [CrossRef] [PubMed]
50. Preechaburana, P.; Macken, S.; Suska, A.; Filippini, D. HDR imaging evaluation of a NT-proBNP test with a
mobile phone. Biosens. Bioelectron. 2011, 26, 2107–2113. [CrossRef]

© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).

Das könnte Ihnen auch gefallen