Beruflich Dokumente
Kultur Dokumente
PERFORMANCE
MONITORING
By
M.S.Mani
N.A.Baxi
H.Madhvani
J.F.D’Souza
V.R.Patel
1
Objective
• To understand the need of Analyzer
Performance Monitoring
• An analyzer differs from the standard test apparatus and even the basic
measurement technique may be different. Thus the analyzer and the
standard test should give identical results, provided that due account is
taken of any difference in the precision of the two methods. 3
Why Analyzer Performance Monitoring ?
• True value
The value which characterizes a perfectly defined quantity in the condition which
exists when that quantity is considered.
• Measurement uncertainty
It is an estimate characterizing the range of values within which the error is
asserted to lie.
It is an expression of the fact that, for a given result of measurement there is not
one value, but an infinite number of values dispersed about the result with a
varying degree of credibility.
• Accuracy
Accuracy describes the nearness of a measurement to the standard or true
value, i.e., a highly accurate measuring device will provide measurements very
close to the standard, true or known values.
• Precision
Precision is the degree to which several measurements provide answers very
close to each other. It is an indicator of the scatter in the data. The lesser the
5
scatter, higher the precision.
Terms to know
Bias
A consistent difference (either on positive side or negative side) between the
actual value and the measured value is called the bias or offset.
Confidence level
The probability expressed as a decimal or percentage that the true value lies
within a specified range of values.
• Availability Rate
It is the percentage of the time during which the instrument is functioning
correctly during actual operation of the plant.
.
6
Terms to know
EXAMPLE Accuracy Vs Precision:
Reproducibility
Closeness of agreement between independent test result obtained
on identical material, by different operators using different
instruments in different laboratories. The difference between two
successive results that would be exceeded in the long run in only 1
case in 20 ( 95% confidence level) when the analyzer is operating
normally.
9
General Terms
• Variance
The expectation of the
deviation of a random
variable about its
expectation
10
Calibration / Validation
Calibration
It is adjusting the analyzer output to agree with a
standard introduced into the analyzer.
Validation
It is observing and noting the difference ( if any )
between the analyzer reading and the reference
standard / sample used. But with no adjustments made
to the analyzer. When a difference is recorded , random
statistical error can be estimated and accurate
adjustments made. This avoids the practice of too
frequent adjustments in an attempt to tune out normal
system fluctuations.
11
Calibration/ Validation methods
Reference sample method
Samples which have been accurately prepared, tested and certified [ as
per well laid standards ] is used for calibration.
12
Validation of analyzers
13
Validation planning for analyzers
• Periodic validation of the analyser running, on-line, with
process sample. The validation frequency shall be in
accordance with the Validation frequency flow-chart
14
Frequency of validation
• The frequency of validation will depend upon many
factors such as
– Application
– Type of analyzer
– The precision required
– Whether checking will be at regular intervals
or related to parcel transfers
15
Start
Commissioning
Validation frequency
2 per week / 1 per week Y Trouble shoot
Take corrective actions
Record the action points
1 month
(Or sooner) N
Performance
Ok ?
Major
N corrective
Y
Actions required
Validation frequency
2 per month / 1 per month
A B C 16
A B C
Trouble shoot
Take corrective actions
Record the action points
6 months N
Performance
Ok ? Y
Major
Y corrective
N
Actions required
Validation frequency
1 / month
1 / 2 months Trouble shoot
a & b as required Take corrective actions
Record the action points
Y Performance N
Ok ?
17
Performance monitoring
Sampling Application built Sample conditioning system Different sample preperation method in Lab
Sampling point from Sample conditioning system Sample collection point may be different
Sample transportation and Handling fixed varies
Sample measured by QMI is representative of the actual Manual sampling has inherent sampling errors such as
process sample a). Inadequate sample container purge
b). Evaporation of lighters
c). Removal of separable water from the sample prior
to analysis
Analyser type Difference in Analyser H/W & detection element Not identical H/W
Ranges Not identical ranges
Continuous reading discrete sampling -- n times / day
Direct reading Normalisation / avraging may be carried out
Correlates to ASTM stds Follows ASTM standard
Operator Bias Nil The result can very from Lab Technician to Technician
Dedicated Instrument for dedicated stream to analise
19
Different samples from the same stream may be analysed of
Appartus Bias predetermined parameter diofferent identical instruments
Why Control Charts ?
• Every process varies. If you write your name ten times, your signatures
will all be similar, but no two signatures will be exactly alike. There is
an inherent variation, but it varies between predictable limits. If, as you
are signing your name, someone bumps your elbow, you get an
unusual variation due to what is called a "special cause". If you are
cutting diamonds, and someone bumps your elbow, the special cause
can be expensive. For many, many processes, it is important to notice
special causes of variation as soon as they occur.
22
What Are Control Charts
• Happily, there are easy-to-use charts
which make it easy see both special
and common cause variation in a
process. They are called Control
charts, or sometimes Shewhart
charts, after their inventor, Walter
Shewhart, of Bell Labs
– Type I or alpha errors occur when a point falls outside the control limits
even though no special cause is operating. The result is a witch-hunt for
special causes and adjustment of things here and there. The tampering
usually distorts a stable process as well as wasting time and energy.
– Type II or beta errors occur when you miss a special cause because the
chart isn't sensitive enough to detect it. In this case, you will go along
unaware that the problem exists and thus unable to root it out.
• All process control is vulnerable to these two types of errors. The reason
that 3-sigma control limits balance the risk of error is that, for normally
distributed data, data points will fall inside 3-sigma limits 99.7% of the
time when a process is in control. This makes the witch hunts infrequent
but still makes it likely that unusual causes of variation will be detected.
24
Setting up a Control-chart
• Before starting the use of the Control Chart ensure that
the analyser is running.
25
Setting up a Control-chart
• Calculate the average of the results plus the Standard
Deviation of the obtained values during the validation
period
b) Four successive points fall between the aim line and one warning
limit. Suspect a systematic error: Shorten test interval.
c) Six successive points fall between the end line and one warning
limit. Systematic error present; adjustment desirable.
28
a b c d e e
Deviation
+
Upper Control Limit
+
Upper warning Limit
+ +
+ + +
+ + +
+
Reference Value
+ +
+
+ +
Lower Control Limit
Reading
Sr no Date Time Lab Analyser Deviation Control Chart
10
1 01/01/2002 8:00 276 277 1
8
2 01/07/2002 8:00 275 274 -1
6
Reading
Sr no Date Time Lab Analyser Deviation Control Chart
10
1 01/01/2002 8:00 276 277 1
8
2 01/07/2002 8:00 275 277 2
6
0
5 28/1/02 8:00 274 275 1 01/01/2002 01/07/2002 14/1/02 21/1/02 28/1/02 02/05/2002 02/12/2002 19/2/02 27/2/02 03/07/2002
-2
6 02/05/2002 8:00 272 273 1
-4
7 02/12/2002 8:00 275 276 1
-6
31
Case Study- Flashpoint Analyzer- Before Bias i/p
Date & Time IP21 Value LAB Delta
10/1/03 8:12 57.81 68 10.19
10/1/03 15:55 62.28 71 8.72
10/2/03 0:00 59.07 69 9.93
RTF FP trend
10/2/03 7:54 58.20 70 11.80
10/2/03 15:58 59.36 71 11.64 90.00
10/2/03 23:43 57.71 68 10.29
10/3/03 7:58 61.40 70 8.60 85.00
10/3/03 16:01 63.35 71 7.65
10/3/03 23:43 56.45 71 14.55
80.00
10/4/03 7:53 60.24 71 10.76
10/4/03 16:04 58.19 71 12.81
75.00
10/5/03 7:44 59.27 71 11.73
10/5/03 15:35 59.36 68 8.64
11.80 70.00
10/6/03 0:04 58.20 70
10/10/03 7:48 58.68 69 10.32
12/10/03 08.036 59.44 72 12.56 65.00
10/13/03 15:48 60.33 69 8.67
Online
FP
34
Thank You
35