Sie sind auf Seite 1von 6

l

How to conduct a
plant performance test
Performance testing after initial start-up has value well beyond t he short-term
goal of validating equipment g uarantees-it's your only opportunity to
establ'sh the baseline performance of the overall plant and many of its
I 1 or systems. Corporate bean counters may be interested in short-term
but a good plant engineer unde rstands that a thorough performance test will be useful for ma ny years. Here's your guide to each facet
of a performance test-plus pitfa lls to avoid.
By Tina l. Tob uren, PE. and l arry Jones. McHale & Associates Inc.
ompleting a power plant 's sIan-up
and commissioni ng usually means
pushing the prime contraclor to wrap
up the remaining punch lisl items an d gelting the new operators trained. Staffers are
tired of the long hours they've put in and
are looking forward to sculing into a work
routine.
Just when the job site is beginning to look
like an operating planL a group of engineers
arrives with laptops in hand, commandeers
the only spare desk in the control room. and
begins to un pack boxes of precision instruments. In a fit of controlled confusion, the
engineers install the instruments. find primary no .... elements. and make the required
conneclions. Wires are dragged back to the
eOnlrol room and term inated at a row of neatly arranged laptops. When the test begins.

1. Trading spaces. This IS a typical setup


of data acquisitIOn computers used during a
plant performance test. Couflesy McHale &
AsSOCiates

the tes t engi neers stare at their monitors as


if they were watching the Super Bowl and
trade comments in some son of techno-geek
language. The plant perfonnance test has be
gu n (Figure I).

Anatomy of a test
The type and extent of plant performance
testing activities are typicall y driven by the
project specifications or the turnkey contract.
They also usually are linked to a key progress
payment milestone. although the value of the
tests goes well beyond legalese. 1be typica l
test is designed to \'erify power and heat rate
guarantees that are pegged to an agreed-upon
set of operating condi tions. Sounds simple.
right1 But the behind-theseenes work to
prepare for a test on wh ich perhaps mi llions
of dollars are at stake beyond the contract
guarantees almost cenainly exceeds you r expectations (see box).
Long before arrivi ng on site, the te5tteam
will have:
Gathered s ite information .
Reviewed the plant design for the adequacy and proper placement o f test taps and
for the type and location o f primary now
elements.
Developed plant mathematical models
:lTltll~SI pnx""edurt"s
Met with the plant owner. contractor. and
representatives of major original equipment manufacturers (OEMs) to iron oUI
the myriad details oot covered by COntract
specifications. Experienced owners will
have made sure that the plant operations
staff is included in these meetings.
Tests are nonnally conducted at full- load
operation for a predetermined period of time.
The test team collects the necessary data and
ruliS them through Ihe facility correction
www.pow mog.com

model to obtain preliminary resul ts. Usually withill a day. a preliminary test report
or letter is generated to allow the owner to
declare 'substantial completion" and commence commercial operation. The results ror
fuel sample analysis (andlor ash samples) are
usually available within a couple of weeks.
allowing Ihe final custome r repon 10 be fin
ished and subminoo.
The an and seienceor perfonnance testing
require very specialized expenise and experience that take years to de\elop. The sci-

Performance test
economics are
overpowering
Co nsider a 500-MW fa cility with a heat
rate of 7,000 Btu/kWh. Whe n operating
at baseLoad with an 80% capacity factor, the plant will consu me over 24 million mmBtu per yea r. At a tu el cost of
S8/mm Btu, t hafs nearly SlOO miLlion in
fuel cost s fo r t he year.
If a n instrum entation or controL error ra ises t he heat rate of the facility
by 0.5%, t hat would cost the plant an
additional $1 million each year. If. on
t he other hand, a misreported heat rate
causes the faciLity to be dispatched
0.5"10 Less often, reducing the capacity
factor to 79.5%, t he Losses in revenue
at S50/MWh would amou nt to nearly
S1.1 mittion for the year.
Performa nce t ests can bring the right
people t ogethe r at the facility to identify Losses in performance and to recapt ure or prevent such losses in fa cility
profits.

PERFORMANCE TESTING
cnee of crunching data is defined by industry
standards , but the art rests in the ability \0
spot data inconsistencies , subtle instrument
errors. skewed control systems . and operational miscues. The c"pcrienced tester can
also quickly delennine how tbe plant must
be configured for the tests and can answer
questions such as, Will the steam turbine be
in pressure control or at valves wide open in
sliding-p ressure mode? What control loops
will need 10 be in manual or automatic during testing? and At what le vel should the
boiler or duct bumers be fired?

For the no vice, it's easy to miss a 0.3%


error in onc area and an olTsclling 0.4% e(ror in another area that together yiel d a poor
result if they aren't resolved and accounted
for. Wilh millions of dollars on the line, the
results have to be rock solid

M id-term exams
There are many reasons to evaluate the performarKe of a plant beyond meeting contract
guarantees. For example. a performance test
might be conducted on an old plant to verify
its output and heat rate prior to an acquisition

Kick your building schedule ...

~taullC

-.--

It's a fact: Although piping systems account for


as little as five percent of total installed costs on
a project. installation ca n eat up more than 30
percent of all field man ho urs. In fact. any misste p
in th e process- whether labor sho rtages , hot wor k
permit delays. or leng thy weld ti mes-can lead 10
ma king or brea king you r projec t sc hedu le. For over
8Oyears. the Victaulic grooved pipe joining method
has saved lime and money on industri al projects
across the globe . It is effiCient. proven technology
fO( uti lity and fire protecllon services th at's faster,

safer and in teday's world of fast-track building


schedules. built for speed.

to conclusively determine its asset value.


Other perfomlance tests might verify capacit y and heat rme for the purpose of maintaining a power purchase agreement. bidding a
plant properly into a wholesale market, or
confirming the performance changes produced by major maintenance or component
upgwdes.
Performance tests are also an in tegral
part of a quality performance moni toring
program. If conducted consistently. periodic
performance tests can quantify nonrecoverable degradation and gauge the success of
a facility 's maintenance programs. Performance tests also can be ron on individual
plant components to infom1 maimenance
planning. If a component is performing better than expected. the interval between maintenance activities ca n be extended. If the
opposite is the case. additional inspection or
repair items may be added to the next o utage
checklist.
Whatever the reason for a test. its conduct should be defined by industry-standard
specifications such as the Performance Test
Codes (PTCs) published by the American
Society of Mechanical Engineers (ASME),
whose web site - www.asme.org - has a
complete list of available codes. Following
the PTCs allows you to confidently compare
today's and tomorrow's results for the same
plant or equipme nt . Here. repeatabili ty is the
name of the game.
The PTCs don't anticipate how to test every plant configuration but. rather. set general
guidelines. As a result . some imerpretatiOfl
of the codes' intent is always necessary. In
fact. the PTCs anticipate variations in test
conditions and reponing requ ire ments in a
code-compliant test. The test leader must
thoroughly understand the codes and the implications of how they are applied to the plant
in question. Variances must be documented.
and any test anomalies must either be identified and corrected before starting the test or be
accounted for in the final test report.
A performance test involves much more
than just taking data and writing a report.
More time is spent in planning and in posttest evaluations of the data than on the actual
test. Following is a brief synopsis describing
the process of developing and implementing
a typical performance test. Obviously. the details of a particular plant and the requirements
of ils owner should be taken into account
when de\'elopi ng a specific test agenda.

Plan ning for the test


The ASME PTCs are often referenced in
equipment purchase andlor engineering. procure me nt. and construction (EPC) contracts
to provide a standard means of determining
compliance with performance guara ntees.
ClfICU l3 ON REAOtR SERVICE CARD

"

POWER ls.pwmlar 4'tU

The ASME codes are developed by balanced comrnillees of usen. manufacturen.


independent testing agencies. and other parties interested in following best engineering
practices. They include instructions for designing and executing perfonnance tests at
both the overall plant level and the componentlevel.
Planning a perfonnance test begins with
defining its objective(s): the validation of
contractual guarantees for a new plant and!
or the acquisition of baseline data for a new
or old plant. As mentioned. part of planning
is mak.ing sure that the plant is designed
so il enn be tested. Design ~quireJlle"t:>
include defining the physical boundaries
for the tCSt, making sure that test pons and
permanent instrumentation locations are
available and accessible. and ensuring tbat
fl ow metering meets PTe requirements (if
applicable).
After the design of the planl is fixed. the
objectives of testing must be defi ned and
documented along with a plan for conducting the test and analyzing its resUlts . A wellwrillen plan will include provisions for both
ex pected and unexpec ted test conditions.

lion Organization (ISO) conditions: 14.696


psia. 59F. and relative humidity of6O%. n.e
condition of the inlet air has the biggest impact on gas turbine-based plants because the
mass flow of air through the tu rbines (and
consequently the power they can produce)
is a function of pressurc. temperaturc. and
humidity. PerfOnllanCe guarantees for steam
plants also depend on air mass flow. but to a
lesser eXlenl.
The barometric pressure reference condition is nonnally set to the average barometric
pressure of the site . If a gas turbine plant is

sitcd 31 sea level. its barometric pressure reference is 14.696 psia. For the same plant at
an altitude of 5.000 feet. the refere nce would
be 12.231 psia. and its guaranteed output
would be much lower.
The relative humidilY reference condition
mayor may not have a significalll bearing
on plant perfonnance. In gas turbine plants
the effect is nOI large (unless the inlet air is
conditioned). but it still must be accounted
for. The effect of humidity. however. is more
pronounced on cooling towers. Very humid
ambient air reduces the rate at which evapo-

Understanding guarant ees


and corrections
The most common perfonnance guarantees
are the power output and heal rate that the
OEM or contractor agrees to deliver. Determining whether contractual obligations have
been met can be lrick.y. r-or example. a plant
may be guaranteed to have a capacity of 460
MW at a heat rate of 6,900 BtulkWh - but
only under a fixed set of ambient operating
conditions (reference conditions). Typical
reference conditions may be a humid summer day with a barometric pressure of 14.64
psia. an ambient temperature of 78F. and
relative humidity of 80%.
The intent of testing is to confinn whether
the plant performs as advenised under those
specific conditions. But bQw do you verify
that a plant has met ils guarantees when lhe
test must be dOlle on a dry winter day. with
a temperature of 50F alld 20% relative humidity?The challenging pan of perfonnance
t~sting is correctin!l the resuhs for differences in atmospheric conditions. OEMs and
contractors typically provide ambient correctioll factors as a set of correction curves
or fonnulas for their individual components.
But it is often up 10 the performance test engi neers to integrate the component infonnalion into the overall performance correction
curves for the facility.
The reference conditions for performance
guarantees are unique to every sile. A simple-cycle gas turbine's ratings assume ils
operation unde r Intem3lional Standardiza-

FARO

FARO.com

800.736 .0234
----~

THE

MEASURE

..,. .... .. ''"'',,,. ... ,,,.


~

OF

____ =-o> _.,.,.....,.." ....

~~.""""

SUCCESS
~.,

..

""""_~

~,_'""'"'~

CIRCLE 301 otI READER SERVICE CARD

s.-.... Zllli I,.oWER

"

. 1PERFORMANCETESTING
ration takes place in the tower. lowering its
cooling capacity. Do wnstream effects arc
an increase in steam turb ine bac kpressure
and a reduction in the turbine -generator's
gross capacity.
The most imponant correction for gas
turbine plant performance tests involves
compressor inlet air temperature. All hough a
site's barometric pressure Iypically varies by
no more than 10% over a year. its tempera tu res may range from 20F to lOOP over the
period. Because air temperature has a direct
effect on air densit)'. temperature variation
changes a unit's available power output. For
a typical heavy-d uty frame gas turbine. a 3degree change in temperature can affect its
capacit)' by 1%. A temperature swing of 30
degrees could raise or lower power output
by as much as 10%. The effect can be even
more pronounced in aeroderivative engines.
ISO-s tandard operating conditions or
site-specific reference conditions are al most impossible to achieve during an ac tual test. Accordingly. plant contractors
and owners often agree on a base operating
condition that is more in line with normal
site atmospheric conditions. For example. a
gas turbine plant built in Florida might be
tested at reference conditions of 14.6 psia.
78F. and 80%. Establishing a realistic set of

reference cond itions increases the odds that


conditions during a performance leSI will be
close to the reference conditions. Realistic
reference conditions also help ensure that
the guarantee is representative of expected
site output
Establishing site-specific reference conditions also redu ces the magnitude of COfrections to measurements. When only small
corrections are needed to re late measured
performance from the actual te st condition s
10 the reference conditions, the correction
methods themselves become less prone to
question. raising everyone's comfort level
with the quality of thc performancc test
results.
Beyond site ambient conditions. the PTCs
define numerous other correction factors
that the tesl designer must consider. Most are
site-specific and include:
Generator power factor.
Compressor inlet press ure (after losses
across the filter house).
Turbine exhaust pressure (due to the presence of a selective catalytic reduction system or heat-recovery steam generator).
Degradation/fired hours. recoverab le and
unrecoverable.
Process steam now (export and return).

8lowdown (normally isolated during


testing).
Cooling waler tempeT1ltllre (if using onccthrough cool ing. or if the cooling lower is
outside the test boundary).
Condenser pressure (if the cooling water
cycle is beyond the test boundary).
Abnormal auxiliary loads (s uch as heal
tracing or construction loads)
Fuel su pply conditions. including telllperature and/or composition

Choose the right instrumentation


Instrumentation used to record test measurements should be selected based on a pre-test
uncenainty analysis (see "Understanding test
uncenainty'). This analysis is important to
fine-tune the instrumentation to ensure that
the quality of the test meets expectations.
The test instruments themselves are usuall y
a combination of temporary units installed
specifically for testing. permanently installed
plant instrumenlation. and utility instrume ntation (billing or re\'enue metering). Temporary instruments are typica ll y installed
to make key measuremen ts that have a significant impacl on results and where higher
aceumcy is needed to reduce the uncenainty
of test results. Among the advantages of using a piece of temporary instrumentation is

General Physics Corporation (Gp) has been hetping power companies find
soIutioos to worl<forte qualifiation issu,,", for over 40 V"ars. GP a n hel p you
eduale your emptoyee'> to make !itlre they IIave the skills an d knowledge they
need to !itlcu'sslully operate and maintain your ptant.
Benefits of GPi Leam "':
21ft 7 Availability

Cost Effect ive


Centralized RecordKeeping
Accredited Degree Prog ra m
Host S j t~ Sp~cific Conlcnt
WorldClass Oracle Learning
Manage me nt System (lMS)

Ou r extensive li brary of Power industry prog rams addresses:


Hyd roele<:t ric f'lants
Ope.ators
Wind Farms
Mec hanics
Electricians
WastetoEnergy f'lants
I&C Technicians
Na tco Wate r Treatment
- Coal (Mater jaO Hand lers
G~ Fr~me 7F Combined Cycle PI3nts
Siemens SOI F Combined Cyele Plants OSHA Comp!i~nce
Environme nt~l Compliance
Fundamenta l Training (New Hires)
Chemists and lab Technicians

learn more about


energizing your workforce.
General Physics Corporation

410 .379.3 6 59
www.EnergyWBT.com

CIRCU 35 Ofj READER SERVICE CARD


POWI;R

ISeplemb.r 20li

PERFORMANCE TESTING [
that it has been calibrated specifically for the
performance test in question followillg National lnstiwle of Standards and Technology
(NIST) procedures.
Another benefit of installing temporary
instrumentation is to verify the readings of
pennanent plant instruments. Plant instrumentation typically lacks NIST-traceable
calibration or has been calibrated by technicians who are more concerned with operability than with accurate perfonnance testing.
1bere's a good reason for the former: Per-

foroling a codc- Ievel calibration on plant


instrumentation can be more expensive than
installing temporary test instrumentation. An
additional benefit of a oomplete temporary
test instrumentation setup is that the instrumentation. signal conditioning equipmclll.
and data acquisition system are often calibrated as a oomplctc loop, as is recommendcd in PTC-46 (Overall Plant Performance).
All pcrfomlance inStrumeUls should be
installed correctly, and any digital readings
should be routed to a centrnllocation. Chaos-

ing a good pcrform,mce data center is very


important. A perfonnance command ceUler
should be out of the way of site operations
yet close enough to observe plan t instrumenullion input and operation.
Obviously. performance instru ment readings should be checked against those of
plant instrumenlS. where available. This is
one of the most important checks that can
be made prior to a performance test. When
a performance tester can get the same result
from tWO different instruments that were

Understanding test uncertainty


Uncertainty is a measure of the quality of the test or calculation result. A pretest uncertainty analysis can be used to design a test to meet predefined uncertainty limits. A post-test
uncertainty analysis should be performed to verify that those
uncertainty limits were met and to determine the impact of any
random scatter recorded in the test data.
Each input to the calculation must be analyzed for its impact
on the final resuLt. This impact is identified as the sensitivity
of the result to that input. For example, if inlet air temperature
changes by 3 degrees F, and the corrected output changes by
1"10, the sensitivity is 1"10 per 3 degrees F or 0.33"1o/degree F.
The instrumentation information is used to identify the systematic error potential for each input. For exampLe, a precision
4-wire resistance-temperature detector can measure inlet air
temperature with an accuracy of +/-O.18F, based on information
provided by the manufacturer and as confirmed during periodic
calibrations.
During a test run, multipLe recordings are made for any given
parameter, and there will be scatter in the data. The amount of
scatter in the data is an indication of the random error potentiaL
for each input. For example, during a 1-hour test run, the inlet
air temperature may be recorded as an average of 75F, with a
standard deviation in the measurements of O.6F.
If more than one sensor is used to measure a parameter, there
also will be variances between sensors based on location. These
variances may be due to the variances either in the instrumentation or in the actual parameter measured. For example, if air
temperature is being measured by an array of sensors, there may
be effects due to ground warming or exhaust vents in the area,
either of which would affect the uncertainty of the bulk average
measurement. These variances wiLL affect the average and standard deviation values for that parameter. SPiltiill variances are
added into the systematic error potential, based on the deviation of each Location from the average vaLue for aLI locations.
Now that we've defined the three separate inputs to the uncertainty determination- sensitivity (A), systematic error potentiaL/uncertainty (8), and ra ndom error potential/uncertainty
(C)-it's time to put on our statistician's hats.
The terms can be combined in the following equation:
Uncertainty .. SORTI (A x 8F + ( t x A x C)2]

~r2tlOl I PO WE It

The "1" value on the right side of the equation is known as


the St~dent-t factor and is based on the number of degrees of
freedom (or number of data points recorded) in the data set. For
a 95"10 confidence interval and data taken at I-minute intervals
for a 6D-minute test run, the value of "1" is 2.0. If data are taken
less frequently (such as at 2-minute intervals), fewer recordin9s
are made and therefore either the test run must be longer (which
is not recommended, because ambient conditions may change)
or the value of "t" will increase.
The exampLe given above is for a single parameter, such as
inlet air temperature, and its effect on corrected output. for
each correction made, the same process must be carried out to
determine the sensitivity, systematic uncertainty, and random
uncertainty of the corrected result on that correction parameter
(such as barometric pressure and relative humidity).
Once each individuaL uncertainty has been identified, they can
be combined to determine the overall uncertainty of the corrected result. Combining the individual uncertainties is a threestep process;
Determine the total systematic uncertainty as the square root
of the sum of the squares for alL the individuaL systematic
uncertainties.
Determine the totaL random uncertainty as the square root of
the sum ofthe squares for all the individual random uncertainties.
Combine the total systematic uncertainty and total random
uncertainty as follows: Total uncertai nty _ SQRT[(systematic_
total)2 + ( t x random_totaIF ].
The result of the anaLysis is an expression stated in terms of
the uncertainty cakuldll!d for an individual instrument or the
overall system. We might normally say, "The inlet air temperature is 7SF," but when including an uncertainty analysis of a
temperature measurement system, a more accurate statement
would be, NWe are 95"10 certain that the inlet air temperature is
between 74.6f and 75.4f."
Once again, the vaLue for "t" wilL depend on the design of the
test, including the number of multiple sensors and the frequency
of data recordings. Additional information on the Student-t factor as well as a discussion of how to determine uncertainty can
be found in ASME PTC 19.1 (Test Uncertainty).

..

. 1PERFORMANCE TESTING
installed to independent leSI ports and calibrated separately, there's a gOCKl chnnce the
measurement is accurate. If there's a differ-

ence between the readings that is close to or


exceeds instrument error. something is likely
to be amiss.
Typicully, when plant guaramces are tied

to corrected output and heal ratc, the two


most imponam instrument readings aTe measured power and fuel flow. If either is wrong.
the test results will be wrong. For c)(amplc,
say you're testing a unit whose expected out-

put is 460 MW. The plant instrumcm is accurate 10 within I %. and the test instrument
is even more accurate: +/-0.3% . In this case,
the lester prefers to see the two readings well

within 1% of each other (4.6 MW) but they


still may be as far apart as 5.98 MW (1.3%)
and technically be within the instruments'
uncertainty.
When setting up for a perfonnance test.
it is oot uncommon to find errors in pennanent plant instrumentation, control logic,
or equipment installation. These errors can
influence the operation of a gcneratin g unit,
for example by causing ovcr- or underfiring
of a boiler or gas turbine and significalllly
impacting the unit's output and heat rate_ In
cases where the impact on actual operation
continues undetected, the corrected test report values may still be in error due to correc tions made based on fault y instrument
readings. If these reported values are used
as the basis of facility dispatch. a small error could have an enonnous impact on the
plant's bonom line, ranging from erroneous
fuel nominations to the inability to meet a
capacity commitment.

Conduct the test


The perfonnance test should always be
conducted in accordance with its approvcd
procedure. Any deviations should be discussed and documented to make sure their
impact is understood by all parties. [fthe test
is conducted periodically, it is imJXlrtant to
know what deviations were allowed in previous tests to understand if any changes in
perfonnance might have been due to equipment changes or simply to the setup of the
1/"_~1 i1.sdf
Calibrated temJXlrary instrumentation
should be installed in the predetennined
locations, and calibration records for any
plant or utility instrumemation should be reviewed. Check any data collection systems
for proper resol ution and frequency and do
preliminary test runs to verify that aU systcms are operating properly.
The perfonnance test should be preceded
by a walk-down of the plant to verify that all
systems are configured and operating correctly. It's important to verify that plant operations

"

are in compliance with the test procedure because equipment disposition. operating limits.
and load stability affect the results. Data can
then be collected for the time periods defined
in the test pnx:edure and checked for compliance with all test stability criteria. Once data
have been collected and the test has been
deemed complete. the results can be shared
with all interested parties.
Because the short preliminary test may
be the most important part of the process. be
sure to allow sufficient time for it in the test
plan . The preliminary test must be done during steady-state conditions following load
stabilization Of when the unit is operating
at steady state during the emissions testing
program. The preliminary test has three objectives: to verify all data systems, to make
sure manual data takers are reading the correct instruments and meters. and to have the
data pass a "sanity check."
A fter the test data have been collected.
the readings should be entered into the
correction model as soon as possible and
checked for test stabi lity criteria (as dcfined
by the test procedure). At this point, dependin g on the correction methods. the test
director may be able to make a preliminary
analysis of the results. If the numbers are
way out of whack with expected values, a
good director will start looking for explanations-poSSibly, errors in the recorded
data or something in the operational sctup
of the unit itsclf. Though everyone is concerned when a uni t underperfonns, a unit
that performs unexpectedly well may have
problems that have been overlooked. For
exam ple, a unit that corrected test results indicate has a 5% capacity margin may need
10 have its metering checked and rectified.
or it may have been mistuned to leavc it in
an overfired condition.
Although an overtuned gas turbine may
produce more megawatt-hours during initial
operations. the gain comes with a price: increasing degradation of the unit's hot section.
shortening parts life and increasing maintenance costs. The most common mistake in
testing is acceptance of resu lts that are too
good. If results are bad. everyone looks for
thp. pmhlem. If the reslllt~ are ah-nve ]lar.
everyone is happy-especially the plant
ow ner. who seems to have gotten a "super"
machine. However. there's a reason for every excursion beyond expected perfonnance
limits - for better or worse
If all the pretest checks are done properly. the actual perfonnance test should be
uneventful and downright boring. It should
be as simple as veri fying that test parameters
(load , stability. etc.) are being mel. This is
where the really good perfonnance testers
make their work look easy. They appear to

have nothing to do during the test. and that's


true because they planncd it that way. Having done all the "real" work beforehand. they
can now focus on making sure that nothing
changes during the lest that may affect the
stability of the data.

Analyze the results


Almost immediately after the perfonnance
test (and sometimes even before il is complete). someone is sure to ask. "Do you have
the results yet?" Everyone wants 10 know if
the unit passed. As soon as practical. the perfonnance group should produce a preliminary report describing the tes1 and detai ling
the results . Data should be reduced to test
run averages and scrutinized for any spurious outliers. Redundant instrumentation
should be compared. and instrumentation
should be verified or calibrated after the test
in accordance with the requirements of the
procedure and applicable test codes
The test runs should be anal yzed following the methods outlincd in the test procedure . Results from multiple test runs can be
compared with one another for the sake of
repeatability. PTe 46 (O verall Plant Performance) outlines criteria for overlap of corrected test results. For example. if there are
three test runs, a quality test should demonstrate that the overlap is well wi thin the uncertainty limits of the test.
Once test analysts are satisfied that the results were proper, the test report can be written to communicate them. This reJXlrt should
describe any exceptions to the tcst procedure
that may have becn requi.red due to thc conditions of the facility during thc test. In the
event that the results of the perfonnance test
are not as expected, the reJXlrt may also suggest potential next steps 10 rectify them.
For sites where the fuel ana lysis is not
available online or in real time, a preliminary efficiency and/or heat rale value may
be reported based on a fuel sample taken
days o r even weeks before the test. Depending on the type and source of the fuel. this
preliminary analysis may be significantly
different than that for the fue l burned during the test. It's important to understand thaI
preliminary heill ralp. and efficiency re~lllts
are often subject to significant changes.
Once the fuel analyses are available for the
fuel samples taken during the test. a final
reJXlrt can be prepared and presented to all
interested parties .
- Tina L Toburen, PE, is manager of
performance monitoring and Larry Jones
is a testing consultant for McHale &
Associates. Toburen can be reached at
425-557-8758 or tina.toburen@mchale.org;
Jones can be reached at 855-588-2554 or
larry.joneS@mchale.org.
POWER I Stpllmbe, 2006

Das könnte Ihnen auch gefallen