Sie sind auf Seite 1von 112

Thermo Fisher Scientific Niton

Analyzers

FXL
Version 8.0

Users Guide
Revision A

March 2011

2010 Thermo Fisher Scientific Inc. All rights reserved.

Thermo Fisher Scientific Inc. provides this document to its customers with a product purchase to use in the
product operation. This document is copyright protected and any reproduction of the whole or any part of this
document is strictly prohibited, except with the written authorization of Thermo Fisher Scientific Inc.
The contents of this document are subject to change without notice. All technical information in this
document is for reference purposes only. System configurations and specifications in this document supersede
all previous information received by the purchaser.
Thermo Fisher Scientific Inc. makes no representations that this document is complete, accurate or errorfree and assumes no responsibility and will not be liable for any errors, omissions, damage or loss that might
result from any use of this document, even if the information in the document is followed properly.
This document is not part of any sales contract between Thermo Fisher Scientific Inc. and a purchaser. This
document shall in no way govern or modify any Terms and Conditions of Sale, which Terms and Conditions of
Sale shall govern all conflicting information between the two documents.

Release history:

For Research Use Only. Not for use in diagnostic procedures.

Table of
Contents

Intended Use Statement ................................................................................................... 1


Radiation and General Safety ........................................................................................... 1
Warning Symbols ................................................................................................................ 1
Radiation and General Safety Overview .............................................................................. 1
Radiation Protection Basics ................................................................................................. 2
Exposure to Radiation ......................................................................................................... 2
Other Sources of Radiation Dose ........................................................................................ 3
Monitoring Your Radiation Dose ........................................................................................ 4
Pregnancy and Radiation Exposure ..................................................................................... 4
How to Use your Thermo Scientific Niton FXL Analyzer Safely ......................................... 4
Radiation Dose Rate Specifications ..................................................................................... 5
Storage and Transportation ................................................................................................. 5
Emergency Response Information ....................................................................................... 6
Registration and Licensing .................................................................................................. 6
Hardware Tour ................................................................................................................ 7
Front View .......................................................................................................................... 7
Front View - Open .............................................................................................................. 8
Left View ............................................................................................................................. 9
Rear View ............................................................................................................................ 10
Right View .......................................................................................................................... 11
Control Panel ...................................................................................................................... 12
Touch Screen Display ......................................................................................................... 12
X-Y Jog Button ................................................................................................................... 13
Start Button ........................................................................................................................ 13
Stop Button ......................................................................................................................... 14
Cover .................................................................................................................................. 14
Warning Lights ................................................................................................................... 15
Ports .................................................................................................................................... 15
USB Ports ........................................................................................................................... 16
Download Port .................................................................................................................... 16
Power Port .......................................................................................................................... 17
Helium Port ........................................................................................................................ 17
On-Off Switch .................................................................................................................... 18
Power Supply ...................................................................................................................... 18
Ground Port ........................................................................................................................ 18
Battery ................................................................................................................................. 19
Startup Operations ........................................................................................................... 21
Starting Up your Analyzer ................................................................................................... 21
Performing a System Check ................................................................................................. 22
Batteries .............................................................................................................................. 23
Basic Operation ............................................................................................................... 27
The Top Menu ................................................................................................................... 27
Taking a Sample Analysis .................................................................................................... 27
Analysis Modes .................................................................................................................... 34
Using General Metals Mode ................................................................................................ 35
The Menu System ............................................................................................................ 39
The Test Menu ................................................................................................................... 39
The Data Menu .................................................................................................................. 40

The Method Setup Menu .................................................................................................... 41


The System Menu ............................................................................................................... 48
Date and Time .................................................................................................................... 49
Bluetooth ............................................................................................................................ 49
Start/Stop Settings ............................................................................................................... 50
Calibrate Touch .................................................................................................................. 50
Specs ................................................................................................................................... 51
System Check ...................................................................................................................... 52
Log Off ............................................................................................................................... 52
Common Operations ....................................................................................................... 53
Metal Sample Prep .............................................................................................................. 53
Soil Sample Prep ................................................................................................................. 63
Preparing Mining Samples .................................................................................................. 65
Setting Up Beep Times ....................................................................................................... 66
Sorting the Custom Element Display .................................................................................. 67
Max Measure Time ............................................................................................................. 67
Minumum Test Time ......................................................................................................... 68
Virtual Keyboard ................................................................................................................. 69
Setting Display Units .......................................................................................................... 71
Adjusting the Element Range .............................................................................................. 72
Downloading to a USB Memory Stick or Thumb Drive ..................................................... 73
X-Y Movement .................................................................................................................... 75
Setting the Date and Time .................................................................................................. 76
Calibrating the Touch Screen .............................................................................................. 82
Advanced Topics .............................................................................................................. 85
TestAll Geo ......................................................................................................................... 85
Start/Stop Settings ............................................................................................................... 85
Adjusting the Calibration .................................................................................................... 86
Calculating Calibration Factors ........................................................................................... 88
Pseudo-Elements ................................................................................................................. 94
Helium Purged Analysis ...................................................................................................... 100
Standard Maintenance ..................................................................................................... 105
Analyzer Specifications ........................................................................................................ 105
Battery Pack Disposal: ......................................................................................................... 105
Shipping and Transportation (Air): ..................................................................................... 105
Product Registration: ........................................................................................................... 105
Customer Service Contact Information: .............................................................................. 106
Limited Warranty: ............................................................................................................... 106

Intended Use Statement


The Thermo Scientific Niton FXL is a field mobile (weighing 30lbs/14 kgs) analytical instrument. It utilizes X-ray
Fluorescence (XRF) for elemental analysis of samples of various matrices. These matrices can include, but are not
limited to, metals, powders, liquids and films. The system is intended to provide the user with non-destructive
quantitative analysis (in ppm (mg/kg) or wt.%) of up to forty elements in any sample. The user can archive and store
these results either on the analyzer itself (up to 10,000 readings) or download the data to a PC or memory stick.
Regarding measurements, factory calibrations are provided using reference sample or NIST certified reference (when
available). The analyzer is verified using in-house testing protocol and shipped with a copy of that verification (when
available).
When using the Niton FXL for purposes other than its intended use as an analytical instrument, it is at the user's
discretion with acknowledgement of the stated warnings against such usage. Please reference radiation safety section
and "how to analyze" section of the Thermo Scientific Niton FXL Resource Guide for further information on
relevant safety requirements and on preparing to analyze samples.

Radiation and General Safety


CAUTION Thermo Scientific Niton analyzers are not intrinsically safe analyzers. All pertinent Hot Work procedures
should be followed in areas of concern.
WARNING Always treat radiation with respect. Do not attempt to override interlocks.

Warning
Symbols

Figure 1. The Radiation Warning Symbol

On the fold-out Control Panel, below the Analysis Stop button, is a symbol. This symbol means that the analyzer
will produce radiation when a sample is undergoing analysis, so proper care must be taken to minimize any risk to
the operator.

Radiation and
General
Safety
Overview

The Thermo Scientific Niton Model FXL analyzer contains an x-ray tube which emits radiation into a shielded
sample chamber. The x-ray tube emits radiation only when a sample is being measured. During this time, indicator
lights surrounding the analyzer light up to alert personnel that the x-ray tube is on - See Warning Lights on
page 15. The shielded sample chamber is designed to reduce radiation dose rates to less than 2.5 microsieverts (or
equivalently 0.25 millirem) per hour at a 5 centimeter distance from any point along the surface of the analyzer.
The sample chamber is also designed with an interlock system so that the x-ray tube cannot be energized unless the
sample chamber door is closed. The purpose of this interlock system is to prevent a person from placing any part of
their body into the primary, unshielded x-ray beam.
With the shielded and interlocked sample chamber, the Thermo Scientific Niton Model FXL analyzer can be
operated safely with minimal programmatic and administrative controls. The following list summarizes the key safe
use guidelines that all operators should be made aware of:
Operators should be aware of the hazards and precautions associated with the use, storage, and
transportation of lithium ion batteries. This information can be found in Standard Maintenance and
Startup Operations in this Resource Guide.
Operators should be provided basic radiation safety information. Consult with your local radiation safety
authority to determine the specific operator training requirements in your jurisdiction. At a minimum,
operators should read and understand the safety information in this Resource Guide.
Where required by your local radiation safety authority, obtain authorization to use this analytical device.
Authorization is typically granted in the form of a registration, license, certificate, or permit. Your local
authority can be a valuable resource for safe use guidelines.
It is recommended that your organization establish, and follow a Radiation Protection Program.
Use caution when opening and closing the LCD control panel and sample chamber door. Be aware of
potential pinch points.
Do not attempt to override interlocks or in any way interfere with their proper operation.
The FXL model analyzer incorporates engineering controls that are designed to prevent access to the sample
chamber during a measurement. Never place a part of your body in the sample chamber if the "X-RAY ON"
indicator lights are on.

Maintain the analyzer in accordance with the instructions in this User's Guide.
High voltage and high intensity x-ray beam hazards exist inside the instrument. Do not attempt to
disassemble or service your analyzer beyond the maintenance instructions described in this Resource Guide.
Never remove parts or components except as described in this Resource Guide. Service must be performed
at an authorized service center.
Use caution when lifting or moving the analyzer to prevent strains or back injuries.

Radiation
Protection
Basics

Reasonable effort should always be made to maintain exposure to radiation as far below dose limits as is practical.
This is known as the ALARA (As Low as Reasonably Achievable) principle. For any given source of radiation, three
factors will help minimize your radiation exposure: Time, Distance, and Shielding.

Time

The longer you are exposed to a source of radiation the longer the radiation is able to interact in your body and the
greater the dose you receive. Dose increases in direct proportion to length of exposure.

Distance

Intensity of radiation becomes weaker as it spreads out from a source since the same amount of radiation becomes
spread over a larger area. Based on geometry alone, dose increases and decreases with an inverse-squared relation to
your distance from the source of radiation (additional dose rate reduction comes from air attenuation).
For example, the radiation dose one foot from a source is nine times greater than the dose three feet from the source.

Shielding

Shielding is any material that is placed between you and the radiation source. The more material between you and
the source, or the denser the material, the less you will be exposed to that radiation. The sample chamber of the
Thermo Scientific Niton Model FXL analyzer incorporates various forms of shielding.

Exposure to
Radiation

Human dose to radiation is typically measured with either the unit sievert (Sv) or the unit rem. More commonly,
for low level occupational exposure, these units are expressed with prefixes such as microsieverts (Sv)or millirem
(mrem). The following conversions apply:
1 Sv = 100 rem
1 Sv = 1,000,000 Sv
1 Sv = 1,000 mSv
1 rem = 1,000 mrem
The following is a list of dose limits recommended by the International Commission on Radiological Protection
(ICRP) in ICRP Publication 60. These proposed limits are adopted by many countries worldwide.
Table 1. Recommended Dose Limits from ICRP Publication 60
Application

Occupational

Public

Whole Body

20 mSv/year Effective dose

1 mSv in 1 year

averaged over 5 years,


max: 50 mSv/yr
Annual Equivalent
Dose to
Lens of eye

150 mSv

15 mSv

Skin

500 mSv

50 mSv

Hands & Feet

500 mSv

Radiation dose to personnel from a properly used Niton FXL analyzer should be well within these example dose
limits, even if the analyzer is used as much as 2,000 hours per year. These are examples and it is important that
users consult with local authorities to determine the limits that apply to specific use situations.

Other
Sources of
Radiation
Dose

Shown in Table 1 are the typical background radiation doses received by the average member of the public. The
radiation dose limits for radiation workers in the US are also shown in Table 2.
Table 2. Typical Radiation Doses Received (Source: NCRP 1987)
Category

Dose in mrem

Dose in mSv

Average total dose in US (annual)

360

3.6

Average worker exposure (annual)

210

2.1

Average exposure for an underground miner

400

4.0

Exposure for airline crew (1,000 hours at 35,000 ft)

500

5.0

Additional from living in Denver at 5300 (annual)

25

0.25

Additional from 4 pCi/l radon in home

1,000

10.0

Typical Chest X-Ray

0.06

Typical Head or Neck X-Ray

20

0.2

Typical pelvis/hip X-ray

65

0.65

Typical lumbar spine x-ray

30

0.3

Typical Upper G.I. x-ray

245

2.45

Typical Barium enema x-ray

405

4.05

Typical CAT scan

110

1.10

Table 3. Annual Occupational Dose Limits for Radiation Workers (Source: Code of Federal Regulations Title 10, Part 20)
Category

Dose in mrem

Dose in mSv

Whole Body

5000

50

Pregnant Worker (during gestation period)

500

Eye Dose Equivalent

15,000

150

Shallow dose equivalent to the skin or any extremity or organ

50,000

500

Maximum allowable dose for the general public (annual)

100

1.0

For a Minor

500

5.0

Monitoring
Your
Radiation
Dose

Individuals can be monitored for the radiation dose they receive by use of radiation dosimetry devices (dosimeters).
In some locations, dosimetry is required by regulation and in others it is optional. Thermo Fisher Scientific
recommends that you determine and obey the local regulatory requirements concerning radiation monitoring of
occupational workers.
Two common types of dosimeters are whole-body badges and ring badges. Whole body badges are often attached to
the users torso (e.g., clipped to the collar, shirt pocket, or waist as appropriate). A ring badge is worn on the finger as
a measure of maximum extremity dose. When worn, the specific location of the dosimeter should be that part of the
body that is expected to receive the highest dose. This location will depend on how the analyzer is used and so it may
not be the same for all users.
Dosimetry services are offered by many companies. Two companies offering dosimetry services in the USA and
much of the world are:
Table 4. Sources for Dosimeters
Company

Global Dosimetry Solutions

Landauer, Inc.

Address

2652 McGaw Avenue

2 Science Road

City and State

Irvine, CA 92614

Glenwood, IL 60425-9979

Website

www.dosimetry.com

www.landauerinc.com

Phone Number

(800) 251-3331

(800) 323-8830

Note Wearing a dosimeter badge does not protect you against radiation exposure. A dosimeter badge only measures
your exposure (at the dosimeter location).

Pregnancy
and Radiation
Exposure

International guidance documents (e.g., ICRP Publication 60 and NCRP Publication 116*) recommend that the
radiation dose to the embryo/fetus of a pregnant woman should not exceed a total of 5 mSv (500 mrem) during the
gestation period. While this dose limit far exceeds any anticipated operator dose, pregnant workers may choose to
take special precautions to reduce their exposure to radiation. For more information see the U.S. NRC Regulatory
Guide 8.13 "Instruction Concerning Prenatal Radiation Exposure".
* The International Commission on Radiological Protection, ICRP, is an independent Registered Charity,
established to advance for the public benefit the science of radiological protection, in particular by providing
recommendations and guidance on all aspects of protection against ionizing radiation.
* The National Council on Radiation Protection and Measurements (NCRP) was chartered by the U.S. Congress in
1964 as the National Council on Radiation Protection and Measurements.

How to Use
your Thermo
Scientific
Niton FXL
Analyzer
Safely

The Themo Scientific Niton FXL analyzer is designed to be safe to operate provided that it is used in accordance
with the manufacturer's instructions listed below.
Operators should be aware of the hazards and precautions associated with the use, storage, and
transportation of lithium ion batteries. This information can be found in Standard Maintenance and
Startup Operations in this Resource Guide.
Operators should be provided basic radiation safety information. Consult with your local radiation safety
authority to determine the specific operator training equirements in your jurisdiction. At a minimum,
operators should read and understand the safety information in this Resource Guide.
Where required by your local radiation safety authority, obtain authorization to use this analytical device.
Authorization is typically granted in the form of a registration, license, certificate, or permit. Your local
authority can be a valuable resource for safe use guidelines.
It is recommended that your organization establish, and follow a Radiation Protection Program.
Use caution when opening and closing the LCD control panel and sample chamber door. Be aware of
potential pinch points.
Do not attempt to override interlocks or in any way interfere with their proper operation.

The Thermo Scientific Niton FXL model analyzer incorporates engineering controls that are designed to
prevent access to the sample chamber during a measurement. Never place a part of your body in the sample
chamber if the "X-RAY ON" indicator lights are on.
Maintain the analyzer in accordance with the instructions in this User's Guide.
High voltage and high intensity x-ray beam hazards exist inside the instrument. Do not attempt to
disassemble or service your analyzer beyond the maintenance instructions described in this Users Guide.
Never remove parts or components except as described in this User's Guide. Service must be performed at
an authorized service center.
Use caution when lifting or moving the analyzer to prevent strains or back injuries.

Radiation
Dose Rate
Specifications

The Niton FXL Analyzer is designed to limit the radiation dose rate to no more than 0.25 mrem per hour at any
point 5 cm from the instrument surface under the worst-case operating conditions. Worst-case operating conditions
are as follows:
x-ray tube voltage at its maximum of 50 kilovolts
x-ray tube current at its maximum of 0.2 milliamps
lightest filter combination possible for a 50 kilovolt tube setting
an unattenuated primary x-ray beam (i.e., no sample present), or
a solid plastic sample present to maximize scatter radiation.
Survey results under the above stated worst-case conditions have been performed and demonstrate that the Niton
FXL Analyzer produces no more than a 2.5 Sv (0.25 mrem) per hour dose rate above background at a 5 cm
distance from any point along the surface. The manufacturer has used a Thermo Scientific Model MicroRem LE
dose rate meter to perform these measurements.

Storage and
Transportation
Storage

Transportation

Regulations in some jurisdictions may require that you store your analyzer in a secured area to prevent access, use,
and/or removal by unauthorized individuals. Storage requirements may vary by location, particularly with regard to
storage at temporary job sites or away from your primary storage location such as hotels and motels and in vehicles.
You should contact your local Radiation Control Authority to identify the specific storage requirements in your
jurisdiction.
Transport of lithium ion batteries is regulated by most transport authorities. End-users should obtain additional
information and training regarding the local requirements for transport of lithium ion batteries, as appropriate for
the specific transport modes that may be used. In particular, for air transport, most jurisdictions have adopted the
regulatory guidance published by the International Air Transport Association (IATA). These IATA regulations
provide instructions for the safe transport of lithium ion batteries by air in Packing Instructions 965 (for batteries
packed alone) and Packing Instruction 966 (for batteries packed with equipment). You will find additional
information about lithium ion battery safety and transportation in the Startup Operations and Standard
Maintenance Sections of this guide.
It is recommended that you ship the Niton FXL in its original shipping containerand foam to protect the sensitive
measuring equipment inside the analyzer.

Emergency
Response
Information
Thermo Fisher
Scientific's
Niton Analyzer
Contact
Numbers

Your local authorized service center should be contacted in any event where one suspects that an analyzer has been
damaged or is malfunctioning in a way that may prevent it from being used safely. The following contact
information should be made available to all operators for this purpose. Additionally, your local radiation safety
authority should be notified in any event that may have resulted in an overexposure of radiation.
Main Number (USA): (800) 875-1578
Additional Radiation Emergency #'s: (978) 790-8269 or (617) 901-3125
Outside the USA - Local Niton Service Center:___________________
Europe
Niton Analyzers Europe
Munich, Germany
Phone: +49 89 3681 380
Fax: +49 89 3681 3830
Email: niton.eur@thermofisher.com
Asia
Niton Analyzers Asia
Hong Kong
Phone: +852 2869-6669
Fax: +852 2869-6665
Email: niton.asia@thermofisher.com

Registration
and Licensing

As a user of a Niton FXL analyzer, you may be required to register or obtain a license, certificate, or permit with your
local radiation control authority. In the US, if you intend to do work with your Niton FXL in states other than your
own, you may be required to register there as well. See the Safety and Compliance Web Hub in the Resource Guide
for much more information about US regulatory requirements.

Hardware Tour
Front View

This is the front of the Niton FXL Analyzer.

Figure 1. Front View

Front View Open

This is the front of the Niton FXL Analyzer, opened for access to the Control Panel and the Cover.

Figure 2. Open Cover for Front View

Left View

This is the left side of the Niton FXL Analyzer.

Figure 3. Left View

Rear View

This is the rear of the Niton FXL Analyzer.

Figure 4. Rear View

10

Right View

This is the right side of the Niton FXL Analyzer.

Figure 5. Right View

11

Control Panel

This is the Control Panel of the Niton FXL Analyzer. All display and control buttons are located here.

Figure 6. Control Panel

Touch Screen
Display

This is the Touch Screen Display. This is your interface to analyzer operation and setup. If you prefer, you can use a
USB mouse instead of the touch screen.

Figure 7. Touch Screen Displau=y

12

X-Y Jog
Button

This is the X-Y Jog Button. This controls movement of the X-Y Table and thus the measurement window under the
sample. See X-Y Movement on page 75

Figure 8. The X-Y HJog Button

Start Button

This is the Start Button. Press this button to begin analysis.

Figure 9. The Start Button

13

Stop Button

This is the Stop Button. Press this button to stop analysis.

Figure 10. The Stop Button

Cover

This is the cover. It serves to shield the operator from radiation, and must be fully shut before analysis can begin.

Figure 11. The Cover

14

Warning
Lights

These are the Warning Lights. They serve to alert you that an analysis is taking place. Do not open the cover while
these lights are lit. See Radiation and General Safety on page 1 for more information.

Figure 12. The Warning Lights

Ports

These are the input and output ports, as well as the main Off-On Switch of the analyzer. When not in use, the ports
are normally protected by a flexible rubber cover, which needs to be flipped down to expose the port for use.

Figure 13. The Ports

15

USB Ports

These are the USB Ports of the analyzer. There are four of them. They are used for communication with devices of
various kinds.

Figure 14. The USB Ports

Download
Port

This is the Download Port for the Analyzer. It is used to exchange data between the analyzer and your computer. It is
a special USB port (Mini-USB) dedicated to the Computer.

Figure 15. Download Port

16

Power Port

This is the Power Port for the analyzer. Inserting the proper power cable into the port powers the analyzer. Select the
Power Port to see the Power Supply.

Figure 16. The Power Port

Helium Port

This is the Helium Port of the analyzer. This port is used to supply helium to the measurement head for better
analysis of light metals.

Figure 17. The Helium Port

17

On-Off
Switch

This switch turns your analyzer on or off if the power cable is properly inserted into the Power Port.

Figure 18. The On-Off Switch

Power Supply

This is the Power Supply for the analyzer. It supplies electricity to the analyzer, and trickle-charges the battery when
plugged in.

Figure 19. The Power Supply

Ground Port

This is the Ground Port for the analyzer. It's purpose is to supply separate grounding for the analyzer.

Figure 20. The Ground Port

18

Battery

This is the analyzer's battery, allowing it to run without being plugged into a power supply. Below the battery is a
picture of the battery's charge indicator, which is located on the battery's upper inside face, opposite the release
button.

Figure 21. The Battery

19

20

Startup Operations
Starting Up
your Analyzer
First

Read and follow the instructions in the laminated Quick Start Guide that came with your analyzer. If you have lost
or cannot find your Quick Start Guide, view and print the Quick Start Guide pdf included with this Resource
Guide.

Turning the
Analyzer On

Make sure that your analyzer is plugged into a wall outlet using the AC Adaptor provided, or that the analyzer has a
charged battery properly inserted. You should never use an AC Adapter which was not designed for this analyzer. See
Power Port on page 17
At the back of your analyzer, press the On/Off off rocker switch into the On position. See On-Off Switch on
page 18
The analyzer will start up up after initializing. This will take some seconds.
When the Startup Screen appears, your analyzer is ready to be used.

Figure 1. The Startup Screen

21

Logging In

Touch the Startup Screen anywhere to get the Logon Screen. See Virtual Keyboard on page 69

Figure 2. The Logon Screen

When the Logon Screen appears, use the Virtual Keyboard to log in, selecting each number of the password in
sequence, followed by the Enter Key. The default sequence is 1234.

Performing a
System
Check

Select the System Check Icon from the System Menu to perform a system check. Thermo Scientific recommends
that you perform a system check once every working day, as part of your normal startup procedure, after allowing a
minute or so for the analyzer to warm up. See The System Menu on page 48

Figure 3. System Check Icon on System Menu

22

While performing the system check, your screen will show a progress bar indicating the progress of the check. When
it is done, the screen will show 100% completion. A second check will then start. When this check is complete, the
check is finished.

Figure 4. Performing a System Check

If you see any final result other than "System OK", perform another system check. If the result is still not "System
OK", please notify Thermo Scientific Service at 800-875-1578.

Batteries

Batteries must be inserted if you want your analyzer to work without being connected to an AC outlet. The battery
is inserted into the cavity on the lower right hand side of your analyzer.

Inserting the
Battery

Tilt your analyzer to the left, making the right side rise high enough for the battery housing to slip beneath
when held in the proper, upright position.
Holding the battery upright, with the curved side and Latch Button down and the flat side with the
connector up, place it under the cavity on the right side of your analyzer. Make sure that the rib on wither
side of the battery aligns properly with the groove on either side of the cavity.

Figure 5. Inserting the Battery

23

Push the battery up, sliding the rib through the groove smoothly until the battery clicks solidly into place.

Removing the
Battery

Tilt your analyzer to the left, making the right side rise high enough for the battery housing to slip beneath
when held in the proper, upright position.

Figure 6. Battery Inserted

Press the Latch Button while pushing the battery up.


The battery will slide down into your hands.
Slide the battery out completely before setting your analyzer back upright.

Figure 7. Battery Removed

Charging the
Battery

With the battery properly inserted into your analyzer, and the analyzer drawing power through the AC
Adapter, the battery will automatically be trickle charged properly.

24

If you have ordered a spare battery, the Battery Charger which comes with the spare battery can safely charge
batteries offline.
WARNING Do not attempt to use the AC Adapter directly to charge batteries! The AC Adapter does not contain the
circuitry necessary to allow safe charging of a battery. Only recharge batteries through your analyzer or with the
Battery Charger supplied with the spare battery.
See Standard Maintenance on page 105.
Do not place the battery pack or cells on or near fires, heaters and other high temperature locations or apply
heat to the battery.
Do not pierce the battery with any sharp objects, strike the battery with a hammer,tools, or heavy objects,
step on the battery pack, or otherwise damage the outer casing.
Do not subject the battery pack to strong impacts or shocks.
Do not expose the battery to water or any other type of liquid, or allow the battery to get wet.
Do not leave the battery in direct sunlight, and avoid storing spare battery packs inside cars in extreme hot
weather. Doing so may cause the battery to generate heat, rupture, or ignite. Using the battery in this
manner may also result in a loss of performance and a shortened life expectancy. When a battery becomes too
hot, the built-in protection circuitry is activated, preventing the battery from charging further. Heating the
battery can destroy the safety devices, and can cause additional heating, rupture or ignition of the battery
cells.
Never short-circuit, reverse polarity, disassemble, damage or heat the battery pack over 100C (212F).
If an exposed lithium-ion battery does start a fire, it may burn even more violently if it comes into contact
with water or even the moisture in the air. DO NOT THROW WATER ON A BURNING LI-ION
BATTERY! A class C fire extinguisher must be used.
Although most battery packs have protected (recessed) connectors, do not carry individual battery packs in
your pockets as they could short-circuit against other metal items.
In the case of a high-impact event to the test system or the battery pack (e.g. car crash or drop > 0.75m/30
in) you must carefully inspect the battery for damage and properly discard it if damaged. Always observe the
battery carefully for at least 20 minutes after an impact. The pack may look fine but a perforation or
damaged wire means the pack must be disposed of according to local regulation. Contact Thermo Fisher
Scientific Inc. if in doubt.
Do not disassemble or modify the battery pack. The battery contains safety and protection devices which, if
damaged, may cause the battery to generate heat, rupture or ignite.
Any modification may damage the battery pack or cells and will invalidate any warranty claim.
If you happen to get any electrolyte from the cells on your skin, wash thoroughly with soap and water. If in
your eyes, do not rub. Rinse thoroughly with water and seek medical assistance.
Keep battery packs away from untrained personnel and children!

Charging and
Storing Battery
Packs

Note For safety reasons, rechargeable battery packs are not fully charged when they are shipped.
Please read the following instructions carefully:
New battery packs need to be fully charged and discharged up to five times before performing at full capacity.
Always use the Thermo Fisher Scientific Inc. charger that came with the device. Do not attempt to charge
the battery pack by any other means.
The portable instrument, its external chargers, and the battery pack itself continuously monitor the
conditions of the cells for safety and maximum performance.
Do not use the Thermo Fisher Scientific Inc. charger with other lithium batteries or on any other type of
battery fire or explosion may occur.
Never modify or repair the charger supplied.
Never use a NiCd charger or any other charger to recharge the Li-ion battery pack as this is very dangerous.
Never charge your Li-Ion battery pack near heat or flammable objects.

25

The required charging time will depend upon the remaining charge level of the battery, and will vary from
product to product. Charging the battery while the test set is being used will increase the charging time.
Required charging time may also increase at lower temperatures.
The temperature range over which the battery can be charged is typically 0C to 45C (32F to 113F).
Therefore, charging efforts outside the prescribed temperature range may automatically be blocked by the
protection circuitry of the battery pack.
If a battery pack can not maintain charge for long periods, even when it is being charged correctly, this may
indicate it is time to replace the battery.
If the product or battery pack becomes too hot to the touch during charging, disconnect and switch off
immediately. Contact Thermo Scientific.
Do not charge battery packs if the battery has expanded or swollen in size, or if the battery cells have been
punctured, even if this is the first time the battery is going to be charged.
Do not charge or use the battery if any mechanical damage has occurred.
Do not continue charging the battery if it does not recharge within the specified charging time. Doing so
may cause the battery to become hot, rupture, or ignite. Please consult the products manual and datasheet.
Because batteries utilize a chemical reaction, battery performance will naturally deteriorate over time, even if
stored for a long period without being used. In addition, if the various conditions such us charge, discharge,
ambient temperature, etc. are not maintained within the specified ranges, the life expectancy of the battery
may be shortened, or the device in which the battery is used may be damaged by electrolyte leakage.
Storage: For long term storage, the battery pack should be stored at room temperature (around 20C/68F)),
charged at about 30 to 50% of its capacity. We recommend that spare battery packs are charged and used at
least once per year to prevent over-discharge.
If you have spare (extra) battery packs, rotate the packs regularly, so they all stay active and avoid
over-discharge. It is recommended to charge and use battery packs at least every three months. Battery packs
shall not go without reconditioning (recharging) for more than six months.
After extended storage battery packs may reach deep discharge state and/or enter into sleep mode. For safety
reasons, Li-ion batteries in deep discharge state may take up to 24 hours to pre-recharge, before starting the
regular fast charging cycle.
Charging indicators (e.g. LEDs) may not turn on during the pre-charging state.

Using Battery
Packs:

For the expected performance of each individual battery pack and test set, please refer to their specific manuals and
datasheets.
Thermo Fisher Scientific battery packs shall only be used with the Thermo Fisher Scientific product for
which they were intended to be used. Follow the products low battery indication and warnings. Do not over
discharge a Li-ion battery pack. If the voltage does drop below specifications, and you can get your battery to
take a charge, it may not give its full capacity and deterioration in performance will occur. This will
invalidate all warranty claims.
Do not discharge the battery pack using any device except for the specified test set it came with. The test set
constantly monitors and controls the discharge rate to keep it within specifications. If used in devices aside
from the specified devices, it may damage the performance of the battery pack, reduce its life expectancy, and
if such device causes an abnormal current flow, it may cause the battery to become hot, rupture, or ignite,
and could cause serious injuries.
The temperature range over which the battery can be discharged is -10C to 60C (14F to 140F). Use of
batteries at temperatures outside this range may damage the performance of the battery pack or may reduce
its life expectancy.
To avoid short circuits, make sure the battery packs contacts are not exposed when transported outside the
intended device (e.g. spares).
Every deep discharge cycle decreases their capacity. Battery life will be extended by proper storage, and by
charging the pack at least once per year to prevent over discharge.

26

Basic Operation
The Top Menu

Select the Top Menu button that you want to know more about.
The Test Menu on page 39
The Data Menu on page 40
The Method Setup Menu on page 41
The System Menu on page 48
The Start/Stop Button on page 50

Taking a
Sample
Analysis

1. Clean the sample to be analyzed so it is free of all surface contamination.

27

2. Place the sample so that it covers the analysis window.

28

3. Shut the cover securely.

4. Select the Method Setup Button.

29

4a. Select the proper Mode (in this case Mining Cu/Zn) from the Mode Menu.
Note See Analysis Modes on page 34. for more information on the Modes available.

5. Select the Test Button.

30

5a. Select the vertical Data Button on the lower left of the screen if you wish todo any data entry.

5b and 5c. Enter the data on the sample using the Virtual Keyboard.

31

6. Select the Start Button, or setup a Batch Analysis.

7. When the sample has been sufficiently analyzed, select the Stop Button.

32

8. View the composition returned.

8a. You can select to display All elements, whether detected or not, or only Detected elements by selecting the
appropriate button.

33

9. Remove the sample.

Analysis
Modes

Your analyzer has several Analysis Modes. Which Analysis Mode you should use depends on the nature of the sample
you are attempting to analyze.

General Metals
Mode

Use this mode to analyze samples entirely composed of metal alloys. This mode will attempt to return an Alloy
Grade Identification by matching the analyzed composition of the sample with the nominal composition of alloys in
the analyzer's Alloy Grade Library. It will also return an elemental composition of the alloy as analyzed. Alloy
Composition is output by default in terms of percent of composition by weight.
See Using General Metals Mode on page 35.

Electronic
Metals Mode

Use this mode to analyze electronic component samples - circuit boards, chips, etc. This mode will attempt to return
an Alloy Grade Identification by matching the analyzed composition of the sample with the nominal composition of
electronic alloys in the analyzer's Alloy Grade Library. It will also return an elemental composition of the electronic
alloy as analyzed. Electronic Metal Composition is outputby default in terms of percent of composition by weight.
See Using Electronic Metals Mode on page 36.

Precious
Metals Mode

Use this mode to analyze samples composed primarily of precious metals. This mode will attempt to return an Alloy
Grade Identification by matching the analyzed composition of the sample with the nominal composition of alloys in
the analyzer's Precious Alloy Grade Library. It will also return an elemental composition of the precious metal
sample as analyzed. Precious Alloy Composition is outputby default in terms of parts per million.
See Using Precious Metals Mode on page 36.

Plastics Mode

Use this mode to analyze samples composed primarily of plastic. This mode will return an elemental composition of
the plastic sample as analyzed. Plastic Composition is outputby default in terms of parts per million.

34

See Using Plastics Mode on page 36.

Soils Mode

Use this mode to analyze samples composed primarily of soil and rock. This mode will return an elemental
composition of the soil sample as analyzed. Soil Composition is outputby default in terms of parts per million.
See Using Soils Mode on page 37.

Mining Cu/Zn
Mode

Use this mode to analyze samples composed of potential metal ore - rock containing high proportions of metal - and
containing Cu and/or Zn. This mode will return an elemental composition of the ore sample as analyzed. Ore
Composition is outputby default in terms of percent of composition by weight.
See Using Mining Cu/Zn Mode on page 37.

Mining Ta/Hf
Mode

Use this mode to analyze samples composedof potential metal ore- rock containing high proportions of metal - and
containing Ta and/or Hf. This mode will return an elemental composition of the ore sample as analyzed. Ore
Composition is outputby default in terms of percent of composition by weight.
See Using Mining Ta/Hf Mode on page 37.

TestAll Mode

Use this mode to analyze samples composed of unknown and/or mixed composition, such as toys and consumer
products. This mode will attempt to return a general Material Identification by comparing the analysis with other
general types of materials. It will select the proper sub-mode for analysis and return an elemental composition of the
sample as analyzed. Material Elemental Composition is outputby default in terms of parts per million.
See Using TestAll Mode on page 38.

TestAll Geo
Mode

Use this mode to analyze powder, mineral, and ore samples without first determining whether the samples would
best be analyzed with Mining or Soil Mode. This mode uses both the Compton Normalization calibration (Soil) and
the Fundamental Parameters calibration (Mining) to determine whether the soil calibration is acceptable or whether
the total metal content is too high for Compton mode. It will then return an elemental composition of the sample as
analyzed. If the sample can be analyzed via soil mode, then the analyzer will display results from both Soil and
Mining Modes in one unified list. If both calibrations contain the same element, then the mode that has the lower
detection limit will be displayed. Material Elemental Composition is outputby default in terms of both parts per
million (mg/kg) and percent of composition by weight, with 0.10% being the cutoff point.
Note Due to the nature of this mode, your analyzer will only use factory calibrations. User modified Cal Factors will
not be available.
See Using TestAll Geo Mode on page 38.

Using
General
Metals Mode

1. Clean the sample to be analyzed so it is free of all surface contamination, grinding the surface if appropriate.
2. Place the sample so that it covers the analysis window.
3. Shut the cover securely.
4. Select the Method Setup Button.
a. Select General Metals from the Mode Menu.
5. Select the Test Button.
a. Select the vertical Data Button on the lower left of the screen if you wish to do any data entry.
b. Enter the data on the sample using the Virtual Keyboard.
6. Select the Start Button.
7. When the sample has been sufficiently analyzed, select the Stop Button.
8. View the composition returned.
a. You can select to display All elements, whether detected or not, or only Detected elements by selecting the
appropriate button.
9. Remove the sample.

35

Using
Electronic
Metals Mode

1. Clean the sample to be analyzed so it is free of all surface contamination.


2. Place the sample so that it covers the analysis window.
3. Shut the cover securely.
4. Select the Method Setup Button.
a. Select Electronic Metals from the Mode Menu.
5. Select the Test Button.
a. Select the vertical Data Button on the lower left of the screen if you wish to do any data entry.
b. Enter the data on the sample using the Virtual Keyboard.
6. Select the Start Button.
7. When the sample has been sufficiently analyzed, select the Stop Button.
8. View the composition returned.
9. Remove the sample.
a. You can select to display All elements, whether detected or not, or only Detected elements by selecting the
appropriate button.

Using Precious
Metals Mode

1. Clean the sample to be analyzed so it is free of all surface contamination, grinding the surface if appropriate.
2. Place the sample so that it covers the analysis window.
3. Shut the cover securely.
4. Select the Method Setup Button.
a. Select Precious Metals from the Mode Menu.
5. Select the Test Button.
a. Select the vertical Data Button on the lower left of the screen if you wish to do any data entry.
b. Enter the data on the sample using the Virtual Keyboard.
6. Select the Start Button.
7. When the sample has been sufficiently analyzed, select the Stop Button.
8. View the composition returned.
a. You can select to display All elements, whether detected or not, or only Detected elements by selecting the
appropriate button.
9. Remove the sample.

Using Plastics
Mode

1. Clean the sample to be analyzed so it is free of all surface contamination.


2. Place the sample so that it covers the analysis window.
3. Shut the cover securely.
4. Select the Method Setup Button.
a. Select Plastics from the Mode Menu.
5. Select the Test Button.
a. Select the vertical Data Button on the lower left of the screen if you wish to do any data entry.
b. Enter the data on the sample using the Virtual Keyboard.
6. Select the Start Button.
7. When the sample has been sufficiently analyzed, select the Stop Button.

36

8. View the composition returned.


a. You can select to display All elements, whether detected or not, or only Detected elements by selecting the
appropriate button.
9. Remove the sample.

Using Soils
Mode

1. Pack the sample into a Sample Cup.


2. Place the sample so that it covers the analysis window, with the clear Mylar or Polypropylene film facing down
over the analysis window.
3. Shut the cover securely.
4. Select the Method Setup Button.
a. Select Soils from the Mode Menu.
5. Select the Test Button.
a. Select the vertical Data Button on the lower left of the screen if you wish to do any data entry.
b. Enter the data on the sample using the Virtual Keyboard.
6. Select the Start Button.
7. When the sample has been sufficiently analyzed, select the Stop Button.
8. View the composition returned.
a. You can select to display All elements, whether detected or not, or only Detected elements by selecting the
appropriate button.
9. Remove the sample.

Using Mining
Cu/Zn Mode

1. Clean the sample to be analyzed so it is free of all surface contamination.


2. Place the sample so that it covers the analysis window.
3. Shut the cover securely.
4. Select the Method Setup Button.
a. Select Mining Cu/Zn from the Mode Menu.
5. Select the Test Button.
a. Select the vertical Data Button on the lower left of the screen if you wish to do any data entry.
b. Enter the data on the sample using the Virtual Keyboard.
6. Select the Start Button.
7. When the sample has been sufficiently analyzed, select the Stop Button.
8. View the composition returned.
a. You can select to display All elements, whether detected or not, or only Detected elements by selecting the
appropriate button.
9. Remove the sample.

Using Mining
Ta/Hf Mode

1. Clean the sample to be analyzed so it is free of all surface contamination.


2. Place the sample so that it covers the analysis window.
3. Shut the cover securely.
4. Select the Method Setup Button.
a. Select Mining Ta/Hf from the Mode Menu.
5. Select the Test Button.

37

a. Select the vertical Data Button on the lower left of the screen if you wish to do any data entry.
b. Enter the data on the sample using the Virtual Keyboard.
6. Select the Start Button.
7. When the sample has been sufficiently analyzed, select the Stop Button.
8. View the composition returned.
a. You can select to display All elements, whether detected or not, or only Detected elements by selecting the
appropriate button.
9. Remove the sample.

Using TestAll
Mode

1. Clean the sample to be analyzed so it is free of all surface contamination.


2. Place the sample so that it covers the analysis window.
3. Shut the cover securely.
4. Select the Method Setup Button.
a. Select TestAll from the Mode Menu.
5. Select the Test Button.
a. Select the vertical Data Button on the lower left of the screen if you wish to do any data entry.
b. Enter the data on the sample using the Virtual Keyboard.
6. Select the Start Button.
7. When the sample has been sufficiently analyzed, select the Stop Button.
8. View the composition returned.
a. You can select to display All elements, whether detected or not, or only Detected elements by selecting the
appropriate button.
9. Remove the sample.

Using TestAll
Geo Mode

1. Clean the sample to be analyzed so it is free of all surface contamination.


2. Place the sample so that it covers the analysis window.
3. Shut the cover securely.
4. Select the Method Setup Button.
a. Select Testall Geo from the Mode Menu.
5. Select the Test Button.
a. Select the vertical Data Button on the lower left of the screen if you wish to do any data entry.
b. Enter the data on the sample using the Virtual Keyboard.
6. Select the Start Button.
7. When the sample has been sufficiently analyzed, select the Stop Button.
8. View the composition returned.
a. You can select to display All elements, whether detected or not, or only Detected elements by selecting the
appropriate button.
9. Remove the sample.

38

The Menu System


The Test
Menu

The Test Menu is a screen divided into five areas.

Figure 1. The Test Menu

The Top Menu


Area

The ID Area

This area is always the same, except for the right-hand most button. There are four buttons in a strip running
horizontally across the top of the screen - Test, Data, Method Setup, and System - and a fifth, separate button to the
right, the Start or Stop Button. When the analyzer is analyzing, the Stop Button appears here. Pressing it stops the
analysis. At all other times, the Start Button is here. Pressing this button initiates an analysis.
This area displays the identification of the material, as well as meta-data associated with the reading such as the
Reading Number, the Analysis Mode, the Analysis Time, the size of the Small Spot, and the filters used.

The Video Area

This area displays a live video of the sample on the Analysis Window, as well as a button which both displays the
current Small Spot size and, when selected, brings up a menu to change between 8mm, 3mm, and 1mm Small Spot
diameters.

The Display
Area

This area displays the elemental concentrations of the latest analysis. It can be displayed two ways - Detected, which
shows only elements detected to be above the Level of Detection, and All, which shows the concentrations of all
elements. The Display area shows the element, the concentration of that element in ppm or percent weight, and the
Sigma Value of the analysis. Each of these columns can be sorted to be either ascending or descending.

The Data
Entry/Spectrum Display
Area

This area shows the data entered about the sample analyzed, or the spectrum of the analysis. To show the data
entered, select the Data button. To show the spectrum, select the Spectra button. The Data Entry display shows the
name of the field, the value input into that field, and a Keyboard Icon to bring up the Virtual Keyboard.

39

The Data
Menu

The Data Menu is a screen divided into four areas.

Figure 2. The Data Menu

The Top Menu


Area

This area is always the same, except for the right-hand most button. There are four buttons in a strip running
horizontally across the top of the screen - Test, Data, Method Setup, and System - and a fifth, separate button to the
right, the Start or Stop Button. When the analyzer is analyzing, the Stop Button appears here. Pressing it stops the
analysis. At all other times, the Start Button is here. Pressing this button initiates an analysis.

The Readings
Listing Area

This area shows your previous readings, listed by reading number, five at a time. There are four buttons at the top of
the listing which allow you to browse through the readings:
The |<< button lists the first five readings in memory.
The < button lists the previous five readings in memory.
The > button lists the next five readings in memory.
The >>| button lists the last five readings in memory.
You can select any of the readings displayed in the list.

The Modifier
Area

This area contains buttons which operate on the readings:


The Delete All button will delete all readings currently in memory.
The Export All button will export all readings to an attached thumb drive without deleting them. See Downloading
to a USB Memory Stick or Thumb Drive on page 73.

The Display
Area

This area displays the elemental concentrations of the currently selected reading. It can be displayed two ways Detected, which shows only elements detected to be above the Level of Detection, and All, which shows the
concentrations of all elements. The Display area shows the element, the concentration of that element in ppm or
percent weight, and the Sigma Value of the analysis. Each of these columns can be sorted to be either ascending or
descending.

40

The Method
Setup Menu

The Method Setup Menu is a screen divided into four areas.

Figure 3. The Method Setup Menu

The Top Menu


Area

The Mode
Menu Area
The Mode-Specific Action
Menu Area
The
Action-Specific Screen
Area

This area is always the same, except for the right-hand most button. There are four buttons in a strip running
horizontally across the top of the screen - Test, Data, Method Setup, and System - and a fifth, separate button to the
right, the Start or Stop Button. When the analyzer is analyzing, the Stop Button appears here. Pressing it stops the
analysis. At all other times, the Start Button is here. Pressing this button initiates an analysis.
This area is a vertical strip running from the bottom of the Top Menu to the bottom of the screen, along the left
hand side of the screen. From the scrollable Mode Menu located here, you can choose between all of the many
modes available for testing.
This area vertical strip running from the bottom of the Top Menu to the bottom of the screen, down the left-center
of the screen. Here is a menu of actions that you can perform, which is mode-specific - that, is it changes for
different Modes. There are five possible Actions - Custom Element Display, Element Range, Measurement
Parameters, Consumer Goods, and Adjust Calibrations - but not all Actions are possible for all Modes.
This area, taking up the balance of the screen, contains tools, listings, fields, and other ways for you to change the
analysis, to make it do what you want it to do.

41

The Method
Setup
Action-Specific Functions

Custom Element Display

Sorting the Custom Element Display on page 67

42

Element Range

Adjusting the Element Range on page 72

43

Measurement Parameters

Setting Up Beep Times on page 66


Setting Display Units on page 71
Adjusting the Sigma Values on page 71
Max Measure Time on page 67

44

Consumer Goods

Minumum Test Time on page 68


Report Settings on page 67

45

Adjust Calibrations

Adjusting the Calibration on page 86


Editing the Calibration Factors - Slope on page 86
Editing the Calibration Factors - Intercept on page 86
Oxides versus elemental concentrations on page 86
Calculating Calibration Factors on page 88

46

Pseudo Elements

Pseudo-Elements on page 94

47

The System
Menu

The System Menu is a screen divided into three areas.

Figure 4. The System Menu

The Top Menu


Area

The Menu
Button Area

This area is always the same, except for the right-hand most button. There are four buttons in a strip running
horizontally across the top of the screen - Test, Data, Method Setup, and System - and a fifth, separate button to the
right, the Start or Stop Button. When the analyzer is analyzing, the Stop Button appears here. Pressing it stops the
analysis. At all other times, the Start Button is here. Pressing this button initiates an analysis.
This area contains an array of menu buttons enablling you to select which aspect of teh system you wish to work
with. The buttons allow you to select:
Date and Time - to set the date and/or time on your analyzer.
Bluetooth - to connect to Bluetooth wireless networked devices.
Start/Stop Settings - to enable and control Batch Analysis.
Calibrate Touch - to calibrate your touch screen precisely.
Specs - to list the instrument's current operating specs, and access diagnostic information.
System Check - to perform a system check for normalization.
Log Off - to log off your analyzer.

The Context-Sensitive
Display Area

This area contains a display screen which is configured differently, depending on the menu button you selected. The
above screen shot shows the Date and Time screen. The screens contain various fields, buttons, radio buttons, and
toggles which allow you to interact with the system, along with displaying appropriate system data.

48

Date and
Time

See Setting the Date and Time on page 76.

Bluetooth

49

Start/Stop
Settings

See Start/Stop Settings on page 85.

Calibrate
Touch

50

See Calibrating the Touch Screen on page 82.

Specs

51

System
Check

See Performing a System Check on page 22.

Log Off

52

Common Operations
Metal Sample
Prep

Up until recently, sample preparation was not a big worry for XRF metals analysis, as the LOD of the analyzer was
seldom low enough for any but the most heavy contamination to be intrusive; but recent developments such as
He-purged analysis have brought analysis to a level where even light surface contamination can skew an analysis.
You should always prepare your samples before analysis, especially when using He-purged analysis, as these analyzers
will see even trace amounts of contaminants. Oils from fingerprints and other body contact, lint, oxidation
materials, and abrasive materials used in cleaning can all skew readings if not removed. Sample preparation is simple
and not time consuming, and usually well worth the effort.
The following is a list of problems that need correction before testing:
Oxidation or Rust may produce an increase or decrease in one or more element test values unless we remove
the rust or oxidation and expose the raw metal.
Paint may contain several elements which need to be tested at lower levels within metal alloys (Ti & Zn in
white paint, Fe in red paint, Cr in green paint).
Oil, grease or lubricates may contain high levels of the following elements: lithium, aluminum, barium,
strontium, molybdenum or calcium.
Plated surfaces may have high levels of the following elements: zinc, chromium, nickel, or copper.
CAUTION Anything on the metal surface will become part of your test results!

Sample
Analysis
Preparation

You need to clear the surface of your samples of any paint, plating, or any oxidation such as rust or verdigris before
analysis. In order to accomplish this, you need the following:
Isopropyl alcohol - not rubbing alcohol, which contains oils.
Lint-free paper.
Diamond paper - P/N 179-1202- cut into 1 inch/2.5 cm squares. Never re-use this paper, as it may transfer
contaminants to the surface of the sample from previous cleanings. Depending on the state of the sample,
several squares may be needed per sample.
A Sample Grinder for removing deeper surface contamination. Choice of grinding wheel media also may be
important, depending on what you are testing for. Never re-use grinding media, as contaminants can be
transferred from sample to sample on the media itself.
For light contamination on hard metal reference standards, remove the oxidation by scrubbing the dry sample lightly
with the diamond paper square, using the fingers to maintain pressure. If the diamond paper begins to load up with
material, discard it and use a fresh square. When the oxidation is removed, wipe the sample with lint-free paper
soaked with isopropyl alcohol to remove any oils or dust. Let the sample dry before attempting analysis.
For soft metal reference standards, wipe the sample with lint-free paper soaked with isopropyl alcohol, then remove
the oxidation by scrubbing the wet sample lightly with the diamond paper square, using the fingers to maintain
pressure. If the diamond paper begins to load up with material, discard it and use a fresh square. When the oxidation
is removed, wipe the sample again with lint-free paper soaked with isopropyl alcohol to remove any oils or dust. Let
the sample dry before attempting analysis.
Oils, lint and dust can be removed by wiping the sample with lint-free paper soaked with isopropyl alcohol. Let the
sample dry before attempting analysis.

Surface
Oxidation

With the exception of a limited number of metal types, most metal alloys form an oxide covering on the surface
when exposed to oxygen or air. This oxide covering is visible in carbon and low alloy steel as a red colored substance
called rust. Other metal alloys form oxidation which is not always visible, but that does not mean that it is not
present. If the test results for low concentration elements are higher or lower than expected, remove the oxide coating
by grinding and retest. Follow proper safety procedures when changing discs or grinding materials.

53

During a recent case study the effects of sample preparation became apparent. A customer asked for low detection
limits of nickel, chromium and copper in carbon steel pipe. The reported chemistry of the purchased material is
listed on the first line in the chart below. The test results of a hand held Niton XL3t 900S GOLDD instrument
appears in the second line of the chart. The results from a test on the unground surface appear in the bottom line of
the chart. Note the values for nickel and copper in this carbon steel alloy in the chart below. The oxidation on the
surface of this pipe was not visibly egregious. We need to always be wary of the presence of even low levels of
oxidation and their possible effects on analytic accuracy.
Table 1. Comparative test results with and without grinding

Sample

% Mn

Reported Chemistry
Test Results with Ground Surface
Test Results with Unground Surface
Painted
Surfaces

0.650
0.67
0.61

% Ni
0.090
0.089
0.178

% Cr
0.070
0.070
0.081

% Mo
0.030
0.033
0.033

% Cu
0.040
0.039
0.514

Paint is a mixture of several items that are combined into a liquid which is applied to the surface of materials such as
metal. Once applied this liquid dries with time and adheres to the surface of metal. Paint is used to protect or
decorate the metal item. Paint can also be used to identify or mark the metal during the manufacturing process.
Components of paint are divided into classifications of pigments, binders, solvents, additives and fillers. The
inorganic elements in pigments will contribute to increases in displayed values for those elements if paint on the
metal surface is not removed prior to testing. Be especially careful of the presence of heavy elements, which can also
act to shield x-rays from lighter elements in the metal sample.
The following is a list of some of the most common components of paint:
White Paint
Antimony (Sb)
Lead (Pb)
Titanium (Ti)
Zinc (Zn)
Cobalt (Co)
Red Paint
Iron (Fe)
Lead (Pb)
Green Paint
Chromium (Cr)
An experiment was conducted to determine the effect and severity of surface problems on XRF results. Results from
analyses of a 1541 alloy steel sample are shown below, before and after surface grinding. The sample had painted
markings, of light to medium thickness, on the surface, as well as light rust. Note the change in titanium, zinc and
cobalt levels after surface grinding.
Table 2. Prepped and unprepped painted metal analysis

Sample
Ground Surface
Unground Surface

Mn
1.49
1.34

Ni
0.04
0.01

54

Cr
0.03
0.04

Mo

Ti

0.004
0.011

0.011
2.507

Zn
0.0001
1.751

Co
0.03
0.21

Oil, Grease &


Cutting Oils

Oil and grease contain a number of elements combined into a viscous substance and applied to moving parts in
order to reduce friction. Grease coatings can remain on component surfaces after it has been removed from service.
Grease can also be applied to a metal's surface by accidental contact with other materials coated in heavy grease.
Metals can also be coated in oil as a result of cutting and machining processes in manufacturing.
Grease and oil may contain the following elements:
Aluminum (Al)
Zinc (Zn)
Molybdenum (Mo)
Sodium (Na)
Calcium (Ca)
An experiment was performed to show how grease on metal surfaces affects XRF results. A carbon steel sample was
cleaned and ground as a control surface for the experiment. XRF tests were performed on the control surface, and
again after light and heavier layers of automotive wheel bearing grease were applied to the surface of the steel sample.
Results are shown below. Note the elevated levels of molybdenum, cobalt and zinc from the grease.
Table 3. Clean and greased sample metal analysis

Sample
Clean Surface
Light Grease
Heavy Grease

Mn
1.18
1.07
0.96

Ni

Cr

Mo

Cu

Co

Zn

0.001
0.001
0.001

0.041
0.001
0.001

0.004
0.067
0.500

0.001
0.033
0.062

0.001
0.322
1.760

0.019
0.416
3.430

If a sample's surface contains lubricants or cutting oil, use a solvent and a clean towel or rag to remove them before
analysis. You may then need to grind the surface to insure good results. Clean first, grind second, test last.
Remember to follow safe techniques for handling and disposing of solvents and cleaning rags

Anodized,
Plated and
Galvanized
Surfaces

Anodizing is the process of polarizing the metal surface into a passive state which protects it against corrosion. This
process is most often applied to aluminum alloys.
Galvanized steel is one of the most common of the coated surfaces. In this process, steel is passed through a molten
bath of a zinc alloy. Zinc reacts with the steel metal to form a bonding layer on the steel surface. The zinc layer does
not separate from the steel and forms a protective layer that protects the steel from oxidation.
Galvanized layers are relatively thick compared to other plating elements and methods. When grinding to remove
the zinc coating, you will find increased zinc values even when you can see the steel surface. Grind a little further and
zinc values will disappear. Zinc clings to the surface of the sanding disc, so you will need to frequently change discs.
Electroplating is another common practice of applying a coating which not only protects the surface from oxidation,
but also improves the base material's wear resistance, lubricity and improves the overall aesthetics of the product.
The electroplated coating is generally thinner and more evenly applied than galvanizing. Electroplating has a wide
range of elements and in some situations there may be two or more different coatings on the same part.
The following is a partial list of elements that are used to plate the surface of base metals:
Ni, Cr, Cadmium (Cd), Tin (Sn), Zn, Al

Cordless Right
Angle Drill

This style of drill is recommended for most surface preparation in the field because it gives the operator the greatest
amount of control, and thus safety, when grinding samples. When moving a sanding disc on a conventional drill
over a sample, forces tend to produce movement the operator may find difficult to control. Control and stability are
important in grinding from effectiveness and safety perspectives.
A cordless right angle drill similar to the one pictured below is recommended for light to medium surface removal.
For materials with heavy oxidation such as carbon and low alloy steel, an angle grinder, explained in the next section,
is recommended. A kit with the drill, batteries and charging units, can be purchased from ThermoFisher, or
companies such as DeWalt, Hitachi, Makita, Milwaukee or Ryobi.

55

Figure 1. Example of Right Angle Drill

A disc holder is needed with the drill to hold the sanding disc. (In the US, we recommend a 3.0 inch disc holder. It
has a 0.25 inch shank to insert into the chuck of the drill.) If sanding discs are ordered from a local supplier,
attention should be paid to the method of attaching the sanding disc to the disc holder. There are three types of
connections: metal snap-on, plastic twist and plastic snap-on.

56

Figure 2. Sanding Disc

Before attaching the grinder and sanding disc as pictured below, first remove the battery to disable the grinder. Then
insert the shaft of the disc holder into the drill and securely tighten the chuck. Next, attach the appropriate sanding
disc. The method of attachment will vary depending upon the type of fastener on the sanding disc (snap-on or twist
connectors). Reinstall the battery and prepare for use.

57

Figure 3. Attaching the Sanding Disc 1

Figure 4. Attaching the Sanding Disc 2

58

Cordless Angle
Grinder

A cordless angle grinder similar to the one pictured below will successfully remove medium to heavy oxidation or
paint. This grinder (which uses a 4.5 inch sanding disc with a rubber backup pad) can be purchased from
ThermoFisher or industrial tool manufactures like DeWalt, Makita or Milwaukee.

Figure 5. Cordless Angle Grinder Kit

A grinder kit typically contains the grinder, a battery, and charging unit. If the kit contains a grinding stone wheel,
remove and dispose of it. Grinding stones are not to be used for XRF sample preparation. A rubber backup pad and
a retaining nut are needed to use with sanding discs. (See picture below).

59

Figure 6. Rubber Backing Pad and Nut

In the US, sanding discs are 4.5 inch diameter and can be purchased in various grit sizes of 36 to 120. The surface
abrasive can be one of the following materials: aluminum oxide, silicon carbide or zirconia alumina. The selection of
sanding discs is covered in the next section.

60

Figure 7. Assembling the Grinder

Remove the battery before assembling the grinder, backup pad and sanding disc. Start by installing the backup pad
onto the drive shaft of the grinder, or, with some backup pads. You will need to screw it onto the threaded shaft.
Next place the sanding disc over the drive shaft onto the backup pad. Hold the locking button on the reverse side of
the grinder while tightening the retaining nut into the hole of the sanding disc.
Once the backup pad, sanding disc and locking nut are secured, reinstall the battery. The grinder is now ready for
use.

Sanding Discs

It has been well tested and determined that samples can be easily contaminated by the abrasive material contained in
and on a sanding disc. An example would be the increase in aluminum content of carbon steel after grinding the
sample with a new aluminum oxide sanding disc. Aluminum from the aluminum oxide disc imbeds itself in the
surface of the steel sample and an XRF would show an unusually high aluminum concentration.
Aluminum oxide is the most common abrasive surface used today. For most applications it will be safe to use
aluminum oxide discs. But if test results for aluminum in any metal alloy are significantly higher than expected,
switch to another type of abrasive disc. Also, when grinding aluminum, aluminum oxide discs tend to trap
aluminum from the metal surface into the disc surface. Once this happens, the disc looses its efficiency and cross
contaminates the next sample.
Silicon Carbide
Silicon carbide discs are a good alternative for aluminum oxide and the cost of a disc is only slightly higher than
aluminum oxide. This adhesive type is best for grinding aluminum, copper and other soft metals.

61

Zirconia Alumina
Zirconia alumina discs are more expensive than aluminum oxide or silicon carbide but they last much longer and so
may be the best investment. Few metal alloys have low additive levels of zirconium, so it is one of the safest abrasive
types for general use.
One exception is the Aluminum alloy Al 7050 which is a near twin to alloy Al 7075 except for the ~0.1% Zr in
7050. Therefore, if 7075 is ground with Zr grinding paper it may be erroneously identified as Al 7050 due to the Zr
transferred from the grinding disk to the surface of the Al 7075. s
Diamond Sanding Paper
Do not use diamond sanding paper for surface preparation in the field. Even after extensive and aggressive sanding
with diamond paper, a metal surface will not be prepared properly. Diamond sanding paper is only recommended
for removal of very light oxide coatings on flat surfaces such as analytical reference standards.
Nickel, cobalt, and steel alloys should be ground using 36, 40, 50 or 60 grit discs. The selection of a grit size
of 100 or la
inum, copper alloys, and other softer metals should be ground using 60 or 80 grit discs.
Grinding stones are not recommended because they will absorb surface material and transfer them onto the
next surface ground.

Safety Rules

When using a grinder, follow these safety rules:


When changing sanding discs, always remove the grinder battery to prevent accidental activation of the
grinder.
Allow the grinder to stop spinning before placing it on a flat surface.
Replace any damaged or torn sanding discs immediately.
Always wear impact eye protection to prevent eye damage from flying debris.
Place small samples or standards in a clamping device when grinding to prevent accidental contact between
the spinning disc and your hand.
Use proper techniques and safety precautions when grinding beryllium, beryllium copper, lead, or titanium
alloys.
Always follow the safety instructions outlined by the grinder's manufacture as mentioned in the instruction
manual..

62

Soil Sample
Prep

Examine the site for differences in surface characteristics before sampling. Valid results depend on a sufficient and
appropriate selection of sites to sample. Incorrect sample collection may give rise to misleading or meaningless
results, regardless of the analysis method. Delineate sections with different characteristics and treat them as different
areas. It may be desirable to subdivide larger areas even if they have the same characteristics to ensure a thorough
examination. Make certain to label each bag thoroughly. Common information included on each bag includes the
person and/or the company who collected the sample, the location and area where the sample was taken, and the
date the sample was collected.
Prepared sample analysis is the most accurate method for determining the concentration of elements in a bulk
medium using the instrument. Sample preparation will minimize the effects of moisture, large particle size,
variations in particle size and sample non-homogeneity.
Note More sample preparation (drying, milling and sifting) will yield greater accuracy. The drier, finer, and more
homogeneous the particles, the better the measurements.

Preparing Bulk
Soil Samples

We recommends establishing a specific sample protocol. Following this protocol for preparing and testing samples is
vital for achieving a level of accuracy comparable with laboratory results. The equipment you need to prepare
samples is included in your kit. Among these are a mortar and pestle, several different sized metal sieves, and cups to
hold the samples
.
CAUTION All test equipment must be kept clean to prevent contaminationof samples.

Cleaning Your
Equipment:

The mortar, pestle, and grinding mill may be cleaned with dry paper towels. You can also clean the mortar, pestle,
and the mills container with water, but be sure each is absolutely dry before using them on another sample. The
mortar and pestle may be cleaned by grinding clean, dry sand in the mortar. Use the short bristle brushes (included
in your Soil Testing Kit) to clean the sieves. If you have an electric soil grinder in your kit, when the soil grinder
blades wear out, unbolt the worn blades and replace them. Call the Thermo Sales Department at 1-800-875-1578
for replacement blades.
Note Using the soil grinder may artificially increase the amount of Fe in soil samples.

Sample
Preparation

Prior to analysis, the material should be dry and well homogenized. Ideally, the entire sample should be dried to
constant weight, sifted to remove gravel and debris, and ground or milled to a fine powder. Dry the sample if it is
moist and cohesive. The sample can be dried in any of several ways. Choose one of the following:
Oven dry the sample for approximately 2 hours at 150 C, until the sample reaches a constant weight. Note:
Oven drying is inappropriate when volatile compounds may be present in the sample. For example, lead
present as tetraethyl lead would be driven off by the heat of drying. Some forms of mercury and arsenic are
volatile. Air drying will preserve more of these volatile substances.
Air dry the sample overnight at room temperature in a shallow pan.
Stir gently and warm the sample in a pan over a hot plate or burner.
Coning and Quartering
You may need to divide your sample at various times during preparation. Coning and quartering is a method for
dividing the sample into homogenous quarters.
Pour the dry material slowly and carefully onto a flat sheet or pan, forming a symmetrical cone. Divide the
cone into equal piles using a flat thin-bladed tool, such as a knife or ruler. Divide these in half again.
Now you have four samples, each one-quarter the size of the original and each more homogenous than the
original.
Grind the sample to break up dirt clods and/or paint chips.
WARNING Grinding and sifting dried samples produces dust. Even clean soil contains silica, which may be
hazardous when airborne. Prepare all samples in a ventilated area; wear a mask, gloves, and an apron; and spread a
drop cloth.

63

Sift using the #10 (2mm) mesh and separate out the larger pieces (stones, organic matter, metallic objects, etc.
Examine the larger particles by eye but do not include in the sample. Grind the sample again so its particles will be
finer and more homogenous. Use mortar and pestle, or an electrically powered grinding mill. Sift at least 10 grams of
the sample through #60 (250 ?m) and #120 (125 ?m) mesh. Re-grind the un-passed material until the entire fraction
is able to pass. Mix the resulting sample.

Placing the
Sample in an
XRF Sample
Cup

Note The sample container should be a sample cup of a type that can be filled from the rear; that is, the side
opposite the window (e.g. Thermo NITON Part Number 187-466). Thermo recommends using a 1/4 mil
Polypropylene film (e.g. Thermo NITON Part Number 187-461). A supply of cups and films are included.
The container used to hold the sample will affect the accuracy of the measurement. Use a container with as
thin-walled a window as is convenient and use the same kind of container and window for each sample. Consistency
and careful attention to detail are keys to accurate measurement.
PLACE FILM

Place a circle of polypropylene film on top of an XRF sample cup. This film goes on the end of the cup with the
indented ring. Thermo recommends preparing the cup ahead of time, if possible.
SECURE FILM

Secure the film with the collar. The flange inside the collar faces down and snaps into the indented ring of the cup.
Inspect the installed film window for continuity and smooth, taut appearance.
FILL CUP

Set the cup on a flat surface film-window-side down. Fill it with at least five grams of the prepared sample, making
sure that no voids or uneven layers.
TAMP SAMPLE

Lightly tamp the sample into the cup. The end of the pestle makes a convenient tamper.

64

PLACE FILTER

Place a filter-paper disk on the sample after tamping it.


STUFF CUP

Fill the rest of the cup with polyester fiber stuffing to prevent sample movement. Use aquarium filter or pillow filling
as stuffing. A small supply of stuffing comes with your bulk sample kit.
CAP CUP

Place a cap on your cup.


LABEL CUP

Place a label on teh cup. Using a pen with indelible ink, write identifying information on the cup. Keep a record of
the sample designation, the site and location, the date of the sample, and any other relevant comments.
Cup is ready for testing.

Preparing
Liquids and
Sludge

Liquids
Fill an XRF sample cup with the liquid to be tested (do not pad the sample with cotton). The cup must be full so it
is best if some liquid is allowed to overflow when the cap is put on.
Sludge
Sludge can be placed directly into an XRF cup for screening. This is considered in-situ testing because no attempt
has been made to prepare the sample. For more accuracy, the sludge can be dried, sieved, and ground. Prepare in an
XRF sample cup and test the same way you would with a soil sample. For risk analysis, it is advisable to use a
60-mesh sieve to isolate and test only fine particles.

Preparing
Mining
Samples

Examine the site for differences in surface characteristics before sampling. Valid results depend on a sufficient and
appropriate selection of sites to sample. Incorrect sample collection may give rise to misleading or meaningless
results, regardless of the analysis method. Delineate sections with different characteristics and treat them as different
areas. It may be desirable to subdivide larger areas even if they have the same characteristics to ensure a thorough

65

examination. Make certain to label each bag thoroughly. Common information included on each bag includes the
person and/or the company who collected the sample, the location and area where the sample was taken, and the
date the sample was collected.
Prepared sample analysis is the most accurate method for determining the concentration of elements in a bulk
medium using the instrument. Sample preparation will minimize the effects of moisture, large particle size,
variations in particle size and sample non-homogeneity.
Note More sample preparation (drying, milling and sifting) will yield greater accuracy. The drier, finer, and more
homogeneous the particles, the better the measurements.

Specimen
Preparation Fused Glass
Disk

The samples need to be predried for 2-6 hours in 105C depending on the moisture content.
1. Grind the dried samples to ~200mesh (74 ?m).
2. Calcination (Ashing) the sample
a. About 4-6 g of dry pulverized sample is calcinated in an alumina or platinum crucible in a muffle furnace at
1000C for 1 hour.
b. The sample is cooled in a dedicator and loss on ignition (LOD) is calculated from weight difference before and
after Calcination.
3. Weight 1.0g of calcinated sample into fusion crucible add 5.0 g of lithium tetraborate and 0.3 lithium fluoride,
and 10-20 mg lithium bromide as a nonstick agent.
4. Fuse in a fluxer for at least 4 min in the flame.
5. The resulting disk is released from the mold, cooled, then presented to the spectrometer.

Specimen
Preparation Pressed
Powder
Briquette
Preparation

1. Thoroughly remix the sample in its jar by rotating in a figure-eight motion with two hands
2. Weight 7.0g of sample into weighting boat by taking several separate gram-size portions then fine grind sample
using a swing mill.
3. Add 2 small drops of propylene glycol on the top of the powder sample in the mill as a grinding aid, grind 4min
at 1000rpm to obtain 10 ?m particle size.
4. Add 0.5g binder to the sample and continue grinding for 30sec more.
5. Brush the finely grounded samples into 31 mm aluminum sample cap and press at 50,000psi for 1 min.
CAUTION All test equipment must be kept clean to prevent contamination of samples.

Setting Up
Beep Times

Selecting the Measurement Parameters icon allows you to set up Beep Times, enabling changes to the beep settings
for various modes. This option allows you to change the beep settings for different modes independently. Select
Mode you want to change, then the Measurement Parameters icon to set up your preferred beep times.
First Beep
This option allows you to change the seconds of delay before the First Beep. Select the screen button labeled with the
number of seconds of delay for the First Beep. The Beep One Time editor will open. Clear the current number of
seconds with the "C" button, then select the E button to enter the information.
Second Beep
This option allows you to change the seconds of delay before the Second Beep. Select the screen button labeled with
the number of seconds of delay for the Second Beep. The Beep Two Time editor will open. Clear the current
number of seconds with the "C" button, then select the E button to enter the information.

66

Third Beep
This option allows you to change the seconds of delay before the Third Beep. Select the screen button labeled with
the number of seconds of delay for the Third Beep. The Beep Three Time editor will open. Clear the current
number of seconds with the "C" button, then select the E button to enter the information.
Beep on Grade Match
Selecting this option will enable a special beep when the reading chemistry matches an alloy grade, and put a check
mark in the box. Selecting the box again will remove the check mark and turn the beep off

Sorting the
Custom
Element
Display

Select the Custom Element Display icon to configure sorting criteria used for analysis display. Select the mode you
wish to change, then selecting the Custom Element Display icon opens up the Custom Element Display Screen.
On the left of the display are elements, each with its currently selected display option beside it to the right. The
element list is ranked by importance, with the most important element on top, and each one lower down of less
importance than the one above it.
By selecting an element and using the arrow buttons to the right of the list, you can change its ranking. Use the Up
Button to move an element one rank closer to the top with each click. Use the Dn Arrow Button to move an element
one rank closer to the bottom with each click.

Display
Options

The Display Options Drop Down Menu allows you to change the display status of any element to one of three
states:
Normal - The standard state. Element displays only when the elemental value is greater than the limit of
detection.
Always - Always display the results for this element. Use this state for elements critical to all of your analyses.
Never - Never display the results for this element. Use this state for elements which are unimportant to your
work. This makes your instrument display less complex.
Select the element you want to change, then select the menu option corresponding to your choice of display status.
The currently selected element is displayed in white on black.
Select the Save Button to save your current status as the new default. Select the Reset button to reset the settings to
the previously saved state. Select the Close button to exit the screen without saving.

Report Settings

Under Electronics Metals, Plastics, and Test All Modes, A field called Report Settings is available. Selecting the
triangle next to the Report Settings Field will open a pop up menu allowing you to choose between the three Report
Settings Modes. Select the mode you wish to edit.
Changing the settings for one analysis mode will not affect the settings for other modes, and the configurations can
be saved independently.
RoHS Option
When the RoHS Option is selected, clicking on the Pass and Fail values works as it does in any other Mode.
Detection Option
When the Detection Option is selected, Selecting the Pass/Fail field for that element acts as an On/Off Toggle,
which will switch Pass/Fail mode between On and Off for the selected element. Selecting it again will reverse the
toggle.
Consumer Products Option
When the Consumer Products Option is selected, clicking on the Pass and Fail values works as it does in any other
Mode. In addition, the total of Cl+Br is also calculated and used for Pass/Fail Testing.

Max Measure
Time

Under the Method Setup -> Measurement Parameters option is a field called Max Measure Time. Here you can set
up the maximum time your analyzer will continue to analyze the sample. Select the Max Measure Time field, and a
Virtual Numeric Keypad will pop up, allowing you to input a new Maximum Measurement Time in seconds. The
default Max Measure Time is set to 300 seconds.

67

Minumum
Test Time

Under the Method Setup -> Consumer Goods option is a field called Minimum Test Time. Here you can set up the
minimum time your analyzer will continue to analyze the sample when using the Detection Option only. Select the
Minimum Test Time field, and a Virtual Numeric Keypad will pop up, allowing you to input a new Minimum Test
Time in seconds. The default Minimum Test Time is set to 60 seconds.

68

Virtual
Keyboard

Whenever you see the Keyboard Icon, you can select it to bring up a Virtual Keyboard on your touch screen.
Generally, selecting the keys on the Virtual Keyboard will type the corresponding character into the field. The
exceptions are the meta-keys Del, Clear, Left, Right, Shift, Backspace, Cancel, and Enter.

Figure 8. Virtual Keyboard

Figure 9. Shifted Virtual Keyboard

Del
Clear
Left

Del is the Delete Key. Selecting this key will delete the character to the left of the cursor.
Clear is the Clear Key. Selecting this key will clear all characters from the field.
Left is the Left Cursor Key. Selecting this key will move the cursor one space to the left.

69

Right

Right is the Right Cursor Key. Selecting this key will move the cursor one space to the right.

Shift

Shift is the Shift Key. Selecting this key will bring up the alternate, shifted keyboard. See Figure 1-1B. Selecting the
Shift Key on the shifted keyboard will bring up the normal keyboard. See Figure 1-1A.

Backspace
Cancel
Enter

Backspace is the Backspace Key. Selecting this key will delete the character to the right of the cursor.
Cancel is the Cancel Key. Selecting this key will return you to the normal screen without inputting your changes
into the field.
Enter is the Enter Key. Selecting this key will return you to the normal screen, replacing the former contents of the
field with the changes you have made.

70

Setting
Display Units

Select the Display Units radio button on the Method Setup page to choose between ppm (parts per million) and
percentage (hundredths of whole) displays when taking readings, and to change the Sigma value you want for the
reading.
In the Display Units area, you can select between Percent composition and Parts per Million as the units displayed in
a measurement, and you can change this setting independently for any mode. You can also change the Sigma for
each of these modes independently. When you have changed the display units to the appropriate values, select the
Close button to save these settings for use.

Changing
Precision
(Sigma Value)

Sigma is the symbol used for Standard Deviation, a measure of how much a set of numbers deviates from the mean.
For example, each of the three data sets {0, 0, 14, and 14}, {0, 6, 8, and 14} and {6, 6, 8, 8} has a mean of 7. Their
standard deviations are 7, 5, and 1, respectively. The third set has a much smaller standard deviation than the other
two because its values are all close to 7. In a loose sense, the standard deviation tells us how far from the mean the
data points tend to be.
The number of standard deviations between the process mean and the nearest specification limit is given in sigmas.
As process standard deviation goes up, or the mean of the process moves away from the center of the tolerance, the
sigma number goes down, because fewer standard deviations will then fit between the mean and the nearest
specification limit.

Confidence
Intervals

Confidence intervals assume that the data are from an approximately normally distributed population - generally,
sums of many independent, identically distributed random variables tend towards the normal distribution as a limit.
Using this assumption, about 68 % of the values must be within 1 standard deviation of the mean, about 95 % of
the values must be within two standard deviations, about 99.7 % must lie within 3 standard deviations, and about
99.99% of the values must lie within 4 standard deviations.
The greater the sigma value of the test, the more confident you can be that the sample is as it appears, but the more
difficult and time consuming the testing must be to verify this. That's why it's important to use the most appropriate
sigma value for the test. By adjusting the sigma value for each type of test, you can optimize the process for your
needs.
Adjusting the Sigma Values
The sigma values are listed in the column headed with the Greek letter "sigma". The default value is 2 sigma. You
can change this value by selecting the down arrow next to the value, which opens up a drop-down menu from which
you can select the desired sigma value by clicking on it.

Figure 10. Selecting the Sigma Value

When you have changed the sigma values to the appropriate number, select the Close button to save these settings
for use.

71

Adjusting the
Element
Range

Figure 11. Adjusting the Element Range

Multi-Range tests are used to either preferentially excite specific elements for increased sensitivity, or to cover a wider
element range than one Range alone can provide. Most modes, when enabled, will use several Ranges in sequence to
produce a combined analysis result. In typical Metals analysis applications, Main Range is used for the analysis of
most elements, Low Range is utilized for the subsequent high sensitivity analysis of V, Ti, and Cr, High Range is is
used to optimize the sensitivity for the elements from Palladium (Pd) through Barium (Ba), and Light Range is
typically used in light element analysis. Multi-Range switching can be set to activate off time alone, or, when time
switching is disabled, off settings in the General Metals grade library. In most modes, Low and Light Range add the
capability to analyze light elements which cannot be efficiently excited by Mid Range.
Select the mode you wish to configure from the Mode Menu. You can set different configurations for different
modes.
The Element Range Screen enables you to directly enable or disable any Range, or control the time that a Range
alters the irradiation of the sample before auto-switching to another Range.
Select the checkbox next to the Range you want to use to determine exactly which of the Ranges contained in your
Analyzer is used for sample testing. Selecting an empty checkbox will enable that range and place a check into the
box as an indicator. Selecting a checked box will disable the Range and clear the box.
In typical metals analysis applications, Main Range is used for the analysis of most elements. You cannot deselect the
Main Range in metals analysis.
Low Range is utilized for the subsequent high sensitivity analysis of V, Ti, and Cr.
Select the Element List Button - labeled with a question mark - to display the Element List for that Range. This list
shows the elements that the Range is best designed to detect.
Select the Range Time field for the intended range to change the switch time for that range. The Range Time Editor
will appear. This enables you to set the number of seconds each enabled range is allotted before auto-switching will
occur when needed during sample testing. Your analyzer will auto-switch from one range to another when the
testing time for that range is greater than or equal to the time you have chosen, and the identified alloy is flagged as
needing the switch in the Niton Alloy Library.
Select the C button to clear the current time, then from the virtual numeric key pad, select each digit you want to
input, then select the E button to enter.

72

Downloading
to a USB
Memory Stick
or Thumb
Drive

You can Export your readings from your analyzer to a USB memory device such as a thumb drive. Insert your USB
thumb drive into the USB Extension cable provided with your analyzer, then insert the other end of the Extension
cable into one of your analyzer's USB drives. This enables the Export function.
Note You must use the USB Extension Cable provided. Do not insert the Thumb Drive directly into one of your
analyzer's USB ports.

Figure 12. Exporting Data

To access the Export function, select the Data icon from the Top Menu, then select the Export All icon. If there is no
thumb drive in a USB port, this button will be grayed out and inaccessible.

73

Figure 13. Data Export Dialog Box

A Dialog Box will pop up. By default, the file will be saved under the file name FXL-(Serial number of your
analyzer). Selecting the Virtual Keyboard icon will make the Virtual Keyboard pop up, allowing you to change the
name of the file. Selecting the Export button will initiate the Data Export process, and selecting the Cancel button
will close the dialog box and bring you back to the previous screen.

Figure 14. Export Progress Meter

Once the Export process is initiated, a progress meter will pop up, showing you approximately how far along the
process is at the moment. The progress meter will disappear when the process is complete.
The Export process does not delete the readings in your analyzer after exporting them.

74

X-Y
Movement

Using the joystick, the measurement area can be moved about the sample to capture different features of the sample,
a very useful feature for Consumer Goods and Mining applications, where the samples are rarely homogeneous.
Pushing the joystick forward will move the camera and the measurement area away from you, toward the rear of the
sample chamber. Moving the joystick left will move the camera and measurement area to the left, and similarly for
other directions. Combining directions is possible as well. Moving back and right on the joystick will result in the
camera moving toward you and to the right on a diagonal.

Figure 15. Joystick Movement and Corresponding Camera Movement

75

Setting the
Date and
Time

Figure 16. Setting the Date and Time

From the System Menu, select the Date & Time button from the System Screen to set the date and time as needed
for different time zones, daylight savings time, or any other reason. The date and time are factory preset prior to
shipping. The clock is a 24 hour clock, so add 12 to PM hours - i.e. 1:13 PM would be 13:13.

Figure 17. The Date & Time Screen

When the Date & Time button is selected, the Date & Time Screen comes up on your analyzers LCD Screen. You
may change the Month, Year, Date, Hour, and Minute on your analyzer.

76

Changing the
Month

To change the month, select the downward pointing triangle button next to the month displayed. A drop down
menu will appear, listing the months of the year in order of appearance.

Figure 18. Month Drop Down Menu

Select the month you want from the drop down menu, using the vertical slider button to display hidden months.
The display will change to show the month you selected.

77

Changing the
Year

To change the year, select the downward pointing triangle button next to the year displayed. A drop down menu will
appear, listing the years in order of appearance.

Figure 19. Changing the Year

Select the year you want from the drop down menu, using the vertical slider button to display hidden years. The
display will change to show the year you selected.

78

Changing the
Date

To change the date, select the date you want from the Date Selection Screen. The date you selected will be
highlighted in red, while the old date will be shown in red numbers.

Figure 20. Selecting the Date

79

Changing the
Hour and
Minute

To change the hour, select the hour numbers. The hour numbers will be highlighted in gray. Then select the
Upwards Pointing Chevron Button to increment (increase) the hour, or the Downward Pointing Chevron Button to
decrement (decrease) the hour.

Figure 21. Changing the Hour

To change the minute, select the minute numbers. The minute numbers will be highlighted in gray. Then select the
Upwards Pointing Chevron Button to increment (increase) the minute, or the Downward Pointing Chevron Button
to decrement (decrease) the minute.

80

Figure 22. Changing the Minute

Saving Your
Changes
Exiting Without
Saving

To save your changes, select the "Save" screen Button. The display will return to the previous screen and the Date
and Time will be saved.
To exit the screen without saving changes, select the "Cancel" Screen Button. The display will return to the previous
screen and the Date and Time will not be saved.

81

Calibrating
the Touch
Screen

Figure 23. Initiating Touch Screen Calibration

Select the Calibrate Touch Screen button from the System Screen to re-calibrate the analyzer's touch screen display.
This procedure establishes the display boundaries for the touch screen interface.
1. Select the Touch Screen icon.
2. The display will show a message asking you to confirm whether or not you want to calibrate your Touch Screen.
Select the Yes button.
3. The display will show the message: "Calibrate Touch Screen". There will be a small cross in the upper left-hand
corner of the display.
4. Tap on this cross with the stylus, and the cross will disappear and reappear in the upper right-hand corner of the
screen.
5. Tap on the cross again, and it will reappear in the lower right-hand corner of the screen.
6. Tap on the cross again and it will reappear in the lower left-hand corner of the screen.
7. Tap on the cross once more, and you will be presented with a Confirmation Screen.
8. Select the Yes Button to confirm that the parameters are good. Select the No Button to start the process again.
9. Once you have confirmed the parameters, the System Menu will be displayed. The screen is now calibrated.

82

Figure 24. Touch Screen Calibration Crosses

The Touch Screen can be calibrated - and the system operated - with a USB mouse plugged into the USB ports in
the rear of teh analyzer.

83

84

Advanced Topics
TestAll Geo

TestAll Geo Mode allows you to test powder, mineral, and ore samples without first determining whether the
samples would best be analyzed with Mining or Soil Mode. TestAll Geo Mode uses both the Compton
Normalization calibration (Soil) and the Fundamental Parameters calibration (Mining) to determine whether the
soil calibration is acceptable or whether the total metal content is too high for Compton mode.
If the sample can be analyzed via soil mode, then the analyzer will display results from both Soil and Mining Modes
in one unified list. If both calibrations contain the same element, then the mode that has the lower detection limit
will be displayed. Elements will be displayed in both ppm (mg/kg) and wt.% with 0.10% being the cutoff point.
Notice the ppm heading, but % is listed with results where appropriate. TestAll Geo Mode may be used with any
Standard Operating Procedure appropriate for either Mining or Soil analysis.

Start/Stop
Settings

The Start/Stop Setting Screen enables you to change the preconditions for operation. The Start/Stop parameter
options are Batch Readings, Number of Readings, Duration, and Prompt with Each Reading.

Figure 1. Batch Control

Batch
Readings

Selecting the empty checkbox will enable Batch Readings, where you can automatically run a set of analyses with
identical analysis parameters on a sample. Selecting the checked box will disable Batch Readings.
Number of Readings
Selecting the Number of Readings field enables you to input the number of readings to be taken in the batch. Select
the field, and a Virtual Numeric Keypad will pop up to change the Number of Readings parameter. To input the
maximum number of samples, select the C button to clear the current time, then from the Virtual Numeric Keypad,
select each digit you want to input, then select the E button to enter. Of the non-numeric screen buttons, C = Clear
All, E = Enter, and ">" will backspace over the current value. Selecting the E button will enter the current value as
the Number of Readings, and return to the Start/Stop Settings Screen.

85

Duration
Duration is a field to set the maximum time for sample analysis before the analysis stops. Select the field, and a
Virtual Numeric Keypad will pop up to change the maximum analysis time parameter. To input the maximum
number of seconds before automatic shutoff, select the C button to clear the current time, then from the Virtual
Numeric Keypad, select each digit you want to input, then select the E button to enter. Of the non-numeric screen
buttons, C = Clear All, E = Enter, and ">" will backspace over the current value. Selecting the E button will enter the
current value as the Max Time, and return to the Start/Stop Settings Screen.
Prompt Before Each Reading
Selecting the empty checkbox will prompt you before each batch of readings, reminding you of what your
parameters are set to. Selecting the checked box will disable the prompt.
OK Button
Selecting the OK Button will save your current settings.

Adjusting the
Calibration

Selecting Adjust Calibration enables you to change calibrations for various analysis modes.
When the Calfactors Screen appears, select the radio button of the Calibration Factor Set you wish to edit, then
select the appropriate Edit Button. The Calibration Edit Screen will open. You cannot edit the Factory Calibration.
You may edit and store up to four alternate calibrations per Mode.
Choose the element you wish to edit in the Element Column
Select the intersection of the Slope Column and the Element Row to edit Slope.
Select the intersection of the Intercept Column and the Element Row to edit Intercept
Select the Virtual Keyboard Icon to bring up the Virtual Keyboard and edit the Set name in the Set Name Field.

Editing the
Calibration
Factors - Slope

Selecting the intersection of the Slope Column and the Element Row will bring up the Slope Edit Screen, a variation
on the standard Virtual Numeric Keypad. To enter in a new value for this slope, select the C Button to clear the
field, enter in the new value for the slope, and select the E Button to enter this value. The ">" Button operates as a
backspace key.

Editing the
Calibration
Factors Intercept

Selecting the intersection of the Intercept Column and the Element Row will bring up the Intercept Edit Screen, a
variation on the standard Virtual Numeric Keypad. To enter in a new value for this intercept, select the C Button to
clear the field, enter in the new value for the intercept, and select the E Button to enter this value. The ">" Button
operates as a backspace key.

Calibration
Factors

Although the FP software automatically corrects for most inter-element matrix effects, Niton analyzers cannot
currently detect elements lighter than magnesium. As a result, the oxidation state of elements can bias measurements
of other elements. In many cases, this bias is small, and uncorrected results provide sufficient accuracy, especially
when data is for screening purposes only.
The degree of severity of the bias should be evaluated before proceeding with routine measurement. A few test
samples should be carefully measured by another technique, or by an outside lab. These samples should then be
analyzed using the analyzer. If the agreement is good enough to provide the accuracy required for the application, the
instrument can be operated as shipped. If it is determined that a bias correction is necessary, the procedure for
establishing calibration factors should be followed. As site characteristics change, it is good practice to run a few
check standards to ensure that the instrument results are still within an acceptable error range.

Oxides versus
elemental concentrations

Laboratories, certificates of analysis and other XRF instruments sometimes report data as oxides. The analyzer
reports data only in elemental concentration. Therefore, oxide data must be converted to elemental concentration
for comparison with Niton FXL analyzer results. This may be achieved using the conversion factors below. The
oxide is multiplied by this factor to convert to elemental concentration.

86

Table 1. Oxides vs. Elemental Concentrations

Oxide

Conversion Factor

MgO

0.6030

Al2O3

0.5292

SiO2

0.4674

SO3

0.4005

K2O

0.8301

CaO

0.7147

TiO2

0.5994

V2O5

0.5602

Cr2O3

0.6842

Mn3O4 0.7203
MnO

0.7745

Fe2O3

0.6994

FeO

0.7773

Co3O4

0.7342

NiO

0.7858

CuO

0.7989

ZnO

0.8034

PbO

0.9283

Fe2O3

0.6994

Bi2O3

0.8970

ZrO2

0.7403

MoO3

0.6665

WO3

0.7930

Ta2O5

0.8190

Nb2O5

0.6990

SnO2

0.7877

This conversion factor is obtained by taking the atomic weight of the element multiplied by the number of its atoms
in the oxide, and dividing by the molecular weight of the oxide.

Worked
examples:

To convert SnO2 to Sn:


Sn = concentration of SnO2 * 0.7877
To convert Sn to SnO2:
SnO2 = concentration of Sn * 1/0.7877

87

Measuring the
Standard
Samples

Use your analyzer to take readings of samples for which you already know the composition of the sample. It is
important that the known composition of these samples be accurate. These samples provide the baseline to which
your analyzer is adjusted. If the known composition is inaccurate, your analyzer will also be inaccurate.
For each sample, take a reading of at least 120 seconds. Make a note of the reading numbers for these samples.
For each sample, your analyzer reports the percentage by weight for the elements present in the sample. With the
calibration factors set to the defaults, these percentages will differ from the known percentages, but are used to
calculate the calibration factors.
Note Readings from only three samples are required for each element, but increasing the number of readings also
increases the accuracy of the results.

Calculating
Calibration
Factors

Using the data that you collected by measuring the standard samples, you now need to plot the percentage of each
element as indicated by the standard samples, against the percentage of each element as reported by your analyzer.
Then use the linear regression function to calculate the slope and intercept for a straight line drawn through a graph
of those data points. The slope and intercept for this line are the calibration factors for the element.
You may use any tools that you prefer to make this calculation. This document shows one possible method, using
Excel.
Note Your analyzer reports percentages of elements, and the calibration factors are based on percentages of elements.
If your standard samples indicate percentages of oxides, see Oxides versus elemental concentrations to convert
percentages of oxides to percentages of elements.

Calculating
Calibration
Factors Using
Excel

To calculate calibration factors:


Open Excel. To do this:
a. Click the Start button.
b. Select Programs.
c. Click Microsoft Excel.

88

Figure 2. Starting Excel

1. In the first column, enter all the percentages by weight for the element as reported by your analyzer.

Figure 3. Adding Data

2. In the second column, enter all the percentages by weight for the element as indicated by each standard.
3. Use the cursor to highlight all the numbers in both columns.

89

Figure 4. Setting Up the Chart

4. Click Chart Wizard.


5. Select the Scatter Chart with data points that are not connected by lines.

90

Figure 5. Selecting the Scatter Chart

6. Click Finish.
7. Right-click on one of the data points.
8. Click Add Trendline on the pop-up menu.

Figure 6. Adding the Trendlines

a. On the Type tab, click Linear.

91

Figure 7. Selecting Linear Trendline

b. On the Options tab, check the boxes for Display equation on chart and Display R-squared value on chart.

92

Figure 8. Selecting Options

c. Click OK.
d. The equation shows the slope and intercept for the trend line. These are the calibration factors that you enter into
your analyzer.

93

Figure 9. Slope and Intercept Displayed

Note If the intercept is negative in the equation, be sure that you enter the intercept as negative in your analyzer.
9. Repeat these steps to find the slope and intercept for every element that you are interested in.

Pseudo-Elements

Pseudo-Elements are constructs you can create which will be treated like elements in analysis, showing up in analysis
as if they were actual elements. In setting up a Pseudo Element, you must name it with at least 3 characters and at
most 6 characters - except you cannot use * (asterisk) and space. You may create up to 15 pseudo-elements, but you
may not have more than 64 displayed values. You can set up equations with the following operands:

Add

You can Add detected levels of elements or Pseudo Elements together or to a constant. For example: sum applicable
elements for FAC Analysis - (FAC = Cr+Cu+Mo), or sum elements for Residual Element Application - (REA =
Cr+Ni+Cu).

Subtract

You can Subtract detected levels of elements or Pseudo Elements from each other or a constant. For example: find
the level of clinker material C2S in cement - (C2S = (2.87*SiO2)-(0.754*Ca3SiO5))

Multiply

You can Multiply detected levels of elements or Pseudo Elements by a constant. For example: Convert elemental data
to oxide form - (CaO = Ca*1.4).

Divide

You can Divide detected levels of elements or Pseudo Elements by a constant. For example: Convert Au
concentration to Karat% - (Karat = ((Au/100)*24))

Power

You can raise levels of elements or fixed numbers by an arbitrary power. For example: Ba^2.

Parentheses

You can use parentheses to stack order of calculation, as in the Karat and C2S examples above.

94

Figure 10. Pseudo Elements Screen

The Pseudo Elements Screen enables you to Create, Edit, and Delete Pseudo Elements

Figure 11. The Pseudo Element Editor

To set up a new Pseudo Element, select the New Button. To Edit an existing Pseudo Element, select the Edit Button.
To Delete an existing Pseudo Element, select the Delete Button. To Close the window and return to the previous
screen, select the Close Button.

95

New

Figure 12. The Pseudo Element Equation Editor

1. Select the Keyboard Button to bring up the Virtual Keyboard.


2. Type in the name you chose to call your Pseudo Element.

Figure 13. Using the Virtual Keyboard

3. Select Enter.
4. Select the inverted triangle button to open the Drop-Down Menu.

96

Figure 14. Selecting Elements from the Drop-Down List

5. Select the element you want to add to the equation. It will appear in the Current Element window.
6. Select the Add Button to add the Current Element to the equation.

Figure 15. Adding the Current Element to the Equation

7. Using the Virtual Numeric Keypad integral to the editor, type in the other parts of the equation.

97

Figure 16. Adding Operands and Constants

8. When the equation is complete, select the Close Button.


9. The Pseudo Element will show up in the Mode Editor Window.

Figure 17. The Finished Pseudo Element in the Mode Editor Window

Edit

Select the Pseudo Element you want to edit from the Mode Editor window, then select the Edit Button.

98

Figure 18. Selecting a Pseudo Element for Editing

This will open up the Pseudo Element Equation Editor with the selected Pseudo Element pre-loaded.

Delete

1. Select the Pseudo Element you want to delete from the Mode Editor window.

Figure 19. Selecting a Pseudo Element for Deletion

2. Select the Delete Button.


3. Select the Close Button to return to the previous screen.

99

Helium
Purged
Analysis

Attaching the Helium Supply to Your Analyzer


Make sure before you start that you are using a clean Prolyne Measurement Window intended for use with
He-purged analysis, rather than the standard Polypropylene Measurement Window. Securely attach the regulator
accompanying your analyzer to the helium tank. This custom regulator governs the helium flow to 70 cc/min flow
rate. Confirm that the helium tank contains a sufficient supply for your testing purposes.

Figure 20. The Helium Tank Setup - Inset Showing Regulator Dial

1. Check the line for kinks, knots, or other obstructions before turning on helium flow.
2. Attach the Ferrule, shown in Figure 1-2, to the Helium Input Port, shown in Figure 1-3.

Figure 21. The Helium Ferrule

100

Figure 22. The Helium Input Port

3. Snap the Locking Clip into the Locking Groove, securing the ferrule to the instrument, as in Figure 1-4.
4. Turn on helium flow. Allow the helium to purge the instruments front end for a minimum of five minutes before
starting analysis. Helium purging can be done at the same time as instrument warm-up.

He-Purged
Analysis

Helium purge is required for optimum detection of light elements such as Mg, Al, Si and P. For analyses using
helium, we recommend using the default Range setup, as shown below in Figures 1-5 and 1-6. As with other energy
dispersive XRF analyzers, the longer the testing time, the better the statistical precision of the reading. See Adjusting
the Element Range for more on setting Ranges.

Figure 23. The Default Range Setup for Metals

101

Figure 24. The Default Range Setup for Metals

The user can see the precision improving with analysis time by watching the third column on the analysis display
the 2-sigma error. This value decreases as reading time increases. Knowledge of this relationship is important,
especially for light element work.
Table 1 shows the approximate limits of detection of the light elements in various matrices. If testing time is
shortened, the limits of detection (which are directly based on statistical precision) will increase.
Table 2. Approximate LODs for Light Elements in Fe and Al Matrices

Element
Zn
Cu
Ni
Cr
V
Ti
Al
Si
Mg

Fe Matrix
(concentration %)

Al Matrix
(concentration %)

0.015
0.030
0.050
0.015
0.010
0.010
0.031
0.075
1.300

0.010
0.030
0.010
0.015
0.010
0.010
0.150
0.650

The above LODs were calculated with 60 sec per filter measurement times
Sample Requirements
1. Must be flat, sit directly on top of and cover the entire measurement window.
2. Surface must be clean and free of oxidation or other surface films.

102

See Metal Sample Prep for more information for metal sample preparation, and Preparing Mining Samples for more
information on preparing mining samples.
Alloy Grade Libraries
Using a library of alloy grade specifications, your alloy analyzer will attempt to identify an alloy sample being
analyzed. It does this by comparing the chemistry analysis results against the minimum and maximum specifications
in the library. The library is an .alb or .clb file, and is viewable, editable, and upload-able via the included NDT
program. Your analyzer can store two alloy grade libraries in its memory, and automatically chooses one of the two
for use while testing.
Selecting the He Library
Selecting the He button on the Test Screen will switch your analyzer into Helium-purged mode. In non-GOLDD
analyzers, this will select the Light Range and change to the 900 Library. Selecting it again will de-select the Light
Range and change to the 800 Library.

Figure 25. Selecting the He Button

If your analyzer is equipped with Helium Purge, selecting the Helium Button will not deselect the Light Range.
Instead it switches between Helium and Non-Helium applications - the Light Range remains enabled. You may also
wish to use the analyzer without the Light Range activated. To do this, deselect the light range via the Element
Range Screen. To use the analyzer this way, be sure helium has been completely purged from the system.

Note When switching between Helium and Non-Helium applications on your analyzer, allow the helium to
dissipate out of system for at least ten minutes before beginning non-helium analysis.

103

104

Standard Maintenance
Analyzer
Specifications
Environmental
Specifications

Ambient Temperature Range: -10C to +50C


Ambient Humidity Range: RH 0-95% Non Condensing

Voltage Current
Specifications

12 Volts, 5 Amps

Battery Pack
Disposal:

Batteries must be recycled or disposed of properly.


Follow local regulations and ordinances for the disposal batteries.
Do not trash battery packs in the garbage can.
Before disposing the battery pack or cells, insulate any exposed terminals with 1adhesive tape or similar
material to prevent short circuits.

Shipping and
Transportation (Air):

Air transportation of Li-ion batteries is regulated in several countries, and by United Nations through the
International Air Transportation Association (IATA) Dangerous Goods Regulations, among others. Please
check local regulations and with the common carrier before shipping Li-ion battery packs or products
containing relatively large Li-ion battery packs.
As of January 1, 2008, the US Department of Transportation (DOT) through the Pipeline and Hazardous
Materials Safety Administration (PHMSA) no longer allows loose lithium batteries in checked baggage.
Thermo Fisher Scientifics Li-ion batteries are tested in accordance with specifications detailed in UN 3090
(UN manual of Tests and Criteria, Part III, subsection 38.3).
This safety precaution safeguards against the shipment of defective battery packs.
Do not ship or carry recalled or damaged batteries on an aircraft. Check battery information and instructions
at Thermo Fisher Scientifics website or contact Customer Support at 1 (800) 875-1578 (in the USA) or +1
(978) 670-7460 (anywhere) or niton@thermofisher.com.
If the original packaging is not available for shipping spare batteries, effectively insulate any exposed battery
terminals by isolating the batteries from contact with other batteries and metal. Do not permit a loose battery
to come in contact with any metal objects.
IATA Classification Medium and large capacity Lithium-ion battery packs may be classified as Class 9
Miscellaneous dangerous goods. Shipments of such products must be identified by a Class 9 label on the
shipping package and may be considered restricted cargo in passenger aircrafts. Individual Li-ion battery
packs must be declared as:
UN 3480 for Lithium-ion batteries or battery packs being shipped alone. If they are contained inside a piece
of equipment or packed along with a piece of equipment, they must be declared as:
UN 3481 for Lithium-ion batteries contained in equipment
UN 3481 for Lithium-ion batteries packed with equipment.
Figure 1-1. Li-ion Battery Category Definition & Restrictions

Product Registration:

Registration is a very important step in the ownership of your instrument.


For warranty and safety purposes all Thermo Fisher Scientific products must be registered by the owner
and/or end user.
Customers would not only benefit from service notes and safety notifications, but from access to a wide
range of improvements (e.g. software updates, new features, new documentation, etc.). Please follow the
registration instructions that came with your product.

105

Registration is the sole responsibility of the end user and owner. Products must also be registered, and contact
information must be updated, whenever the end user, responsible person, or contact information changes.
Customer and end user shall follow any calibration, verification, and/or preventive maintenance cycles
recommended by the manufacturer for each specific product, to keep it under warranty and assure safety.
During these procedures the battery pack will be checked for any sign of degradation, leakage or any other
possible defect that may affect safety.
Refer to the specific manual and datasheet for the suggested maintenance period.
Copy of this document shall be kept with the product at all times.

Customer
Service
Contact Information:

THERMO FISHER SCIENTIFIC INCORPORATED


900 Middlesex Turnpike, Building 8
Billerica, MA 01821
USA
1-800-875-1578 / 1-978-670-7460

Limited
Warranty:

THERMO FISHER SCIENTIFIC INC. WARRANTS ITS BATTERY PRODUCTS TO BE FREE FROM
DEFECTS IN MATERIAL AND WORKMANSHIP, UNDER NORMAL USE AND IF PROPERLY
INSTALLED, FOR A PERIOD OF ONE YEAR FROM DATE OF PURCHASE.
Note The information presented in this document is generally descriptive only and it is not intended to make or
imply any representation, guarantee or warranty with respect battery packs. Please refer to individual product
manuals and datasheets for specific information. This document is subject to change without notice. Contact
Thermo Fisher Scientific for the latest information.

106

Das könnte Ihnen auch gefallen