Sie sind auf Seite 1von 5

The British Journal of Radiology, 79 (2006), 592596

A survey of MRI quality assurance programmes


C J KOLLER,
MSc,

J P EATOUGH,

PhD,

P J MOUNTFORD,

PhD

and G FRAIN,

MMath

Medical Physics Directorate, University Hospital of North Staffordshire, Princes Road, Hartshill, Stoke on Trent ST4 7LN, UK
ABSTRACT. There are currently no national guidelines on appropriate quality assurance (QA) test frequencies for MRI equipment in clinical use. From a random selection of 45 hospitals in England, who were contacted by phone, 35 hospitals agreed to participate in a survey of MRI QA and were sent a questionnaire requesting information on the range and frequency of QA tests, as well as the staff groups who conduct these tests. Twenty-four completed replies were received, representing a 68% response rate from the distributed questionnaires. Of these, 79% undertook some form of QA, typically conducted by the radiographic staff. Tests were most often undertaken on the head coil, but there was a considerable variation in the frequency and range of tests undertaken at different hospitals. For example, exactly half of the respondents conducted signal to noise ratio (SNR) tests on both head and body coils, but only 13% of centres extended this test to other coils. Results from this survey should inform radiology departments regarding practice at other hospitals and should assist in formulating the frequency and scope of appropriate MRI QA programmes.

Received 13 July 2005 Revised 7 October 2005 Accepted 24 October 2005 DOI: 10.1259/bjr/67655734
2006 The British Institute of Radiology

The use of MRI has more than doubled over the last decade and, on average, 3000 scans per day were undertaken in England in 2004 [1]. There will be a variation in the quality of images produced in different centres, due to differences in equipment and scan parameters. Within a given centre, it is important to ensure that all medical images produced are consistent and of sufficient quality to answer the diagnostic question. A change in sensitivity of the system or an increase in the noise present in an image may cause fine detail to be lost. Image artefacts may affect the region of interest. These effects may be subtle, leaving the observer unaware that they are happening, yet can result in loss of confidence in the diagnosis or even misdiagnosis. It may be argued that the skilled observer would detect any decrease in quality of the radiological image. However, a case was reported in the national press several years ago where more than 1000 patients had to be recalled as a result of undetected image quality problems with an MRI scanner (Hospitals brain scanner fails Daily Telegraph, 6th June 2002). Quality assurance (QA) is a process to ensure that any product or service meets a required standard. This is particularly important for diagnostic imaging equipment, as it may not be immediately obvious that there have been any changes in the performance of the equipment. A QA programme for a MRI scanner must be able to detect changes in system performance, allowing equipment faults to be identified and rectified before they become clinically significant. This goal can only be achieved if the tests are of an appropriate type and range, and they are carried out at an appropriate frequency. Data from QA tests can be used to identify trends and hence anticipate deterioration in performance, and in some cases provide evidence to support equipment replacement business cases. The results also
592

enable system drift to be monitored and quantified. This is particularly important when comparing follow-up scans on a patient, as it is necessary to ensure that any change in the appearance of the image is not due to changes in the equipment performance. The importance of undertaking appropriate measurements on new systems should also not be overlooked. Acceptance and commissioning tests allow a judgement to be made on whether the equipment meets the manufacturers specifications, as well as providing baseline performance data for future QA testing. McRobbie et al [2] found considerable value in undertaking acceptance tests on new MR installations, with signal to noise ratio and geometric linearity being the most common parameters to fail acceptance tests, and these were successfully corrected and improved in the majority of instances. For radiological equipment where ionizing radiation is used, not only are there national standards for QA tests to be undertaken [3, 4], but also the requirement to carry out these tests has been enshrined in legislation [5]. For MRI the situation is somewhat different. Comprehensive guidelines exist that detail a range of appropriate QA tests and their methodology [6]. However, there appears to be no consensus as to how frequently these tests should be undertaken in clinical practice, and currently there are no statutory requirements for these tests to be carried out. The purpose of this survey was to identify which MRI QA tests are currently undertaken in England, who undertakes the tests, and the frequency at which these tests are carried out. The results of this survey should assist radiology departments in the formulation of their own appropriate QA programme and contribute towards establishing which tests to undertake and their appropriate frequency.
The British Journal of Radiology, July 2006

Survey of QA programmes

Method
Within the National Health Service (NHS) there are over 200 MRI scanners in England, located in around 180 hospitals. From an alphabetical list of hospital trusts with MR scanners, 45 were selected by contacting every fourth hospital, enquiring if they would participate in this study. Of these, 35 hospitals agreed to take part in this survey, and a questionnaire was sent to each. Where hospitals had more than one MRI scanner, only one questionnaire was returned which reflected their typical QA programme. However, in practice this was similar for all the scanners within a particular hospital. For those hospitals that received medical physics support, a copy of the questionnaire was also sent to the relevant medical physics department. Follow-up calls were made to those hospitals that had not returned the survey to maximize the return rate. Information was requested on the nature and frequency of QA tests undertaken on the MRI equipment, as well as the type, field strength and age of the MRI equipment and also the staff group (radiographers, medical physicists, clinical engineering or service engineers) who undertook these tests.

Results
From 35 hospitals who received the questionnaire, completed replies were received from 24, corresponding to a response of just over 68%. Responses were received in two further cases where the MRI scanner was undergoing replacement and, as routine QA was under review, the users felt unable to complete the questionnaire. Overall, this corresponded to a 74% response rate. The responses covered a broad range of equipment manufacturers, field strengths and ages (as shown in Figure 1) and all four major MRI manufacturers were represented to varying degrees. Out of the 24 hospitals, 79% undertook some form of QA (Figure 2a). Most routine QA was undertaken by the radiographers who operated the equipment on a regular basis. Service engineers have been excluded from this graph as it is understood that they will invariably undertake some QA during each service visit. Figure 2b compares the different tests performed by radiographers and medical physicists. Both staff groups performed a range of tests, with signal to noise ratio (SNR) measurements most commonly undertaken by radiographic staff. However, there was a considerable difference in the range of tests undertaken in different hospitals with only four hospitals carrying out all eight of the QA tests listed. Test frequencies for head coils and for other coils are compared between radiographers and medical physicists in Figure 2c. The head coil was the most frequently tested of all the coils, typically tested on either a daily or weekly basis by radiographic staff. The frequency of testing of the other coils was considerably less and this was further highlighted in Figure 2d which shows that where SNR measurements were performed, they were always undertaken on the head coil. Half of the hospitals extended this test to the body coil, but less than 13% included any further coils.
The British Journal of Radiology, July 2006

Figure 1. Number of hospitals responding to the survey by (a) manufacturer, (b) field strength and (c) equipment age.

Figure 2e,f detail how often the other QA tests (i.e. non-SNR) are performed by radiographers and medical physicists, respectively. Image uniformity testing was carried out most often by both groups of staff. Slice position was the least frequent test to be carried out. The time interval between tests undertaken by the two groups of staff generally mirrored that of the SNR tests, with medical physicists tending to carry out a wider range of tests less often. Once again, there was a marked variation in practice between hospitals. The time interval between service visits varied between 1 month and 6 months, with 3 months being the most common (Figure 3a). At each service visit some
593

C J Koller, J P Eatough, P J Mountford and G Frain


80 Percentage of hospitals Any Staff Radiographers Medical Physics Clinical Engineering

80 Percentage of hospitals

60

60

40

40

20

20

Resolution

Geometric

Thickness

Uniformity

Distortion

0 Image SNR

Ghosting

Linearity

(a)
45 40 35 Percentage of hospitals 25 20 15 10 5 6 Monthly + Fortnightly 2 Monthly 3 Monthly 4 Monthly Monthly 0 Weekly Daily Percentage of hospitals 30 60 80

(b)

40

20

Extremity

Shoulder

Frequency

Coil

(c)
30 25 Percentage of hospitals Percentage of hospitals 20 15 10 5 6 Monthly + Fortnightly 2 Monthly 3 Monthly 4 Monthly 0 Weekly Daily 10 8 6 4 2 0 Weekly Daily

(d)

Monthly

Frequency

Frequency

(e)

(f)

Figure 2. The percentage of hospitals that: (a) undertake quality assurance (QA), by staff group; (b) undertake QA by type of
test, and by staff group; radiographers &medical physicists; (c) undertake signal to noise ratio (SNR) tests by frequency, by coil and by staff group; head coil-radiographers & head coil-medical physicists % other coils-radiographers other coils-medical physicists; (d) undertake SNR tests, by coil type; (e) undertake other QA tests, by type and by frequency radiographers. Image Uniformity % Geometric Distortion & Spatial Resolution Ghosting Slice Thickness Slice Position. (f) undertake QA tests by type and by frequency medical physicists. Image Uniformity % Geometric Distortion & Spatial Resolution Ghosting Slice Thickness Slice Position.

QA was routinely undertaken on both the head and body coils. However, the testing of the other coils was not as regular (Figure 3b).

Discussion
One goal of undertaking QA is to maintain confidence in the performance of imaging equipment and to ensure that it is operating within specification. To that end, the measurement of SNR for an MRI unit gives a good, all round indication of the performance of the whole system. The SNR is determined by many different parameters, but for a particular sequence and coil combination, should remain stable over the long term
594

[2, 7, 8]. Hence a significant change in SNR will indicate a potential problem, but more specific tests are required to locate the exact cause. It should be noted, however, that testing the SNR on one coil does not itself provide information on the performance of any of the other coils. The other tests undertaken regularly, such as image uniformity, spatial resolution and slice thickness examine, amongst other things, field gradient strength and selection. Since many MRI systems have integral gradient coils, this aspect of the system may be sufficiently tested using just one receiver coil. SNR was the test most commonly performed in a QA program, typically on a daily to weekly basis. This test was always performed on the head coil at hospitals where QA was undertaken. Only half the hospitals
The British Journal of Radiology, July 2006

6 Monthly +

Fortnightly

2 Monthly

3 Monthly

4 Monthly

Monthly

Surface

0 Spine Head Body

Position

Spatial

Image

Slice

Slice

Survey of QA programmes
30 25 Percentage of hospitals 20 15 10 5 0 Monthly 2 Monthly 3 Monthly Frequency 4 Monthly 6 Monthly

(a)
30 25 Percentage of hospitals 20 15 10 5 2 Monthly 3 Monthly 4 Monthly 6 Monthly Monthly Annual 0

Frequency

(b)
Figure 3. Percentage of hospitals where service engineers
undertake quality assurance (QA) tests: (a) by frequency; (b) by coil tested. , Head coil; &, body coil; %, other coils.

addition, the responsibility for equipment performance and safety rests primarily with the user and not with the engineers. This survey was designed, in part, to capture a representative sample of practice of MRI QA across hospitals in England. It should, of course, be borne in mind that the final response of 24 completed questionnaires from the random selection of 45 trusts might allow the possibility of bias if the response, be it positive or negative, was related to the extent of QA undertaken. There was no evidence, however, that this was the case; and, indeed, it is possible to argue both for and against a bias towards hospitals that undertake greater QA. Moreover, the possibility of bias would not affect the major finding of this study, namely that the range and frequency of MRI QA varies substantially between different hospitals. Quality is a core principle of both clinical governance and the NHS Plan [10]. In terms of medical imaging, this means the assurance that equipment is always working optimally in order to enable the quality and confidence of diagnosis to be maintained. However effective, QA is also about maintaining a balance between undertaking sufficient meaningful tests to ensure the equipment is operating optimally, whilst ensuring that the resource is used effectively in scanning patients. In the current climate, patient throughput is essential in maintaining waiting lists. Hence too much QA may also be deleterious to the diagnostic service.

Conclusion
This national survey indicated that 21% of hospitals did not undertake any form of in-house QA on their MR scanners, and as such may be out of step with national practice. However, there was a wide variation in the number and frequency of QA tests carried out, and even for the most common QA test (SNR measurements) there was a wide variation in the frequency and range of coils tested. A national standard would help users to achieve the balance between maintaining quality and effective patient throughput.

extended this test to the body coil, and even fewer hospitals (, 13%) tested the other coils available. Hence these hospitals are unable to make any judgment of the long-term performance of these coils. Another goal of QA is to maintain equipment calibrations, to ensure that there is no image distortion, and that any numerical quantities measured are accurate. The importance of this will depend to a great extent on how the diagnostic images are to be interpreted, and the type of diagnostic information required. Hence certain specific QA tests will be more appropriate to some hospitals [9]. Due to the complexity of MRI equipment, all hospitals received regular service visits on a 16 monthly basis. During these service visits, the service engineer may undertake some form of QA tests, but this will depend upon the work required and the availability of the MRI equipment. The particular tests undertaken will vary between manufacturers, engineers and between different systems, and may well not be appropriate for the required diagnostic information. Of note is that 21% of hospitals did not perform any QA measurements themselves, but left these tests to the service engineers. These hospitals did not receive service visits with any greater frequency than any of the other hospitals. This approach should only be undertaken with due caution as the users and the service engineers may have different goals. The user may wish to observe and minimize any drift in equipment performance whereas the engineer is primarily concerned with ensuring that the equipment is just operating within specification. In
The British Journal of Radiology, July 2006

Acknowledgments
The authors wish to thank the MR staff and the service engineers who took part in this survey.

References
1. Imaging and radiodiagnostics (KH12). http://www.performance.doh.gov.uk/hospitalactivity. Department of Health. October 2004. 2. McRobbie DW, Quest RA. Effectiveness and relevance of MR acceptance testing: results of an 8 year audit. Br J Radiol 2002;75:52331. 3. The Institute of Physics and Engineering in Medicine. Recommended standards for the routine performance testing of diagnostic X-ray imaging systems. IPEM Report 91. York: IPEM, 2005. 4. The Institute of Physics and Engineering in Medicine. Quality control of gamma cameras and associated computer systems. IPEM Report 66. York: IPEM, 1997.

595

C J Koller, J P Eatough, P J Mountford and G Frain


5. The Ionising Radiations Regulations 1999. London: The Stationery Office, 2000. 6. The Institute of Physics and Engineering in Medicine. Quality control in magnetic resonance imaging. IPEM Report 80. York: IPEM, 2000. 7. Colombo P, Baldassarri A, Del Corona M, Mascara L, Strocchi S. Multicenter trial for the set-up of a MRI quality assurance programme. Magn Reson Imaging 2004;22:93101. 8. Lerski RA, De Certaines JD. Performance assessment and quality control in MRI by eurospin test objects and protocols. Magn Reson Imaging 1993;11:81733. 9. Barker GJ, Tofts PS. Semiautomated quality assurance for quantitative magnetic resonance imaging. Magn Reson Imaging 1992;10:58595. 10. The NHS Plan: a plan for investment, a plan for reform. London: The Stationery Office, 2000.

596

The British Journal of Radiology, July 2006

Das könnte Ihnen auch gefallen