Diana
Guimarães
a,
Tracie M.
Cleaver
a,
Steven F.
Martin
b and
Patrick J.
Parsons
*ac
aLaboratory of Inorganic and Nuclear Chemistry, Wadsworth Center, New York State Department of Health, PO Box 509, Albany, NY 12201-0509, USA. E-mail: patrick.parsons@health.ny.gov
bBureau of Community Environmental Health and Food Protection, Center for Environmental Health, New York State Department of Health, Albany, NY 12237, USA
cDepartment of Environmental Health Sciences, School of Public Health, University at Albany, State University of New York, Albany, NY 12201-0509, USA
First published on 24th September 2014
Childhood lead poisoning remains a significant public health issue, especially in the United States, where the most common source of exposure is lead-based paint (LBP). X-Ray Fluorescence (XRF) analysis is still the most widely used method for detecting LBP in the field. Although portable XRF instrumentation based on excitation from a 57Co radioisotope has been used for more than 30 years, there have been few reports documenting its performance. Here we describe a study that was conducted by the New York State Department of Health's Wadsworth Center laboratory in response to concerns raised by field users of the RMD LPA-1 XRF analyzer (Protec Instrument Corp.) working across the state. The performance issues were investigated for ten field units: five reported as problematic based on user feedback, and 5 that were not. Accuracy was assessed against NIST SRM 2579 lead in paint, which was developed specifically for use with portable XRF analyzers. On average, the absolute bias found was within ±20% at the threshold value for LBP (1.0 mg cm−2) based on the NIST SRM 2579 data. Calibration blocks provided with each analyzer for quality assurance monitoring were evaluated using a different XRF analyzer (Niton XLT 3t 700s GOLDD) operated in painted products mode (μg cm−2). However, when the Niton XRF analyzer was checked against NIST SRM 2579, it was found to have a negative bias. That negative bias was easily corrected using a “calibration” curve with a quadratic fit to the data. NIST-corrected data obtained for the calibration blocks showed assigned values were within the manufacturer's stated tolerance range, albeit with a consistent positive bias. The root cause for 3 of the 5 problematic devices was likely incorrect positioning of the device. A low bias for a fourth device was likely caused by a deteriorated calibration block, and the fifth device, while just within the manufacturer's technical specifications, was the only one confirmed with a low bias. Increased operator training may resolve some of the issues reported in the field; on-going competency assessments may be warranted for this hand-held technology.
The use of LBP for interior use was banned in 1971 under the Lead-Based Paint Poisoning Prevention Act (LBPPPA)5 which defined LBP as paint containing more than 1 wt% lead by weight (10000 mg kg−1), and which was then subsequently reduced further to 0.5 wt% in 1973. In 1977 the US Consumer Product Safety Commission (CPSC) issued a final ban on lead-containing paint for residential use and on toys and coated furniture, and lowered the permissible amount from 0.5 wt% (5000 mg kg−1) to 0.06 wt% (600 mg kg−1) by weight.6 Later on, in the early 1990s, the US Department of Housing and Urban Development (HUD) defined LBP as greater than or equal to 1.0 mg cm−2, or 0.5% by weight.7 In 2008, the CPSC reduced the permissible amount of lead in new painted coatings from 0.06 wt% to 0.009 wt% (90 mg kg−1), and set the limit for total lead content of children’s consumer products to 0.01 wt%, or 100 mg kg−1.8 Although the limit now permitted for lead in paint on children's toys and products has been reduced to 90 mg kg−1, the household LBP compliance level set by HUD remains at 0.5 wt%.7
Portable X-ray Fluorescence (XRF) analyzers have been used to screen homes for LBP for over 40 years. In 1971 Laurer G. R. et al. published one of the first papers describing development of an XRF instrument for the in situ determination of lead in wall paint.9 As the control of lead exposure in public housing became a priority, HUD developed guidelines for using field portable XRF instruments to conduct home inspections, for identifying the presence of LBP.7
XRF analysis is based on the principle that all elements will emit characteristic X-ray electromagnetic radiation when submitted to a suitable excitation source. For portable XRF analytical purposes, fluorescence X-rays are produced when a high-energy photon interacts with inner shell electrons causing ionization of the atom and creating a vacancy in one of the fully occupied inner shells. The transition of electrons from an outer shell to fill this vacancy results in emission of a characteristic X-ray. The primary advantages of XRF devices are that they give fast and real time results, are non-destructive, and enable relatively low cost per analysis; it also allows a large number of homes to be screened readily with no damage to the painted surfaces.
Earlier XRF technologies relied mostly on radioisotopes as the excitation source, and proportional counters as detectors to capture the K-shell fluorescence photons from lead. While progress in XRF technology and digital electronics has resulted in performance improvements for bench top analyzers, detection limits for many portable XRF analyzers have remained largely unchanged. Recently, new XRF technology based on doubly curved crystal optics coupled with silicon drift detectors, has been used to determine lead in paint layers and similar coatings or in substrates and homogeneous materials.10 This new XRF technology was approved by the CPSC in 2010 for demonstrating conformance and producing reliable data, comparable to laboratory analytical methods for the identification of LBP.11
Other field portable techniques, like ultrasonic extraction with anodic stripping voltammetry (UE/ASV) and simple chemical spot tests, have been used to determine lead in paint. However these techniques both have some disadvantages. The UE/ASV is destructive and is time consuming since it depends on grinding the sample.12 Chemical spot tests are not quantitative, have a limited lifetime, are destructive and have considerable rates of false positive and negative errors.13 All these disadvantages make portable XRF instruments the preferred method to identify LBP.
In 2013 the New York State Department of Health (NYSDOH) Lead Poisoning Prevention Program received several reports from local public health agencies (e.g., county, city and district health departments) expressing concerns with the performance of their portable XRF analyzers during standard calibration check procedures. Currently, more than 50 hand-held XRF analyzers (RMD LPA-1) are in active use among local health departments located throughout New York State.
The principal goal of this report is to document several problems raised by field users of the RMD LPA-1 XRF analyzer, and describe the results of a subsequent study carried out to characterize the analytical performance of 10 field units. Specifically, some users reported that quality control data for their calibration check blocks had shown a downward shift from 2010 to 2011. While the downward shift was reported to be within the manufacturer's specifications provided on the HUD performance characteristic sheet, the trend raised concerns since the root cause remained unknown. Consequently, the study design included an assessment of the accuracy and precision of these analyzers using standard reference materials (SRM) for LBP developed and certified for lead content by the US National Institute of Standards and Technology (NIST). An assessment was also carried out on the calibration blocks that are provided with each analyzer for quality assurance purposes and that are used to track daily performance. The calibration block assessment was accomplished using an independent, portable XRF analyzer from Thermo Scientific, operated in a “painted product calibration mode”, wherein results for LBP are reported based on area measurement units (μg cm−2).
The accuracy of these RMD LPA-1 XRF devices was evaluated against NIST SRM 2579 “Lead in Paint”, an international reference standard developed specifically for XRF analyzers. Complete details for NIST SRM 2579 are provided below (see “Study samples”). Using these SRMs, we obtained an instrument specific “response curve” for each LPA-1 device, and used it to assess performance and bias based on 2 replicates measurements of each standard. All the RMD LPA-1 units were operated in time corrected mode for this study. Triplicate measurements were obtained on each calibration, or quality control (QC) block provided with each RMD LPA-1 unit. These measurements were obtained at the beginning and at the end of each analytical run, in accordance with the standard operating procedures (SOP) established for this analysis. The Pb content of these QC calibration blocks was evaluated using an independent XRF analyzer (Thermo Niton XL3t) operated in a painted product calibration mode (μg cm−2), to assess inter-block agreement (3 replicates, different positions). While this approach did not address accuracy (or trueness), it does provide an opportunity to compare data from another XRF technology to the value assigned for use with the RMD LPA-1. As a result of these studies, the Niton XL3t was also evaluated against NIST SRM 2579 “Lead in Paint”.
The RMD LPA-1 spectrometers14 use a 57Co radioactive source (max 12 mCi, 444 MBq) to excite the Pb K series XRF lines. The source emits mainly 122 keV radiation (about 85%) and has a half-life of 271 days, i.e., approximately 9 months. The spot size is 1.5′′ × 1.25′′ (approximately 38 × 32 mm) and has no object interference beyond a depth of 3/8′′ (approximately 1 cm).
A cadmium telluride detector is used that has a high sensitivity for high energy K lines.15 Quantitation is achieved by calibration against known concentration samples on substrates such as hardwood, cement, etc. The RMD LPA-1 weights 3 lbs. (approximately 1.4 kg) and the operator's radiation dose rate is approximately 0.3 mrem h−1 (3 μSv h−1). This is substantially below the derived dose rate given the annual 5 rem (50000 μSv) total effective dose equivalent for an occupational exposed adult, assuming 2000 work hours in a year.16
The device has three testing modes: quick mode, standard mode and time corrected mode. According to the manufacturer, quick mode is the shortest possible time to achieve a 95% confidence measurement. The closer the read is to the critical value of 1.0 mg cm−2, the longer the measurement time will be. In Standard Mode the user selects the length of time for the measurement between 5, 10 and 30 seconds. In Time Corrected Mode, the read time is automatically adjusted based on the source age (see Table 5 for details). In this study, Time Corrected Mode was used to ensure that the different source ages were taken into account.
This instrument has several modes but only the consumer goods mode was used here. The consumer goods mode allows the operator to choose between plastic, metals and ceramics, test all, and painted products samples. For determination of Pb in coatings, the painted products mode was selected (main filter, 120 s) and test results were reported as μg cm−2 but they were converted to mg cm−2 to maintain consistency with the RMD LPA-1 technology. The XL3t also provides the option to select a spot size diameter of either 8 mm or 3 mm. In this study both spot sizes were evaluated for performance characteristics.
The accuracy of each of the 10 XRF instruments was assessed using NIST SRM 2579. SRM 2579 “Lead in Paint” consists of a set of 5 Mylar sheets, 4 coated with a lead-containing paint and 1 coated with a lead-free lacquer layer. Certified values for Pb content are provided by NIST in units of mg cm−2 along with the estimated uncertainty (Table 1). XRF measurements were taken for each SRM level by placing the sheet on top of the QC calibration block with blank side facing up to ensure consistency across instruments based on a wood substrate.
Level | Lead concentration (mg cm−2) | Estimated uncertainty (mg cm−2) |
---|---|---|
I | 3.53 | 0.24 |
II | 1.63 | 0.08 |
III | 1.02 | 0.04 |
IV | 0.29 | 0.01 |
Blank | <0.0001 |
Unit # | Correlation coefficient | Slope | y-Intercept (mg cm−2) |
---|---|---|---|
1 | 0.999 | 0.99 ± 0.02 | 0.04 ± 0.03 |
2 | 0.997 | 1.07 ± 0.04 | −0.12 ± 0.06 |
3 | 1.000 | 1.15 ± 0.01 | −0.18 ± 0.02 |
4 | 0.998 | 1.02 ± 0.03 | 0.05 ± 0.05 |
5 | 0.999 | 1.02 ± 0.01 | 0.11 ± 0.03 |
6 | 0.999 | 1.00 ± 0.02 | 0.15 ± 0.03 |
7 | 1.000 | 1.17 ± 0.01 | −0.16 ± 0.01 |
8 | 1.000 | 1.06 ± 0.01 | 0.02 ± 0.02 |
9 | 0.999 | 1.04 ± 0.02 | −0.06 ± 0.03 |
10 | 0.999 | 1.08 ± 0.02 | −0.21 ± 0.04 |
Difference plots were generated for each local health agency (identified arbitrarily as A through F) based on the data from NIST SRM 2579. In these plots the discrepancy between found and certified values are plotted as a function of the certified value (Fig. 1). The NIST expanded uncertainty is also shown to provide an evaluation of RMD performance as a function of increasing NIST certified concentration. Examination of these plots shows the absence of any major bias at low levels of LBP (NIST blank, NIST level IV) for 3 XRF instruments (#1, #4 and #8). However, the latter two instruments do show a positive bias for NIST SRM level III (certified at 1.02 mg cm−2), which is close to the current action limit (1.0 mg cm−2) for residential LBP.
Other XRF units (#2, #3, #7, #9 and #10) exhibited a negative bias at low levels but this was more troublesome for unit #10. The latter was especially concerning so close to the 1.0 mg cm−2 action threshold (Fig. 1F).
Fig. 2 shows the distribution of reported values obtained for NIST SRM 2579 “Lead in Paint” for each of the 10 XRF units. The data are ranked from low to high concentration, and arbitrary performance specifications (±10%, ±20%, ±30%) are shown so that inter-unit performance can be assessed. On average, the 10 RMD LPA-1 portable XRF units yielded results for NIST SRM 2579 that were accurate to within ±20% of the certified value for levels I through III, and thus are fit-for-purpose for screening homes for LBP at the action level of 1.0 mg cm−2. However, the average performance at 0.29 mg cm−2 for 6 of the 10 units tested exceeded ±30% (Fig. 2b).
Fig. 2 Distribution of found values for NIST SRM 2579 “Lead in Paint” from ten RMD LPA-1 XRF units ranked from low to high concentration, and showing arbitrary performance specifications. |
Each RMD LPA-1 instrument comes with a calibration check block that is used in the field to assure the device is operating within its tolerance limits. In this study, each of the 10 calibration blocks was measured on their respective instrument, according to the established field procedure. The manufacturer specifies that measurements should be within ±0.3 mg cm−2 of the assigned target value. Fig. 3 shows the overall performance achieved for the XRF units based on respective analysis of the 8 calibration blocks at 1.0 mg cm−2 and 2 blocks at 1.9 mg cm−2. The test showed all instruments operating within the tolerance range specified for the QC block, with the possible exception of one device (#9), where the performance was borderline passing (Fig. 3). Of the 5 devices that were reportedly reading low on their calibration check, two (#10 and #9) were consistently low, but nonetheless operating within the tolerance range for the calibration block. In fact, the unit #10 also exhibited a low bias for NIST 2579 level III, while the unit #9 yielded results that were within 10% for NIST SRM levels III and II.
Fig. 3 Performance of all the RMD LPA-1 instruments for the respective calibration blocks. Error bars represent the SD. |
Measurement repeatability based on the calibration block data (Fig. 3), defined as the % relative standard deviation (RSD), ranged from 3–10%. Fig. 3 indicates that unit #9 may have a negative bias, since half the found values were 1.5 mg cm−2 (acceptable range: 1.6–2.2 mg cm−2). However, when the RMD LPA-1 instruments were crosschecked against NIST SRM 2579, (Fig. 2) the bias for unit #9 ranged from just 3% at 1.02 mg cm−2 to −2% at 1.63 mg cm−2. This suggested the root cause of the bias problem observed in Fig. 3 is likely to be the specific calibration block provided with the device. It was noted later that values of 1.5 mg cm−2 were all obtained from one (left) side of the calibration block, which appeared from visual observation to be in less than optimal condition. By contrast, the other (right) side of the block yielded higher values of 1.7 and 1.9 mg cm−2.
As a practical measure, we chose to analyze the NIST materials in duplicate to obtain a more robust estimate of the Pb content. In contrast, the number of measurements (n = 3) taken for the calibration blocks (Table 3) is consistent with the HUD protocol requirements for QC in the field. However, the same protocol requires just a single screening measurement of painted surfaces in the field. Of course, uncertainty decreases as the number measurements increases, so this aspect of performance (i.e., the uncertainty) will be larger when just a single measurement is made.
Unit # | Reference value mg cm−2 | Accept. range mg cm−2 | Found valuea mg cm−2 | NIST corrected valueb mg cm−2 | ||
---|---|---|---|---|---|---|
Small spot | Large spot | Small spot | Large spot | |||
Mean (%RSD) | Mean (%RSD) | Mean (%RSD) | Mean (%RSD) | |||
a Niton found values (μg cm−2) multiplied by 103 to be consistent with the units used to report results using the LPA-1 (mg cm−2). b NIST “corrected values” (mg cm−2) were obtained from a bias calibration curve traceable to NIST SRM 2579 (see Fig. 4 and text for details). | ||||||
1 | 1.0 | 0.7–1.3 | 0.997 (1) | — | 1.141 (1) | — |
2 | 1.0 | 0.7–1.3 | 1.022 (1) | — | 1.173 (1) | — |
3 | 1.0 | 0.7–1.3 | — | 0.919 (1) | — | 1.198 (1) |
4 | 1.9 | 1.6–2.2 | — | 1.546 (2) | — | 2.218 (2) |
5 | 1.0 | 0.7–1.3 | 0.998 (1) | 0.862 (2) | 1.142 (1) | 1.115 (2) |
6 | 1.0 | 0.7–1.3 | 1.053 (3) | 0.890 (2) | 1.212 (3) | 1.155 (2) |
7 | 1.0 | 0.7–1.3 | — | 0.930 (1) | — | 1.214 (1) |
8 | 1.0 | 0.7–1.3 | — | 0.911 (1) | — | 1.186 (1) |
9 | 1.9 | 1.6–2.2 | — | 1.522 (2) | — | 2.174 (2) |
10 | 1.0 | 0.7–1.3 | 1.026 (1) | — | 1.178 (1) | — |
Seven calibration blocks were analyzed using the 8 mm spot size and five were analyzed using the 3 mm spot size. Two calibration blocks were analyzed using both spot sizes. Within-block consistency was assessed by triplicate measurements at different locations. Precision estimates of <5% RSD were judged as acceptable consistency. Table 3 shows the values obtained for Pb content in the calibration blocks using the Niton, operated in painted products mode, and reported in units of μg cm−2. These values were multiplied by 103 to make them consistent with the units (mg cm−2) used by the RMD LPA-1 instruments. Found RSD values were typically <3% RSD, based on triplicate measurements. In Table 3, the found values obtained using the Niton XL3t show a greater negative bias with the 8 mm beam spot size compared to the 3 mm spot size, relative to the assigned value. Indeed, for the two higher QC Pb values (1.9 mg cm−2), the found values by Niton using the 8 mm spot size fall outside the range deemed acceptable (1.6–2.2 mg cm−2). It should be noted that the “reference value” for Pb content assigned to each calibration QC block lacks a statement of traceability. However, we were able to calculate NIST-“corrected values” (Table 3) using data obtained from the Niton XL3t with traceability to NIST SRM 2579, as explained below.
During the course of the experimental study, we analyzed SRM 2579 using the Niton XL3t. Table 4 shows analytical performance data obtained for the Niton XL3t with SRM 2579. The found values were multiplied by 103 for reporting unit consistency (mg cm−2). It is evident from the data that the specific Niton instrument used here has a negative bias relative to the NIST certified values. The bias appears to be worse for the 8 mm spot size. We were able to correct the Niton found values by fitting the NIST data to a quadratic function to yield two calibration curves as shown in Fig. 4, one for the large spot size and one for the small spot size. The curvature observed is probably due to the absorption of the lower energy Pb L lines in the higher concentration samples compared to the RMD unit that relies on the much higher energy K lines that will not be absorbed at higher Pb content. The “NIST corrected” values as shown in Table 4 virtually eliminate the negative bias, and thereby permit an independent validation of the RMD calibration block assigned values. Table 3 reflects the “NIST corrected” values found for the RMD calibration blocks using the Niton, and adjusted by 103 for reporting unit consistency (mg cm−2). Based on the “NIST-corrected” data, the assigned values for the RMD calibration blocks are within the manufacturer's stated tolerance range, albeit with a positive bias.
NIST SRM 2579 | Certified value ± U mg cm−2 | Found valuesa ± SD (n = 3) mg cm−2 | NIST corrected valuesb ± SD (n = 3) mg cm−2 | ||||||
---|---|---|---|---|---|---|---|---|---|
Small spot 3 mm | Bias % | Large spot 8 mm | Bias % | Small spot 3 mm | Bias % | Large spot 8 mm | Bias % | ||
a Niton found values (μg cm−2) multiplied by 103 to be consistent with the units used on the NIST certificate (mg cm−2). b NIST “corrected values” (mg cm−2) were obtained from a bias calibration curve traceable to NIST SRM 2579 (see Fig. 4 and text for details). <LOD: below the limit of detection. | |||||||||
Blank | <0.0001 | <LOD | — | <LOD | — | <LOD | — | <LOD | — |
Level IV | 0.29 ± 0.01 | 0.277 ± 0.004 | −4 | 0.240 ± 0.002 | −17 | 0.295 ± 0.005 | 2 | 0.289 ± 0.002 | 0 |
Level III | 1.02 ± 0.04 | 0.905 ± 0.011 | −11 | 0.776 ± 0.004 | −24 | 1.025 ± 0.012 | 0 | 0.993 ± 0.004 | −3 |
Level II | 1.63 ± 0.08 | 1.357 ± 0.018 | −17 | 1.219 ± 0.007 | −25 | 1.622 ± 0.022 | 0 | 1.657 ± 0.009 | 2 |
Level I | 3.53 ± 0.24 | 2.425 ± 0.013 | −31 | 2.143 ± 0.010 | −39 | 3.524 ± 0.018 | 0 | 3.536 ± 0.016 | 0 |
Fig. 4 NIST SRM 2579 calibration curves for the Niton XL3t 700s GOLDD XRF showing quadratic fits for (a) 3 mm spot size and (b) 8 mm spot size option. |
Unit # | Time corrected mode duration (s) | Approx. age of source | Mode reported as used during inspection | Mode reported as used during calibration check | Mode found at start up |
---|---|---|---|---|---|
a Bought in December 2011 not resourced since then. | |||||
1 | 87 | 1 year and 2 monthsa | Quick mode | Time corrected | Quick Mode |
2 | 63 | 10 months | Quick mode | Quick mode | Standard mode |
3 | 118 | 1 year and 6 months | Quick mode | Quick mode | Quick mode |
4 | 104 | 1 year and 4 months | Quick mode | Time corrected | Time corrected |
5 | 70 | 11 months | Quick mode | Time corrected | Time corrected |
6 | 67 | 9.5 months | Quick mode | Time corrected | Time corrected |
7 | 44 | 5 months | Quick mode | Quick mode | Quick mode |
8 | 102 | 1 year and 4 months | Quick mode | Quick mode | Quick mode |
9 | 42 | 4 months | Quick mode | Time corrected | Quick mode |
10 | 70 | 11 months | Quick mode | Quick mode | Quick mode |
With the exception of one device (unit #9), the calibration blocks provided by RMD for use with the LPA-1 performed satisfactorily and data were within the range of values expected. The outlier calibration block appeared to have undergone some deterioration. Some judgment may be required when using calibration blocks to ensure the absence of visible wear and tear. An independent analysis of the RMD LPA-1 calibration blocks using the “NIST corrected” data from the Niton XL3t 700s GOLDD XRF yielded acceptable data for both spot sizes.
During the course of this study, it was observed that improper use of the RMD LPA-1 could easily lead to incorrect readings. For example, it was noted that readings in the Standard Mode were adversely affected when the tip is not placed correctly in a perpendicular plane with the calibration block, i.e., the tip is slightly raised and unit is not flush with the calibration block. Such readings are typically lower than they should be. However, this effect is not so evident in either quick mode or in time corrected mode. Based on our observation that one of the devices was previously used in Standard Mode, we conclude that end-users appear to be using different measurement modes. The NYS DOH requires that all local health units carrying out testing for LBP by XRF follow the instrument manufacturer's instructions. Those instructions follow the HUD guidelines.7 However, the current protocol provides several options as to which measurement mode should be used, and end users may very well be confused as to which mode to use. We recommend that protocols specify a single measurement mode for all field users, or provide more clarity on when and how to use the alternate measurement modes. Supervisors should ensure that field personnel are following the standard operating procedures correctly. Additionally, it is good QC practice to use the same mode for both the calibration check and analysis of paint samples.
While not investigated here, use of the “quick mode” (where the measurement time is varied to give a 95% confidence) may sacrifice accuracy to increase testing throughput. However, this may be justified for improved screening purposes, since quick mode eliminates the need for adjusting the measurement time based the age of the source. Further investigation of the quick mode may be warranted.
This journal is © The Royal Society of Chemistry 2015 |