5
Methods for Assessing Exposure to Lead

INTRODUCTION

The purpose of this chapter is to discuss analytic methods to assess exposure to lead in sensitive populations. The toxic effects of lead are primarily biochemical, but rapidly expanding chemical research databases indicate that lead has adverse effects on multiple organ systems especially in infants and children. The early evidence of exposure, expressed by the age of 6–12 months, shows up in prenatal or postnatal blood as lead concentrations that are common in the general population and that until recently were not considered detrimental to human health (Bellinger et al., 1987,1991a; Dietrich et al., 1987a; McMichael et al., 1988). As public-health concerns are expressed about low-dose exposures (Bellinger et al., 1991a,1987; Dietrich et al., 1987a; McMichael et al., 1988; Landrigan, 1989; Rosen et al., 1989; Mahaffey, 1992), the uses of currently applicable methods of quantitative assessment and development of newer methods will generate more precise dosimetric information on small exposures of numbers of sensitive populations.

Ultraclean techniques have repeatedly shown that previously reported concentrations of lead can be erroneously high by a factor of several hundred (Patterson and Settle, 1976). The flawed nature of some reported lead data was initially documented in oceanographic research: reported concentrations of lead in seawater have decreased by a factor of 1,000 because of improvements in the reduction and control of lead contamination during sampling, storage, and analysis (Bruland, 1983). Parallel decreases have recently been noted in reports on lead concentrations



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 191
Measuring Lead Exposure in Infants, Children, and Other Sensitive Populations 5 Methods for Assessing Exposure to Lead INTRODUCTION The purpose of this chapter is to discuss analytic methods to assess exposure to lead in sensitive populations. The toxic effects of lead are primarily biochemical, but rapidly expanding chemical research databases indicate that lead has adverse effects on multiple organ systems especially in infants and children. The early evidence of exposure, expressed by the age of 6–12 months, shows up in prenatal or postnatal blood as lead concentrations that are common in the general population and that until recently were not considered detrimental to human health (Bellinger et al., 1987,1991a; Dietrich et al., 1987a; McMichael et al., 1988). As public-health concerns are expressed about low-dose exposures (Bellinger et al., 1991a,1987; Dietrich et al., 1987a; McMichael et al., 1988; Landrigan, 1989; Rosen et al., 1989; Mahaffey, 1992), the uses of currently applicable methods of quantitative assessment and development of newer methods will generate more precise dosimetric information on small exposures of numbers of sensitive populations. Ultraclean techniques have repeatedly shown that previously reported concentrations of lead can be erroneously high by a factor of several hundred (Patterson and Settle, 1976). The flawed nature of some reported lead data was initially documented in oceanographic research: reported concentrations of lead in seawater have decreased by a factor of 1,000 because of improvements in the reduction and control of lead contamination during sampling, storage, and analysis (Bruland, 1983). Parallel decreases have recently been noted in reports on lead concentrations

OCR for page 191
Measuring Lead Exposure in Infants, Children, and Other Sensitive Populations in fresh water (Sturgeon and Berman, 1987; Flegal and Coale, 1989). Similar decreases in concentrations of lead in biologic materials have been reported by laboratories that have adopted trace-metal clean techniques. The decreases have been smaller, because lead concentrations in biologic matrixes are substantially larger than concentrations in water, and the amounts of contaminant lead introduced during sampling, storage, and analysis are similar. Nevertheless, one study revealed that lead concentrations in some canned tuna were 10,000 times those in fresh fish, and that the difference had been overlooked for decades because all previous analyses of lead concentrations in fish were erroneously high (Settle and Patterson, 1980). Another study demonstrated that lead concentrations in human blood plasma were much lower than reported (Everson and Patterson, 1980). A third demonstrated, with trace-metal clean techniques, that natural lead concentrations in human calcareous tissues of ancient Peruvians were approximately one five-hundredth those in contemporary adults in North America (Ericson et al., 1991). Problems of lead contamination are pronounced because of the ubiquity of lead, but they are not limited to that one element. Iyengar (1989) recently reported that it is not uncommon to come across order-of-magnitude errors in scientific data on concentrations of various elements in biologic specimens. The errors were attributed to failure to obtain valid samples for analysis and use of inappropriate analytic methods. The former  includes presampling factors and contamination during sampling, sample preservation and preparation, and specimen storage. The latter includes errors in choice of methods and in the establishment of limits of detection and quantitation, calibration, and intercalibration (Taylor, 1987). Decreases in blood lead concentrations reportedly are associated with the decrease in atmospheric emissions of gasoline lead aerosols. The correlation between the decreases in blood lead and gasoline lead emissions is consistent with other recent observations of decreases in environmental lead concentrations associated with decreases in atmospheric emissions of industrial lead (Trefry et al., 1985; Boyle et al., 1986; Shen and Boyle, 1988). However, the accuracy of the blood lead analyses has not been substantiated by rigorous, concurrent intercalibration with definitive methods that incorporate trace-metal clean techniques

OCR for page 191
Measuring Lead Exposure in Infants, Children, and Other Sensitive Populations in ultraclean laboratories. Moreover, previous blood lead measurements cannot be corroborated now, because no aliquots of samples have been properly archived. Nonetheless, within the context of internally consistent and carefully operated chemical research laboratories, valuable blood analyses have been obtained. Future decreases in blood lead concentrations will be even more difficult to document, because the problems of lead contamination  will be greater. Figure 5-1 depicts the relative amounts of blood lead and contaminant lead measured in people with high (50 µg/dL) and low (1 µg/dL) blood lead concentrations when amounts of contaminant lead introduced during sampling, storage, and analysis were kept constant (1 µg/dL each). The blood lead concentration measured in the person with a high blood lead concentration (53 µg/dL) will be relatively accurate to within 6%, because the sum of contaminant lead is small relative to blood lead. The same amount of contaminant lead, however, will erroneously increase the measured blood lead concentration of the other person by a factor of 4 (i.e., to 400%). That would seriously bias studies of lead metabolism and toxicity in the latter person. It would also lead to the erroneous conclusion that there was only about a 12-fold difference, rather than a 50-fold difference, in the blood lead concentrations of the two people. Both problems will become more important as the average lead concentration in the population decreases and as more studies focus on the threshold of lead toxicity. In general, techniques to measure internal doses of lead involve measurement of lead in biologic fluids. Tissue concentrations of lead also provide direct information on the degree of lead exposure after lead leaves the circulation by traversing the plasma compartment and gaining access to soft and hard tissues. Once lead leaves the circulation and enters critical organs, toxic biochemical effects become expressed. It is of great importance for the protection of public health from lead toxicity to be able to discern the quantities of lead in target organs that are prerequisites for biochemically expressed toxic effects to become evident. The latter has been difficult, if not impossible, in humans, but lead measurement of the skeletons and placenta might make it more approachable with respect to fetuses, infants, women of child-bearing age, and pregnant women. Furthermore, measurements of lead in the skeleton of workers in lead industries has substantial potential for revealing the body burden of lead required for evidence of biochemical

OCR for page 191
Measuring Lead Exposure in Infants, Children, and Other Sensitive Populations FIGURE 5-1 Illustration of relative problems of contamination in analysis of high and low blood lead concentration. Source: Adapted from Flegal and Smith, 1992.

OCR for page 191
Measuring Lead Exposure in Infants, Children, and Other Sensitive Populations toxicity to become manifest. Hence, measurements of lead in bone and placenta have the potential to couple quantitative analyses of lead at the tissue level to biochemical expressions of toxicity at the cellular level. Noninvasive x-ray fluorescence (XRF) methods of measuring lead in bone, where most of the body burden of lead accumulates, have great promise for relating dosimetric assessments of lead to early biochemical expressions of toxicity in sensitive populations if their sensitivity can be improved by at least a factor of 10. The L-line XRF technique (LXRF) appears to be of potential value for epidemiologic and clinical research related to infants, children,  and women of child-bearing age, including studies during pregnancy (Rosen et al., 1989, 1991; Wielopolski et al., 1989; Kalef-Ezra et al., 1990; Slatkin et al., 1992; Rosen and Markowitz, 1993). The K-line XRF method (KXRF) appears to be suited for studies in industrial workers and postmenopausal women, in addition to probing epidemiologic links between skeletal lead stores and both renal disease and hypertension (Somervaille et al., 1985, 1986, 1988;  Armstrong et al., 1992). Measurements of bone lead and blood lead in pregnant women throughout the course of pregnancy and assessments of amniotic-fluid lead concentrations and placental lead concentrations at term collectively hold promise for further characterizing the dynamics of maternal-fetal lead transfer. Clinical research studies are examining epidemiologic issues related to the best measures of exposure and of the duration of exposure. Needleman et al. (1990) have reported that tooth lead concentrations constitute the best short- and long-term predictors of lead-induced perturbations in neurobehavioral outcomes. Longitudinal studies are examining whether cumulative measures (indexed by bone lead content based on LXRF), exposures during the preceding 30–45 days (indexed by blood lead concentrations), or exposures during critical periods are most important in the CNS effects of lead and in the reversibility of toxic effects on CNS function in children treated promptly with a chelating agent. Some health effects of lead most likely depend on recent exposure; but knowledge of whether exposure was in the preceding few days, few months, or few years is extremely relevant clinically and epidemiologically. Previous reliance on blood lead concentrations alone has limited the use of time in treatment and outcome protocols. The half-time of

OCR for page 191
Measuring Lead Exposure in Infants, Children, and Other Sensitive Populations lead in blood is short and reflects primarily recent exposure and absorption (Rabinowitz et al., 1976, 1977). Moreover, blood lead concentration does not reflect lead concentrations in target tissues that have different lead uptake and distribution or changes in tissue lead that occur when lead exposure is modified. Even lead in trabecular bone has a shorter duration than does lead in cortical bone. The most appropriate measure will likely vary with the end point in question. It is apparent, however, that current methods can strengthen epidemiologic and treatment efficacy studies by using multiple markers with different averaging times. The recent development of the ability to measure lead averaged over short periods (blood lead), intermediate periods (trabecular bone), and long exposure intervals (cortical bone) promises new techniques for measuring lead exposure in sensitive populations. SAMPLING AND SAMPLE HANDLING It is universally accepted that a crucial part of monitoring of lead in biologic material is the quality of sample collection and sample handling. Lead is pervasive and can contaminate samples randomly or systematically. In addition, the lead content of substances can be reduced by inappropriate collection, storage, or pretreatment. Protocols for acceptable sampling and sample handling vary with the material being sampled and the analytic technique being used, but most precautions apply across the board. In all cases, sample containers, including their caps, must be either scrupulously acid-washed or certified as lead-free. That is particularly important for capillary- and venous-blood sampling as now incorporated into the guidelines of the 1991 CDC statement (CDC, 1991). For example, as little as 0.01 µg (10 ng) of contaminant lead entering 100 µL of blood in a capillary tube adds the equivalent of a concentration of 10 µg/dL, the CDC action level. Reagents added to a biologic sample before, during, or after collection especially must be lead-free. Lead concentrations in plasma or serum are generally so low to begin with and relative environmental lead concentrations so high that it is extremely difficult to collect and handle such samples without contamination. Urine sampling, especially the 8-hour or 24-hour sampling associated with chelator administration, requires collection in acidwashed

OCR for page 191
Measuring Lead Exposure in Infants, Children, and Other Sensitive Populations containers. Although the amounts of lead being removed to urine with chelation are relatively high, the large volumes of sample and correspondingly large surface areas of collection bottles affect contamination potential. A particularly important step in sample collection is the rigorous cleaning of the puncture site for capillary- or venous-blood collecting. The cleaning sequence of particular usefulness for finger puncture, the first step in blood lead screening, is that recommended by the Work Group on Laboratory Methods for Biological Samples of Association of State and Territorial Public Health Laboratory Directors (ASTPHLD, 1991). Fingers are first cleaned with an alcohol swab, then scrubbed with soap and water and swabbed with dilute nitric acid; and a silicone or similar barrier is used. Sample storage is very important. Whole blood can be stored frozen for long periods. At -20°C in a freezer, blood samples can be stored for up to a year and perhaps longer. Sample handling within the laboratory entails as much risk of contamination as sample collection in the field. Laboratories should be as nearly lead-free as possible. Although it is probably impractical for most routine laboratories to meet ultraclean-facility requirements (see, e.g., Patterson and Settle, 1976; EPA, 1986a), minimal steps are required, including dust control and use of high-efficiency particle-accumulator (HEPA) filters for incoming air and ultrapure-reagent use. Collection and analysis of shed children's teeth entail unavoidable surface contamination, but this complication can be reduced by confining analysis to the interior matrix of a tooth, preferably the secondary (circumpulpal) dentin segment. The contaminated surface material is discarded. Isolation of the secondary dentin requires use of lead-free surfaces of cutting tools, lead-freework surfaces, and so forth. MEASUREMENT OF LEAD IN SPECIFIC TISSUES Whole Blood The most commonly used technique to measure blood lead concentrations involves analysis of venous blood after chemical degradation (for example, wet ashing with nitric acid), electrothermal excitation (in a

OCR for page 191
Measuring Lead Exposure in Infants, Children, and Other Sensitive Populations graphite furnace), and then measurement with atomic-absorption spectroscopy, or AAS (EPA, 1986a). With AAS, ionic lead is first vaporized and converted to the atomic state; that is followed by resonance absorption from a hollow cathode lamp. After monochromatic separation and photomultiplier enhancement of the signal, lead concentration is measured electronically (Slavin, 1988). Because it is much more sensitive than flame methods, the electrothermal or graphite-furnace technique permits use of small sample volumes, 5–20 µL. Physicochemical and spectral interferences are severe with flameless methods, so careful background correction is required (Stoeppler et al., 1978). Diffusion of sample into the graphite furnace can be avoided by using pyrolytically coated graphite tubes and a diluted sample applied in larger volumes. Electrochemical techniques are also widely used for measurement of lead. Differential pulse polarography (DPP) and anodic stripping voltammetry (ASV) offer measurement sensitivity sufficient for lead analyses at blood concentrations characteristic of the average populace. The sensitivity of DPP is close to borderline for this case, so ASV has become the method of choice. It involves bulk consumption of the sample and thus has excellent sensitivity, given a large sample volume (Jagner, 1982; Osteryoung, 1988). This property is, however, of little practical significance, because, of course, sample size and reagent blanks are finite. That ASV is a two-step process is advantageous. In the first step, lead is deposited on a mercury thin-film electrode simply by setting the electrode at a potential sufficient to cause lead reduction. The lead is thus concentrated into the mercury film for a specified period, which can be extended when higher sensitivity is needed; few techniques offer such preconcentration as an integral part of the process. After electro-deposition, the lead is reoxidized (stripped) from the mercury film by anodically sweeping the potential. Typically, a pulsed or stepping operation is used, so differential measurements of the peak current for lead are possible (Osteryoung, 1988; Slavin, 1988). The detection limit for lead in blood with ASV is approximately 1 picogram (pg) and is comparable with that attainable with graphite-furnace AAS methods. The relative precision of both methods over a wide concentration range is ±5% (95% confidence limits) (Osteryoung, 1988; Slavin, 1988). As noted, AAS requires attention to spectral

OCR for page 191
Measuring Lead Exposure in Infants, Children, and Other Sensitive Populations interferences to achieve such performance. For ASV, the use of human blood for standards, the presence of coreducible metals and their effects on the measurement, the presence of reagents that complex lead and thereby alter its reduction potential, quality control of electrodes, and reagent purity must all be considered (Roda et al., 1988). It must be noted, however, that the electrodeposition step of ASV is widely used and effective for reagent purification. The practice of adding an excess of other high-purity metals to  samples, thereby displacing lead from complexing agents and ameliorating their concomitant interference effects, has demonstrated merit. Copper concentrations, which might be increased during pregnancy or in other physiologic states, and chelating agents can cause positive interferences in lead measurements (Roda et al., 1988). The general sensitivity of ASV for lead has led to its use in blood lead analyses. The relative simplicity and low cost of the equipment has made ASV one of the more effective approaches to lead analysis. As described in Chapter 4, the measurement of erythrocyte protoporphyrin (EP) in whole blood is not a sensitive screening method for identifying lead-poisoned people at blood lead concentrations below 50 µg/dL, according to analyses of results of the NHANES II general population survey (Mahaffey and Annest, 1986). Data from Chicago's screening program for high-risk children recently analyzed by CDC and the Chicago Department of Health indicated further the current limitations of EP for screening. The data, presented in Table 5-1, provide specificity and sensitivity values of EP screening at different blood lead concentrations. The sensitivity of a test is defined as its ability to detect a condition when it is present. The EP test has a sensitivity of 0.351, or about 35%, in detecting blood lead concentrations of 15 µg/dL or greater. This means that on average the EP test result will be high in about 35% of children with blood lead concentrations of 15 µg/dL or greater. It will fail to detect about 65% of those children. As the blood lead concentration of concern increases, the EP test becomes more sensitive. At blood lead concentrations of 30 µg/dL or greater, the sensitivity of the EP test is approximately 0.87. However, if it is used to detect blood lead concentration of 10 µg/dL or greater, the EP test has a sensitivity of only about 0.25. The specificity of a test is defined as its ability to detect the absence of a condition when that condition is absent. As seen in Table 5-1,

OCR for page 191
Measuring Lead Exposure in Infants, Children, and Other Sensitive Populations TABLE 5-1 Chicago Lead-Screening Data, 1988-1989a Definition of Increased Blood Lead, µg/dL Sensitivity (Confidence Intervalb) Specificity Predictive Value Positive Prevalence of Increased Blood Lead as Defined at Left ≥10 0.252 (0.211-0.294) 0.822 0.734 0.660 ≥15 0.351 (0.286-0.417) 0.833 0.503 0.325 ≥20 0.479 (0.379-0.579) 0.818 0.322 0.152 ≥25 0.700 (0.573-0.827) 0.814 0.245 0.079 ≥30 0.871 (0.753-0.989) 0.806 0.189 0.043 ≥35 1.00 (0.805-) 0.794 0.119 0.030 ≥40 1.00 (0.735-) 0.788 0.084 0.019 ≥45 1.00 (0.590-) 0.782 0.049 0.011

OCR for page 191
Measuring Lead Exposure in Infants, Children, and Other Sensitive Populations Definition of Increased Blood Lead, µg/dL Sensitivity (Confidence Intervalb) Specificity Predictive Value Positive Prevalence of Increased Blood Lead as Defined at Left ≥50 1.00 (0.158-) 0.775 0.014 0.003 a Data indicate sensitivity, specificity, and predictive value positive of zinc protoporphyrin (ZPP) measurement for detecting increased blood lead concentrations. Increased ZPP is defined as ≥35 µg/dL. Definition of increased blood concentration varies. Data derived from systematic sample (2% of total) of test results for children 6 mo to 6 yr old tested in Chicago screening clinics from July 22, 1988, to September 1, 1989; these clinics routinely measure ZPP and blood lead in all children. n = 642. Data from M.D. McElvaine, Centers for Disease Control, and H.G. Orbach, City of Chicago Department of Health,unpublished; and McElvaine et al., 1991. b Confidence intervals calculated by normal approximation to binomial method at 95% level for two tails. For estimates of sensitivity of 1.00, only lower-tail confidence interval is calculated. Exact binomial method is used.

OCR for page 191
Measuring Lead Exposure in Infants, Children, and Other Sensitive Populations they remain still during the 16.5-minute measurement. Chloral hydrate, which is used extensively in pediatric practice for mild sedation before many electrophysiologic procedures, is not known to have any substantial short-term side effects when appropriately administered. However, more recent concerns regarding its potential carcinogenicity have been raised by the findings of laboratory animal studies. The potential long-term effects have yet to be resolved. The usefulness of the LXRF technique can probably be expanded by measuring the ratio of the L-line bone lead concentration to the K-line signal in the 10- to 16-keV interval of the XRF spectrum (Rosen et al., 1989). Standard reference materials are now needed for external and internal instrument calibrations for both L-line and K-line techniques. The calibrations should be carried out under strictly defined operating conditions that achieve the minimal detection limit of each instrument with concurrent measurement of radiation exposure, according to recommendations of the ICRP (1991). Internal and external calibrations should be assessed directly by independent experts. Calibration, with the dosimetry and systematic measurements noted earlier, should provide confidence that risk assessments of both L-line and K-line techniques have been thorough. L-line and K-line XRF techniques are complementary and provide a new, exciting, and needed capability to assess lead exposure that has accumulated over time (many months to several years) in sensitive populations. Both techniques are based on the general principles of x-ray fluorescence, but the current characteristics of each technique indicate that each has specific applications for developing needed information on different sensitive populations. Given the current state of development of both the L-line and K-line methods, it is not currently recommended that either instrument be used as a screening technique in general populations. Research Needs Although L-line and K-line XRF methods are becoming standard techniques to assess previous lead exposure over a person's lifetime, they entail critical research needs that must be addressed before they can be more generally applied for screening of populations.

OCR for page 191
Measuring Lead Exposure in Infants, Children, and Other Sensitive Populations Even though the radiation doses are low for both techniques, efforts should be made to develop methods to reduce radiation exposures further. Measurements of randomly selected men, in the near future, with both L-line and K-line instruments, will provide important information as to whether the two techniques estimate the same or different compartments of lead in bone. Men have been designated for initial comparisons, because dosimetry data on men have been detailed and published for the L-line technique, and estimates of dosimetry do not appear to be widely dissimilar for the K-line method in the same population. It is possible that the two methods measure lead from metabolically different skeletal compartments. Accordingly, experimental microlocalization studies of human limbs would be relevant. Such studies could be carried out with proton-induced x-ray emission (PIXE) after careful sectioning of limbs and assessment of tissues that are sectioned by experts and do not measure more than about 30–40 µm. Before the K-line XRF method can be considered for use in children, women of child-bearing age, and pregnant women, more detailed and systematic studies are required to define dosimetry, precision, minimal detection limits, and clinical utility in these populations. Dosimetry calculations on all tissues considered to be radiosensitive should be carried out according to NCRP and ICRP guidelines. Calibration of both instruments with standard reference materials or amputated limbs that parallel the mineral mass of the subjects to be measured is clearly needed. For the use of the K-line XRF instrument in postmenopausal osteoporotic women, the instrument should be calibrated with relevant standard reference materials or limbs (to reflect decreased mineral mass) obtained from postmenopausal women. For the L-line XRF instrument, studies are in progress to calibrate the instrument with the use of surgically amputated limbs from children. It is important to incorporate bone lead measurements in sensitive populations coupled to multiple outcome measures, and such outcome measures (biochemical, electrophysiologic, and neurobehavioral) can now be incorporated into cohort studies of infants, children, women of child-bearing age, pregnant women, and other adults with the L-line XRF method. For the K-line XRF instrument, similar or other outcome measures can be incorporated into cohort studies of occupationally exposed workers and postmenopausal women.

OCR for page 191
Measuring Lead Exposure in Infants, Children, and Other Sensitive Populations QUALITY ASSURANCE AND QUALITY CONTROL Quality assurance includes all steps that are taken to ensure reliability of data, including use of scientifically and technically sound practices for the collection, transport, and storage of samples; laboratory analyses; and the recording, reporting, and interpretation of results (Friberg, 1988). Quality control focuses on the quality of laboratory results (Taylor, 1987) and consists of external quality control, which includes systems for objective monitoring of laboratory performance by an external laboratory, and internal quality control, which encompasses a defined set of procedures used by laboratory personnel to assess laboratory results as they are generated (Friberg, 1988). Those procedures lead to decisions about whether laboratory results are sufficiently reliable to be released. Auditing procedures are used to monitor sampling, transport of samples, and recording and reporting of data (Friberg, 1988); these procedures are intended to promote laboratory discipline and vigilance aimed at preventing errors, both inside and outside the laboratory (Taylor, 1987). Statistical evaluation of laboratory data is essential for quality assurance and quality control; its primary objective is to assess analytic results for accuracy and precision. In this context, ''precision'' refers to the reproducibility of results, and "accuracy" specifies the validity of those results. Hence, precision is a measure of random errors of a method, and accuracy assesses systematic bias of the method. Random errors are always present and systematic errors sometimes occur. In the absence of systematic errors, the reproducibility of measurements places the ultimate limit on the confidence ranges that can be assigned to a set of analytic results. Similarly, the ability to detect systematic bias is limited by the analytic precision of the method: an inaccuracy of ±20% is unlikely to be detected when the reproducibility of the measurement is 20% or more. Statistical evaluation of laboratory data is essential to detect systematic bias (Taylor, 1987). Whether a laboratory participates in a round-robin proficiency testing program or uses regression analyses, tests of homogeneity, or other statistical methods, the acceptability of the laboratory results is based on stringently defined methods for assessing potential error. A repeatedly observed deviation from an assumed value, whether statistically significant or not, should be a cause for concern

OCR for page 191
Measuring Lead Exposure in Infants, Children, and Other Sensitive Populations and demand close scrutiny of laboratory results, even if no quality-control test rejected the laboratory (Taylor, 1987). Inside the laboratory, the validity of each procedure must be established within a specific matrix, which replicates the matrix of biologic fluids or tissues to be analyzed. Internal laboratory checks include measurement of the stability of samples, extensive calibration and replication of samples measured in duplicate, determination of precision and accuracy testing with matrix-matched samples and standards, carrying through of blank solutions for analyses sequentially in every phase of analysis, and useful characterization of the range of linearity and variance of calibration curves over time (Vahter, 1982; Nieboer and Jusys,  1983; Stoeppler, 1983b). External assessments of laboratory performance are best carried out through a laboratory's participation in a well-organized, formalized interlaboratory proficiency testing program. Such a program should include the use of centrally prepared certified samples in which a specific characteristic has been measured with a reference method (Friberg, 1988). For example, a suitable proficiency testing program for measurement of lead in whole blood would involve the use of blood samples from cows fed an inorganic lead salt; the samples would be certified as to lead content with isotope-dilution mass spectroscopy at the National Institute of Standards and Technology. A definitive method is one in which various characteristics are clearly defined and instrumental measurements can be performed with a high level of confidence (Cali and Reed, 1976). For lead in biologic fluids, the definitive method is isotope-dilution mass spectroscopy (IDMS). IDMS reaches a high level of accuracy, because manipulations are carried out on a weight basis. Measurement involves determinations of isotope ratios, not absolute determinations of individual isotopes; analytic precision of one part in 104–105 can be routinely obtained. Atomic-absorption spectroscopy and anodic stripping voltammetry both qualify as reference methods for measurement of lead in whole blood, because results obtained with them can be assessed precisely by or calibration against results of IDMS analyses (Stoeppler, 1983b). As acceptable concentrations of lead in blood and other biologic media become lower, the availability of standard reference materials and their wider distribution to laboratories become increasingly important. Rapid advances in development of sophisticated instrumentation,

OCR for page 191
Measuring Lead Exposure in Infants, Children, and Other Sensitive Populations increased awareness of background contamination of lead analyses outside and inside the laboratory, and the use of reference methods have contributed to laboratories capability to measure lower concentrations of lead and to measure lead with increased precision (Stoeppler, 1983b; Taylor, 1987; Friberg, 1988). Because lead is widely distributed in air, in dust, and in routinely used laboratory chemicals, the laboratory requirements for ultraclean or ultratrace analyses of lead, such as for lead in plasma, are extremely demanding; very few laboratories have ultraclean facilities and related instrumentation to permit accurate measurements of trace amounts of lead in biologic fluids (Patterson and Settle, 1976; Everson and Patterson, 1980). In an ideal world, ultraclean laboratory techniques would be applied in all laboratory procedures. In the practical world of clinical and epidemiologic studies, these exacting techniques are unrealistic. For example, excellence in laboratory standards is essential, but sampling conditions in the field, clinic, or hospital cannot be expected to be the same as ambient conditions in an ultraclean laboratory. Sampling under practical conditions is bound to involve some positive analytic error (Stoeppler, 1983b). However, the amount of lead introduced by ambient exposure of biologic fluids and by routinely used laboratory reagents must be known and assessed frequently to identify significant compromise within an analytic procedure (Rabinowitz and Needleman, 1982; Friberg, 1988). Temporal considerations are also important in the measurement of lead and biologic indicators of lead toxicity, because these substances might circulate in biologic fluids with specific and different patterns of ultradian (between an hour and a day) and circadian rhythmicity (Rabinowitz and Needleman, 1982). Sample collection has the potential to account for the greatest amount of positive error during analyses of lead at low concentrations, in contrast with the smaller errors that are inherent in instrumentation (Nieboer and Jusys, 1983). In measurement of lead in whole blood, for example, blood for analysis is preferably obtained with venipuncture and less preferably with fingerstick. The choice of sampling technique is based on feasibility, setting (parent-child acceptance), and extent of training of the persons who are collecting the samples. Fingerstick or capillary sampling might be problematic because of skin contamination, contamination within capillary tubing, and inadequate heparinization of the sample (EPA, 1986a). Cleaning the skin thoroughly is more important

OCR for page 191
Measuring Lead Exposure in Infants, Children, and Other Sensitive Populations than the choice between alcohol and a phosphate-containing soap as the cleanser. Overestimates of the concentration of lead in whole blood can occur through contamination of the skin or capillary tubes, and underestimates can occur through dilution of blood with tissue fluids caused by squeezing of the finger too rigorously. For those reasons, during NHANES II, only venous blood samples were obtained and included in the analysis of survey results (Annest et al., 1982). It should be noted that the erythrocyte protoporphyrin (EP) test is an insensitive assay at blood lead concentrations below 50 µg/dL (Mahaffey and Annest, 1986); widespread use of properly obtained fingerstick samples appears to be necessary in screening large populations. The EP test has commonly been used in the past to screen large populations of children. From a public-health standpoint, it is desirable to obtain false-positive values with the new fingerstick techniques and then perform definitive venous sampling. Public-health officials should not rely on old EP methods, which have been shown to yield a false-negative rate of about 50% at blood lead concentrations less than 50 µg/dL (Mahaffey and Annest, 1986). Plastic and glass laboratory equipment must be cleaned rigorously in an acid bath and then washed thoroughly in high-purity (18-megohm) water. Only laboratory reagents with known lead concentrations should be used, and these contributors to positive laboratory errors should be measured individually (Rabinowitz and Needleman, 1982). For ultratrace analyses, doubly distilled reagents and highly purified materials are necessary (Everson and Patterson, 1980). SUMMARY Most clinical and epidemiologic research laboratories now involved in measuring lead in biologic materials use only a few well-studied analytic methods routinely. In coming years, most laboratories will probably retain these methods for measuring lead at the ever-lower guideline concentrations that are being promulgated, e.g., lead in whole blood at or below 10 µg/dL, the new CDC action level for childhood lead exposure. The routine methods used for such typical analyses as lead in whole blood are principally electrothermal or graphite-furnace atomic-absorption spectrometry (GF-AAS) and the electrochemical technique of

OCR for page 191
Measuring Lead Exposure in Infants, Children, and Other Sensitive Populations ASV. Other variations on AAS and electrochemistry are either passing out of use or increasing in use; they are not as popular with laboratories as those two. Both AAS and ASV are theoretically adequate for the new, more rigid performance and proficiency demands being placed on laboratories in light of lower body lead burdens and exposure and toxicity guidelines, provided that attention to rigid protocols is scrupulous. It appears that blood lead measurements will continue to have an important place in the human toxicology of lead, primarily as an index of recent exposure. L-line XRF measurements of lead in tibial cortical bone of children, women of child-bearing age, pregnant women and other adults appear to have considerable dosimetric relevance for assessing lead stores that have accumulated over a lifetime. K-line XRF techniques appear to have substantial biologic relevance for assessing cumulative lead exposure in workers in lead-related industries and possibly for relating cumulative lead exposure to epidemiologic study of renal disease, hypertension, and osteoporosis. Wider application of both techniques, coupled to sensitive high-performance liquid chromatographic methods for assessing lead's inhibition of the heme biosynthetic pathway in worker populations, holds considerable promise for further delineating toxic effects of lead in sensitive populations at relatively low exposures. The primary concern with current lead concentration measurements is over analytic error, rather than instrumental limitations—specifically caused by the introduction of lead into samples during collection, storage, treatment, and analysis. Contaminant lead, which is commonly not accurately quantified, increased measured lead concentrations. Consequently, reported lead concentrations in both biologic and environmental samples are often erroneously high, as demonstrated with intercalibrated studies that used trace-metal-clean techniques and rigorous quality-control and quality-assurance procedures. Intercalibrations have demonstrated that many conventional instruments are capable of accurately and precisely measuring lead concentrations in biologic tissues and environmental samples. They include atomic-absorption spectrometers and anodic-stripping voltammeters, which are commonly used to measure lead concentrations in hospitals and commercial laboratories. Moreover, recent advances in instrumentation

OCR for page 191
Measuring Lead Exposure in Infants, Children, and Other Sensitive Populations and methods have demonstrated that parts-per-trillion lead concentrations can be accurately and precisely measured with either of those instruments. Noninvasive XRF methods are being used more for measuring lead in bone, where most of the body burden of lead accumulates. They include complementary methods: LXRF and KXRF. Developments in vivo XRF analysis of human bone lead are occurring in parallel with increases in knowledge of two closely related subjects: the quantitative relation of lead exposure to bone lead concentrations and the quantitative relation of bone lead concentration, either total or compartment-specific, to either the extent of resorption into the circulation or the health risks associated with such resorption. The direction of future XRF methodologic developments and the ultimate potential of XRF methods for use in large-scale exposure screenings of diverse populations will be influenced by the answers to these questions: Are sensitivity limits of present XRF methods adequate for reliable measurement of bone lead concentrations that reflect unacceptable cumulative exposures and indicate some potential health risk associated with resorption into blood? Are thresholds of potential concern for adverse health effects of lead, when indexed as bone lead concentration, around or below the current measurement capability for in vivo lead quantitation? Are XRF systems likely to be useful only for screening high-exposure subjects, or can the potential for this in vivo determination extend to low-exposure subjects if instruments are refined? How reliably can the apparently distinct bone lead pools probed with LXRF vs. KXRF methods be related to past lead exposures that could lead to mobilization of lead from bone stores and increased future toxicity? Is tandem use of LXRF and KXRF measurements required in serial exposure screenings of sensitive populations for adequate risk assessment? Who are the most appropriate sensitive populations for XRF analysis? Young children? If so, at what age range? Women of child-bearing age? To what purpose should XRF data be put? To determine that cumulative past exposure producing irreversible intoxication is a marker

OCR for page 191
Measuring Lead Exposure in Infants, Children, and Other Sensitive Populations of current adverse health effects? To determine the extent of future risk of endogenous toxicity associated with resorption of lead from bone into blood? Technical problems with XRF methods appear to fall into three general categories, as follows: Difficulty in separating instrument background from lead signal. Research groups and instrument vendors use varied software to separate the lead signal from the instrument background. Some software is proprietary and has little publicly available documentation; sharing of applicable software could help to advance the field. Even some of the better methods for separating signal from background cannot identify a signal that is less than 2 times greater than background. Lead concentrations of 2–10 µg/g are the lowest that are quantifiable with XRF methods. Those concentrations are higher than bone lead concentrations of many adults and most children, but not higher than those of occupationally exposed people or people with atypically increased environmental lead exposures. Variability between instruments. No appropriate certified standards are available to provide a basis for obtaining consistent, accurate, and quantitative data. The presence of standards would make it possible to assess and improve the accuracy and precision of different instruments. The availability of standards would also improve the validity of quality-assurance and quality-control programs in use by various groups and permit comparative measurements between instruments. Physiologic aspects of bone turnover. Physiologic variability in lead concentration between and within various bones has not been explored with XRF methods. Bone turnover and remodeling are age-and sex-dependent, and that complicates interpretation among groups (especially young children) that have bone lead concentrations substantially below the limit of in vivo XRF methods now available. Reported LXRF and KXRF systems appear to measure bone lead pools that differ in concentration, anatomic region, and toxicokinetic mobility. LXRF data might reflect a bone lead pool more labile than the purely cortical mineral lead fraction measured by the KXRF system. It would be premature to delineate the age-developmental time boundaries

OCR for page 191
Measuring Lead Exposure in Infants, Children, and Other Sensitive Populations over which these two systems can be viewed as operating optimally. Consequently, it is appropriate to view the two approaches as providing complementary information. All such information might, in fact, be required to obtain a complete exposure profile and a comprehensive framework for assessment of health risks. The definitive method of measuring lead concentration is still isotope-dilution thermal-ionization mass spectrometry (TIMS). Although TIMS is not appropriate for routine analyses, because of cost and complexity, it is becoming more useful for medical research as a result of continuing improvements in sample processing and sensitivity. The applicability of TIMS in medical research is also expanding with the development of stable-isotope tracer methods, which eliminate the need for radioisotopes in studies of lead metabolism. Other types of inorganic mass spectrometers are becoming feasible. Major developments in inductively coupled plasma mass spectrometry have indicated that it will soon become common in hospitals and commercial laboratories, where it might be used for both lead concentration and isotopic-composition analyses of biologic tissues and environmental materials. Comparable advances in other types of mass spectrometry indicate that they could also soon be used to measure lead in solid materials; they include gas-discharge mass spectrometry, secondary-ion mass spectrometry, and laser-microprobe mass analysis. Nuclear magnetic resonance spectroscopy is also potentially valuable for analysis of lead in biologic materials. Specifically, it might be used to measure intracellular and cellular lead, as well as calcium, zinc, iron, and other trace elements. However, applications of this technique will continue to be limited by the expense of the instrument and the technical expertise required to operate it. Lead concentrations and isotopic compositions in biologic and environmental matrices can be accurately and precisely measured with existing instrumentation, notably atomic-absorption spectrometry and anodic-stripping voltammetry. The capabilities of those commonly used instruments now exceed most analytic requirements and are still being improved. The applicability of other instruments, which often provide complementary lead measurements, has been demonstrated within the last decade. It is anticipated that newer techniques (L-line and K-line XRF and inductively coupled plasma mass spectrometry) will soon be common in clinical and epidemiologic studies. Other types of mass

OCR for page 191
Measuring Lead Exposure in Infants, Children, and Other Sensitive Populations spectrometry and nuclear magnetic resonance spectroscopy are also expected to be used in more specialized studies. The utility of all those types of analyses will continue to be limited by the degree of quality control and quality assurance used in sample collection, storage, and analysis. As the focus of public-health officials has turned to lower exposures, the errors-in-variables problem has become more severe, and the need for more careful measurements has increased. For example, the standard error of the blind quality-control data in NHANES II was approximately 10% of the mean blood lead concentration. To achieve a similar signal-to-noise ratio in the data (and hence a similar reliability coefficient in epidemiologic correlations), NHANES III will need to reduce absolute measurement error by about a factor of 3. Similarly, as screening programs focus on lower exposures, the probability of misclassification increases, unless the measurement errors are reduced proportionally. Analyses of data from NHANES II, which had much better contamination control and laboratory technique than most screening programs, showed a significant risk of misclassification of a child's blood lead concentration as being above 30 µg/dL because of the analytic error (Annest et al., 1982). Reliable detection of blood lead concentrations of 10 µg/dL will require considerably more care and probably different methods from those now used.