A COST-EFFECTIVE METHOD TO CHARACTERIZE SOIL
CONTAMINATION:OPTIMIZING USE OF EXISTING
DATA AT THE NEVADA TEST SITE

Dennis D. Weber
Harry Reid Center for Environmental Studies
4505 Maryland Parkway Box 454009
Las Vegas, Nevada 89154

Edward J. Bentz
E. J. Bentz and Associates, Inc.
7915 Richfield Rd.
Springfield, VA 22153

ABSTRACT

In view of the large expanse of the Nevada Test Site surface soils that are contaminated with man-made radionuclides which resulted from above-ground nuclear weapons testing and of shrinking budgets, Department of Energy waste managers require accurate estimates of the areal extent (the distribution) of the contamination. Data acquisition obtained by using high energy-resolution gamma detectors is required in most cases to extract the individual radionuclide concentrations. But data acquisition is time and cost prohibitive for such large contaminated areas (over 100 square miles). In response to this situation, the authors of this paper developed a cost-effective method for calculating improved estimates of radionuclide concentrations by using two existing data sets, one which has high energy-resolution, but is sparse, and another which has low energy resolution, but has higher data density and a more homogeneous distribution. The method exploits the unique features of each set to form a combined data set which results in improved concentration estimates and, hence, more accurate distribution estimates. The method is illustrated by using a case study at the Nevada Test Site, and the results are compared with earlier estimates made by using only one set of data. The cost implications to waste management are illustrated for this case study.

INTRODUCTION

Waste management of radiologically contaminated surface soils at the Nevada Test Site is problematic not in its type or complexity, but rather in its enormity. A rough estimate of the area contaminated at above-background levels by several radionuclides is over 100 square miles, and the area contaminated above 200 pCi/g of 239Pu (which was the DOE’s choice of clean-up level) amounts to 760 hectares. By using the DOE’s cost of 4 million dollars per hectare (based on a recent off-site cleanup), the total cost would be 3.04 billion dollars, an amount that exceeds the total annual NTS waste management budget by 34 times. Even if remediation costs were slashed, the amount is still staggering. It is clear that not all areas will be cleaned up, which requires management to decide which areas to remediate, demarcate, or release as is. These decisions would be best made with the best estimates of the areal distribution of the contamination in order to evaluate the cost-versus-benefit relationship. Previous radionuclide distribution estimates were made with sparse data, which necessarily only allowed crude estimations to be made. This paper describes a method which improves those estimates by supplementing the original data with a new set of data which is lower in quality, but which is more abundant and evenly distributed.

THE PROBLEM

A moratorium on testing of nuclear weapons at the NTS created a need for discussions of alternative uses of the 1,350 square miles of land. Possible alternative land-uses (e.g., recreation, mining, grazing, and residing) necessitate an evaluation of risk to (potential) participants from radionuclide contamination of the surface soils created by about 100 nuclear detonations and some safety shots. Since management decisions depend on cost/risk/benefit relationships, their quality depends strongly on accurate characterization of the radionuclide concentrations. Inaccuracies in characterization of such areas could substantially affect the cost/risk/benefit relationships and, because of the enormity of the area, could lead to mismanagement of the land. This paper describes and demonstrates a cost-effective method designed to calculate more accurate estimates of the surface contamination at the NTS by combining different sets of existing data. The use of existing data seems appropriate because data acquisition is costly and DOE budgets are shrinking. The method suggests that more cost effective and accurate contamination characterization could be accomplished at the NTS by designing future data acquisition to accommodate the method.

ESTIMATION OF AREAL DISTRIBUTIONS

To best characterize the distribution of a radionuclide in an area, one would like to know the concentration of the contamination at every location in the area, but that is impractical in large areas for both technical and economical reasons. What one does is to take as many samples as is affordable, then to estimate the concentrations at unsampled locations on a regular grid that gives the spatial resolution required. Most estimators use a weighted average of the measured data that surround the unsampled point to arrive at its value. These estimators work best when interpolating the value at a point that is surrounded by near neighbors, but they do not extrapolate well, that is, when the unsampled point is not surrounded by measurements. After the regular grid of values is estimated, isopleth contours, i.e., lines of equal concentration, are drawn from which the area having concentrations greater or less than the isopleth value can be calculated.

The quality of the distribution estimation depends on the data quality, distribution heterogeneity, and density which are often inadequate to deliver the desired accuracy. In the NTS case, where the site is very large and sampling is very expensive, it is difficult to sample sufficiently to make accurate source characterizations.

GOAL AND OBJECTIVES

The goal of this work is to develop a data analysis method for producing more accurate estimates of man-made radiological surface contamination at unsampled locations, and subsequently, to improve radionuclide distribution estimates. The objective is to accomplish the goal by using two sources of existing data and to obtain an accuracy significantly greater than that obtained by using previous methods which were based on only one set of data. The method entails combining two existing data sets which were acquired (over a decade apart) with different acquisition systems and sampling strategies. It accounts for the differences in the data sets while exploiting the fact that both systems measured gamma radiation from the same source.

A case study is used to demonstrate the method at a site that resulted from a nuclear detonation which was part of the Plowshare Program to investigate the use of nuclear bombs for excavation projects (later abandoned). A 4.3 kiloton bomb was detonated (in 1965) 280 feet below the surface: It created a crater approximately 300 feet across and 150 feet deep and a plume of radiological contamination on the surface of the land that extends almost three miles from ground zero and still is detectable easily in 1997 by gamma ray sensors. Soil contamination concentrations of Americium 241 (241Am) were calculated.

A sparse and poorly distributed set of 241Am data (see Figure 1) exits from which it is impossible to make accurate estimates at unsampled locations. These measurements, which were part of the DOE’s Radionuclide Inventory and Distribution Program (RIDP) that was performed in 1983, used a hyperpure Germanium detector attached to a telescoping mast behind a vehicle which could be raised to 7.4 m above the surface. Each measurement represented a 15 minute acquisition to accumulate a gamma spectrum from 50 to 3,000 keV 1. These high quality gamma spectra allowed extraction of the concentrations of several different man-made radionuclides. Spatial coordinates were measured to within 3 meters. Spatial coverage, however, was sparse because of the long accumulation time, the huge expanse of contamination, and rough terrain. The case study used the 241Am concentrations extracted from these spectra.

Figure 1. Palanquin Site data used in this case study. The black square is Ground Zero. Dotted line is the NTS boundary.

A second set of data exists which yields only the total gamma energy (Exposure) due to man-made radiological contamination. The Exposure data were acquired in 1994 by using high gain sodium iodide (NaI) scintillation detectors mounted on a helicopter that was flown at an average of 200 feet above the land surface. Individual radionuclides are difficult to extract from these data because of poor detector energy resolution, the low energy of 241Am gamma emission, and interference from other radionuclides. North-south transects separated by about 500 feet were flown, along which a gamma spectrum was acquired about each 120 feet. The Global Positioning System provided spatial. The survey’s areal coverage of the NTS was almost 100 percent.

These two data sets measure gamma energy from exactly the same physical phenomenon, but were measured at different times (1983 and 1994) and were acquired by using different data acquisition techniques and systems. The RIDP’s high energy resolution allowed extraction of 241Am, but are too sparse to allow good spatial estimates of concentrations at unsampled locations. The aerial data best yield the Exposure (microRoentgens/hour) from man-made radionuclides, but they provide excellent areal coverage. The data were "decay corrected" to account for the eleven years between data acquisitions, but it is difficult to account for man-made terrain disturbances, soil erosion, and the difference in the "support" or "footprint" of the data. The aerial data, acquired at 200 feet above the surface, measured a circular footprint of diameter approximately 400 feet, whereas the RIDP data, which were acquired at 22 feet, measured a circular footprint of radius about 40 feet.

THE METHOD

The method developed here is comprised of a series of steps which produces a regular grid of estimates of 241Am concentrations. A regular grid of values is necessary in order to use modern computer contouring programs that produce lines of equal values of contamination that are used to calculate areas having concentrations above certain values. The most noteworthy steps are those that exploit the information contained in each data set to create the final combined data set from which the final regular grid of estimates is made.

The method takes advantage of the fact that both sets of data measure exactly the same physical phenomenon (one nuclear bomb detonation) and accounts for the differences in the way they were measured, physical changes over time, and the difference in support. It further takes advantage of the spatial correlation that is inherent in the physical process (the bomb blast) of creating the soil contamination, and the spatial cross correlation of the two methods used to measure it. In this paper, estimation of the areal distribution of 241Am is the objective: 241Am is a gamma emitter that is used as a surrogate for 239Pu because 239Pu is an alpha emitter which is more difficult to measure.

The Palanquin data in Figure 1 show that there are large areas near ground zero that are void of 241Am samples. Note, however, that the Exposure data are relatively abundant and more homogeneously distributed. A major reason for using the Exposure data here is that the 241Am data do not include sufficient background data to allow interpolation over the area of interest.

DETERMINATION OF BACKGROUND CONCENTRATION

The use of Exposure data to determine locations of background levels of the 241Am is a major strength of this method. Because our 241Am data are sparse in areas near ground zero, they do not define continuous concentrations from peak to background, hence making valid estimation of those concentrations (which include the concentrations that we are interested in) impossible. Fortunately, we can determine background values by using our Exposure data: This determination is easy because of the nature of the technique used to process them.

The spatial statistics used here produce better results when the major feature of a data set (the trend) is removed. A second major strength of this method is that it enables the removal of the trend. In order to remove the trend, however, it first must be estimated which also requires knowing the background data.

The Exposure data were processed by using a procedure1 which assumed that the spectral shape of the gamma energy in the region between 50 and 3,000 keV is relatively independent of flight altitude, concentration of airborne radium and thorium daughters, and the natural radionuclide soil concentration. Furthermore, man-made radionuclides emit most of their gamma energy in the range from 50 to 1,390 keV. The procedure involves first evaluating the ratio K of low (50 to 1,390 keV) to high energy (1,400 to 3,000 keV) in areas free of man-made radionuclides. Assuming that K is constant, the man-made energy emission in a "hot spot" is the difference between the product of K times the high band energy and the low band energy. The values obtained in background regions (outside of the plume) constitutes a random noise component which is comprised of statistical noise, electronic noise, and random counts. A histogram of the noise should vary around zero and have the approximate shape of a Gaussian curve. Use of the histogram provides a tool for the evaluation of the quality of the processing, and a determination of the background values of the aerial data. Based on a mean of zero and standard deviation of 0.52 mR/h, we can assume that data above +1.56 (3 standard deviations above the mean includes 99.87% of the noise) represent contamination. We assume that data below 1.56 are noise, contain no information, and can be set to zero. Therefore, they will be appended to our primary data set to allow interpolation.

RIDP DATA TRANSFORM

Because both of our data sets are measurements of the same gamma radiation, they are "similar" in the sense that, if they were perfect, a surface map of one of them could be transformed mathematically to have the same general features and magnitudes as the other one. Because of different support and detector characteristics, physical changes, measurement error, etc., one would expect the data maps to differ in their finer features.

In order to exploit this similarity we find a mathematical transform which makes their surface maps roughly similar. Noting Figure 1, we see that there are several north-south transects of 241Am data which are nearly collocated with the flight lines of the Exposure data. To aid in the determination of the transform, two sets of collocated estimates projected on a north-south line and were plotted together. Based on the best visual "fit", a mathematical function (0.1( 241Am)7/12) was applied to the original 241Am values while the Y coordinates of the Exposure data were shifted 200 feet to the south (to account for intentional error in the Global Positioning System). To obtain the spatial error correction, the Exposure data were shifted incrementally north-south and east-west with respect to the 241Am until a best fit was obtained. Henceforth, the transformed 241Am values were used.

TREND REMOVAL

Spatial statistics makes the assumption that the mean of the data is constant over the area represented by the data set that is not satisfied when the data represent one major feature (the plume with high values) surrounded by background (essentially zero). In order to satisfy the assumption, it is common to subtract the major feature (before estimation) from the original data leaving the "residuals": this is trend removal. The trend is never known exactly, but it is sufficient to estimate it, because it will be added back in after the spatial statistical estimation is performed on the residuals. In our case, the 241Am data set is not only sparse, but also contains large empty regions which border the plume. Therefore, it is not possible to define the trend by using only the 241Am data. Here is where we use the secondary data set. We established above that the Exposure data below 1.56 mR/h are noise, which tells us where there is no radiation. Therefore, the values at these locations are set to zero and appended to the transformed 241Am data set. This set of background locations is shown in Figure 2. Because 241Am comprises only about 2 percent of the total energy, when the total man-made exposure is at noise level, the 241Am content is certainly at background levels. This appended data set has background information that helps to define the plume (the trend.)

Figure 2. Locations of background data that were applied to the Americium data set to define the trend of the contaminant plume.

An inverse distance squared interpolator was used to estimate the trend on a regular grid by using the transformed 241Am data with the appended background data. At each original data location (for both 241Am and Exposure), a trend value was interpolated from the regular grid, then subtracted from the respective original data value. The differences comprised the residuals of the 241Am and Exposure data which, when combined, becomes the final data set used for cokriging.

GEOSTATISTICS

The next step was to determine the spatial correlation of the data by calculating variograms for each variable (241Am and Exposure) and a cross-variogram for the correlation between them, after which mathematical models were fit to the variograms for use in kriging and cokriging. Variograms were calculated by using GSLIB software 2. Background (noise) data were not included in the calculation of geostatistical variograms. On the other hand, the sets created for producing the final geostatistical estimates included the background because it is needed to make interpolations. Kriging and cokriging (using GSLIB software) produced estimates on a 100' x 100' grid (930 m2) by using the variogram models and final data sets. The trend surface was added back to the gridded estimates of the residual data, after which the reverse transformation was performed to get the final estimates of the 241Am contamination. Inventory is calculated by summing the products of concentrations (nCi/m2) times the area ( 930m2) of each grid cell. In addition, for comparison with previous studies at the NTS, the 241Am values were converted to obtain the concentration of 239Pu in picoCuries/gram.

RESULTS AND COMPARISONS

The results from three interpolation methods is shown in Figure 3, where the three contour maps show the progression from 1) the kriging without knowledge of either the trend or the background, 2) kriging with trend removal but without appended background, 3) and cokriging with trend removal. The effect is dramatic between 1 and 2, and significant between 2 and 3. To further assess the differences, we compare our inventory and distributions quantitatively with two previous studies. Our results and those of the two previous studies are given in Table 1.

Figure 3a shows results of kriging with neither trend removal nor appended background data. Figure 3b shows kriging results with trend removal but without appended background data. Figure 3c shows cokriging results with trend removal (background are part of the cokriging data set.)

An inventory 3 for the Palanquin area was made by the RIDP, but because the paucity of data prevented producing isopleth contours by machine, the investigators "inferred" contours from the value of the data points. Their estimate of the 241Am inventory was 13.3 Curies. By summing the products of concentrations (nCi/m2) and the area per grid cell (m2) from our cokriging estimates, we obtain 16.4 Curies which is a 23 percent increase.

The distribution provided by calculating the area between two contamination level isopleths is perhaps a more meaningful measure of comparison since risk assessment is based on the concentration level of the contaminant to which a subject is exposed. A second study 4 appended zeros (no methodology explained) to the RIDP 239Pu concentration data at the Palanquin site in order to calculate isopleth contours by computer. 239Pu (pCi/g) was calculated from 241Am
(nCi/m2) by using the depth of the contamination (2.5cm), the density of the soil (1.5g/cm3) and the ratio of 239Pu to 241Am (2.6) . The contours from the DOE study are shown along with our cokriging result in Figure 4, and Table 1 gives the areas and 241Am inventories for both studies calculated from mean interval concentrations and areas.

Figure 4. Comparison of areal distribution results from previous investigators (left) and cokriging (right) overlaid with Americium data (round dots.) Shaded areas represent the same levels in both plots. Labels are in pCi/g.

DISCUSSION

Two major features of this method make it possible to improve the accuracy of 241Am estimates. The first is based on the fact that both 241Am and Exposure data are measurements of the same phenomenon which allows us to transform one set to be similar to the other on a large feature scale. Transformation of the data subsequently allows the use of the Exposure data to improve estimates of 241Am where it is sparsely sampled.

The second main feature is the method of defining the background regions, which in turn allows us to remove the trend from the 241Am data set. By understanding the data processing technique that produced the Exposure data, we were able to identify background levels of exposure (gamma energy) from man-made radionuclides. Appending zeros at background locations to the 241Am data set helps to define and, subsequently, to remove the trend. Because we achieved similarity of the two data sets by a transformation, we were able to remove the same trend from both data sets.

In waste management, it is the contaminant distribution that ultimately decides the area to be remediated or demarcated, therefore, knowing it accurately is more important than knowing inventory. Table 1 shows the difference in the estimated cost of remediating soil contaminated for five clean-up levels that would be achieved by using the distributions calculated by using this method rather than those of the previous study. A recent remediation by the DOE of 1.25 hectares (at a cost of 5 million dollars) was performed for 239Pu concentrations above 200 pCi/g. From Table 1, the area having 239Pu concentrations above 200pCi/g at the Palanquin site would be 55.4 hectares by the DOE estimate and 60.8 by ours. The increase in cost for this site would be 21.6 million dollars for the extra 5.3 hectares. But, if the remediation level were set at 100pCi/g, there would be a reduction of 57.2 million dollars in the cost of remediation. These numbers are significant and this is just one of many sites on the NTS.

Table I. Comparison of 239Pu distributions and Calculated 241Am Inventories
for the Palanquin Site

According to previous DOE estimates4 the areas having 239Pu concentrations above 200pCi/g for sites on and near the NTS are 760 and 550 hectares, respectively. At a cost of 4 million dollars per hectare, the total cleanup would be over 5 billion dollars. Not all of these sites will be cleaned up, but the DOE plans to remediate the 550 hectares near the NTS. If previous estimates were in error by 10 percent, the error in cost estimate for remediation could be 220 million dollars which could have a significant affect on waste management decisions. For the entire NTS, such an error could produce errors in cost estimates of about one half billion dollars.

CONCLUSION

A method that combines two existing data sets to improve estimates of inventories and distributions of specific radionuclides in soils has been developed. It takes advantage of the high energy resolution of ground-based gamma measurements that were acquired by using a hyper-pure germanium detector, and of the high areal coverage of an aerial survey (Exposure data) that used high-gain (but low energy resolution) sodium iodide detectors.

The comparisons of our results with those from other studies are interesting but not conclusive because the "truth" is not known, however, our method is technically defensible. It uses, but does not depend on sophisticated statistical procedures. The main features are grounded in well established data acquisition and processing methods. A study is needed to calibrate and validate our method. Such a study would include acquiring collocated data from both data acquisition systems over at least one profile that includes a hot spot. Such a study has been proposed.

If this method is found to be valid, the cost differentials between its results and those of previous studies would justify its application to the other Corrective Action Sites) on the Nevada Test Site.

NEXT STEP: APPLICATIONS TO RISK ASSESSMENT
AND RISK MANAGEMENT

Results of utilization of this method will permit a more accurate determination of human health risk assessment consistent with anticipated exposure conditions. Currently, interim remediation radionuclide concentration standards have been established for only several of the identified 82 surface soils contaminated Corrective Action Sites. These interim standards have been based on in-situ concentration measurements, presumed future exposure scenarios, and human health radiation dose limits based on DOE orders. The improved source characterization method also could be utilized to determine the impacts on proposed remediation concentration standards (and subsequent impacts on remediation cost) due to potential changes in the basic human health dose limit. DOE is in the process of developing a proposed draft rule applicable to the cleanup of property containing residual radioactive material (10 CFR Part 834). Similarly, both EPA and NRC already have issued draft rules (40 CFR Part 196 and Decommissioning Rule, respectively) on this topic. In addition, several DOE sites already have proposed adoption of interim concentration standards consistent with dose limits more stringent than the maximum allowed DOE dose limit (reflecting State and Federal EPA regulatory compliance concerns).

ACKNOWLEDGMENTS

The work reported in this paper would not have been possible without the support and cooperation of Leah Dever and David Hippensteel of the DOE, Nevada Operations Office. Appreciation is due to Dr. Evan Englund of the US E.P.A. and to Dr. Dale Easley of the University of New Orleans for their professional advice and review of this manuscript.

REFERENCES

  1. Hendricks, Thane J. 1985. Radiation and Environmental Data Analysis Computer Hardware, Software, and Analysis Procedures. EG&G Energy Measurements Report, Section 2.
  2. Deutsch, Clayton V. and Journel, André G. 1992. Geostatistical Software Library and User"s Guide, Chapter IV, pp. 69-115. New York, Oxford University Press
  3. McArthur, Richard D. and Mead, S. W. 1988. Nevada Test Site Radionuclide Inventory and Distribution Program: Report #4. Areas 18 and 20. DOE/NV/10384-22 Publication #45063
  4. Department of Energy 1995. Cost/Risk/Benefit Analysis of Alternative Cleanup Requirements for Plutonium-Contaminated Soils On and Near the Nevada Test Site. DOE/NV-339 UC-700

BACK