Sie sind auf Seite 1von 15

SPWLA 31st AnnualLoggingSymposium.

June24-27,199O

RESOLUTIONENHANCEMENTOFNUCLEAR
MEASUREMENTSTHROUGHDECONVOLUTION

L. A. Jacobson
D. F. Wyatt, Jr.
L. L. Gadeken
G. A. Merchant
Halliburton Logging Services, Inc.
Houston, TX. TT

Abstract

Recently, resolution enhancement of nuclear log measurements has used either continuously
recalibrated sensor information of good intrinsic vertical character or the matching of
formation modeIs employing narrow rectangular beds to log measurements. The former is
computationally fast but sensitive to the borehole environment, while the latter technique is
computationally intensive, depends on the validity of the model assumed and is not readily
applied to real-time processing. As an alternative, deconvolution of count rate information
from nuclear measurements is proposed. Since noise enhancement is the inevitable result
of deconvolution, only mild deconvolution is employed.

Noisy and noise free synthetic data are used to demonstrate that the technique does not
produce any significant artifacts in the sharpened data, to determine the degree of bed
boundary sharpening that results and to illustrate the extent of noise enhancement. A
simple combined smoothing filter and deconvolver that requires only a short interval of
data is described and demonstrated on field measurements from natural gamma-ray and
pulsed neutron tools. The advantages of this technique are that it is simple and fast. It also
makes no assumptions concerning the physical environment so that standard processing of
the count rate data is retained.

Introduction

Resolution enhancement of nuclear log measurements has generally followed either of


two distinctly different paths. One uses continuously recalibrated sensor information of
good intrinsic vertical character such as the near spaced detectors of density or neutron
tools.‘-s Recently, one of these procedures was evaluated and compared to inverse filtering.’
The other involves matching formation models employing narrow rectangular beds to log
measurements using sophisticated minimization schemes such as the maximum entropy
method.‘-’ The former is computationally fast but can have problems with rapid varia-
tions in the borehole environment. Furthermore, it can never achieve depth sharpening

1
SPWLA 31~ Annual Logging Symposium. June 24-27. 1990

greater than the intrinsic resolution of the near spaced sensor. The latter techniques can
produce impressive results, however, they are computationally intensive, depends on the
validity of the model assumed and are not readily applied in real-time.

An alternative is proposed here which employs the deconvolution of nuclear count rate
data in a manner similar that suggested by Quirein and Purdy? This procedure has been
assiduously avoided in the past because noise enhancement is the inevitable result of de-
convolution and nuclear data are inherently noisy. However, rather than completely de-
convolve the sensor’s smearing effect only mild deconvolution is utilized to minimize the
noise amplification. Furthermore, the scheme is restricted to sensors where count rates are
fairly high. Noisy and noise free synthetic data are used to demonstrate that the technique
does not produce any significant artifacts in the sharpened data such as ringing or serious
overshoots at sharp bed boundaries. These synthetic data are also used to determine the
degree of bed boundary sharpening that results for a given degree of deconvolution and to
illustrate the extent of noise enhancement as a function of the strength of the deconvolu-
tion used. A simple combined smoothing fdters and deconvolver is described that is fast
and requires only a short interval (several feet) of log data in real-time applications.

The combined filter is applied to real log data from a natural gamma-ray tool and a pulsed
neutron tool to demonstrate its utility. The procedure makes few assumptions save for
the sensor impulse response function. Previous work’ has shown this not to be a very
stringent constraint. The advantages of this approach are that it is simple and fast so that
consistent results are obtained in real-time log data processing. Furthermore, no assump-
tions concerning the physical environment are made so that standard processing of the
count rate data is retained. For the pulsed neutron log this means the usual computation
of C is retained along with all the standard environmental corrections. A procedure simi-
lar to this, but more sophisticated, for the dual-spaced neutron porosity tool is discussed
elsewhere.“’

Theoretical basis

Czubek” has shown that a simple gamma-ray detector has a spatial response function
that is closely approximated by the convolution of a rectangular box whose length is
the detector’s physical length with an exponential cusp function whose decay constant is
related to the average gamma-ray attenuation length in common reservoir formations. This
is shown schematically in figure 1. In-house evaluations and data published by Czubek
show that the cusp decay constant, (Y,has a value of approximately 3 ft-’ for sedimentary
formations. The similarity of the box-plus-cusp function to a gaussian function, which will
be discussed in detail below, is important to the success of the smoothing filter described
by Jacobson.’

The box-plus-cusp representation of the sensor’s spatial response function suggest that
partial deconvolution may be obtained by simply removing the effect of the cusp. Moreover,
a sampled exponential cusp has an inverse spatial filter which is very simple. Figure 2
shows the spatial filter coefficients for the inverse cusp. This three point filter will be

2
SPWLA 31s Annual Logging Symposium. June 24-27. 1990

familiar to many in the well log community since it has the same form as the shoulder bed
correction filter that has been applied to the deep induction log for over 30 years. The
spatial coefficients (current depth minus one sample, current depth and current depth plus
one sample) for this deconvolver are: -a, 2a+ 1, -a where a is obtained from the cusp decay
constant. For a decay constant of 3 ft-’ and a sampling interval of l/4 ft the value of a is
1.7.

Thus, the approach taken here is to first smooth the log data using the filtering algorithm
of Jacobson and then apply the inverse cusp deconvolution filter to enhance the spatial
resolution. The former provides nearly optimal suppression of statistical noise consistent
with preservation of the sensor spatial resolution while the latter enhances the spatial
resolution at some increase in log noise.
TT
It is important to remember that the object of resolution enhancement is solely to create
sharper bed character where it exists intrinsically. Some of the log examples in section IV
will clearly demonstrate that not all features are spatially enhanced.

Results on synthetic data

Simulated logging profiles are a useful mechanism for testing many data processing tech-
niques since the target objective is then known in advance. Field log data can be used
to show how the filtering and deconvolution schemes reported here perform, but since the
true geologic profile is often not well known, it is difficult to assess the accuracy of the
scheme. From field logs it is also difficult to determine if the processing introduces any
biases or overshoots. However, noise free synthetic data can be constructed that allows
accurate assessment of bias or overshoot effects. When a known amount of noise is added
to the synthetic data, the suppression or enhancement of noise can be accurately assessed
for the techniques developed here.

A simulated log profile was constructed which possessed many features found on real
well logs. A portion of this simulated log (dotted curve) is shown in figure 4. Depth
sampling at l/4 ft intervals was used on beds of thickness down to 1 ft. This profile
was smeared by the finite spatial response of a sensor which was characterized by a box
length of .75 ft and a cusp decay factor of 3 ft-‘. These values produce a sensor response
function of roughly 1 ft spatial resolution (here defined roughly as the full width at half
maximum of the response function) and is shown by the solid curve~in figure 3. This
response characteristic would be typical of a gamma-ray logging tool. A Gaussian response
function of 1 ft spread (FWHM) is shown by the dotted line in figure 3 for comparison.
The close similarity of the two functions allows the use of the filtering algorithm described
by Jacobson. This algorithm has the property of significantly reducing statistical noise
consistent with minimal degradation of the vertical resolution of the sensor.

This can be seen in figure 4 by comparing the smeared response (dashed curve) with the
response after filtering (solid curve). They are nearly indistinquishable from one another.
Clearly there is virtually no loss of spatial resolution. Figure 6 shows the effectiveness of

3
SPWLA 31s Annual Logging Symposium. June 24-27.1990

this filter (solid curve) in suppressing noise which was added to the smeared profile (dashed
curve). The random noise added had a one standard deviation of 3% at full scale with the
relative noise amplitude increasing as the reciprocal square root of the amplitude as the
amplitude decreases. This approximates the effect of statistical noise.

This smoothing filter, whose spatial filtering coefficients are shown by the short-dashed
curve in figure 3, preserves the sensor intrinsic spatial resolution but does not enhance
it. As mentioned above some enhancement can be obtained by applying the inverse cusp
filter (a deconvolver) to the filtered data. The effect of filtering and deconvolving in this
manner is illustrated by the solid curve in figure 5 for noise free data. Small overshoots of
a few percent do occur at the high contrast bed boundaries (for instance at a depth near
12 ft) and at the center of the 2 ft beds. Notice, however, the significantly better center
bed response in the 1 ft beds and the sharper bed boundaries in the thick beds. Figure 7
shows how the combination of filtering and deconvolution perform on noisy data. Clearly
the deconvolution process has slightly increased the noise as expected. The noise level is
about the same (perhaps slightly less) when compared to the unfiltered data. However,
the spatial resolution is enhanced. In sum, improved spatial resolution is obtained relative
to unfiltered raw data without any increase in noise. Because of the simplicity of the
inverse cusp deconvolution filter, it can easily be combined with the smoothing filter to
produce a single filter that accomplishes both processes. The spatial filter coefficients for
this combined filter are shown by the dot-dashed curve in figure 3. This single combined
filter has only two adjustable parameters: the smoothing filter’s spatial extent (FWHM)
and the deconvolver strength (a).

Field log results

This section contains field log examples showing the results of using the combined data-
filtering plus deconvolution processing on natural gamma ray data and on pulsed neutron
data. The gamma ray logging runs were made in one of the test wells at the HLS facility in
Ft. Worth, Texas at a logging speed of 1800 ft/hr and used a sampling rate of ten samples
per foot. The gamma ray tool had a 8 inch long sodium iodide detector.

The raw data and the processed log results are shown in Figure 8. The raw data are
displayed in track 1 at the left, the smoothed data are in track 2 in the middle and the
smoothed plus deconvolved data are in track 3 at the right. In each track the three
individual logging passes are shown as the lighter dashed traces surrounding a heavy solid
trace corresponding to the average value. The thinner solid trace at the base of each track
is the standard deviation associated with the average. Note that the full-scale value for
each track is 200 API.

The raw gamma ray data in track 1 show that there are some small structural features
which are masked when the data are processed with a smoothing filter using the spatial
coeffecients obtained from the dashed curve in figure 3 and shown in track 2. However,
note several discernable thin beds, particularly the 1.5 ft bed at 1851 ft, the 1 ft bed at
1877 ft and the 1.5 ft bed at 1903 ft. The deconvolution enhances some of the lost smaller

4
SPWLA 31~ hnud Logging Symposium.June2627. 1990

structure and, for the beds noted, results in an enhanced minimum-to-maximum amplitude
when compared to the unfiltered data. However, the additional statistical noise tends to
mask some of the smaller real features when only a single pass is used (see the region 1880-
1895 ft). The standard deviation curve for the deconvolved logs clearly show this increased
uncertainty. Since natural gamma rays are emitted at relatively low counting rates, the
statistics of the measurements cannot be materially improved except by multiple logging
passes. Considering how low the counting rates actually are, the deconvolved logs in track
3 show significant improvement. All the thin bed features noted above exhibit excellent
repeatability. For nuclear logging tools which contain sources which produce the detected
gamma-rays, the issue of statistics becomes much less important as is demonstrated in the
next example.

The combined data filtering - deconvolution process was implemented in pulsed neutron TT
logging software to demonstrate its utility to this application. These data were sampled
at l/4 ft intervals. The data smoothing filter width was first determined through careful
matching of smoothed data to unfiltered results. A tool response width (FWHM) of 5
samples (1.25 ft) was obtained in this manner. Then the inverse cusp deconvolver was
added. A cusp decay constant of 4 ft-’ was used here. With the filter-deconvolver final-
ized, it was installed in the standard field logging software allowing count rate resolution
sharpening prior to the neutron capture cross section (C) computation.

Figure 9 is a log example comparing the results of the new processing scheme with C’s
computed from: a) unfiltered count rates, b) standard filtering and c) a previously pub-
lished resolution enhancement technique of Smith and Wyatt.’ Also shown are standard
deviations computed from multiple logging passes for the various C’s displayed here. The
standard deviations between different schemes should be compared only in regions of little
log activity - elsewhere much of the apparent standard deviation arises from problems of
depth matching multiple runs and not statistics. Figure 9 shows that the deconvolved
log has slightly less statistical variation than the unfiltered data and an improved vertical
resolution. Compared to the processing technique of Smith and Wyatt, it is noisier but
has better vertical response. This is most clearly illustrated in figure 10 which expands a
portion of figure 9 and displays the curves overlaid to better show the differences. For the
thin bed at 6268 ft, the deconvolution process produces the largest peak E value of any
of the techniques shown. Elsewhere the deconvolved log follows the unfiltered and high
resolution processing closely without introducing horns or other structure - note the bed
at 6300 ft. Enhanced response can also be noted in figure 9 in the thin beds at 6360 ft and
6420 ft. The principal advantage, however, is that, since the filter-deconvolution process is
applied directly to the raw count rates, standard E and ratio processing software is used.
Thus, it is applicable in real-time and eliminates concerns over unknown or miscalibrated
count-rate-to-log functions used by other processing methods.

Conclusions

Mild deconvolution employing an inverse cusp filter was applied to gamma-ray and pulsed
neutron capture data. The reasonableness of this approach was demonstrated on synthetic

5
SPWLA 31~ Annual Logging Symposium. he 24-27. 1990

data. When applied to well log data, a noticeable improvement in vertical response was
observed with only a modest increase in log noise. Even so, the statistical uncertainties of
the filtered and deconvolved logs were generally less than the unfiltered log results. The
important advantages of this approach are the simplicity of the combined filter-deconvolver
(making it well suited to real-time applications) and the retention of standard log value
data computations and corrections since the filtering is applied directly to the count rates.

Acknowledgements

We thank the oil company responsible for the pulsed neutron log example and Dennis
Durbin of HLS for his invaluable assistance in developing the log processing software.

References

1. J. E. Galford, C. Flaum, W. A. Gilchris , and S. W. Duckett, Enhanced resolution


processing of compensated neutron lo s, paper SPE 15441, SPE Ann. Tech. Con!.,
1986.

2. C. Flaum, J. E. Galford, and A Hastings, t Enhanced vertical resolution processing


of dual detector gamma-gamma density logs, paper M, SPWZA Z@ Ann. Logging
Symp., 1987.

3. H. D. Smith, Jr., and D. F. Wyatt, Jr., A technique for obtaining high vertical
resolution formation capture cross section from the thermal multigate decay
log, paper TT, SPWZA 9tih Ann. Logging Symp., 1989.

4. J. A. Quirein and C. C. Purdy, Improved resolution of nuclear well logs, paper


A, SAID I!?’ Int. Form. Ed. Symp. Trans., 1989

5. C. J. Dyos, Inversion of well log data by the method of maximum entropy,


paper H, Trans. ldh European Form. Ed. Symp., 1986

6. W. D. Lyle and D. M. Williams, Deconvolution of well log data - an innovations


approach, The Log Analyst, ~28, n3,321,1987.

7. P. Sheng, B. White, B. Nair and S. Kerford, Bayesian deconvolution of gamma-ray


logs, Geophysics, v-52, nll, 1535, 1987.

8. D. V. Ellis, Some insights on neutron measurements, presented at 1989 IEEE


N.S. Symp., San Francisco, CA, Jan. 1990.

9. L. A. Jacobson, A matched filter data smoothing algorithm, IEEE Trans. Nucl.


Sci., ~36, nl, 1227, 1989.

10. M. P. Smith, Enhanced vertical resolution processing of dual-spaced neutron

6
SPWLA 31~ Annual Logging Symposium. June 24.27. 1990

and density tools using standard shop calibration and borehole compensation
procedures, to be presented at SPWLA 91”’ Ann. Logging Symp., June 1990.

11. J. A. Czubek, Quantitative interpretation of gamma-ray logs in presence of


random noise, paper KKK, SPWLA 2?lh Ann. Logging Symp., 1986.

TT
SPWLA 31st Annual Logging Symposium. June 24-27.1990

Figure 1. The convolution of a rectangular response with


a cusp results in a gaussian-like response function.

I
. . . .. . . . .. . . .. . . . .. . . .

I I
. . . .. . . . . . .. . .. . . . . . .. . . .

b
-1 0 +1 +2
sample interval

Figure 2. The simple spatial three point deconvolution filter


for the cusp shown in figure 1.

8
SPWLA 31s Annual Logging Symposium. June 2627.1990

0.80 _
.I\ .
Cusp + Box
...........eeee Gaussian
0.70 - .I I.
----- Filter Function
.I I*
0.60 - -.- Filter +
.I I.
Deconvolver
I I
i i
TT

-0.303
-10
Sample As (.,25ft)
Figure 3. Response and smoothing filter f unctions for 0
sensor with 1 foot spotiol resolution.

9
SPWA 31st Annual Lagging Symposium. June 24-27. 1990

Figure 4. When a matched smoothkg filter (solid) is applied to a


simulated formation response (dotted) smeared with a box-plus-cusp
response (dashed) the effect is negligible. The solid curve falls
almost exactly on top of the dashed curve.

Figure 5. The same as figure 4. except the solid curve is now


the effect of the smoothing filter plus the cusp deconvolver.

10
SPWLA 31~ Annual Logging Symposium, June 24-21.1990

TT

0.01 1 ’ ’ ’ 1 ’ ’ I I I I I I I I I I I

20 40
DEPTWft)
Figure 6. A simulated formotion response (dotted) is smeared with a
box-plus-cusp response to which noise was added (dashed). The
solid curve is the effect of a matched smoothing filter.

,
i

figure 7. The same as figure 6. except the salid curve is now


the effect of the smoothing filter plus the cusp deconvolver.

11
SPWLA 31st Annual Logging Symposium. June 24-27.1990

Gamma Ray I Gamma Ray


(API) 200 0 (API) 201

Figure 8. Comparison of raw gamma-ray data (left) with smoothed (middle)


and smoothed-plus-deconvolved processed results (right). Three logging pas-
ses are shown as the lighter dashed traces in each trackwith the corresponding
average shown as the heavy solid trace. The lighter solid trace at the left is the
standard deviation associated with the three passes.

12
SPWLA 31st Annual Logging Symposium. June 2627.1990

%JNF
:,-_----_____ .-------------------------.
c “NF

uDECON
35
c DIXON
-2

-3 15 m,
%R c HR
______ __- -. _ _ ___ ___
5 55
%TD c STD
----------------______
7--------Y 55

5250

TT

Figure 9. Curves at the right show comparison of PNL data computed from:
a) unfiltered countrates, b) matched filtered - deconvolved countrates, c)
HRSIGMA processing, d) standard filtered countrates. Curves at the left show
the corresponding standard deviations.

13
SPWLA 31st Annual Logging Symposium, June 24-27,199O

----------------------------
c STD

---__------------~~~~~~~-----_-_____
2“NF
5 -2
2HR

6250

6300

Figure 10. A detailed comparison from figure 9. of bedding response for the
various processing techniques.

14

Das könnte Ihnen auch gefallen