next up previous print clean
Next: Implementation of adaptive PEFs Up: Interpolation with adaptive PEFs Previous: Interpolation with adaptive PEFs

Arguments against patching

Independent patches work reasonably well in many cases, but there are some arguments against them. One easy argument against patching is that it effectively increases the size of the data. Patches usually need to have significant overlap in order to get a good interpolation result at the end. Along a single axis, a given data point is contained in 1.5 to 2 patches on average. For a 3-D cube of input data, as in a prestack 2-D survey or a prestack 3-D survey considered as individual source/streamer combinations, that effectively increases the data volume by 3.5 to 8 times.

Using adaptive PEFs means a larger volume of PEFs rather than of data. The larger volume of PEFs does not add to the number of computations required the way that a larger data volume does. The computational cost of convolutions increases with the size of the data and the number of coefficients that multiply each data point (the size of a single filter), but is indifferent to whether the coefficients that multiply two different data points belong to the same filter or different filters.

More important than the cost, there is the argument that the data are not really aligned along linear, constant slopes. There are often large portions of the data that nearly are, usually at late times and large offsets, where events approach their asymptotes. However, there are also portions of the data that do not fit that model, especially near the apex of hyperbolas, where events have the most curvature. As shown in Figure bpcmpdiff, where events have significant curvature, interpolation results suffer. Modifying our assumptions to fit curvy data should help us get better interpolation results.

Also, while there is some convenience in the notion of dividing the interpolation into many small independent problems, it ignores some potentially useful information. Dips in prestack seismic data are not independently distributed throughout the data volume. The seismic data response of a point reflectivity anomaly is large relative to an interpolation patch, and of a somewhat predictable shape, even with variable velocities. In other words, even when events are nonhyperbolic, they are likely to be nearly hyperbolic, in some rough sense.

Instead of filters in independent patches, in this chapter we estimate sets of filters in smaller, non-overlapping, non-independent micropatches. Micro-patches which are near each other in the data volume are assumed to have similar sets of dips, and thus similar filter coefficients. We assume that dips of seismic events change gradually as we move around in the data. The micropatches are small enough so that events with tight curvature can be still be reasonably resolved into linear events, but large enough to avoid excessive memory consumption brought on by allocating more filters than necessary.


next up previous print clean
Next: Implementation of adaptive PEFs Up: Interpolation with adaptive PEFs Previous: Interpolation with adaptive PEFs
Stanford Exploration Project
1/18/2001