next up previous print clean
Next: Algorithm Up: Data editing Previous: Data editing

Data continuity assumptions

To eliminate noise, some description of the signal is needed. In this case, signal is assumed to be anything that is predictable from trace to trace, while noise is unpredictable between traces. While this predictability might be refined to specify as signal anything outside the evanescent zoneClaerbout (1985), for the sake of simplicity I will stay with the assumption that signal is the only part of the data that is predictable between traces. Although later I will show an example where some coherent noise was removed, the process described in this chapter is not designed to attack coherent noises such as ground roll. The emphasis here is on removing isolated noise rather than coherent noise.

Another assumption made here is that good data may be predicted using filters. To allow a valid prediction filter to be computed, the number of samples that can be missing must be only a fraction of the original samples available, otherwise there will not be enough data to compute the filter reliably. A related assumption is that there will be enough good data to allow a valid prediction filter to be computed. If the data is completely dominated by noise, calculating a meaningful filter is difficult.


next up previous print clean
Next: Algorithm Up: Data editing Previous: Data editing
Stanford Exploration Project
2/9/2001