Trace interpolation by simulating a prediction-error filter

Jun Ji

By simulating a prediction-error filter using a dip-spectrum, I devise an interpolation scheme. The procedure has three stages: (1) First of all, the dips of linear events in a given data set are estimated from the dip spectrum obtained from the slant stack. (2) Next, the filters in the $(f,x)$ domain are invented by putting zeros along the dips picked in the $(f,k)$ spectrum. (3) Finally, missing traces are found by minimizing the filtered output in the least square sense. The advantage of this approach lies in its application to both regularly and irregularly sampled traces.

SEP-report, vol.73, 203-219, (1992).


Want to see whole paper? Click here.