Density is a desirable property in the input to most seismic processing. Sparse or irregular data leads to edge effects and aliasing; Kirchhoff methods will leave acquisition footprints or generate spurious events, frequency domain and finite-difference methods may be unstable or totally unapplicable. Velocity space may be used to interpolate irregular or sparse CMP gathers, but the success of such a method depends greatly on the spikiness of the model. This is because the null space in the operator changes when the data space changes, which is how the interpolation and/or regularization is accomplished. A spike in velocity space models a hyperbola to any data space, but some other shape in velocity space is likely to work only for a single data space. Using preconditioning, the model may be made spikier and interpolation results greatly enhanced.
This method gave good results on the examples studied here. Especially pleasing is the method's ability to handle non-hyperbolic moveouts, such as from point scatterers. Interesting questions arise with the next step, which is a move to 3D. How should azimuth be dealt with? Some subsequent processes will want offset and azimuth coordinates, some x and y. Regardless, Kirchoff methods do best with regularly spaced data, where amplitudes may be properly calculated. Will preconditioning be able to regularize as well as interpolate, or will the moving null spaces sink it?
Further, though the synthetic example interpolated aliased data without creating events, and appeared to be as effective in a real data test, it is worth a more exhaustive test to be certain that the preconditioning will not actually create new events in the data space.