Vast regions of the world have good petroleum potential but are hard to explore because of the difficulty of obtaining good quality reflection seismic data. The reasons are often unknown. What is ``poor quality'' data? From an experimental view, almost all seismic data is good in the sense that it is repeatable. The real problem is that the data makes no sense.
Take as an earth model a random arrangement of point reflectors. Its migrated zero-offset section should look random too. Given the repeatability that is experienced in data collection, data with a random appearance implies a random jumble of reflectors. With only zero-offset data little else can be deduced. But with the full range of offsets at our disposal, a more thoughtful analysis can be tried. This chapter provides some of the required techniques.
An interesting model of the earth is a random jumble of point scatterers in a constant-velocity medium. The data would be a random function of time and a random function of the horizontal location of the shot-geophone midpoint. But after suitable processing, for each midpoint, the data should be a perfectly hyperbolic function of shot-geophone offset. This would determine the earth velocity exactly, even if the random scatterers were distributed in three dimensions, and the survey were only along a surface line.
This particular model could fail to explain the ``poor quality'' data. In that case other models could be tried. The effects of random velocity variations in the near surface or the effects of multiple reflections could be analyzed. Noise in seismology can usually be regarded as a failure of analysis rather than as something polluting the data. It is the offset dimension that gives us the redundancy we need to try to figure out what is really happening.