Although the interval velocities obtained from surface seismic data normally contain errors (caused by poor processing, anisotropy, and finite aperture effects, among other factors), prospects are often drilled using only depth-converted seismic data. Unsurprisingly, depth converted seismic data often poorly predicts the true depth of important horizons. This ``mis-tie'' is more than a mere inconvenience; inadvertently drilling into salt Payne (1994) or into an overpressured layer Kulkarni et al. (1999) can result in expensive work interruptions or dangerous drilling conditions. For depth conversion, vertical seismic profile (VSP) data generally produces better estimates of interval velocity than does surface seismic data. For this reason, VSP data has been used to ``calibrate'' seismic velocities to improve depth conversion, before, during, and after drilling.

Methods to independently estimate interval velocity from VSP and surface seismic data exist and are more or less mature. Surprisingly, there exists no robust, ``industry-standard'' method for jointly integrating these two data types to estimate a common velocity model. The main challenge in developing such a method lies in measuring and accounting for the errors between each data type. Data errors may be either random or correlated, or most commonly, both. A viable integration scheme must account for both types of error.

Some calibration algorithms Ensign et al. (2000) directly compute the depth misfit between depth-converted seismic and VSP data (or sonic log picks) and use it to compute a correction velocity. The reliability of such algorithms is hampered by the assumption that the VSP data is error-free, when in fact, these data have random (hopefully) first-break-picking errors and also possibly exhibit correlated errors resulting from correction of deviated well VSP data to vertical Noponen (1995).

Various authors have employed least-squares optimization algorithms to solve a related problem: the estimation of an optimal time shift to tie crossing 2-D seismic lines at the intersections Bishop and Nunns (1994); Harper (1991). While these algorithms correctly assume that all data has errors, they assume that these errors are uncorrelated, or in other words, that the data are realizations of the same random variable. We expect seismic time/depth pairs to differ from VSP time/depth pairs by a low frequency shift, and that both data have random errors. Figure 1 illustrates this relationship as shifted probability distribution functions. A common view in practice, and one espoused by geostatistics, is that the inherent inaccuracy, or ``softness'' of seismic data causes the observed misfit between seismic and wellbore data Mao (1999). No attempt is made to estimate the joint data correlation, and the net effect is a faulty assumption that the seismic data is less accurate than it really is.

In this paper, we present a nonlinear least-squares algorithm using VSP and surface seismic data for the simultaneous estimation of interval velocity and an additive seimic correction velocity. We test the algorithm on a real VSP dataset. To simulate seismic data, we perturb the VSP data with depth errors derived from a positive velocity anomaly. The tests show that our algorithm correctly handles the errors in VSP data and leads to an unbiased residual. We also add random errors to both the VSP and seismic data and show that by assuming that the data are correlated, we can improve the final result.

soft
We assume that seismic time/depth pairs differ from
VSP time/depth pairs by a shift (due to poor processing, anisotropy, finite aperture effects, etc.),
and that both are random variables. The respective probability
distribution functions (pdf's) are displayed as bell shaped curves. If the seismic data are
considered soft, and no effort is made to estimate correlated errors (shift in pdf), then a
common, incorrect tendency is to assume that the seismic data are much less accurate than they
are in reality.
Figure 1 |

4/29/2001