SEP140 -- TABLE OF CONTENTS |

We show 3D real data results of migration velocity analysis by wavefield extrapolation using data synthesized by the pre-stack exploding-reflector model (PERM). PERM is a generalization of the exploding reflector model (ERM) in the sense that migration of PERM data can generate a pre-stack image, which is not achievable with ERM. PERM data allow fast migration velocity analysis, since its size can be orders of magnitude smaller than conventional data reduction approaches such as those used in plane wave migration. Also, because the modeling can be limited to the region where the velocity is to be updated, additional savings are possible by solving for the velocity in a target-oriented manner.

Target-oriented wavefield tomography using demigrated Born data [pdf 1.4M][source]

We present a method to reduce the computational cost of image-domain wavefield tomography. Instead of using the originally recorded data for velocity estimation, the proposed method simulates a new data set obtained using Born modeling or demigration based on the initial image and gathers. The modeling can be performed in a target-oriented fashion, and it can use arbitrary types of source functions and acquisition geometries. Hence the size of the new data set can be substantially smaller than the original one. We demonstrate with numerical examples that the new data set correctly preserves velocity information useful for velocity estimation, and that it generates wavefield-tomography gradients similar to that obtained using the original data set. We apply the proposed method to a modified version of the Sigsbee2A model, where two square anomalies below the salt have been successfully recovered in a target-oriented fashion at a much lower computational cost.

Wave-equation tomography by beam focusing [pdf 560K][source]

Velocity can be estimated using a wave-equation operator by maximizing an objective function that measures the flatness of the crosscorrelation computed between a source wavefield and a receiver wavefield. The proposed objective function depends on the parameters of a residual moveout applied to the computed correlation. It is composed of two terms: the first term maximizes the energy of the stack computed on local subarrays as a function of the local curvature. The second term maximizes the power of the stack computed globally as a function of time shifts applied to the stacks of the local subarrays. The first terms is essential to assure global convergence in presence of large velocity errors. The second term plays a role in estimating localized velocity anomalies. Numerical examples of computation of the gradients of the proposed objective function confirm its potential for velocity estimation.

Near-surface velocity estimation by weighted early-arrival waveform inversion [pdf 1.1M][source]

In this paper, I present a modified version of the conventional waveform inversion objective function to bridge the gap between the acoustic waveform inversion engine and the more complicated physics in recorded data. The proposed method weighs the amplitude of observed and modeled data. In this way, I use more of the phase information rather than the amplitude information in recorded data. Since phase information is more robust in the presence of visco-acoustic or even elastic related physics. Synthetic examples with inversion of visco-acoustic data and elastic data show that using the proposed objective function can recover detailed velocity structures even with the presence of non-acoustic physics, while conventional waveform inversion tends to fail.

Wave-equation tomography for anisotropic parameters [pdf 392K][source]

Anisotropic models are recognized as more realistic representations of the subsurface where complex geological environment exists. These models are widely needed by many anisotropic migration and interpretation schemes. However, anisotropic model building is still a challenging problem in the industry. In this paper, we propose an approach to build the anisotropic model using Wave-Equation Tomography (WETom) on surface seismic data in the image space. To reduce the null space of the inversion, we parametrize our model space using only vertical velocity (V

Wave-equation traveltime tomography by global optimization [pdf 156K][source]

Wave-equation traveltime tomography is conventionally done by picking the maximum cross-correlation lags between the modeled and observed data. However, a trace-by-trace method of picking makes the velocity update more susceptible to local noise in the correlation as well as inconsistencies in the data. In this paper, I compare the local method of picking the maximum correlation to a global method based on maximizing the stacking power along an interpolated spline surface in the correlation window. The results show that the global scheme is more robust to local noise but sacrifices accuracy and convergence rate.

We present a method for computing the wave-equation-based angle-domain illumination for subsurface structures. It creates subsurface illumination for different scattering or/and dip angles for a given acquisition geometry, velocity model and frequency bandwidth. The proposed method differs from the conventional method in that it does not require local plane-wave decompositions for each source and receiver Green's functions. Instead, it transforms a precomputed subsurface-offset-indexed sensitivity kernel into angle domain using either a Fourier-domain mapping or a space-domain slant stack. We show that the computational cost can be significantly reduced by phase encoding the receiver-side Green's functions, or by simultaneously encoding both the source- and receiver-side Green's functions. Numerical examples demonstrate the accuracy and efficiency of our method. The main anticipated applications of our method are in areas of: (1) accurate amplitude-versus-angle (AVA) analysis by compensating depth-migrated images with angle-dependent illumination, (2) migration velocity analysis that incorporates angle-dependent illumination for robust residual parameter estimation, and (3) optimum seismic survey planning.

Joint least-squares inversion of up- and down-going signal for ocean bottom data sets [pdf 1.6M][source]

We present a joint least-squares inversion method for imaging the acoustic primary (up-going) and mirror (down-going) signals for ocean-bottom seismic processing. Joint inversion combines the benefits of wider illumination from the mirror signal and better signal-to-noise ratio from the primary signal into one image. Results from two modified 2D Marmousi models show a better illumination of the subsurface and improved resolution in geologically complex areas.

More fun with random boundaries [pdf 312K][source]

Transfering wave-field checkpoints from disk to the compute engine is often a bottleneck in Reverse Time Migration (RTM). The need to create wave-field checkpoints can be eliminated by running the wave equation backwards and choosing a time reversible boundary condition. In acoustic propagation, velocity can be made more and more random within a boundary region. Reflections from the incoherent boundaries are random in nature and produce minimal coherent correlations when applying the RTM imaging condition. In Vertical Transverse Isotropic (VTI) media, the horizontal and Normal MoveOut (NMO) velocities can be modified to cause wavefronts to propagate parallel to the boundary region causing further degradation of coherent signals. Coherent reflections can be further reduced by decreasing the randomized velocity within the boundary region.

Implicit finite difference in time-space domain with the helix transform [pdf 348K][source]

Spectral factorization is a method of creating causal filters which have causal inverses. I use spectral factorization of an implicit finite-difference stencil of the two-way wave equation approximation in order to model wave propagation by a sequence of deconvolutions. I deconvolve this filters coefficients with the wave-field propagating in a constant velocity medium using the helix approach. In comparison with explicit approximations, implicit approximations have unconditional stability, enabling the use of larger time steps during the modeling process. The advantages are both in reduced computation time, and in the extension and scalability to multiple dimensions enabled by the helix operator.

Predicting rugged water-bottom multiples through wavefield extrapolation with rejection and injection [pdf 1.8M][source]

Although convolution-based and WEM-modeling-based methods for predicting surface-related multiples are well-recognized in marine seismic data processing today, the effectiveness and efficiency of these methods are still a challenge in practice. In this paper, I present a WEM-modeling-based approach to multiple prediction. When wave-field rejection and injection are used during wave-field extrapolation, rugged water-bottom multiples can be accurately predicted when only the water-bottom elevation and water velocity are known.

To retrieve a sparse model, we applied the hybrid norm conjugate-direction (HBCD) solver proposed by Claerbout to two interesting geophysical problems: least-squares imaging and blind deconvolution. The results showed that this solver is robust for generating sparse models.

Blocky velocity inversion by hybrid norm [pdf 1.7M][source]

Inverting a regularized Dix equation using the

Geophysical applications of a novel and robust L1 solver [pdf 412K][source]

L1-norm is better than L2-norm at dealing with noisy data and yielding blocky models, features crucial in many geophysical applications. In this report, we develop a hybrid-norm solver proposed by Claerbout (2009) to perform L1 regressions. The solver is tested on a 1-D field RMS velocity inversion, a 2-D regularized Kirchhoff migration inversion and a 2-D velocity analysis problem. The results of the inversions show that this solver can yield ``blocky'' models, and has the advantage of straightforward parametrization.

Hydrocarbon reservoirs can be efficiently monitored with simultaneous-source seismic data sets. Because simultaneous-source acquisition reduces time and cost requirements, seismic data sets can be recorded cheaply at short regular intervals, thereby allowing for near real-time monitoring. Although, in many cases, the recorded multiplexed data can be separated into independent records, we choose to leverage the efficiency of direct imaging of such data sets. However, direct imaging with a migration algorithm introduces cross-talk artifacts and does not account for differences in acquisition geometry and relative shot-timing between surveys. To attenuate cross-talk artifacts and acquisition discrepancies between data sets, we propose a joint least-squares migration/inversion method. By incorporating spatio-temporal and sparseness constraints in our inversion algorithm, we ensure that the resulting time-lapse images are geologically plausible. Using a 2D numerical model, we show that our method can give results of comparable quality to migrated single-source data sets.

Low frequency passive seismic interferometry for land data [pdf 4.4M][source]

Here we report results achieved by low-frequency seismic interferometry on a passive seismic land dataset recorded at a field in Saudi Arabia. Computed spectra for different portions of the data show a time varying ambient seismic wavefield displaying a diurnal pattern. At low frequencies (< 10 Hz) the ambient seismic wavefield mainly consists of surface waves in two modes, the fundamental mode propagates with a velocity of about 1250 ms. Results suggest that sufficient coherent energy is recorded between 1 Hz and 7 Hz for retrieval of a Rayleigh surface wave. The strength of the ambient seismic field affects the convergence rate of the correlations. The directionality in the ambient seismic field affects the radiation pattern of the virtual sources. Retrieved Rayleigh waves at low frequencies show spatial variation and dispersive behavior. Dispersion curve estimation opens opportunities for reservoir monitoring by background velocity estimation.

Cyclic 1D matching of time-lapse seismic data sets: A case study of the Norne Field [pdf 22M][source]

Seismic cross-equalization attenuates unwanted or non-production related artifacts from time-lapse seismic data sets. Two robust post-imaging cross-equalization methods are considered. First, an efficient cyclic 1D correlation method is used to estimate vertical and lateral displacement components between baseline and monitor images. To obtain accurate estimates of production-related changes, all displacement components must be taken into account. Next, a cyclic 1D match-filtering method that estimates the optimal filtering parameters attenuates residual artifacts. Application to the Norne time-lapse seismic data set shows that these methods provide a powerful time-lapse cross-equalization scheme.

Seismic images of the subsurface are often very large and tedious to interpret manually; as such, automatic segmentation algorithms can be highly useful for tasks such as locating large, irregularly-shaped salt bodies within the images. However, seismic images present unique challenges for image segmentation algorithms. Here, a new segmentation algorithm using a ``pairwise region comparison'' strategy is implemented and tested on seismic images. Numerous modifications to the original algorithm are necessary to make it appropriate for use with seismic data, including: (1) changes to the nature of the input data, (2) the way in which the graph is constructed, and (3) the formula for calculating edge weights. Initial results, including a preliminary 3D implementation, indicate that the new method compares very favorably with an existing implementation of the eigenvector-based normalized cuts approach, both in terms of accuracy and efficiency.

Hypercube viewer update [pdf 1.7M][source]

Efficient viewing and interacting with multi-dimensional data volumes is an essential part of many scientific fields. This interaction ranges from simple visualization to steering computationally demanding tasks. SEP uses a multi-dimensional slice viewer called

Short note: Enhanced visualization for seismic imaging [pdf 48K][source]

I have developed a new visualization package for seismic images. SEP's existing visualization toolkit, including Grey, Graph, Wiggle, Contour, and Cube, will be integrated into a novel, unified, open-source tool. Enhanced features include interactive manipulation of graphics, efficient file-access for large data sets, network-streaming for large files, 3D acceleration options, and platform portability. This new suite of technologies will be capable of tight integration with processing workflows, enabling interactive processing. At the same time, it preserves the best features of our legacy tools, and provides a compatibility layer for seamless integration into existing environments. JTube charges forward with new features, and capitalizes on a quarter-century of technological advances since the last major design of SEP's visualization tools.

The mechanics of vertically stratified porous media has some similarities to and some differences from the more typical layered analysis for purely elastic media. Assuming welded solid contact at the solid-solid interfaces implies the usual con- tinuity conditions, which are continuity of the horizontal strain components and the vertical stress components. These conditions are valid for both elastic and poroelastic media. Differences arise through the conditions for the pore pressure and the increment of fluid content in the context of fluid-saturated porous media. The two distinct conditions most typically considered between any pair of con- tiguous layers are: (1) an undrained fluid condition at the interface, meaning that the increment of fluid content is zero (i.e., δζ= 0), or (2) fluid pressure continuity at the interface, implying that the change in fluid pressure is zero across the interface (i.e., δp

SEP140 -- TABLE OF CONTENTS |

2010-05-19