SEP148 -- TABLE OF CONTENTS |

I speed up the log-decon method by replacing the slow steepest-descent method with a faster quasi-Newton technique known as the limited-memory BFGS.

Modeling data error during deconvolution [PDF 72K]

Our current decons take the data sacrosanct and find the best noncausal wavelet to deconvolve it with. We propose allowing the data to include an explicit noise that does not fit the convolutional model. We write regressions to define this noise, and develop an expression for the gradient needed to fit the regressions.

Decon comparisons between Burg and conjugate-gradient methods [PDF 408K]

In testing on several nearby data sets, three shown here, the Burg method of deconvolution exhibited no issues of numerical round-off. In every case it did exhibit whiteness, an aspect of the theory normally considered desirable. Prediction-error code based on conjugate gradients (actually conjugate directions) showed some minor issues. Output comparisons of the two were never perceptibly different on paper documents such as this. When those same PDF documents are viewed on a screen, differences might be noticeable with ``blink'' screen presentation. Doing no more than the number of iterations theoretically predicted (equal to the number of filter coefficients) gave differences generally noticeable with blink presentation. Tripling the number of iterations made the differences much smaller, sometimes differing at a mere handful of pixels. Although discrepancies were minuscule on the filtered data, the differences are quite clear in a spectral comparison. Differences tend to occur at the very high and very low frequencies that are weak in the field data.

Decon in the log domain - practical considerations [PDF 1.7M]

We apply deconvolution in the log domain to marine seismic data. The inversion promotes sparsity. We compare the deconvolution results of using two types of regularizations: filter symmetry and filter length. We show that the regularizations aid in acquiring the correct shot waveform, and in sparsifying the data. An consideration that we can add to the inversion is the elimination of the marine acquisition notch frequency.

Six tests of sparse log decon [PDF 2.8M]

Previously, we developed a sparseness-goaled decon method. Here we test it on six data sets. None showed the unfortunate phase-shift we always see with minimum-phase decon. None showed the polarity reversals or time shifts that perplexed our earlier work. Results on all six data sets enhance polarity visibility. We had expected to see sparseness decon limit the bandwidth in some natural way unlike prediction-error decon with its white output. Instead, in all the cases our sparseness decon boosted frequencies much the way predictive decon does. We had not expected to see an estimated shot waveform containing a lot of low-frequency sea surface waves. One such result provoked a new theoretical development not yet tested (Claerbout and Guitton, 2012).

I present a tomographic full waveform inversion method that is based to an extension of the velocity model in time. The resulting wavefield modeling operator is linear with respect to the non-zero time lags of the extended velocity, but can effectively model multiple scattering caused by velocity perturbations. This property is attractive to achieve robust global convergence in a waveform inversion algorithm. A simple 1D numerical example illustrates the properties of the new modeling operator and its promises for robust waveform inversion.

Computational analysis of extended full waveform inversion [PDF 432K]

I compare the computational cost of conventional full-waveform inversion to the extended full-waveform inversion in both space and time. These model space extensions provide accurate results but increase the cost drastically. I also compare the cost of the nonlinear inversion to linearized inversion by scale-separation. I then propose extending the inversion in data space where there are more underlying assumptions but whose cost competes with the conventional inversion. I test extending the inversion by source ray parameter on the Marmousi model. The results of the synthetic tests show that convergence is possible even with large errors in the initial model which would have prevented convergence of conventional full-waveform inversion.

We propose a computationally efficient technique for extrapolating seismic waves in an arbitrary isotropic elastic medium. The method is based on factorizing the full elastic wave equation into a product of pseudo-differential operators. The method extrapolates displacement fields, hence can be used for modeling both pressure and shear waves. A significant reduction in the cost of elastic modeling can be achieved compared to the currently prevalent time- and frequency-domain numeric modeling methods and can contribute to making multicomponent elastic modeling part of the standard seismic processing work flow.

Modeling ocean-bottom seismic rotation rates [PDF 964K]

Seismic systems today record up to four components which provide the particle displacement and the pressure. The pressure is proportional to the divergence of the displacement. We need the hydrophones because the divergence is useful and cannot be calculated in processing. The curl cannot be calculated from four component data just like the divergence cannot be directly calculated from the displacements. If the curl is useful, we can add rotation sensors to today's four component recorders and have seven component data. To evaluate the added information that would come from rotation sensors we used elastic modeling. In our synthetic data experiment, we predicted the effect of a seabed scatterer on fully multi-component data. We used a pressure source that generates P waves. The P-waves are converted to S-waves and to surface waves propagating on the seabed. Our evaluation is that the added information from rotation sensors will be useful for identifying and separating surface waves from body waves.

Two point raytracing for reflection off a 3D plane [PDF 476K]

I present a simple, elegant approach to calculating two-point rays reflecting off a 3D dipping plane and investigate extensions to converted wave reflection and offset-vector map demigration.

Interactive processing: Geometry manipulation [PDF 304K]

Header manipulation for regularization, registration and quality control is often a time-consuming task with 3-D datasets. These manipulation tasks are often performed multiple times before achieving the desired result, each requiring a large amount of data to be read in with very few operations performed on each byte read. The problem can be made more tractable by reading in a random subset of the headers. Operations such as rotating, windowing, and gridding can then be performed interactively. With each processing step a record is created. These records can then be used to process the entire dataset, requiring only a single read and write of the volume.

Microseismic data are not created ready for imaging. They can be extremely noisy and it is not a straightforward task identifying reflections. In a previous work, we were able to use multiplets or events of the same waveform to identify some reflections. However, realizing that reflections were weaker on the stack than they are on the individual seismograms, we inferred we had a misalignment issue between the different seismograms. In this work, we use fractional shifts and receiver-by-receiver shifting to align seismograms more effectively. We also investigate some aspects of multiplets and how they are related to each other in space and time, as well test the use of cross correlations of P-direct arrivals with whole seismograms to identify the unfound P-reflections.

Earthquake extraction and correlation energy at Long Beach, California seismic survey [PDF 3.3M]

Seismic interferometry of passive data offers a potential solution to creating reservoir-scale images in urban environments. A four-month, high station density passive seismic dataset collected in Long Beach, California is ideal for testing this hypothesis. Preliminary work on these data is promising. We clearly capture waveforms from earthquakes near (less than 15 km) and far (greater than 250 km). We successfully construct virtual sources through cross-correlation of low-frequency energy (0.175 to 1.75 Hz). The correlated energy is quite noisy and appears to be directed toward the northeast, suggesting that longer correlation times are needed for land data and that the Pacific Ocean is likely producing strong directed energy, respectively. Furthermore, the quality of correlation results differ depending on the time window. We argue that these differences are attributed to weather conditions, with records during stormier periods producing cleaner Green's functions than records during calmer periods.

SEP148 -- TABLE OF CONTENTS |

2012-10-29