next up previous print clean
Next: THE NORMAL MOVEOUT MAPPING Up: INTERPOLATION AS A MATRIX Previous: Looping over output space

Formal inversion

We have thought of equation (1) as a formula for finding $\bold y$ from $\bold x$.Now consider the opposite problem, finding $\bold x$ from $\bold y$.Begin by multiplying equation (2) by the transpose matrix to define a new quantity $\tilde\bold x$:
\begin{displaymath}
\left[ 
 \begin{array}
{c}
 \tilde x_1 \\  
 \tilde x_2 \\  ...
 ...\\  
 y_2 \\  y_3 \\  y_4 \\  y_5 \\  y_6
 \end{array} \right] \end{displaymath} (3)
$\tilde\bold x$ is not the same as $\bold x$,but these two vectors have the same dimensionality and in many applications it may happen that $\tilde\bold x$ is a good approximation to $\bold x$.In general, $\tilde\bold x$ may be called an ``image'' of $\bold x$.Finding the image is the first step of finding $\bold x$ itself. Formally, the problem is  
 \begin{displaymath}
\bold y \eq \bold B \, \bold x\end{displaymath} (4)
And the formal solution to the problem is  
 \begin{displaymath}
\bold x \eq ({\bf B'\, B})^{-1} \, {\bf B'} \, \bold y\end{displaymath} (5)
Formally, we verify this solution by substituting (4) into (5).
\begin{displaymath}
\bold x \eq ( {\bf B' \, B} )^{-1} \, ({\bf B'} \, \bold B) \,\bold x
 \eq \bold I \, \bold x \eq \bold x\end{displaymath} (6)
In applications, the possible nonexistence of an inverse for the matrix $( {\bf B' \, B} )$is always a topic for discussion. For now we simply examine this matrix for the interpolation problem. We see that it is diagonal:
\begin{displaymath}
\bold B' \, \bold B
\eq
 \left[ 
 \begin{array}
{ccccccc}
 1...
 ...& 0 \\  0 & 0 & 1 & 0 \\  0 & 0 & 0 & 2
 \end{array} \right] \;\end{displaymath} (7)
So, ${\bf \tilde x}_1 = \bold x_1$; but ${\bf \tilde x}_2 = 2 \bold x_2$.To recover the original data, we need to divide ${\bf \tilde x}$ by the diagonal matrix $\bold B'\,\bold B$.Thus, matrix inversion is easy here.

Equation (5) has an illustrious reputation, which arises in the context of ``least squares.'' Least squares is a general method for solving sets of equations that have more equations than unknowns.

Recovering $\bold x$ from $\bold y$ using equation (5) presumes the existence of the inverse of $\bold B'\,\bold B$.As you might expect, this matrix is nonsingular when ${\bf B}$stretches the data, because then a few data values are distributed among a greater number of locations. Where the transformation squeezes the data, $\bold B'\,\bold B$must become singular, since returning uniquely to the uncompressed condition is impossible.

We can now understand why an adjoint operator is often an approximate inverse. This equivalency happens in proportion to the nearness of the matrix $\bold B'\,\bold B$to an identity matrix. The interpolation example we have just examined is one in which $\bold B'\,\bold B$differs from an identity matrix merely by a scaling.


next up previous print clean
Next: THE NORMAL MOVEOUT MAPPING Up: INTERPOLATION AS A MATRIX Previous: Looping over output space
Stanford Exploration Project
12/26/2000