previous up next print clean
Next: REFERENCES Up: APPENDIX Previous: APPENDIX

Proof that PE filter output is white

 

[*] The basic idea of least-squares fitting is that the residual is orthogonal to the fitting functions. Applied to the PE filter, this idea means that the output of a PE filter is orthogonal to lagged inputs. The orthogonality applies only for lags in the past because prediction knows only the past while it aims to the future. What we want to show is different, namely, that the output is uncorrelated with itself (as opposed to the input) for lags in both directions; hence the output spectrum is white.

We are given a signal yt and filter it by  
 \begin{displaymath}
x_t \quad = \quad y_t - \sum_{\tau\gt} a_\tau y_{t-\tau}\end{displaymath} (3)
We found $a_\tau$ by setting to zero $d (\sum x_t^2)/da_\tau$:
\begin{displaymath}
\sum_t x_t y_{t-\tau} \quad = \quad0 \quad\quad\quad {\rm for~} \tau \gt\end{displaymath} (4)
We interpret this to mean that the residual is orthogonal to the fitting function, or the present PE filter output is orthogonal to its past inputs, or one side of the crosscorrelation vanishes. Taking an unlimited number of time lags and filter coefficients, the crosscorrelation vanishes not only for $\tau\gt$but for larger values, say $\tau+s$ where $\tau\ge 0$ and s>0. In other words, the future PE filter outputs are orthogonal to present and past inputs:
\begin{displaymath}
\sum_t x_{t+s} y_{t-\tau} \quad = \quad0
\quad\quad\quad {\rm for~} \tau \ge 0 {\rm ~and~} s\gt\end{displaymath} (5)
Recall that if $\bold r \cdot \bold u = 0$ and $\bold r \cdot \bold v = 0$, then $\bold r \cdot ( a_1 \bold u \pm a_2 \bold v) = 0$for any a1 and a2. So for any $a_\tau$ we have
\begin{displaymath}
\sum_t x_{t+s} (y_t \pm a_\tau y_{t-\tau}) \quad = \quad0
\quad\quad\quad {\rm for~} \tau \ge 0 {\rm ~and~} s\gt\end{displaymath} (6)
and for any linear combination we have
\begin{displaymath}
\sum_t x_{t+s} (y_t - \sum_{\tau\gt} a_\tau y_{t-\tau}) \quad = \quad0
\quad\quad\quad {\rm for~} \tau \ge 0 {\rm ~and~} s\gt\end{displaymath} (7)
Therefore, substituting from ((3)), we get
\begin{displaymath}
\sum_t x_{t+s} x_t
\quad = \quad0 \quad\quad\quad {\rm for~} s\gt\end{displaymath} (8)
which is an autocorrelation function and must be symmetric. Thus,
\begin{displaymath}
\sum_t x_{t+s} x_t
\quad = \quad0 \quad\quad\quad {\rm for~} s \ne 0\end{displaymath} (9)
Since the autocorrelation of the prediction-error output is an impulse, its spectrum is white.


previous up next print clean
Next: REFERENCES Up: APPENDIX Previous: APPENDIX
Stanford Exploration Project
11/18/1997