next up previous print clean
Next: Conclusion Up: THE LSL ALGORITHM Previous: Statistical signification of

General LSL algorithm

We are interested in the prediction of a time series xT from a related process yT. We want to minimize:

\begin{displaymath}
\sum_{t=0}^T (\varepsilon^x_{n,T}(t))^2 = \sum_{t=0}^T [x(t)-f_{n,T}(1)y(t-1)-\cdots-f_{n,T}(n)y(t-n)]^2\;.\end{displaymath}

Once again, we are interested only in the residual $\varepsilon^x_{n,T}(T)$, so that the residuals are affected only by past values, and not by future values. This minimization problem leads to:
\begin{eqnarraystar}
f_{n,T}&=&(A'_{1,n,T}A_{1,n,T})^{-1}A'_{1,n,T}.x_T \\ \varepsilon_{n,T}^x&=& x_T-A_{1,n,T}f_{n,T} = (I-P_{1,n,T}).x_T\end{eqnarraystar}

If we are interested in the residuals $\varepsilon^x_{k,T}$ themselves, then we have to find the orthogonal complement of xT on the space $Y_{1,k,T}=(Zy_T,\cdots,Z^ky_T)$. So, the idea is to use the backward residuals of the self-prediction of yT to form an orthogonal basis of the space Y1,k,T, and then project the vector xT onto this particular basis. Consequently, we will lead two recursions at the same time: one to compute the variables $\varepsilon_{k,T}$ and rk,T of the self-prediction of yT, and the other to compute the residuals $\varepsilon^x_{k,T}$ of the joint prediction of xT from yT.

The order-updating of the residuals $\varepsilon^x_{k,T}$ is very similar to the updating of the residuals $\varepsilon_{k,T}$. Effectively,
   \begin{eqnarray}
\varepsilon^x_{k+1,T}&=&x_T - P_{1,k+1,T}.x_T \nonumber \\  &=&...
 ...on^x_{k,T} - {r'_{k,T-1}x_T\over r'_{k,T-1}r_{k,T-1}}r_{k,T-1} \;.\end{eqnarray}
(6)
We can define the crosscorrelation $\Delta^x_{k+1,T}$ and the reflection coefficient Kxp+1,T:

\begin{displaymath}
\Delta^x_{k+1,T}=r'_{k,T-1}x_T\;, \mbox{\hspace{1.0cm}and\hspace{1.0cm}} K^x_{p+1,T}={\Delta^x_{k+1,T}\over R^r_{k,T-1}} \;.\end{displaymath}

The time-updating is similar to equation (5). I use here an equation derived from Lee et al. (1981), saying that for any vectors uT and vT, we have the relationship:  
 \begin{displaymath}
u'_T(P_{1,k,T}^{\perp}v_T)-u'_{T-1}(P_{1,k,T-1}^{\perp}v_{T-...
 ...\pi'_T(P_{1,k,T}^{\perp}v_T)].{1\over \cos^2\theta_{1,k,T}} \;.\end{displaymath} (7)
This equation, used with $(u_T,v_T)\!=\!(y_T,Z^{k+1}y_T)$, leads to the time-updating of $\Delta_{k+1,T}$ (equation (5)) in the self-prediction of yT. Here, using the equation (8) with uT=xT and vT=Zk+1yT, we get the time-update recursion:  
 \begin{displaymath}
\Delta^x_{k+1,T}=\Delta^x_{k+1,T-1} + {\varepsilon^x_{k,T}(T)r_{k,T-1}(T)\over\cos^2\theta_{1,k,T}} \;.\end{displaymath} (8)
The use of exponential tapering also gives a recursion similar to equation (6):

\begin{displaymath}
\Delta_{k+1,T}^x=\lambda\Delta_{k+1,T-1}^x+{\varepsilon^x_{k,T}(T)r_{k,T-1}(T)\over\cos^2\theta_{1,k,T}} \;.\end{displaymath}

In conclusion, the recursions (7) and (9), joined to the previous recursions (3), (4), and (5) of the basic LSL algorithm, give us the entire set of recursions necessary for the general LSL algorithm.


next up previous print clean
Next: Conclusion Up: THE LSL ALGORITHM Previous: Statistical signification of
Stanford Exploration Project
1/13/1998