Next: Conclusion
Up: THE LSL ALGORITHM
Previous: Statistical signification of
We are interested in the prediction of a time series
x_{T} from a related process y_{T}. We want to minimize:
Once again, we are interested only in the residual , so that
the residuals are affected only by past values, and not by future values.
This minimization problem leads to:
If we are interested in the residuals themselves, then
we have to find the orthogonal complement of x_{T}
on the space . So, the idea is
to use the backward residuals of the selfprediction of y_{T}
to form an orthogonal basis of the space Y_{1,k,T}, and then
project the vector x_{T} onto this particular basis.
Consequently, we will lead two recursions at the same time:
one to compute the variables and r_{k,T} of the
selfprediction of y_{T}, and the other to compute the residuals
of the joint prediction of x_{T} from y_{T}.
The orderupdating of the residuals is very similar
to the updating of the residuals . Effectively,
 

 
 (6) 
We can define the crosscorrelation and the
reflection coefficient K^{x}_{p+1,T}:
The timeupdating is similar to equation (5). I use here an equation
derived from Lee et al. (1981), saying that for any vectors u_{T} and
v_{T}, we have the relationship:
 
(7) 
This equation, used with , leads
to the timeupdating of (equation (5)) in the
selfprediction of y_{T}. Here, using the equation (8)
with u_{T}=x_{T} and v_{T}=Z^{k+1}y_{T}, we get the timeupdate recursion:
 
(8) 
The use of exponential tapering also gives a recursion similar to
equation (6):
In conclusion, the recursions (7) and (9), joined
to the previous recursions (3), (4), and (5)
of the basic LSL algorithm, give us the entire set of recursions necessary
for the general LSL algorithm.
Next: Conclusion
Up: THE LSL ALGORITHM
Previous: Statistical signification of
Stanford Exploration Project
1/13/1998