I come back to the general problem: estimating a time series x(t) from the past values of another general series y(t). As we did in the LSL algorithm, we need to project the data vector x onto the space of regressors . If , the backward residuals in the self-prediction of y form an exact orthogonal basis of this space. So, the residuals in the joint prediction of x from y are the orthogonal complement of the series x(t) on this basis, and verify:

In the more general case (when is close to 1), I consider that this formulation still holds. Using a reflection coefficient Kxk, we get:
 (16)
To compute Kxk, we minimize the energy:

As I already said, this expression of the energy forces the residuals to depend on past and future data, because the summation symbol extends to the whole time window. But the formalism is still adaptive, because the weighting is non-uniformly centered around the time of interest T. The minimization with respect to Kxk leads to the expression:
 (17)

The denominator is indeed the one involved in the computation of the reflection coefficient Krk,T, and we already saw how we can compute it recursively. The numerator can also be split into past and future summations Nx-k(t) and Nx+k(t) verifying:
 (18)

In conclusion, the idea of the algorithm is to perform at the same time an adaptive self-prediction of the data y(t) with my modified version of Burg's adaptive algorithm, and the adaptive prediction of x from y with the recursions derived in this section. The self-prediction of y is processed with the recursion formulas (13), (15), and (16). The prediction of x is performed with the recursions (17), (18), and (19).