Once again, we are interested only in the residual , so that the residuals are affected only by past values, and not by future values. This minimization problem leads to:
If we are interested in the residuals themselves, then we have to find the orthogonal complement of xT on the space . So, the idea is to use the backward residuals of the self-prediction of yT to form an orthogonal basis of the space Y1,k,T, and then project the vector xT onto this particular basis. Consequently, we will lead two recursions at the same time: one to compute the variables and rk,T of the self-prediction of yT, and the other to compute the residuals of the joint prediction of xT from yT.
The order-updating of the residuals is very similar to the updating of the residuals . Effectively,
The time-updating is similar to equation (5). I use here an equation derived from Lee et al. (1981), saying that for any vectors uT and vT, we have the relationship:
In conclusion, the recursions (7) and (9), joined to the previous recursions (3), (4), and (5) of the basic LSL algorithm, give us the entire set of recursions necessary for the general LSL algorithm.