next up previous print clean
Next: Exponential weighting Up: THE LSL ALGORITHM Previous: Recursions: order updating

Recursions: time updating

In this part, the length of the filter is fixed, but we increase the length of the processing window (from $T\!-\!1$ to T). Equation (3) tells us how to compute $\varepsilon_{k+1,T}$ from $\varepsilon_{k,T}$. But actually, we are only interested in the $\varepsilon_{k,T}(T)$, and we don't want to compute, nor store, the other values $\varepsilon_{k,T}(t)$ ($t=0,\cdots,T-1$) which form the vector $\varepsilon_{k,T}$. It seems difficult then to compute $\Delta_{k+1,T}$, $R^{\varepsilon}_{k,T}$, or Rrk,T-1 !

However, time-updating formulas also exist for these values, which only imply $\varepsilon_{k,T}(T)$ and rk,T-1(T). Their derivation is long, and I refer to Lee et al. (1981) for an elegant proof. These authors use two other intermediate variables, the angles $\theta_{1,k,T}$, and $\theta_{0,k,T}$; especially, $\theta_{1,k,T}$ is defined as:

\begin{displaymath}
\sin^2\theta_{1,k,T}=\pi'_T(P_{1,k,T}\pi_T) \; \mbox{\hspace{0.5cm}with } \pi_T=(0,\cdots,0,1)'\end{displaymath}

The recursions are now:  
 \begin{displaymath}
\left\{\begin{array}
{ccl}
\Delta_{k+1,T}&=&\Delta_{k+1,T-1}...
 ...lon^2_{k,T}(T)\over R^{\varepsilon}_{k,T-1}}}\end{array}\right.\end{displaymath} (4)
In conclusion, the recursions (3), (4), and (5) give us the entire set of recursions we need for the basic LSL algorithm. The initialization procedures are precised in the sketch of the general LSL algorithm I add at the end of the paper. The most important one is the choice of an a priori value for the covariances, which will stabilize the process by avoiding divisions by 0: it has more or less the stabilizing effect of a prewhitening process on the data. After a few samples, the algorithm should not be sensitive anymore to this a priori value.


next up previous print clean
Next: Exponential weighting Up: THE LSL ALGORITHM Previous: Recursions: order updating
Stanford Exploration Project
1/13/1998