FANDOM


Least mean squares (LMS) is an adaptation algorithm with a low computational complexity. While LS (least squares) is an off-line batch algorithm, LMS is an on-line fast algorithm.

Trends Edit

Up to now, the batch types of signal processing algorithms are preferred to the instant types of those algorithms for simple processing and developed fast digital signal processing cores. Recently, researchers realize that the batch type algorithms can not be optimally efficient for parallel and nonlinear signal processing cases as our neural networks are operated based on instantaneous type algorithms.

Algorithm Edit

System modeling Edit

We assume that the inputted signal is denoted as $ x_i(t) $ where $ i $ is the sampling index and is the time index. The filtered output signal is modeled as follows:

$ y(t) = \sum_{i=1}^{N} w_i x_i(t) + n(t) $

where $ w_i $ is the $ i $th filter coefficient and $ n(t) $ is the noise signal at time $ t $.

Derivation of adaptive algorithms Edit

When the set of the input signal patterns and the output signals are fixed, the optimal weight vector can be defined in terms of the output error criterion. If there is no noise signal, the exact weight vector can be found by the matrix inversion. First, the all output signals are rearranged as follows:

$ \mathbf{y} = \mathbf{X} \mathbf{w} $

where $ \mathbf{y} $ is the concatenated output signal vector and $ \mathbf{y} $ is the concatenated input signal matrix. Then, the exact weight vector is given by

$ \mathbf{w} = \mathbf{X}^{-1} \mathbf{y} $

where we assume that $ \mathbf{X} $ is an mathemactically invertable matrix.