Least mean squares (LMS) is an adaptation algorithm with a low computational complexity. While LS (least squares) is an off-line batch algorithm, LMS is an on-line fast algorithm.
Up to now, the batch types of signal processing algorithms are preferred to the instant types of those algorithms for simple processing and developed fast digital signal processing cores. Recently, researchers realize that the batch type algorithms can not be optimally efficient for parallel and nonlinear signal processing cases as our neural networks are operated based on instantaneous type algorithms.
System modeling Edit
We assume that the inputted signal is denoted as where is the sampling index and '"`UNIQ5ef55aa6e3b28b8-math-00000002-QINU`"' is the time index. The filtered output signal is modeled as follows:
where is the th filter coefficient and is the noise signal at time .
Derivation of adaptive algorithms Edit
When the set of the input signal patterns and the output signals are fixed, the optimal weight vector can be defined in terms of the output error criterion. If there is no noise signal, the exact weight vector can be found by the matrix inversion. First, the all output signals are rearranged as follows:
where is the concatenated output signal vector and is the concatenated input signal matrix. Then, the exact weight vector is given by
where we assume that is an mathemactically invertable matrix.