An, P.E., Brown, M. and Harris, C.J.
On the Convergence Rate Performance of Normalized Least-Mean-Square Adaptation.
IEEE Trans. on Neural Networks, 6, (6), .
Full text not available from this repository.
This paper compares the convergence rate performance of the Normalized Least-Mean-Square (or NLMS) algorithm to that of the standard Least-Mean-Square (LMS) algorithm, which is based on a well known interpretation of the NLMS algorithm as a form of LMS via input normalization. With this interpretation, the analysis is considerably simplified, and the difference in the rate of parameter convergence can be compared directly by evaluating the condition number of the normalized input correlation matrix and the unnormalized one. The main contribution of this paper has two parts. Firstly, it derives exact condition number expressions for the normalized input correlation matrix using an arbitrary odd-order filter length with two distinct unnormalized eigenvalues; whereas the corresponding even-order NLMS condition numbers are shown to be bounded between their odd-order counterparts. These expressions require that the input samples be statistically stationary and zero-mean Gaussian distributed, and provide an important insight into the relative convergence performance of the NLMS algorithm as a function of the filter length to that of the LMS. Secondly, this paper provides a conjecture which set bounds on the NLMS condition number for any arbitrary number of distinct unnormalized eigenvalues, and this conjecture has been found to be consistent with extensive computer simulations. Given that the same maximum and minimum unnormalized eigenvalues but with varying power levels associated with intermediate eigenvalues, this bound suggests that the NLMS convergence rate decreases with the number of unnormalized eigenvalues with excessive power levels.
Actions (login required)