Chen, S., Samingan, A.K. and Hanzo, L.
Adaptive near minimum error rate training for neural networks with application to multiuser detection in CDMA communication systems
Signal Processing, 85, (7), .
This is the latest version of this item.
Adaptive training of neural networks is typically done using some stochastic gradient algorithm that tries to minimize the mean square error (MSE). For many applications, such as channel equalization and code-division multiple-access (CDMA) multiuser detection, the goal is to minimize the error probability. For these applications, adopting the MSE criterion may lead to a poor performance. A novel adaptive near minimum error rate algorithm called the least bit error rate (LBER) is developed for training neural networks for these kinds of applications. The proposed method is applied to multiuser detection in CDMA communication systems. Simulation results show that the LBER algorithm has a good convergence speed and a small radial basis function (RBF) network trained by this adaptive algorithm can closely match the performance of the optimal Bayesian multiuser detector. The results also confirm that training the neural network multiuser detector using the least mean square (LMS) algorithm, although converging well in the MSE, can produce a poor error rate performance.
Available Versions of this Item
Adaptive near minimum error rate training for neural networks with application to multiuser detection in CDMA communication systems (deposited 06 Jun 2005)
Actions (login required)