Publication
IJCNN 1998
Conference paper

Case study on Bagging, Boosting, and Basic ensembles of neural networks for OCR

Abstract

We study the effectiveness of three neural network ensembles in improving OCR performance: (i) Basic, (ii) Bagging, and (iii) Boosting. Three random character degradation models are introduced in training individual networks in order to reduce error correlation between individual networks and to improve the generalization ability of neural networks. We compare the recognition accuracies of these three ensembles at various reject rates. An interesting discovery in our comparison is that although the Boosting ensemble is slightly more accurate than the Basic ensemble and Bagging ensemble at zero reject rate, the advantage of the Boosting training over the Basic and Bagging ensembles quickly disappears as more patterns are rejected. Eventually the Basic and Bagging ensembles outperform the Boosting ensemble at high reject rates. Explanation of such a phenomenon is provided in the paper. We also apply the optimal linear combiner (in the least square error sense) to each of the three ensembles to capture different error correlation characteristics of the three ensembles. We find that the optimal linear combiner is very effective in reducing mean square error, but is not necessarily as effective as a simply average method in reducing classification error.

Date

Publication

IJCNN 1998

Authors

Share