About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
Proceedings of the IEEE
Paper
Performance and efficiency: Recent advances in supervised learning
Abstract
This paper reviews recent advances in supervised learning with a focus on two most important issues: performance and efficiency. Performance addresses the generalization capability of a learning machine on randomly chosen samples that are not included in a training set. Efficiency deals with the complexity of a learning machine in both space and time. As these two issues are general to various learning machines and learning approaches, we focus on a special type of adaptive learning systems with a neural architecture. We discuss four types of learning approaches: training an individual model; combinations of several well-trained models; combinations of many weak models; and evolutionary computation of models. We explore advantages and weaknesses of each approach and their interrelations, and we pose open questions for possible future research.