Publication
ICNN 1997
Conference paper

A note on the effective number of parameters in nonlinear learning systems

View publication

Abstract

Moody's notion of effective number of parameters in a nonlinear learning system has been used for studying the generalization ability of feedforward neural networks. It is more meaningful than the number of free parameters in a nonlinear learning system because the former explains explicitly how the generalization error is related to the expected training set error. In this paper, we extend Moody's model to include a more general noise model. We show that the addition of noise in both sampling points in test data and observations increases the deviation of the expected test set mean-squared-error (MSE) from the expected training set MSE, and also increases the effective number of parameters. Our extension makes less restrictive assumptions about the data generation process than in the original Moody's notion. Monte Carlo experiments have been conducted to verify our extension, and to demonstrate the role of the weight-decay regularization in improving the generalization ability of feedforward networks. © 1997 IEEE.

Date

Publication

ICNN 1997

Authors

Share