Publication
IBM J. Res. Dev
Paper

An effective algorithm for hyperparameter optimization of neural networks

View publication

Abstract

A major challenge in designing neural network (NN) systems is to determine the best structure and parameters for the network given the data for the machine learning problem at hand. Examples of parameters are the number of layers and nodes, the learning rates, and the dropout rates. Typically, these parameters are chosen based on heuristic rules and are manually fine-tuned, which may be very time-consuming, because evaluating the performance of a single parametrization of the NN may require several hours. In this paper, we address the problem of choosing appropriate parameters for the NN by formulating it as a box-constrained mathematical optimization problem and applying a derivative-free optimization tool that automatically and effectively searches the parameter space. The optimization tool employs a radial basis function model of the objective function (the prediction accuracy of the NN) to accelerate the discovery of configurations yielding high accuracy. Candidate configurations explored by the algorithm are trained to a small number of epochs, and only the most promising candidates receive full training. The performance of the proposed methodology is assessed on benchmark sets and in the context of predicting drug-drug interactions, showing promising results. The optimization tool used in this paper is open source.

Date

01 Jul 2017

Publication

IBM J. Res. Dev

Authors

Share