About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
INFORMS 2021
Talk
Black-box Optimization for Optimizing Expensive Functions with Mixed Inputs
Abstract
We propose a deep neural network-based optimization for minimizing expensive black-box functions with mixed categorical-continuous inputs and linear constraints. We use a ReLU deep neural network to get a surrogate model from the historical data. To overcome the non-smoothness and bad local minimum of the training problem, a smoothed DNN optimized by a second-order optimization method is utilized. A new sample is obtained by solving a linearized version of the DNN surrogate model.