K. Warren, R. Ambrosio, et al.
IBM J. Res. Dev
In this paper, we propose the StepDIRECT algorithm for derivative-free optimization (DFO), in which the black-box objective function has a stepwise landscape. Our framework is based on the well-known DIRECT algorithm. By incorporating the local variability to explore the flatness, we provide a new criterion to select the potentially optimal hyper-rectangles. In addition, we introduce a stochastic local search algorithm performing on potentially optimal hyper-rectangles to improve the solution quality and convergence speed. Global convergence of the StepDIRECT algorithm is provided. Numerical experiments on optimization for random forest models and hyper-parameter tuning are presented to support the efficacy of our algorithm. The proposed StepDIRECT algorithm shows competitive performance results compared with other state-of-the-art baseline DFO methods including the original DIRECT algorithm.
K. Warren, R. Ambrosio, et al.
IBM J. Res. Dev
Trang H. Tran, Katya Scheinberg, et al.
ICML 2022
Haoran Zhu, Pavankumar Murali, et al.
NeurIPS 2020
Marten van Dijk, Lam Nguyen, et al.
ICML 2019