Symmetry Teleportation for Accelerated Optimization
Bo Zhao, Nima Dehmamy, et al.
NeurIPS 2022
Partially monotone regression is a regression analysis in which the target values are monotonically increasing with respect to a subset of input features. The TensorFlow Lattice library is one of the standard machine learning libraries for partially monotone regression. It consists of several neural network layers, and its core component is the lattice layer. One of the problems of the lattice layer is that it requires the projected gradient descent algorithm with many constraints to train it. Another problem is that it cannot receive a high-dimensional input vector due to the memory consumption. We propose a novel neural network layer, the hierarchical lattice layer (HLL), as an extension of the lattice layer so that we can use a standard stochastic gradient descent algorithm to train HLL while satisfying monotonicity constraints and so that it can receive a high-dimensional input vector. Our experiments demonstrate that HLL did not sacrifice its prediction performance on real datasets compared with the lattice layer.
Bo Zhao, Nima Dehmamy, et al.
NeurIPS 2022
Jihun Yun, Aurelie Lozano, et al.
NeurIPS 2021
Ben Huh, Avinash Baidya
NeurIPS 2022
Hongyu Tu, Shantam Shorewala, et al.
NeurIPS 2022