Publication
CDC 2018
Conference paper

Generalized Deterministic Perturbations for Stochastic Gradient Search

View publication

Abstract

The Stochastic optimization (SO) problem consists of optimizing an objective function in the presence of noise. Most of the solution techniques in SO estimate gradients from noise corrupted observations of the objective and adjust parameters of the objective along the direction of the estimated gradients to obtain locally optimal solutions. Two prominent algorithms in SO namely Random Direction Kiefer-Wolfowitz (RDKW) and Simultaneous Perturbation Stochastic Approximation (SPSA) obtain noisy gradient estimates by randomly perturbing all the parameters simultaneously. This forces the search direction to be random in these algorithms and presents one with additional noise on top of the noise incurred from the samples of the objective. For better convergence properties, the idea of using deterministic perturbations instead of randomized perturbations for gradient estimation has also been studied. Two specific constructions of the deterministic perturbation sequence using lexicographical ordering and Hadamard matrices have been explored and encouraging results have been reported previously in the literature. In this paper, we characterize the class of deterministic perturbation sequences that can be utilized in the RDKW algorithm. This class expands the set of known deterministic perturbation sequences available in the literature. Using our characterization, we propose construction of a deterministic perturbation sequence that has the least cycle length among all such sequences. Through simulations we illustrate the performance gain of the proposed deterministic perturbation sequence in the RDKW algorithm over the Hadamard as well as the random perturbation counterparts. We also establish the convergence of the RDKW algorithm for this generalized class of deterministic perturbations.

Date

18 Jan 2019

Publication

CDC 2018

Share