Cyclic stochastic optimization with noisy function measurements

Jointly optimizing a function over multiple parameters can sometimes prove very costly, particularly when the number of parameters is large. Cyclic optimization (optimization over a subset of the parameters while the rest are held fixed) may prove significantly simpler; it seeks to combine algorithms for performing conditional optimization with the hopes of obtaining a solution to the joint optimization problem. In this paper we focus on cyclic stochastic optimization where loss function measurements contain noise. We give a set of convergence conditions for a cyclic version of stochastic gradient and simultaneous perturbation stochastic approximation. A numerical experiment is performed to analyze the behavior of cyclic vs. non-cyclic stochastic optimization on the Rosenbrock function with Gaussian noise.