Part of Advances in Neural Information Processing Systems 33 (NeurIPS 2020)
Moran Feldman, Amin Karbasi
In this paper, we propose the first continuous optimization algorithms that achieve a constant factor approximation guarantee for the problem of monotone continuous submodular maximization subject to a linear constraint. We first prove that a simple variant of the vanilla coordinate ascent, called \COORDINATE-ASCENT+, achieves a (e−12e−1−\eps)-approximation guarantee while performing O(n/ϵ) iterations, where the computational complexity of each iteration is roughly O(n/√ϵ+nlogn) (here, n denotes the dimension of the optimization problem). We then propose \COORDINATE-ASCENT++, that achieves the tight (1−1/e−\eps)-approximation guarantee while performing the same number of iterations, but at a higher computational complexity of roughly O(n3/\eps2.5+n3logn/\eps2) per iteration. However, the computation of each round of \COORDINATE-ASCENT++ can be easily parallelized so that the computational cost per machine scales as O(n/√ϵ+nlogn).