Efficient Convex Optimization Requires Superlinear Memory

03/29/2022
by   Annie Marsden, et al.
0

We show that any memory-constrained, first-order algorithm which minimizes d-dimensional, 1-Lipschitz convex functions over the unit ball to 1/poly(d) accuracy using at most d^1.25 - δ bits of memory must make at least Ω̃(d^1 + (4/3)δ) first-order queries (for any constant δ∈ [0, 1/4]). Consequently, the performance of such memory-constrained algorithms are a polynomial factor worse than the optimal Õ(d) query bound for this problem obtained by cutting plane methods that use Õ(d^2) memory. This resolves a COLT 2019 open problem of Woodworth and Srebro.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro