# No Quantum Speedup over Gradient Descent for Non-Smooth Convex Optimization

Ankit Garg,
Robin Kothari,
Praneeth Netrapalli,
Suhail Sherif

January, 2021

### Abstract

We study the first-order convex optimization problem, where we have black-box access to a (not necessarily smooth) function $f:\mathbb{R}^n \to \mathbb{R}$ and its (sub)gradient. Our goal is to find an $\epsilon$-approximate minimum of $f$ starting from a point that is distance at most $R$ from the true minimum. If $f$ is $G$-Lipschitz, then the classic gradient descent algorithm solves this problem with $O((GR/\epsilon)^2)$ queries. Importantly, the number of queries is independent of the dimension n and gradient descent is optimal in this regard: No deterministic or randomized algorithm can achieve better complexity that is still independent of the dimension $n$.

In this paper we reprove the randomized lower bound of $\Omega((GR/\epsilon)^2)$ using a simpler argument than previous lower bounds. We then show that although the function family used in the lower bound is hard for randomized algorithms, it can be solved using $O(GR/\epsilon)$ quantum queries. We then show an improved lower bound against quantum algorithms using a different set of instances and establish our main result that in general even quantum algorithms need $\Omega((GR/\epsilon)^2)$ queries to solve the problem. Hence there is no quantum speedup over gradient descent for black-box first-order convex optimization without further assumptions on the function family.

###### Postdoctoral Researcher

I work in theoretical computer science, with a focus on lower bounds in query and communication complexity.