Summarizing https://arxiv.org/pdf/2011.01929.pdf

Here's my try:

The paper presents a theoretical analysis of the complexity of gradient descent for non-convex optimization, showing that computing a Karush-Kuhn-Tucker (KKT) point of a continuously differentiable function over the domain [0, 1]2 is PPAD ∩ PLS-complete. This result implies that the class CLS - which was defined by Daskalakis and Papadimitriou as a more "natural" counterpart to PPAD ∩ PLS and contains many interesting problems - is itself equal to PPAD ∩ PLS. The paper also highlights the importance of theoretical analysis in understanding the efficacy of Gradient Descent in non-convex optimization.

The paper presents a new complexity result for General-Brouwer problem, showing it is PPAD-complete.

Reply to this note

Please Login to reply.

Discussion

No replies yet.