Affiliation:
1. Laboratoire d’Analyse et d’Architecture des Systèmes du CNRS, 31031 Toulouse, France
Abstract
We show that the vanishing step size subgradient method—widely adopted for machine learning applications—can display rather messy behavior even in the presence of favorable assumptions. We establish that convergence of bounded subgradient sequences may fail even with a Whitney stratifiable objective function satisfying the Kurdyka-Łojasiewicz inequality. Moreover, when the objective function is path-differentiable, we show that various properties all may fail to occur: criticality of the limit points, convergence of the sequence, convergence in values, codimension one of the accumulation set, equality of the accumulation and essential accumulation sets, connectedness of the essential accumulation set, spontaneous slowdown, oscillation compensation, and oscillation perpendicularity to the accumulation set.
Publisher
Institute for Operations Research and the Management Sciences (INFORMS)
Subject
Management Science and Operations Research,Computer Science Applications,General Mathematics
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献