Abstract
In the last few years, Algorithms for Convex Optimization have revolutionized algorithm design, both for discrete and continuous optimization problems. For problems like maximum flow, maximum matching, and submodular function minimization, the fastest algorithms involve essential methods such as gradient descent, mirror descent, interior point methods, and ellipsoid methods. The goal of this self-contained book is to enable researchers and professionals in computer science, data science, and machine learning to gain an in-depth understanding of these algorithms. The text emphasizes how to derive key algorithms for convex optimization from first principles and how to establish precise running time bounds. This modern text explains the success of these algorithms in problems of discrete optimization, as well as how these methods have significantly pushed the state of the art of convex optimization itself.
Publisher
Cambridge University Press
Reference256 articles.
1. Madry, Aleksander . 2013. Navigating central path with electrical flows: from flows to matchings, and back. Pages 253–262 of: 54th Annual IEEE Symposium on Foundations of Computer Science, FOCS 2013, October 26–29, Berkeley, CA.
2. The Matching Polytope has Exponential Extension Complexity
3. A polynomial algorithm for linear programming;Khachiyan;Doklady Akademii Nauk SSSR,1979
4. Cohen, Michael B. , Madry, Aleksander , Tsipras, Dimitris , and Vladu, Adrian . 2017. Matrix scaling and balancing via box constrained newton’s method and interior point methods. Pages 902–913 of: 58th IEEE Annual Symposium on Foundations of Computer Science, FOCS 2017, October 15–17, Berkeley, CA.
5. Celis, L. Elisa , Keswani, Vijay , and Vishnoi, Nisheeth K. 2020. Data preprocessing to mitigate bias: a maximum entropy based approach. In: Proceedings of the 37th International Conference on Machine Learning, ICML 2020, July 13–18, 2020, Virtual Event. Proceedings of Machine Learning Research 119. PMLR 2020.
Cited by
11 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献