First-Order Methods in Optimization

Posted By: arundhati

Amir Beck, "First-Order Methods in Optimization"
English | ISBN: 1611974984 | 2017 | 494 pages | PDF | 9 MB

The primary goal of this book is to provide a self-contained, comprehensive study of the main first-order methods that are frequently used in solving large-scale problems. First-order methods exploit information on values and gradients/subgradients (but not Hessians) of the functions composing the model under consideration. With the increase in the number of applications that can be modeled as large or even huge-scale optimization problems, there has been a revived interest in using simple methods that require low iteration cost as well as low memory storage.
The author has gathered, reorganized, and synthesized (in a unified manner) many results that are currently scattered throughout the literature, many of which cannot be typically found in optimization books.
First-Order Methods in Optimization offers comprehensive study of first-order methods with the theoretical foundations; provides plentiful examples and illustrations; emphasizes rates of convergence and complexity analysis of the main first-order methods used to solve large-scale problems; and covers both variables and functional decomposition methods.
Audience: This book is intended primarily for researchers and graduate students in mathematics, computer sciences, and electrical and other engineering departments. Readers with a background in advanced calculus and linear algebra, as well as prior knowledge in the fundamentals of optimization (some convex analysis, optimality conditions, and duality), will be best prepared for the material.
Contents: Chapter 1: Vector Spaces; Chapter 2: Extended Real-Value Functions; Chapter 3: Subgradients; Chapter 4: Conjugate Functions; Chapter 5: Smoothness and Strong Convexity; Chapter 6: The Proximal Operator; Chapter 7: Spectral Functions; Chapter 8: Primal and Dual Projected Subgradient Methods; Chapter 9: Mirror Descent; Chapter 10: The Proximal Gradient Method; Chapter 11: The Block Proximal Gradient Method; Chapter 12: Dual-Based Proximal Gradient Methods; Chapter 13: The Generalized Conditional Gradient Method; Chapter 14: Alternating Minimization; Chapter 15: ADMM; Appendix A: Strong Duality and Optimality Conditions; Appendix B: Tables; Appendix C: Symbols and Notation; Appendix D: Bibliographic Notes.