- ホーム
- > 洋書
- > 英文書
- > Science / Mathematics
Full Description
Conditional Gradient Methods: From Core Principles to AI Applications offers a definitive and modern treatment of one of the most elegant and versatile algorithmic families in optimization: the Frank-Wolfe method and its many variants. Originally proposed in the 1950s, these projection-free techniques have seen a powerful resurgence, now playing a central role in machine learning, signal processing, and large-scale data science.
This comprehensive monograph unites deep theoretical insights with practical considerations, guiding readers through the foundations of constrained optimization and into cutting-edge territory, including stochastic, online, and distributed settings. With a clear narrative, rigorous proofs, and illuminating illustrations, the book demystifies adaptive variants, away-steps, and the nuances of dealing with structured convex sets. A FrankWolfe.jl Julia package that implements most of the algorithms in the book is available on a supplementary website.