by Dimitri P. Bertsekas
Publication: 2016, 880 pages, hardcover
Contents, Preface, Ordering, Home
This is a thoroughly rewritten version of the 1999 2nd edition of our best-selling nonlinear programming book. New material was included, some of the old material was discarded, and a large portion of the remainder was reorganized or revised. The number of pages has increased by about 100.
The book provides a comprehensive and accessible presentation of algorithms for solving continuous optimization problems. It relies on rigorous mathematical analysis, but also aims at an intuitive exposition that makes use of visualization where possible. It places particular emphasis on modern developments, and their widespread applications in fields such as large-scale resource allocation problems, signal processing, and machine learning.
The 3rd edition brings the book in closer harmony with the companion works Convex Optimization Theory (Athena Scientific, 2009), Convex Optimization Algorithms (Athena Scientific, 2015), Convex Analysis and Optimization (Athena Scientific, 2003), and Network Optimization (Athena Scientific, 1998).
These works are complementary in that they deal primarily with convex, possibly nondifferentiable, optimization problems and rely on convex analysis. By contrast the nonlinear programming book focuses primarily on analytical and computational methods for possibly nonconvex differentiable problems. It relies primarily on calculus and variational analysis, yet it still contains a detailed presentation of duality theory and its uses for both convex and nonconvex problems.
Among its special features, the book:
Provides extensive coverage of iterative optimization methods within a unifying framework
Covers in depth duality theory from both a variational and a geometric point of view
Provides a detailed treatment of interior point methods for linear programming
Includes much new material on a number of topics, such as proximal algorithms, alternating direction methods of multipliers, and conic programming
Focuses on large-scale optimization topics of much current interest, such as first order methods, incremental methods, and distributed asynchronous computation, and their applications in machine learning, signal processing, neural network training, and big data applications
Includes a large number of examples and exercises
Was developed through extensive classroom use in first-year graduate courses
From the review by Olvi Mangasarian (Optima, March 1997):
"This is a beautifully written book by a prolific author ... who has taken
painstaking care in making the presentation extremely lucid ... The style is
unhurried and intuitive yet mathematically rigorous."
"The numerous figures in the book are extremely well thought out and are
used in a very effective way to elucidate the text. The detailed and
self-explanatory long captions accompanying each figure are extremely
"The 80 pages constituting the four appendixes serve as a masterfully
written introduction to the field of nonlinear programming that can be used
as a self-contained monograph. Teachers using this book could easily assign
these appendixes as introductory or remedial material."
"This book contains a wealth of material... Throughout this book, well-prepared graphics illustrate ideas and results.
The text contains many examples and each section is followed by a set of nice exercises."
The author is McAfee Professor of Engineering at the Massachusetts Institute of Technology and a member of the prestigious US National Academy of Engineering. He is the recipient of the 2001 A. R. Raggazini ACC education award, the 2009 INFORMS expository writing award, the 2014 Kachiyan Prize, the 2014 AACC Bellman Heritage Award, and the 2015 SIAM/MOS George B. Dantsig Prize. He has been teaching the material included in this book in introductory graduate courses for more than forty years.
The material listed below can be freely downloaded, reproduced, and distributed .