Nonsmooth Optimization: Analysis And Algorithms With Applications To Optimal Control


Book Description

This book is a self-contained elementary study for nonsmooth analysis and optimization, and their use in solution of nonsmooth optimal control problems. The first part of the book is concerned with nonsmooth differential calculus containing necessary tools for nonsmooth optimization. The second part is devoted to the methods of nonsmooth optimization and their development. A proximal bundle method for nonsmooth nonconvex optimization subject to nonsmooth constraints is constructed. In the last part nonsmooth optimization is applied to problems arising from optimal control of systems covered by partial differential equations. Several practical problems, like process control and optimal shape design problems are considered.




Evaluation Complexity of Algorithms for Nonconvex Optimization


Book Description

A popular way to assess the “effort” needed to solve a problem is to count how many evaluations of the problem functions (and their derivatives) are required. In many cases, this is often the dominating computational cost. Given an optimization problem satisfying reasonable assumptions—and given access to problem-function values and derivatives of various degrees—how many evaluations might be required to approximately solve the problem? Evaluation Complexity of Algorithms for Nonconvex Optimization: Theory, Computation, and Perspectives addresses this question for nonconvex optimization problems, those that may have local minimizers and appear most often in practice. This is the first book on complexity to cover topics such as composite and constrained optimization, derivative-free optimization, subproblem solution, and optimal (lower and sharpness) bounds for nonconvex problems. It is also the first to address the disadvantages of traditional optimality measures and propose useful surrogates leading to algorithms that compute approximate high-order critical points, and to compare traditional and new methods, highlighting the advantages of the latter from a complexity point of view. This is the go-to book for those interested in solving nonconvex optimization problems. It is suitable for advanced undergraduate and graduate students in courses on advanced numerical analysis, data science, numerical optimization, and approximation theory.




Convex Optimization


Book Description

This monograph presents the main complexity theorems in convex optimization and their corresponding algorithms. It begins with the fundamental theory of black-box optimization and proceeds to guide the reader through recent advances in structural optimization and stochastic optimization. The presentation of black-box optimization, strongly influenced by the seminal book by Nesterov, includes the analysis of cutting plane methods, as well as (accelerated) gradient descent schemes. Special attention is also given to non-Euclidean settings (relevant algorithms include Frank-Wolfe, mirror descent, and dual averaging), and discussing their relevance in machine learning. The text provides a gentle introduction to structural optimization with FISTA (to optimize a sum of a smooth and a simple non-smooth term), saddle-point mirror prox (Nemirovski's alternative to Nesterov's smoothing), and a concise description of interior point methods. In stochastic optimization it discusses stochastic gradient descent, mini-batches, random coordinate descent, and sublinear algorithms. It also briefly touches upon convex relaxation of combinatorial problems and the use of randomness to round solutions, as well as random walks based methods.




Introduction to Nonsmooth Optimization


Book Description

This book is the first easy-to-read text on nonsmooth optimization (NSO, not necessarily differentiable optimization). Solving these kinds of problems plays a critical role in many industrial applications and real-world modeling systems, for example in the context of image denoising, optimal control, neural network training, data mining, economics and computational chemistry and physics. The book covers both the theory and the numerical methods used in NSO and provide an overview of different problems arising in the field. It is organized into three parts: 1. convex and nonconvex analysis and the theory of NSO; 2. test problems and practical applications; 3. a guide to NSO software. The book is ideal for anyone teaching or attending NSO courses. As an accessible introduction to the field, it is also well suited as an independent learning guide for practitioners already familiar with the basics of optimization.




Fixed-Point Algorithms for Inverse Problems in Science and Engineering


Book Description

"Fixed-Point Algorithms for Inverse Problems in Science and Engineering" presents some of the most recent work from top-notch researchers studying projection and other first-order fixed-point algorithms in several areas of mathematics and the applied sciences. The material presented provides a survey of the state-of-the-art theory and practice in fixed-point algorithms, identifying emerging problems driven by applications, and discussing new approaches for solving these problems. This book incorporates diverse perspectives from broad-ranging areas of research including, variational analysis, numerical linear algebra, biotechnology, materials science, computational solid-state physics, and chemistry. Topics presented include: Theory of Fixed-point algorithms: convex analysis, convex optimization, subdifferential calculus, nonsmooth analysis, proximal point methods, projection methods, resolvent and related fixed-point theoretic methods, and monotone operator theory. Numerical analysis of fixed-point algorithms: choice of step lengths, of weights, of blocks for block-iterative and parallel methods, and of relaxation parameters; regularization of ill-posed problems; numerical comparison of various methods. Areas of Applications: engineering (image and signal reconstruction and decompression problems), computer tomography and radiation treatment planning (convex feasibility problems), astronomy (adaptive optics), crystallography (molecular structure reconstruction), computational chemistry (molecular structure simulation) and other areas. Because of the variety of applications presented, this book can easily serve as a basis for new and innovated research and collaboration.




Convex Optimization


Book Description

Convex optimization problems arise frequently in many different fields. This book provides a comprehensive introduction to the subject, and shows in detail how such problems can be solved numerically with great efficiency. The book begins with the basic elements of convex sets and functions, and then describes various classes of convex optimization problems. Duality and approximation techniques are then covered, as are statistical estimation techniques. Various geometrical problems are then presented, and there is detailed discussion of unconstrained and constrained minimization problems, and interior-point methods. The focus of the book is on recognizing convex optimization problems and then finding the most appropriate technique for solving them. It contains many worked examples and homework exercises and will appeal to students, researchers and practitioners in fields such as engineering, computer science, mathematics, statistics, finance and economics.




Numerical Nonsmooth Optimization


Book Description

Solving nonsmooth optimization (NSO) problems is critical in many practical applications and real-world modeling systems. The aim of this book is to survey various numerical methods for solving NSO problems and to provide an overview of the latest developments in the field. Experts from around the world share their perspectives on specific aspects of numerical NSO. The book is divided into four parts, the first of which considers general methods including subgradient, bundle and gradient sampling methods. In turn, the second focuses on methods that exploit the problem’s special structure, e.g. algorithms for nonsmooth DC programming, VU decomposition techniques, and algorithms for minimax and piecewise differentiable problems. The third part considers methods for special problems like multiobjective and mixed integer NSO, and problems involving inexact data, while the last part highlights the latest advancements in derivative-free NSO. Given its scope, the book is ideal for students attending courses on numerical nonsmooth optimization, for lecturers who teach optimization courses, and for practitioners who apply nonsmooth optimization methods in engineering, artificial intelligence, machine learning, and business. Furthermore, it can serve as a reference text for experts dealing with nonsmooth optimization.




Non-convex Optimization for Machine Learning


Book Description

Non-convex Optimization for Machine Learning takes an in-depth look at the basics of non-convex optimization with applications to machine learning. It introduces the rich literature in this area, as well as equips the reader with the tools and techniques needed to apply and analyze simple but powerful procedures for non-convex problems. Non-convex Optimization for Machine Learning is as self-contained as possible while not losing focus of the main topic of non-convex optimization techniques. The monograph initiates the discussion with entire chapters devoted to presenting a tutorial-like treatment of basic concepts in convex analysis and optimization, as well as their non-convex counterparts. The monograph concludes with a look at four interesting applications in the areas of machine learning and signal processing, and exploring how the non-convex optimization techniques introduced earlier can be used to solve these problems. The monograph also contains, for each of the topics discussed, exercises and figures designed to engage the reader, as well as extensive bibliographic notes pointing towards classical works and recent advances. Non-convex Optimization for Machine Learning can be used for a semester-length course on the basics of non-convex optimization with applications to machine learning. On the other hand, it is also possible to cherry pick individual portions, such the chapter on sparse recovery, or the EM algorithm, for inclusion in a broader course. Several courses such as those in machine learning, optimization, and signal processing may benefit from the inclusion of such topics.




Introductory Lectures on Convex Optimization


Book Description

It was in the middle of the 1980s, when the seminal paper by Kar markar opened a new epoch in nonlinear optimization. The importance of this paper, containing a new polynomial-time algorithm for linear op timization problems, was not only in its complexity bound. At that time, the most surprising feature of this algorithm was that the theoretical pre diction of its high efficiency was supported by excellent computational results. This unusual fact dramatically changed the style and direc tions of the research in nonlinear optimization. Thereafter it became more and more common that the new methods were provided with a complexity analysis, which was considered a better justification of their efficiency than computational experiments. In a new rapidly develop ing field, which got the name "polynomial-time interior-point methods", such a justification was obligatory. Afteralmost fifteen years of intensive research, the main results of this development started to appear in monographs [12, 14, 16, 17, 18, 19]. Approximately at that time the author was asked to prepare a new course on nonlinear optimization for graduate students. The idea was to create a course which would reflect the new developments in the field. Actually, this was a major challenge. At the time only the theory of interior-point methods for linear optimization was polished enough to be explained to students. The general theory of self-concordant functions had appeared in print only once in the form of research monograph [12].