Evaluation Complexity of Algorithms for Nonconvex Optimization: Theory, Computation, and Perspectives


Book Description

One of the most popular ways to assess the "effort" needed to solve a problem is to count how many evaluations of the problem functions (and their derivatives) are required. In many cases, this is often the dominating computational cost. Given an optimization problem satisfying reasonable assumptions-and given access to problem-function values and derivatives of various degrees-how many evaluations might be required to approximately solve the problem? Evaluation Complexity of Algorithms for Nonconvex Optimization: Theory, Computation, and Perspectives addresses this question for nonconvex optimization problems, those that may have local minimizers and appear most often in practice. This is the first book on complexity to cover topics such as composite and constrained optimization, derivative-free optimization, subproblem solution, and optimal (lower and sharpness) bounds for nonconvex problems, to address the disadvantages of traditional optimality measures and propose useful surrogates leading to algorithms that compute approximate high-order critical points, and to compare traditional and new methods, highlighting the advantages of the latter from a complexity point of view. This is the go-to book for those interested in solving nonconvex problems. It is suitable for advanced undergraduate and graduate students in courses on Advanced Numerical Analysis, Special Topics on Numerical Analysis, Topics on Data Science, Topics on Numerical Optimization, and Topics on Approximation Theory.




Evaluation Complexity of Algorithms for Nonconvex Optimization


Book Description

A popular way to assess the “effort” needed to solve a problem is to count how many evaluations of the problem functions (and their derivatives) are required. In many cases, this is often the dominating computational cost. Given an optimization problem satisfying reasonable assumptions—and given access to problem-function values and derivatives of various degrees—how many evaluations might be required to approximately solve the problem? Evaluation Complexity of Algorithms for Nonconvex Optimization: Theory, Computation, and Perspectives addresses this question for nonconvex optimization problems, those that may have local minimizers and appear most often in practice. This is the first book on complexity to cover topics such as composite and constrained optimization, derivative-free optimization, subproblem solution, and optimal (lower and sharpness) bounds for nonconvex problems. It is also the first to address the disadvantages of traditional optimality measures and propose useful surrogates leading to algorithms that compute approximate high-order critical points, and to compare traditional and new methods, highlighting the advantages of the latter from a complexity point of view. This is the go-to book for those interested in solving nonconvex optimization problems. It is suitable for advanced undergraduate and graduate students in courses on advanced numerical analysis, data science, numerical optimization, and approximation theory.




Conjugate Gradient Algorithms in Nonconvex Optimization


Book Description

This book details algorithms for large-scale unconstrained and bound constrained optimization. It shows optimization techniques from a conjugate gradient algorithm perspective as well as methods of shortest residuals, which have been developed by the author.










Global Optimization with Non-Convex Constraints


Book Description

Everything should be made as simple as possible, but not simpler. (Albert Einstein, Readers Digest, 1977) The modern practice of creating technical systems and technological processes of high effi.ciency besides the employment of new principles, new materials, new physical effects and other new solutions ( which is very traditional and plays the key role in the selection of the general structure of the object to be designed) also includes the choice of the best combination for the set of parameters (geometrical sizes, electrical and strength characteristics, etc.) concretizing this general structure, because the Variation of these parameters ( with the structure or linkage being already set defined) can essentially affect the objective performance indexes. The mathematical tools for choosing these best combinations are exactly what is this book about. With the advent of computers and the computer-aided design the pro bations of the selected variants are usually performed not for the real examples ( this may require some very expensive building of sample op tions and of the special installations to test them ), but by the analysis of the corresponding mathematical models. The sophistication of the mathematical models for the objects to be designed, which is the natu ral consequence of the raising complexity of these objects, greatly com plicates the objective performance analysis. Today, the main (and very often the only) available instrument for such an analysis is computer aided simulation of an object's behavior, based on numerical experiments with its mathematical model.







Handbook of Mathematical Models and Algorithms in Computer Vision and Imaging


Book Description

This handbook gathers together the state of the art on mathematical models and algorithms for imaging and vision. Its emphasis lies on rigorous mathematical methods, which represent the optimal solutions to a class of imaging and vision problems, and on effective algorithms, which are necessary for the methods to be translated to practical use in various applications. Viewing discrete images as data sampled from functional surfaces enables the use of advanced tools from calculus, functions and calculus of variations, and nonlinear optimization, and provides the basis of high-resolution imaging through geometry and variational models. Besides, optimization naturally connects traditional model-driven approaches to the emerging data-driven approaches of machine and deep learning. No other framework can provide comparable accuracy and precision to imaging and vision. Written by leading researchers in imaging and vision, the chapters in this handbook all start with gentle introductions, which make this work accessible to graduate students. For newcomers to the field, the book provides a comprehensive and fast-track introduction to the content, to save time and get on with tackling new and emerging challenges. For researchers, exposure to the state of the art of research works leads to an overall view of the entire field so as to guide new research directions and avoid pitfalls in moving the field forward and looking into the next decades of imaging and information services. This work can greatly benefit graduate students, researchers, and practitioners in imaging and vision; applied mathematicians; medical imagers; engineers; and computer scientists.







An Introduction to Convexity, Optimization, and Algorithms


Book Description

This concise, self-contained volume introduces convex analysis and optimization algorithms, with an emphasis on bridging the two areas. It explores cutting-edge algorithms—such as the proximal gradient, Douglas–Rachford, Peaceman–Rachford, and FISTA—that have applications in machine learning, signal processing, image reconstruction, and other fields. An Introduction to Convexity, Optimization, and Algorithms contains algorithms illustrated by Julia examples and more than 200 exercises that enhance the reader’s understanding of the topic. Clear explanations and step-by-step algorithmic descriptions facilitate self-study for individuals looking to enhance their expertise in convex analysis and optimization. Designed for courses in convex analysis, numerical optimization, and related subjects, this volume is intended for undergraduate and graduate students in mathematics, computer science, and engineering. Its concise length makes it ideal for a one-semester course. Researchers and professionals in applied areas, such as data science and machine learning, will find insights relevant to their work.