Proceedings Of The International Congress Of Mathematicians 2018 (Icm 2018) (In 4 Volumes)


Book Description

The Proceedings of the ICM publishes the talks, by invited speakers, at the conference organized by the International Mathematical Union every 4 years. It covers several areas of Mathematics and it includes the Fields Medal and Nevanlinna, Gauss and Leelavati Prizes and the Chern Medal laudatios.




Evaluation Complexity of Algorithms for Nonconvex Optimization


Book Description

A popular way to assess the “effort” needed to solve a problem is to count how many evaluations of the problem functions (and their derivatives) are required. In many cases, this is often the dominating computational cost. Given an optimization problem satisfying reasonable assumptions—and given access to problem-function values and derivatives of various degrees—how many evaluations might be required to approximately solve the problem? Evaluation Complexity of Algorithms for Nonconvex Optimization: Theory, Computation, and Perspectives addresses this question for nonconvex optimization problems, those that may have local minimizers and appear most often in practice. This is the first book on complexity to cover topics such as composite and constrained optimization, derivative-free optimization, subproblem solution, and optimal (lower and sharpness) bounds for nonconvex problems. It is also the first to address the disadvantages of traditional optimality measures and propose useful surrogates leading to algorithms that compute approximate high-order critical points, and to compare traditional and new methods, highlighting the advantages of the latter from a complexity point of view. This is the go-to book for those interested in solving nonconvex optimization problems. It is suitable for advanced undergraduate and graduate students in courses on advanced numerical analysis, data science, numerical optimization, and approximation theory.




Trust Region Methods


Book Description

Mathematics of Computing -- General.




Optimization for Machine Learning


Book Description

An up-to-date account of the interplay between optimization and machine learning, accessible to students and researchers in both communities. The interplay between optimization and machine learning is one of the most important developments in modern computational science. Optimization formulations and methods are proving to be vital in designing algorithms to extract essential knowledge from huge volumes of data. Machine learning, however, is not simply a consumer of optimization technology but a rapidly evolving field that is itself generating new optimization ideas. This book captures the state of the art of the interaction between optimization and machine learning in a way that is accessible to researchers in both fields. Optimization approaches have enjoyed prominence in machine learning because of their wide applicability and attractive theoretical properties. The increasing complexity, size, and variety of today's machine learning models call for the reassessment of existing assumptions. This book starts the process of reassessment. It describes the resurgence in novel contexts of established frameworks such as first-order methods, stochastic approximations, convex relaxations, interior-point methods, and proximal methods. It also devotes attention to newer themes such as regularized optimization, robust optimization, gradient and subgradient methods, splitting techniques, and second-order methods. Many of these techniques draw inspiration from other fields, including operations research, theoretical computer science, and subfields of optimization. The book will enrich the ongoing cross-fertilization between the machine learning community and these other fields, and within the broader optimization community.




Nonsmooth Optimization


Book Description

Nonsmooth Optimization contains the proceedings of a workshop on non-smooth optimization (NSO) held from March 28 to April 8,1977 in Austria under the auspices of the International Institute for Applied Systems Analysis. The papers explore the techniques and theory of NSO and cover topics ranging from systems of inequalities to smooth approximation of non-smooth functions, as well as quadratic programming and line searches. Comprised of nine chapters, this volume begins with a survey of Soviet research on subgradient optimization carried out since 1962, followed by a discussion on rates of convergence in subgradient optimization. The reader is then introduced to the method of subgradient optimization in an abstract setting and the minimal hypotheses required to ensure convergence; NSO and nonlinear programming; and bundle methods in NSO. A feasible descent algorithm for linearly constrained least squares problems is described. The book also considers sufficient minimization of piecewise-linear univariate functions before concluding with a description of the method of parametric decomposition in mathematical programming. This monograph will be of interest to mathematicians and mathematics students.




Recent Advances In Nonsmooth Optimization


Book Description

Nonsmooth optimization covers the minimization or maximization of functions which do not have the differentiability properties required by classical methods. The field of nonsmooth optimization is significant, not only because of the existence of nondifferentiable functions arising directly in applications, but also because several important methods for solving difficult smooth problems lead directly to the need to solve nonsmooth problems, which are either smaller in dimension or simpler in structure.This book contains twenty five papers written by forty six authors from twenty countries in five continents. It includes papers on theory, algorithms and applications for problems with first-order nondifferentiability (the usual sense of nonsmooth optimization) second-order nondifferentiability, nonsmooth equations, nonsmooth variational inequalities and other problems related to nonsmooth optimization.




Lectures on Convex Optimization


Book Description

This book provides a comprehensive, modern introduction to convex optimization, a field that is becoming increasingly important in applied mathematics, economics and finance, engineering, and computer science, notably in data science and machine learning. Written by a leading expert in the field, this book includes recent advances in the algorithmic theory of convex optimization, naturally complementing the existing literature. It contains a unified and rigorous presentation of the acceleration techniques for minimization schemes of first- and second-order. It provides readers with a full treatment of the smoothing technique, which has tremendously extended the abilities of gradient-type methods. Several powerful approaches in structural optimization, including optimization in relative scale and polynomial-time interior-point methods, are also discussed in detail. Researchers in theoretical optimization as well as professionals working on optimization problems will find this book very useful. It presents many successful examples of how to develop very fast specialized minimization algorithms. Based on the author’s lectures, it can naturally serve as the basis for introductory and advanced courses in convex optimization for students in engineering, economics, computer science and mathematics.




Numerical Optimization


Book Description

Optimization is an important tool used in decision science and for the analysis of physical systems used in engineering. One can trace its roots to the Calculus of Variations and the work of Euler and Lagrange. This natural and reasonable approach to mathematical programming covers numerical methods for finite-dimensional optimization problems. It begins with very simple ideas progressing through more complicated concepts, concentrating on methods for both unconstrained and constrained optimization.




Proximal Algorithms


Book Description

Proximal Algorithms discusses proximal operators and proximal algorithms, and illustrates their applicability to standard and distributed convex optimization in general and many applications of recent interest in particular. Much like Newton's method is a standard tool for solving unconstrained smooth optimization problems of modest size, proximal algorithms can be viewed as an analogous tool for nonsmooth, constrained, large-scale, or distributed versions of these problems. They are very generally applicable, but are especially well-suited to problems of substantial recent interest involving large or high-dimensional datasets. Proximal methods sit at a higher level of abstraction than classical algorithms like Newton's method: the base operation is evaluating the proximal operator of a function, which itself involves solving a small convex optimization problem. These subproblems, which generalize the problem of projecting a point onto a convex set, often admit closed-form solutions or can be solved very quickly with standard or simple specialized methods. Proximal Algorithms discusses different interpretations of proximal operators and algorithms, looks at their connections to many other topics in optimization and applied mathematics, surveys some popular algorithms, and provides a large number of examples of proximal operators that commonly arise in practice.




Introduction to Derivative-Free Optimization


Book Description

The first contemporary comprehensive treatment of optimization without derivatives. This text explains how sampling and model techniques are used in derivative-free methods and how they are designed to solve optimization problems. It is designed to be readily accessible to both researchers and those with a modest background in computational mathematics.