An Easy Path to Convex Analysis and Applications


Book Description

This book examines the most fundamental parts of convex analysis and its applications to optimization and location problems. Accessible techniques of variational analysis are employed to clarify and simplify some basic proofs in convex analysis and to build a theory of generalized differentiation for convex functions and sets in finite dimensions. The book serves as a bridge for the readers who have just started using convex analysis to reach deeper topics in the field. Detailed proofs are presented for most of the results in the book and also included are many figures and exercises for better understanding the material. Applications provided include both the classical topics of convex optimization and important problems of modern convex optimization, convex geometry, and facility location.




An Easy Path to Convex Analysis and Applications


Book Description

Convex optimization has an increasing impact on many areas of mathematics, applied sciences, and practical applications. It is now being taught at many universities and being used by researchers of different fields. As convex analysis is the mathematical foundation for convex optimization, having deep knowledge of convex analysis helps students and researchers apply its tools more effectively. The main goal of this book is to provide an easy access to the most fundamental parts of convex analysis and its applications to optimization. Modern techniques of variational analysis are employed to clarify and simplify some basic proofs in convex analysis and build the theory of generalized differentiation for convex functions and sets in finite dimensions. We also present new applications of convex analysis to location problems in connection with many interesting geometric problems such as the Fermat-Torricelli problem, the Heron problem, the Sylvester problem, and their generalizations. Of course, we do not expect to touch every aspect of convex analysis, but the book consists of sufficient material for a first course on this subject. It can also serve as supplemental reading material for a course on convex optimization and applications.




A Simple Path to Convex Analysis and Applications


Book Description

Annotation Convex optimization has an increasing impact on many areas of mathematics, applied sciences, and practical applications. It is now being taught at many universities and being used by researchers of different fields. As convex analysis is the mathematical foundation for convex optimization, having deep knowledge of convex analysis helps students and researchers apply its tools more effectively. The main goal of this book is to provide an easy access to the most fundamental parts of convex analysis and its applications to optimization. Modern techniques of variational analysis are employed to clarify and simplify some basic proofs in convex analysis and build the theory of generalized differentiation for convex functions and sets in finite dimensions. We also present new applications of convex analysis to location problems in connection with many interesting geometric problems such as the Fermat-Torricelli problem, the Heron problem, the Sylvester problem, and their generalizations. Of course, we do not expect to touch every aspect of convex analysis, but the book consists of sufficient material for a first course on this subject. It can also serve as supplemental reading material for a course on convex optimization and applications.




Convex Analysis and Beyond


Book Description

This book presents a unified theory of convex functions, sets, and set-valued mappings in topological vector spaces with its specifications to locally convex, Banach and finite-dimensional settings. These developments and expositions are based on the powerful geometric approach of variational analysis, which resides on set extremality with its characterizations and specifications in the presence of convexity. Using this approach, the text consolidates the device of fundamental facts of generalized differential calculus to obtain novel results for convex sets, functions, and set-valued mappings in finite and infinite dimensions. It also explores topics beyond convexity using the fundamental machinery of convex analysis to develop nonconvex generalized differentiation and its applications. The text utilizes an adaptable framework designed with researchers as well as multiple levels of students in mind. It includes many exercises and figures suited to graduate classes in mathematical sciences that are also accessible to advanced students in economics, engineering, and other applications. In addition, it includes chapters on convex analysis and optimization in finite-dimensional spaces that will be useful to upper undergraduate students, whereas the work as a whole provides an ample resource to mathematicians and applied scientists, particularly experts in convex and variational analysis, optimization, and their applications.




The Projected Subgradient Algorithm in Convex Optimization


Book Description

This focused monograph presents a study of subgradient algorithms for constrained minimization problems in a Hilbert space. The book is of interest for experts in applications of optimization to engineering and economics. The goal is to obtain a good approximate solution of the problem in the presence of computational errors. The discussion takes into consideration the fact that for every algorithm its iteration consists of several steps and that computational errors for different steps are different, in general. The book is especially useful for the reader because it contains solutions to a number of difficult and interesting problems in the numerical optimization. The subgradient projection algorithm is one of the most important tools in optimization theory and its applications. An optimization problem is described by an objective function and a set of feasible points. For this algorithm each iteration consists of two steps. The first step requires a calculation of a subgradient of the objective function; the second requires a calculation of a projection on the feasible set. The computational errors in each of these two steps are different. This book shows that the algorithm discussed, generates a good approximate solution, if all the computational errors are bounded from above by a small positive constant. Moreover, if computational errors for the two steps of the algorithm are known, one discovers an approximate solution and how many iterations one needs for this. In addition to their mathematical interest, the generalizations considered in this book have a significant practical meaning.




Convex Optimization with Computational Errors


Book Description

The book is devoted to the study of approximate solutions of optimization problems in the presence of computational errors. It contains a number of results on the convergence behavior of algorithms in a Hilbert space, which are known as important tools for solving optimization problems. The research presented in the book is the continuation and the further development of the author's (c) 2016 book Numerical Optimization with Computational Errors, Springer 2016. Both books study the algorithms taking into account computational errors which are always present in practice. The main goal is, for a known computational error, to find out what an approximate solution can be obtained and how many iterates one needs for this. The main difference between this new book and the 2016 book is that in this present book the discussion takes into consideration the fact that for every algorithm, its iteration consists of several steps and that computational errors for different steps are generally, different. This fact, which was not taken into account in the previous book, is indeed important in practice. For example, the subgradient projection algorithm consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these two steps there is a computational error and these two computational errors are different in general. It may happen that the feasible set is simple and the objective function is complicated. As a result, the computational error, made when one calculates the projection, is essentially smaller than the computational error of the calculation of the subgradient. Clearly, an opposite case is possible too. Another feature of this book is a study of a number of important algorithms which appeared recently in the literature and which are not discussed in the previous book. This monograph contains 12 chapters. Chapter 1 is an introduction. In Chapter 2 we study the subgradient projection algorithm for minimization of convex and nonsmooth functions. We generalize the results of [NOCE] and establish results which has no prototype in [NOCE]. In Chapter 3 we analyze the mirror descent algorithm for minimization of convex and nonsmooth functions, under the presence of computational errors. For this algorithm each iteration consists of two steps. The first step is a calculation of a subgradient of the objective function while in the second one we solve an auxiliary minimization problem on the set of feasible points. In each of these two steps there is a computational error. We generalize the results of [NOCE] and establish results which has no prototype in [NOCE]. In Chapter 4 we analyze the projected gradient algorithm with a smooth objective function under the presence of computational errors. In Chapter 5 we consider an algorithm, which is an extension of the projection gradient algorithm used for solving linear inverse problems arising in signal/image processing. In Chapter 6 we study continuous subgradient method and continuous subgradient projection algorithm for minimization of convex nonsmooth functions and for computing the saddle points of convex-concave functions, under the presence of computational errors. All the results of this chapter has no prototype in [NOCE]. In Chapters 7-12 we analyze several algorithms under the presence of computational errors which were not considered in [NOCE]. Again, each step of an iteration has a computational errors and we take into account that these errors are, in general, different. An optimization problems with a composite objective function is studied in Chapter 7. A zero-sum game with two-players is considered in Chapter 8. A predicted decrease approximation-based method is used in Chapter 9 for constrained convex optimization. Chapter 10 is devoted to minimization of quasiconvex functions. Minimization of sharp weakly convex functions is discussed in Chapter 11. Chapter 12 is devoted to a generalized projected subgradient method for minimization of a convex function over a set which is not necessarily convex. The book is of interest for researchers and engineers working in optimization. It also can be useful in preparation courses for graduate students. The main feature of the book which appeals specifically to this audience is the study of the influence of computational errors for several important optimization algorithms. The book is of interest for experts in applications of optimization to engineering and economics.




Convex Analysis


Book Description

This book is an introduction to convex analysis and some of its applications. It starts with basis theory, which is explained within the framework of finite-dimensional spaces. The only prerequisites are basic analysis and simple geometry. The second chapter presents some applications of convex analysis, including problems of linear programming, geometry, and approximation. Special attention is paid to applications of convex analysis to Kolmogorov-type inequalities for derivatives of functions is one variable. Chapter 3 collects some results on geometry and convex analysis in infinite-dimensional spaces. A comprehensive introduction written "for beginners" illustrates the fundamentals of convex analysis in finite-dimensional spaces. The book can be used for an advanced undergraduate or graduate level course on convex analysis and its applications. It is also suitable for independent study of this extremely important area of mathematics.




Convex and Set-Valued Analysis


Book Description

This textbook is devoted to a compressed and self-contained exposition of two important parts of contemporary mathematics: convex and set-valued analysis. In the first part, properties of convex sets, the theory of separation, convex functions and their differentiability, properties of convex cones in finite- and infinite-dimensional spaces are discussed. The second part covers some important parts of set-valued analysis. There the properties of the Hausdorff metric and various continuity concepts of set-valued maps are considered. The great attention is paid also to measurable set-valued functions, continuous, Lipschitz and some special types of selections, fixed point and coincidence theorems, covering set-valued maps, topological degree theory and differential inclusions. Contents: Preface Part I: Convex analysis Convex sets and their properties The convex hull of a set. The interior of convex sets The affine hull of sets. The relative interior of convex sets Separation theorems for convex sets Convex functions Closedness, boundedness, continuity, and Lipschitz property of convex functions Conjugate functions Support functions Differentiability of convex functions and the subdifferential Convex cones A little more about convex cones in infinite-dimensional spaces A problem of linear programming More about convex sets and convex hulls Part II: Set-valued analysis Introduction to the theory of topological and metric spaces The Hausdorff metric and the distance between sets Some fine properties of the Hausdorff metric Set-valued maps. Upper semicontinuous and lower semicontinuous set-valued maps A base of topology of the spaceHc(X) Measurable set-valued maps. Measurable selections and measurable choice theorems The superposition set-valued operator The Michael theorem and continuous selections. Lipschitz selections. Single-valued approximations Special selections of set-valued maps Differential inclusions Fixed points and coincidences of maps in metric spaces Stability of coincidence points and properties of covering maps Topological degree and fixed points of set-valued maps in Banach spaces Existence results for differential inclusions via the fixed point method Notation Bibliography Index




Optimization in Banach Spaces


Book Description

The book is devoted to the study of constrained minimization problems on closed and convex sets in Banach spaces with a Frechet differentiable objective function. Such problems are well studied in a finite-dimensional space and in an infinite-dimensional Hilbert space. When the space is Hilbert there are many algorithms for solving optimization problems including the gradient projection algorithm which is one of the most important tools in the optimization theory, nonlinear analysis and their applications. An optimization problem is described by an objective function and a set of feasible points. For the gradient projection algorithm each iteration consists of two steps. The first step is a calculation of a gradient of the objective function while in the second one we calculate a projection on the feasible set. In each of these two steps there is a computational error. In our recent research we show that the gradient projection algorithm generates a good approximate solution, if all the computational errors are bounded from above by a small positive constant. It should be mentioned that the properties of a Hilbert space play an important role. When we consider an optimization problem in a general Banach space the situation becomes more difficult and less understood. On the other hand such problems arise in the approximation theory. The book is of interest for mathematicians working in optimization. It also can be useful in preparation courses for graduate students. The main feature of the book which appeals specifically to this audience is the study of algorithms for convex and nonconvex minimization problems in a general Banach space. The book is of interest for experts in applications of optimization to the approximation theory. In this book the goal is to obtain a good approximate solution of the constrained optimization problem in a general Banach space under the presence of computational errors. It is shown that the algorithm generates a good approximate solution, if the sequence of computational errors is bounded from above by a small constant. The book consists of four chapters. In the first we discuss several algorithms which are studied in the book and prove a convergence result for an unconstrained problem which is a prototype of our results for the constrained problem. In Chapter 2 we analyze convex optimization problems. Nonconvex optimization problems are studied in Chapter 3. In Chapter 4 we study continuous algorithms for minimization problems under the presence of computational errors. The algorithm generates a good approximate solution, if the sequence of computational errors is bounded from above by a small constant. The book consists of four chapters. In the first we discuss several algorithms which are studied in the book and prove a convergence result for an unconstrained problem which is a prototype of our results for the constrained problem. In Chapter 2 we analyze convex optimization problems. Nonconvex optimization problems are studied in Chapter 3. In Chapter 4 we study continuous algorithms for minimization problems under the presence of computational errors.




Convex Functions and Their Applications


Book Description

Thorough introduction to an important area of mathematics Contains recent results Includes many exercises