Automated Machine Learning


Book Description

This open access book presents the first comprehensive overview of general methods in Automated Machine Learning (AutoML), collects descriptions of existing systems based on these methods, and discusses the first series of international challenges of AutoML systems. The recent success of commercial ML applications and the rapid growth of the field has created a high demand for off-the-shelf ML methods that can be used easily and without expert knowledge. However, many of the recent machine learning successes crucially rely on human experts, who manually select appropriate ML architectures (deep learning architectures or more traditional ML workflows) and their hyperparameters. To overcome this problem, the field of AutoML targets a progressive automation of machine learning, based on principles from optimization and machine learning itself. This book serves as a point of entry into this quickly-developing field for researchers and advanced students alike, as well as providing a reference for practitioners aiming to use AutoML in their work.










Evolutionary Deep Neural Architecture Search: Fundamentals, Methods, and Recent Advances


Book Description

This book systematically narrates the fundamentals, methods, and recent advances of evolutionary deep neural architecture search chapter by chapter. This will provide the target readers with sufficient details learning from scratch. In particular, the method parts are devoted to the architecture search of unsupervised and supervised deep neural networks. The people, who would like to use deep neural networks but have no/limited expertise in manually designing the optimal deep architectures, will be the main audience. This may include the researchers who focus on developing novel evolutionary deep architecture search methods for general tasks, the students who would like to study the knowledge related to evolutionary deep neural architecture search and perform related research in the future, and the practitioners from the fields of computer vision, natural language processing, and others where the deep neural networks have been successfully and largely used in their respective fields.







Data-Driven Evolutionary Optimization


Book Description

Intended for researchers and practitioners alike, this book covers carefully selected yet broad topics in optimization, machine learning, and metaheuristics. Written by world-leading academic researchers who are extremely experienced in industrial applications, this self-contained book is the first of its kind that provides comprehensive background knowledge, particularly practical guidelines, and state-of-the-art techniques. New algorithms are carefully explained, further elaborated with pseudocode or flowcharts, and full working source code is made freely available. This is followed by a presentation of a variety of data-driven single- and multi-objective optimization algorithms that seamlessly integrate modern machine learning such as deep learning and transfer learning with evolutionary and swarm optimization algorithms. Applications of data-driven optimization ranging from aerodynamic design, optimization of industrial processes, to deep neural architecture search are included.







Evolutionary Multi-objective Bi-level Optimization for Efficient Deep Neural Network Architecture Design


Book Description

Deep convolutional neural networks (CNNs) are the backbones of deep learning (DL) paradigms for numerous vision tasks, including object recognition, detection, segmentation, etc. Early advancements in CNN architectures are primarily driven by human expertise and elaborate design. Recently, neural architecture search (NAS) was proposed with the aim of automating the network design process and generating task-dependent architectures. While existing approaches have achieved competitive performance, they are still impractical to real-world deployment for three reasons: (1) the generated architectures are solely optimized for predictive performance, resulting in inefficiency in utilizing hardware resources---i.e. energy consumption, latency, memory size, etc.; (2) the search processes require vast computational resources in most approaches; (3) most existing approaches require one complete search for each deployment specification of hardware or requirement. In this dissertation, we propose an efficient evolutionary NAS algorithm to address the aforementioned limitations. In particular, we first introduce Pareto-optimization to NAS, leading to a diverse set of architectures, trading-off multiple objectives, being obtained simultaneously in one run. We then improve the algorithm's search efficiency through surrogate models. We finally integrate a transfer learning scheme to the algorithm that allows a new task to leverage previous search efforts that further improves both the performance of the obtained architectures and search efficiency. Therefore, the proposed algorithm enables an automated and streamlined process to efficiently generate task-specific custom neural network models that are competitive under multiple objectives.