Bayesian Optimization in Action


Book Description

Bayesian Optimization in Action teaches you how to build Bayesian Optimisation systems from the ground up. This book transforms state-of-the-art research into usable techniques you can easily put into practice. With a range of illustrations, and concrete examples, this book proves that Bayesian Optimisation doesn't have to be difficult!




Bayesian Optimization in Action


Book Description

Bayesian optimization helps pinpoint the best configuration for your machine learning models with speed and accuracy. Put its advanced techniques into practice with this hands-on guide. In Bayesian Optimization in Action you will learn how to: Train Gaussian processes on both sparse and large data sets Combine Gaussian processes with deep neural networks to make them flexible and expressive Find the most successful strategies for hyperparameter tuning Navigate a search space and identify high-performing regions Apply Bayesian optimization to cost-constrained, multi-objective, and preference optimization Implement Bayesian optimization with PyTorch, GPyTorch, and BoTorch Bayesian Optimization in Action shows you how to optimize hyperparameter tuning, A/B testing, and other aspects of the machine learning process by applying cutting-edge Bayesian techniques. Using clear language, illustrations, and concrete examples, this book proves that Bayesian optimization doesn’t have to be difficult! You’ll get in-depth insights into how Bayesian optimization works and learn how to implement it with cutting-edge Python libraries. The book’s easy-to-reuse code samples let you hit the ground running by plugging them straight into your own projects. Forewords by Luis Serrano and David Sweet. About the technology In machine learning, optimization is about achieving the best predictions—shortest delivery routes, perfect price points, most accurate recommendations—in the fewest number of steps. Bayesian optimization uses the mathematics of probability to fine-tune ML functions, algorithms, and hyperparameters efficiently when traditional methods are too slow or expensive. About the book Bayesian Optimization in Action teaches you how to create efficient machine learning processes using a Bayesian approach. In it, you’ll explore practical techniques for training large datasets, hyperparameter tuning, and navigating complex search spaces. This interesting book includes engaging illustrations and fun examples like perfecting coffee sweetness, predicting weather, and even debunking psychic claims. You’ll learn how to navigate multi-objective scenarios, account for decision costs, and tackle pairwise comparisons. What's inside Gaussian processes for sparse and large datasets Strategies for hyperparameter tuning Identify high-performing regions Examples in PyTorch, GPyTorch, and BoTorch About the reader For machine learning practitioners who are confident in math and statistics. About the author Quan Nguyen is a research assistant at Washington University in St. Louis. He writes for the Python Software Foundation and has authored several books on Python programming. Table of Contents 1 Introduction to Bayesian optimization 2 Gaussian processes as distributions over functions 3 Customizing a Gaussian process with the mean and covariance functions 4 Refining the best result with improvement-based policies 5 Exploring the search space with bandit-style policies 6 Leveraging information theory with entropy-based policies 7 Maximizing throughput with batch optimization 8 Satisfying extra constraints with constrained optimization 9 Balancing utility and cost with multifidelity optimization 10 Learning from pairwise comparisons with preference optimization 11 Optimizing multiple objectives at the same time 12 Scaling Gaussian processes to large datasets 13 Combining Gaussian processes with neural networks




Experimentation for Engineers


Book Description

Optimize the performance of your systems with practical experiments used by engineers in the world’s most competitive industries. In Experimentation for Engineers: From A/B testing to Bayesian optimization you will learn how to: Design, run, and analyze an A/B test Break the "feedback loops" caused by periodic retraining of ML models Increase experimentation rate with multi-armed bandits Tune multiple parameters experimentally with Bayesian optimization Clearly define business metrics used for decision-making Identify and avoid the common pitfalls of experimentation Experimentation for Engineers: From A/B testing to Bayesian optimization is a toolbox of techniques for evaluating new features and fine-tuning parameters. You’ll start with a deep dive into methods like A/B testing, and then graduate to advanced techniques used to measure performance in industries such as finance and social media. Learn how to evaluate the changes you make to your system and ensure that your testing doesn’t undermine revenue or other business metrics. By the time you’re done, you’ll be able to seamlessly deploy experiments in production while avoiding common pitfalls. About the technology Does my software really work? Did my changes make things better or worse? Should I trade features for performance? Experimentation is the only way to answer questions like these. This unique book reveals sophisticated experimentation practices developed and proven in the world’s most competitive industries that will help you enhance machine learning systems, software applications, and quantitative trading solutions. About the book Experimentation for Engineers: From A/B testing to Bayesian optimization delivers a toolbox of processes for optimizing software systems. You’ll start by learning the limits of A/B testing, and then graduate to advanced experimentation strategies that take advantage of machine learning and probabilistic methods. The skills you’ll master in this practical guide will help you minimize the costs of experimentation and quickly reveal which approaches and features deliver the best business results. What's inside Design, run, and analyze an A/B test Break the “feedback loops” caused by periodic retraining of ML models Increase experimentation rate with multi-armed bandits Tune multiple parameters experimentally with Bayesian optimization About the reader For ML and software engineers looking to extract the most value from their systems. Examples in Python and NumPy. About the author David Sweet has worked as a quantitative trader at GETCO and a machine learning engineer at Instagram. He teaches in the AI and Data Science master's programs at Yeshiva University. Table of Contents 1 Optimizing systems by experiment 2 A/B testing: Evaluating a modification to your system 3 Multi-armed bandits: Maximizing business metrics while experimenting 4 Response surface methodology: Optimizing continuous parameters 5 Contextual bandits: Making targeted decisions 6 Bayesian optimization: Automating experimental optimization 7 Managing business metrics 8 Practical considerations




Bayesian Optimization


Book Description

A comprehensive introduction to Bayesian optimization that starts from scratch and carefully develops all the key ideas along the way.




Surrogates


Book Description

Computer simulation experiments are essential to modern scientific discovery, whether that be in physics, chemistry, biology, epidemiology, ecology, engineering, etc. Surrogates are meta-models of computer simulations, used to solve mathematical models that are too intricate to be worked by hand. Gaussian process (GP) regression is a supremely flexible tool for the analysis of computer simulation experiments. This book presents an applied introduction to GP regression for modelling and optimization of computer simulation experiments. Features: • Emphasis on methods, applications, and reproducibility. • R code is integrated throughout for application of the methods. • Includes more than 200 full colour figures. • Includes many exercises to supplement understanding, with separate solutions available from the author. • Supported by a website with full code available to reproduce all methods and examples. The book is primarily designed as a textbook for postgraduate students studying GP regression from mathematics, statistics, computer science, and engineering. Given the breadth of examples, it could also be used by researchers from these fields, as well as from economics, life science, social science, etc.




Learning to Learn


Book Description

Over the past three decades or so, research on machine learning and data mining has led to a wide variety of algorithms that learn general functions from experience. As machine learning is maturing, it has begun to make the successful transition from academic research to various practical applications. Generic techniques such as decision trees and artificial neural networks, for example, are now being used in various commercial and industrial applications. Learning to Learn is an exciting new research direction within machine learning. Similar to traditional machine-learning algorithms, the methods described in Learning to Learn induce general functions from experience. However, the book investigates algorithms that can change the way they generalize, i.e., practice the task of learning itself, and improve on it. To illustrate the utility of learning to learn, it is worthwhile comparing machine learning with human learning. Humans encounter a continual stream of learning tasks. They do not just learn concepts or motor skills, they also learn bias, i.e., they learn how to generalize. As a result, humans are often able to generalize correctly from extremely few examples - often just a single example suffices to teach us a new thing. A deeper understanding of computer programs that improve their ability to learn can have a large practical impact on the field of machine learning and beyond. In recent years, the field has made significant progress towards a theory of learning to learn along with practical new algorithms, some of which led to impressive results in real-world applications. Learning to Learn provides a survey of some of the most exciting new research approaches, written by leading researchers in the field. Its objective is to investigate the utility and feasibility of computer programs that can learn how to learn, both from a practical and a theoretical point of view.




Designing Deep Learning Systems


Book Description

A vital guide to building the platforms and systems that bring deep learning models to production. In Designing Deep Learning Systems you will learn how to: Transfer your software development skills to deep learning systems Recognize and solve common engineering challenges for deep learning systems Understand the deep learning development cycle Automate training for models in TensorFlow and PyTorch Optimize dataset management, training, model serving and hyperparameter tuning Pick the right open-source project for your platform Deep learning systems are the components and infrastructure essential to supporting a deep learning model in a production environment. Written especially for software engineers with minimal knowledge of deep learning’s design requirements, Designing Deep Learning Systems is full of hands-on examples that will help you transfer your software development skills to creating these deep learning platforms. You’ll learn how to build automated and scalable services for core tasks like dataset management, model training/serving, and hyperparameter tuning. This book is the perfect way to step into an exciting—and lucrative—career as a deep learning engineer. About the technology To be practically usable, a deep learning model must be built into a software platform. As a software engineer, you need a deep understanding of deep learning to create such a system. Th is book gives you that depth. About the book Designing Deep Learning Systems: A software engineer's guide teaches you everything you need to design and implement a production-ready deep learning platform. First, it presents the big picture of a deep learning system from the developer’s perspective, including its major components and how they are connected. Then, it carefully guides you through the engineering methods you’ll need to build your own maintainable, efficient, and scalable deep learning platforms. What's inside The deep learning development cycle Automate training in TensorFlow and PyTorch Dataset management, model serving, and hyperparameter tuning A hands-on deep learning lab About the reader For software developers and engineering-minded data scientists. Examples in Java and Python. About the author Chi Wang is a principal software developer in the Salesforce Einstein group. Donald Szeto was the co-founder and CTO of PredictionIO. Table of Contents 1 An introduction to deep learning systems 2 Dataset management service 3 Model training service 4 Distributed training 5 Hyperparameter optimization service 6 Model serving design 7 Model serving in practice 8 Metadata and artifact store 9 Workflow orchestration 10 Path to production




Advanced Python Programming


Book Description

Create distributed applications with clever design patterns to solve complex problems Key FeaturesSet up and run distributed algorithms on a cluster using Dask and PySparkMaster skills to accurately implement concurrency in your codeGain practical experience of Python design patterns with real-world examplesBook Description This Learning Path shows you how to leverage the power of both native and third-party Python libraries for building robust and responsive applications. You will learn about profilers and reactive programming, concurrency and parallelism, as well as tools for making your apps quick and efficient. You will discover how to write code for parallel architectures using TensorFlow and Theano, and use a cluster of computers for large-scale computations using technologies such as Dask and PySpark. With the knowledge of how Python design patterns work, you will be able to clone objects, secure interfaces, dynamically choose algorithms, and accomplish much more in high performance computing. By the end of this Learning Path, you will have the skills and confidence to build engaging models that quickly offer efficient solutions to your problems. This Learning Path includes content from the following Packt products: Python High Performance - Second Edition by Gabriele LanaroMastering Concurrency in Python by Quan NguyenMastering Python Design Patterns by Sakis KasampalisWhat you will learnUse NumPy and pandas to import and manipulate datasetsAchieve native performance with Cython and NumbaWrite asynchronous code using asyncio and RxPyDesign highly scalable programs with application scaffoldingExplore abstract methods to maintain data consistencyClone objects using the prototype patternUse the adapter pattern to make incompatible interfaces compatibleEmploy the strategy pattern to dynamically choose an algorithmWho this book is for This Learning Path is specially designed for Python developers who want to build high-performance applications and learn about single core and multi-core programming, distributed concurrency, and Python design patterns. Some experience with Python programming language will help you get the most out of this Learning Path.




Gaussian Processes for Machine Learning


Book Description

A comprehensive and self-contained introduction to Gaussian processes, which provide a principled, practical, probabilistic approach to learning in kernel machines. Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kernel machines. GPs have received increased attention in the machine-learning community over the past decade, and this book provides a long-needed systematic and unified treatment of theoretical and practical aspects of GPs in machine learning. The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics. The book deals with the supervised-learning problem for both regression and classification, and includes detailed algorithms. A wide variety of covariance (kernel) functions are presented and their properties discussed. Model selection is discussed both from a Bayesian and a classical perspective. Many connections to other well-known techniques from machine learning and statistics are discussed, including support-vector machines, neural networks, splines, regularization networks, relevance vector machines and others. Theoretical issues including learning curves and the PAC-Bayesian framework are treated, and several approximation methods for learning with large datasets are discussed. The book contains illustrative examples and exercises, and code and datasets are available on the Web. Appendixes provide mathematical background and a discussion of Gaussian Markov processes.




Bayesian Methods for Hackers


Book Description

Master Bayesian Inference through Practical Examples and Computation–Without Advanced Mathematical Analysis Bayesian methods of inference are deeply natural and extremely powerful. However, most discussions of Bayesian inference rely on intensely complex mathematical analyses and artificial examples, making it inaccessible to anyone without a strong mathematical background. Now, though, Cameron Davidson-Pilon introduces Bayesian inference from a computational perspective, bridging theory to practice–freeing you to get results using computing power. Bayesian Methods for Hackers illuminates Bayesian inference through probabilistic programming with the powerful PyMC language and the closely related Python tools NumPy, SciPy, and Matplotlib. Using this approach, you can reach effective solutions in small increments, without extensive mathematical intervention. Davidson-Pilon begins by introducing the concepts underlying Bayesian inference, comparing it with other techniques and guiding you through building and training your first Bayesian model. Next, he introduces PyMC through a series of detailed examples and intuitive explanations that have been refined after extensive user feedback. You’ll learn how to use the Markov Chain Monte Carlo algorithm, choose appropriate sample sizes and priors, work with loss functions, and apply Bayesian inference in domains ranging from finance to marketing. Once you’ve mastered these techniques, you’ll constantly turn to this guide for the working PyMC code you need to jumpstart future projects. Coverage includes • Learning the Bayesian “state of mind” and its practical implications • Understanding how computers perform Bayesian inference • Using the PyMC Python library to program Bayesian analyses • Building and debugging models with PyMC • Testing your model’s “goodness of fit” • Opening the “black box” of the Markov Chain Monte Carlo algorithm to see how and why it works • Leveraging the power of the “Law of Large Numbers” • Mastering key concepts, such as clustering, convergence, autocorrelation, and thinning • Using loss functions to measure an estimate’s weaknesses based on your goals and desired outcomes • Selecting appropriate priors and understanding how their influence changes with dataset size • Overcoming the “exploration versus exploitation” dilemma: deciding when “pretty good” is good enough • Using Bayesian inference to improve A/B testing • Solving data science problems when only small amounts of data are available Cameron Davidson-Pilon has worked in many areas of applied mathematics, from the evolutionary dynamics of genes and diseases to stochastic modeling of financial prices. His contributions to the open source community include lifelines, an implementation of survival analysis in Python. Educated at the University of Waterloo and at the Independent University of Moscow, he currently works with the online commerce leader Shopify.