A Primer on Physical-Layer Network Coding


Book Description

The concept of physical-layer network coding (PNC) was proposed in 2006 for application in wireless networks. Since then it has developed into a subfield of communications and networking with a wide following. This book is a primer on PNC. It is the outcome of a set of lecture notes for a course for beginning graduate students at The Chinese University of Hong Kong. The target audience is expected to have some prior background knowledge in communication theory and wireless communications, but not working knowledge at the research level. Indeed, a goal of this book/course is to allow the reader to gain a deeper appreciation of the various nuances of wireless communications and networking by focusing on problems arising from the study of PNC. Specifically, we introduce the tools and techniques needed to solve problems in PNC, and many of these tools and techniques are drawn from the more general disciplines of signal processing, communications, and networking: PNC is used as a pivot to learn about the fundamentals of signal processing techniques and wireless communications in general. We feel that such a problem-centric approach will give the reader a more in-depth understanding of these disciplines and allow him/her to see first-hand how the techniques of these disciplines can be applied to solve real research problems. As a primer, this book does not cover many advanced materials related to PNC. PNC is an active research field and many new results will no doubt be forthcoming in the near future. We believe that this book will provide a good contextual framework for the interpretation of these advanced results should the reader decide to probe further into the field of PNC.




Communications in Interference Limited Networks


Book Description

This book offers means to handle interference as a central problem of operating wireless networks. It investigates centralized and decentralized methods to avoid and handle interference as well as approaches that resolve interference constructively. The latter type of approach tries to solve the joint detection and estimation problem of several data streams that share a common medium. In fact, an exciting insight into the operation of networks is that it may be beneficial, in terms of an overall throughput, to actively create and manage interference. Thus, when handled properly, "mixing" of data in networks becomes a useful tool of operation rather than the nuisance as which it has been treated traditionally. With the development of mobile, robust, ubiquitous, reliable and instantaneous communication being a driving and enabling factor of an information centric economy, the understanding, mitigation and exploitation of interference in networks must be seen as a centrally important task.




Network Connectivity


Book Description

Networks naturally appear in many high-impact domains, ranging from social network analysis to disease dissemination studies to infrastructure system design. Within network studies, network connectivity plays an important role in a myriad of applications. The diversity of application areas has spurred numerous connectivity measures, each designed for some specific tasks. Depending on the complexity of connectivity measures, the computational cost of calculating the connectivity score can vary significantly. Moreover, the complexity of the connectivity would predominantly affect the hardness of connectivity optimization, which is a fundamental problem for network connectivity studies. This book presents a thorough study in network connectivity, including its concepts, computation, and optimization. Specifically, a unified connectivity measure model will be introduced to unveil the commonality among existing connectivity measures. For the connectivity computation aspect, the authors introduce the connectivity tracking problems and present several effective connectivity inference frameworks under different network settings. Taking the connectivity optimization perspective, the book analyzes the problem theoretically and introduces an approximation framework to effectively optimize the network connectivity.Lastly, the book discusses the new research frontiers and directions to explore for network connectivity studies. This book is an accessible introduction to the study of connectivity in complex networks. It is essential reading for advanced undergraduates, Ph.D. students, as well as researchers and practitioners who are interested in graph mining, data mining, and machine learning.




Communication Networks


Book Description

This book results from many years of teaching an upper division course on communication networks in the EECS department at the University of California, Berkeley. It is motivated by the perceived need for an easily accessible textbook that puts emphasis on the core concepts behind current and next generation networks. After an overview of how today's Internet works and a discussion of the main principles behind its architecture, we discuss the key ideas behind Ethernet, WiFi networks, routing, internetworking, and TCP. To make the book as self-contained as possible, brief discussions of probability and Markov chain concepts are included in the appendices. This is followed by a brief discussion of mathematical models that provide insight into the operations of network protocols. Next, the main ideas behind the new generation of wireless networks based on LTE, and the notion of QoS are presented. A concise discussion of the physical layer technologies underlying various networks is also included. Finally, a sampling of topics is presented that may have significant influence on the future evolution of networks, including overlay networks like content delivery and peer-to-peer networks, sensor networks, distributed algorithms, Byzantine agreement, source compression, SDN and NFV, and Internet of Things.




Analytical Methods for Network Congestion Control


Book Description

The congestion control mechanism has been responsible for maintaining stability as the Internet scaled up by many orders of magnitude in size, speed, traffic volume, coverage, and complexity over the last three decades. In this book, we develop a coherent theory of congestion control from the ground up to help understand and design these algorithms. We model network traffic as fluids that flow from sources to destinations and model congestion control algorithms as feedback dynamical systems. We show that the model is well defined. We characterize its equilibrium points and prove their stability. We will use several real protocols for illustration but the emphasis will be on various mathematical techniques for algorithm analysis. Specifically we are interested in four questions: 1. How are congestion control algorithms modelled? 2. Are the models well defined? 3. How are the equilibrium points of a congestion control model characterized? 4. How are the stability of these equilibrium points analyzed? For each topic, we first present analytical tools, from convex optimization, to control and dynamical systems, Lyapunov and Nyquist stability theorems, and to projection and contraction theorems. We then apply these basic tools to congestion control algorithms and rigorously prove their equilibrium and stability properties. A notable feature of this book is the careful treatment of projected dynamics that introduces discontinuity in our differential equations. Even though our development is carried out in the context of congestion control, the set of system theoretic tools employed and the process of understanding a physical system, building mathematical models, and analyzing these models for insights have a much wider applicability than to congestion control.




Diffusion Source Localization in Large Networks


Book Description

Diffusion processes in large networks have been used to model many real-world phenomena, including how rumors spread on the Internet, epidemics among human beings, emotional contagion through social networks, and even gene regulatory processes. Fundamental estimation principles and efficient algorithms for locating diffusion sources can answer a wide range of important questions, such as identifying the source of a widely spread rumor on online social networks. This book provides an overview of recent progress on source localization in large networks, focusing on theoretical principles and fundamental limits. The book covers both discrete-time diffusion models and continuous-time diffusion models. For discrete-time diffusion models, the book focuses on the Jordan infection center; for continuous-time diffusion models, it focuses on the rumor center. Most theoretical results on source localization are based on these two types of estimators or their variants. This book also includes algorithms that leverage partial-time information for source localization and a brief discussion of interesting unresolved problems in this area.




Modeling and Optimization in Software-Defined Networks


Book Description

This book provides a quick reference and insights into modeling and optimization of software-defined networks (SDNs). It covers various algorithms and approaches that have been developed for optimizations related to the control plane, the considerable research related to data plane optimization, and topics that have significant potential for research and advances to the state-of-the-art in SDN. Over the past ten years, network programmability has transitioned from research concepts to more mainstream technology through the advent of technologies amenable to programmability such as service chaining, virtual network functions, and programmability of the data plane. However, the rapid development in SDN technologies has been the key driver behind its evolution. The logically centralized abstraction of network states enabled by SDN facilitates programmability and use of sophisticated optimization and control algorithms for enhancing network performance, policy management, and security.Furthermore, the centralized aggregation of network telemetry facilitates use of data-driven machine learning-based methods. To fully unleash the power of this new SDN paradigm, though, various architectural design, deployment, and operations questions need to be addressed. Associated with these are various modeling, resource allocation, and optimization opportunities.The book covers these opportunities and associated challenges, which represent a ``call to arms'' for the SDN community to develop new modeling and optimization methods that will complement or improve on the current norms.




Multi-Armed Bandits


Book Description

Multi-armed bandit problems pertain to optimal sequential decision making and learning in unknown environments. Since the first bandit problem posed by Thompson in 1933 for the application of clinical trials, bandit problems have enjoyed lasting attention from multiple research communities and have found a wide range of applications across diverse domains. This book covers classic results and recent development on both Bayesian and frequentist bandit problems. We start in Chapter 1 with a brief overview on the history of bandit problems, contrasting the two schools—Bayesian and frequentist—of approaches and highlighting foundational results and key applications. Chapters 2 and 4 cover, respectively, the canonical Bayesian and frequentist bandit models. In Chapters 3 and 5, we discuss major variants of the canonical bandit models that lead to new directions, bring in new techniques, and broaden the applications of this classical problem. In Chapter 6, we present several representative application examples in communication networks and social-economic systems, aiming to illuminate the connections between the Bayesian and the frequentist formulations of bandit problems and how structural results pertaining to one may be leveraged to obtain solutions under the other.




Poisson Line Cox Process


Book Description

This book provides a comprehensive treatment of the Poisson line Cox process (PLCP) and its applications to vehicular networks. The PLCP is constructed by placing points on each line of a Poisson line process (PLP) as per an independent Poisson point process (PPP). For vehicular applications, one can imagine the layout of the road network as a PLP and the vehicles on the roads as the points of the PLCP. First, a brief historical account of the evolution of the theory of PLP is provided to familiarize readers with the seminal contributions in this area. In order to provide a self-contained treatment of this topic, the construction and key fundamental properties of both PLP and PLCP are discussed in detail. The rest of the book is devoted to the applications of these models to a variety of wireless networks, including vehicular communication networks and localization networks. Specifically, modeling the locations of vehicular nodes and roadside units (RSUs) using PLCP, the signal-to-interference-plus-noise ratio (SINR)-based coverage analysis is presented for both ad hoc and cellular network models. For a similar setting, the load on the cellular macro base stations (MBSs) and RSUs in a vehicular network is also characterized analytically. For the localization networks, PLP is used to model blockages, which is shown to facilitate the characterization of asymptotic blind spot probability in a localization application. Finally, the path distance characteristics for a special case of PLCP are analyzed, which can be leveraged to answer critical questions in the areas of transportation networks and urban planning. The book is concluded with concrete suggestions on future directions of research. Based largely on the original research of the authors, this is the first book that specifically focuses on the self-contained mathematical treatment of the PLCP. The ideal audience of this book is graduate students as well as researchers in academia and industry who are familiar with probability theory, have some exposure to point processes, and are interested in the field of stochastic geometry and vehicular networks. Given the diverse backgrounds of the potential readers, the focus has been on providing an accessible and pedagogical treatment of this topic by consciously avoiding the measure theoretic details without compromising mathematical rigor.




Age of Information


Book Description

Information usually has the highest value when it is fresh. For example, real-time knowledge about the location, orientation, and speed of motor vehicles is imperative in autonomous driving, and the access to timely information about stock prices and interest rate movements is essential for developing trading strategies on the stock market. The Age of Information (AoI) concept, together with its recent extensions, provides a means of quantifying the freshness of information and an opportunity to improve the performance of real-time systems and networks. Recent research advances on AoI suggest that many well-known design principles of traditional data networks (for, e.g., providing high throughput and low delay) need to be re-examined for enhancing information freshness in rapidly emerging real-time applications. This book provides a suite of analytical tools and insightful results on the generation of information-update packets at the source nodes and the design of network protocols forwarding the packets to their destinations. The book also points out interesting connections between AoI concept and information theory, signal processing, and control theory, which are worthy of future investigation.