Subsampling


Book Description

Since Efron's profound paper on the bootstrap, an enormous amount of effort has been spent on the development of bootstrap, jacknife, and other resampling methods. The primary goal of these computer-intensive methods has been to provide statistical tools that work in complex situations without imposing unrealistic or unverifiable assumptions about the data generating mechanism. This book sets out to lay some of the foundations for subsampling methodology and related methods.




Evaluating the effect of within-household subsampling on the precision of crime victimization rates


Book Description

The decision to select a subsample of eligible members of a sampled household is influenced by a number of factors including burden on the household, data quality, cost, and the sampling variance of survey estimates. Design effects quantify the influence of a complex sampling design on the variance of survey estimates. Selecting a subsample of eligible persons within a sampled household can have counteracting impacts on design effects. On one hand, subsampling increases the design effects attributable to unequal weighting. On the other hand, subsampling could reduce the design effects attributable to clustering because the potential intra-household correlation among respondents in the same household may be reduced or eliminated. If the reduction in correlation is greater than the increase caused by unequal weighting, subsampling can achieve the same sampling variance as selecting all eligible household members, with less cost and burden. We present the results of a simulation study that evaluates the design effects associated with subsampling household members on personal victimization rates based on the 2008 National Crime Victimization Survey, which selected all persons 12 and older in a sampled household.




Digital Subsampling Phase Lock Techniques for Frequency Synthesis and Polar Transmission


Book Description

This book explains concepts behind fractional subsampling-based frequency synthesis that is re-shaping today’s art in the field of low-noise LO generation. It covers advanced material, giving clear guidance for development of background-calibrated environments capable of spur-free synthesis and wideband phase modulation. It further expands the concepts into the field of subsampling polar transmission, where the newly developed architecture enables unprecedented spectral efficiency levels, unquestionably required by the upcoming generation of wireless standards.




Subsampling GPS Receiver Front-end


Book Description

Research in RFIC design has recently shifted towards direct conversion and subsampling architectures as an alternative to the conventional super-heterodyne architectures. Bandpass sampling architectures, also called subsampling architectures, exhibit several advantages over super-heterodyne architectures, notably, the complexity of subsampling architectures is significantly lower since no phase-locked loop is required. A direct consequence is that downconversion from RF to IF can be achieved with significant power savings as compared to the super-heterodyne architecture. Another significant benefit of such architectures is the capability for processing multiple signals in parallel. The ability to simultaneously handle multiple carriers makes subsampling architectures particularly well suited for GNSS applications, as downconversion of multiple frequency bands is required in GNSS environments. With the advent of the new civilian GPS signals, L2C and L5, and the onset of the new Galileo signal, a receiver that can process multiple signals without adding complexity, is highly desired. An integrated proof-of-concept subsampling GPS receiver front-end in 130 nm BiCMOS is presented in this dissertation. The receiver achieves a noise figure of less than 3.8 dB, the lowest ever recorded noise figure of a subsampling based receiver.




A First Course in the Design of Experiments


Book Description

Most texts on experimental design fall into one of two distinct categories. There are theoretical works with few applications and minimal discussion on design, and there are methods books with limited or no discussion of the underlying theory. Furthermore, most of these tend to either treat the analysis of each design separately with little attempt to unify procedures, or they will integrate the analysis for the designs into one general technique. A First Course in the Design of Experiments: A Linear Models Approach stands apart. It presents theory and methods, emphasizes both the design selection for an experiment and the analysis of data, and integrates the analysis for the various designs with the general theory for linear models. The authors begin with a general introduction then lead students through the theoretical results, the various design models, and the analytical concepts that will enable them to analyze virtually any design. Rife with examples and exercises, the text also encourages using computers to analyze data. The authors use the SAS software package throughout the book, but also demonstrate how any regression program can be used for analysis. With its balanced presentation of theory, methods, and applications and its highly readable style, A First Course in the Design of Experiments proves ideal as a text for a beginning graduate or upper-level undergraduate course in the design and analysis of experiments.




Fishery Bulletin


Book Description




Outlier Ensembles


Book Description

This book discusses a variety of methods for outlier ensembles and organizes them by the specific principles with which accuracy improvements are achieved. In addition, it covers the techniques with which such methods can be made more effective. A formal classification of these methods is provided, and the circumstances in which they work well are examined. The authors cover how outlier ensembles relate (both theoretically and practically) to the ensemble techniques used commonly for other data mining problems like classification. The similarities and (subtle) differences in the ensemble techniques for the classification and outlier detection problems are explored. These subtle differences do impact the design of ensemble algorithms for the latter problem. This book can be used for courses in data mining and related curricula. Many illustrative examples and exercises are provided in order to facilitate classroom teaching. A familiarity is assumed to the outlier detection problem and also to generic problem of ensemble analysis in classification. This is because many of the ensemble methods discussed in this book are adaptations from their counterparts in the classification domain. Some techniques explained in this book, such as wagging, randomized feature weighting, and geometric subsampling, provide new insights that are not available elsewhere. Also included is an analysis of the performance of various types of base detectors and their relative effectiveness. The book is valuable for researchers and practitioners for leveraging ensemble methods into optimal algorithmic design.




Multimedia Systems, Standards, and Networks


Book Description

Describes ITU H H.323 and H.324, H.263, ITU-T video, and MPEG-4 standards, systems, and coding; IP and ATM networks; multimedia search and retrieval; image retrieval in digital laboratories; and the status and direction of MPEG-7.




XGBoost With Python


Book Description

XGBoost is the dominant technique for predictive modeling on regular data. The gradient boosting algorithm is the top technique on a wide range of predictive modeling problems, and XGBoost is the fastest implementation. When asked, the best machine learning competitors in the world recommend using XGBoost. In this Ebook, learn exactly how to get started and bring XGBoost to your own machine learning projects.