Modern Methods for Robust Regression


Book Description

Offering an in-depth treatment of robust and resistant regression, this volume takes an applied approach and offers readers empirical examples to illustrate key concepts.




Robust Regression


Book Description

Robust Regression: Analysis and Applications characterizes robust estimators in terms of how much they weight each observation discusses generalized properties of Lp-estimators. Includes an algorithm for identifying outliers using least absolute value criterion in regression modeling reviews redescending M-estimators studies Li linear regression proposes the best linear unbiased estimators for fixed parameters and random errors in the mixed linear model summarizes known properties of Li estimators for time series analysis examines ordinary least squares, latent root regression, and a robust regression weighting scheme and evaluates results from five different robust ridge regression estimators.




Robust Methods in Regression Analysis – Theory and Application


Book Description

Diploma Thesis from the year 2006 in the subject Mathematics - Statistics, grade: 1.3, European University Viadrina Frankfurt (Oder) (Wirtschaftswissenschaftliche Fakultät), language: English, abstract: Regression Analysis is an important statistical tool for many applications. The most frequently used approach to Regression Analysis is the method of Ordinary Least Squares. But this method is vulnerable to outliers; even a single outlier can spoil the estimation completely. How can this vulnerability be described by theoretical concepts and are there alternatives? This thesis gives an overview over concepts and alternative approaches. The three fundamental approaches to Robustness (qualitative-, infinitesimal- and quantitative Robustness) are introduced in this thesis and are applied to different estimators. The estimators under study are measures of location, scale and regression. The Robustness approaches are important for the theoretical judgement of certain estimators but as well for the development of alternatives to classical estimators. This thesis focuses on the (Robustness-) performance of estimators if outliers occur within the data set. Measures of location and scale provide necessary steppingstones into the topic of Regression Analysis. In particular the median and trimming approaches are found to produce very robust results. These results are used in Regression Analysis to find alternatives to the method of Ordinary Least Squares. Its vulnerability can be overcome by applying the methods of Least Median of Squares or Least Trimmed Squares. Different outlier diagnostic tools are introduced to improve the poor efficiency of these Regression Techniques. Furthermore, this thesis delivers a simulation of some Regression Techniques on different situations in Regression Analysis. This simulation focuses in particular on changes in regression estimates if outliers occur in the data. Theoretically derived results as well as the results of the simulation lead to the recommendation of the method of Reweighted Least Squares. Applying this method frequently on problems of Regression Analysis provides outlier resistant and efficient estimates.







Robust Regression and Outlier Detection


Book Description

WILEY-INTERSCIENCE PAPERBACK SERIES The Wiley-Interscience Paperback Series consists of selectedbooks that have been made more accessible to consumers in an effortto increase global appeal and general circulation. With these newunabridged softcover volumes, Wiley hopes to extend the lives ofthese works by making them available to future generations ofstatisticians, mathematicians, and scientists. "The writing style is clear and informal, and much of thediscussion is oriented to application. In short, the book is akeeper." –Mathematical Geology "I would highly recommend the addition of this book to thelibraries of both students and professionals. It is a usefultextbook for the graduate student, because it emphasizes both thephilosophy and practice of robustness in regression settings, andit provides excellent examples of precise, logical proofs oftheorems. . . .Even for those who are familiar with robustness, thebook will be a good reference because it consolidates the researchin high-breakdown affine equivariant estimators and includes anextensive bibliography in robust regression, outlier diagnostics,and related methods. The aim of this book, the authors tell us, is‘to make robust regression available for everyday statisticalpractice.’ Rousseeuw and Leroy have included all of thenecessary ingredients to make this happen." –Journal of the American Statistical Association




Robust Estimation and Testing


Book Description

An introduction to the theory and methods of robust statistics, providing students with practical methods for carrying out robust procedures in a variety of statistical contexts and explaining the advantages of these procedures. In addition, the text develops techniques and concepts likely to be useful in the future analysis of new statistical models and procedures. Emphasizing the concepts of breakdown point and influence functon of an estimator, it demonstrates the technique of expressing an estimator as a descriptive measure from which its influence function can be derived and then used to explore the efficiency and robustness properties of the estimator. Mathematical techniques are complemented by computational algorithms and Minitab macros for finding bootstrap and influence function estimates of standard errors of the estimators, robust confidence intervals, robust regression estimates and their standard errors. Includes examples and problems.




Robust Statistical Procedures


Book Description

Here is a brief, well-organized, and easy-to-follow introduction and overview of robust statistics. Huber focuses primarily on the important and clearly understood case of distribution robustness, where the shape of the true underlying distribution deviates slightly from the assumed model (usually the Gaussian law). An additional chapter on recent developments in robustness has been added and the reference list has been expanded and updated from the 1977 edition.




Tests for Differences Between Least Squares and Robust Regression Parameter Estimates and Related Topics


Book Description

At the present time there is no well accepted test for comparing least squares and robust linear regression coefficient estimates. To fill this gap we propose and demonstrate the efficacy of two Wald-like statistical tests for the above purposes, using for robust regression the class of MM-estimators. The tests are designed to detect significant differences between least squares and robust estimates due to both inefficiency of least squares under fat-tailed non-normality and significantly larger biases of least squares relative to robust regression coefficient estimators under bias inducing distributions. The asymptotic normality of the test statistics is established and the finite sample level and power of the tests are evaluated by Monte Carlo, with the latter yielding promising results. The first part of our research focuses on the LS and robust regression slope estimators, both of which are consistent under skewed error distributions. A second part of the research focuses on intercept estimation, in which case there is a need to adjust for some bias in the robust MM-intercept estimator under skewed error distributions. An interesting by-product of our research is that use of the slowly re-descending Tukey bisquare loss function leads to better test performance than the rapidly re-descending min-max bias optimal loss function.