Accepted version, 351.4Kb
Publicly accessible from 2017-02-01
Jung, Y., Lee, S. P., & Hu, J. (2016). Robust regression for highly corrupted response by shifting outliers. Statistical Modelling, 16(1), 1–23. http://doi.org/10.1177/1471082X15624040
Permanent Research Commons link: http://hdl.handle.net/10289/9931
Outlying observations are often disregarded at the sacrifice of degrees of freedom or downsized via robust loss functions (e.g., Huber's loss) to reduce the undesirable impact on data analysis. In this article, we treat the outlying status of each observation as a parameter and propose a penalization method to automatically adjust the outliers. The proposed method shifts the outliers towards the fitted values, while preserve the non-outlying observations. We also develop a generally applicable algorithm in the iterative fashion to estimate model parameters and demonstrate the connection with the maximum likelihood based estimation procedure in the case of least squares estimation. We establish asymptotic property of the resulting parameter estimators under the condition that the proportion of outliers does not vanish as sample size increases. We apply the proposed outlier adjustment method to ordinary least squares and lasso-type penalization procedure and demonstrate its empirical value via numeric studies. Furthermore, we study applicability of the proposed method to two robust estimators, Huber's robust estimator and Huberized lasso, and demonstrate its noticeable improvement of model fit in the presence of extremely large outliers.
This is an author’s accepted version of an article published in the journal: Statistical Modelling. © 2016 Sage.