Koenker & Basset, 1978 introduce the quantile regression estimator, that allows to have a more complete view of the effects a set explicative variables exerts on the response, not only on average but at different points of the conditional distribution: the conditional quantiles. The core of quantile regression is the use of an asymmetric check function that moves the regression line above or below the conditional median, allowing to consider location, scale and shape effects in the study of a statistical relationship. Quantile regression is increasingly implemented due to the variety of its possible applications, as evidenced by the growing number of related papers in recent years. For further details see Koenker, 2005, Hao & Naiman, 2007 and Davino et al. , 2013. A computationally simpler alternative to quantile regression, introduced by Newey & Powell, 1987, is provided by the expectile regression. Expectiles allow the analysis of a regression model at various points of the conditional distribution through the introduction of an asymmetric weighting system. Analogously to the asymmetric check function of the quantile regression estimator, such weighting system moves the least squares regression line, as estimated by ordinary least squares, above or below the regression passing through the conditional average. Compared to quantile regression, the expectile regression is computationally convenient and it still allows to characterize the complete conditional distribution of the response. The main characteristic of expectiles is the adoption of the L2 norm and this causes the lack of robustness of the expectile with respect to the quantile regression estimator. The class of robust regression estimators, the M-estimators (Huber, 1981), computes the regression at the conditional mean meanwhile curbing the impact of outliers on the estimated coefficients. Once again this estimator considers a weighting system to detect and bound outliers while estimating the regression coefficients. Breckling & Chambers, 1988 propose to merge the M–estimators and the expectile approach. Even if both methods are implemented within the least squares framework, robustness is ensured by the introduction of a weighting system to control the outlying observations. The asymmetric weighting system of expectiles is combined with weights bounding outliers and this allows to compute a robust regression away from the conditional mean. Along with the above estimators, it is worth mentioning the modal linear regression (Kemp & Santos Silva, 2012, Yao & Li, 2014). Here the focus is on modeling the conditional mode of the response variable, and it is well adapt in situations where conditional distributions are highly skewed: exploiting the mode features, modal regression reveals robust to outliers, in particular to heavy-tailed conditional error distributions.
A comparison among estimators for linear regression methods
VISTOCCO, Domenico
2015-01-01
Abstract
Koenker & Basset, 1978 introduce the quantile regression estimator, that allows to have a more complete view of the effects a set explicative variables exerts on the response, not only on average but at different points of the conditional distribution: the conditional quantiles. The core of quantile regression is the use of an asymmetric check function that moves the regression line above or below the conditional median, allowing to consider location, scale and shape effects in the study of a statistical relationship. Quantile regression is increasingly implemented due to the variety of its possible applications, as evidenced by the growing number of related papers in recent years. For further details see Koenker, 2005, Hao & Naiman, 2007 and Davino et al. , 2013. A computationally simpler alternative to quantile regression, introduced by Newey & Powell, 1987, is provided by the expectile regression. Expectiles allow the analysis of a regression model at various points of the conditional distribution through the introduction of an asymmetric weighting system. Analogously to the asymmetric check function of the quantile regression estimator, such weighting system moves the least squares regression line, as estimated by ordinary least squares, above or below the regression passing through the conditional average. Compared to quantile regression, the expectile regression is computationally convenient and it still allows to characterize the complete conditional distribution of the response. The main characteristic of expectiles is the adoption of the L2 norm and this causes the lack of robustness of the expectile with respect to the quantile regression estimator. The class of robust regression estimators, the M-estimators (Huber, 1981), computes the regression at the conditional mean meanwhile curbing the impact of outliers on the estimated coefficients. Once again this estimator considers a weighting system to detect and bound outliers while estimating the regression coefficients. Breckling & Chambers, 1988 propose to merge the M–estimators and the expectile approach. Even if both methods are implemented within the least squares framework, robustness is ensured by the introduction of a weighting system to control the outlying observations. The asymmetric weighting system of expectiles is combined with weights bounding outliers and this allows to compute a robust regression away from the conditional mean. Along with the above estimators, it is worth mentioning the modal linear regression (Kemp & Santos Silva, 2012, Yao & Li, 2014). Here the focus is on modeling the conditional mode of the response variable, and it is well adapt in situations where conditional distributions are highly skewed: exploiting the mode features, modal regression reveals robust to outliers, in particular to heavy-tailed conditional error distributions.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.