Abstract
The ordinary least square (OLS) method is very efficient in estimating the regression parameters in a linear regression model under classical assumptions. If the model contains outliers, the performance of the OLS estimator becomes imprecise. Multicollinearity is another issue that can reduce the performance of the OLS estimator. This study proposed the Robust Jackknife Kibria-Lukman (RJKL) estimator based on the M-estimator to deal with multicollinearity and outliers. We examine the superiority of the estimator over existing estimators using theoretical proofs and Monte Carlo simulations. We put the estimator to the test once more using real-world data. We observed that the estimator performs better than the existing estimators.
Publisher
Nigerian Society of Physical Sciences
Subject
General Physics and Astronomy,General Mathematics,General Chemistry
Reference32 articles.
1. K. Ayinde, A. F. Lukman & O. Arowolo, “Robust regression diagnostics of influential observations in linear regression model”, Open Journal of Statistics 05 (2015) 273.
2. A. F. Lukman & K. Ayinde, “Review and classifications of the ridge parameter estimation techniques”, Hacettepe Journal of Mathematics and Statistics 46 (2017) 953.
3. A. F. Lukman, K. Ayinde, G. Kibria & S. L. Jegede, “Two-parameter modified ridge-type m-estimator for linear regression model”, The Scientific World Journal (2020).
4. ¨O. G. Alma, “Comparison of robust regression methods in linear regression”, Int. J. Contemp. Math. Sciences 6 (2011) 409.
5. Hampel, F. R. (2002). Robust Inference, (In A. H. ElShaarawi & W. W. Piegorsch (Eds.), Encyclopedia of Environmetrics;(pp. 1865–1885). New York: Wiley & Sons.)
Cited by
5 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献