ESTIMATORS FOR HANDLING MULTICOLLINEARITY PROBLEMS IN LINEAR REGRESSION MODEL WITH NORMALLY AND UNIFORMLY DISTRIBUTED REGRESSORS

Authors

  • A.H. BELLO Department of Statistics, School of Physical Sciences, Federal University of Technology, Akure

Keywords:

Multicollinearity, Autocorrelation, Estimator, Regressors and TSP- Time Series Processor

Abstract

The assumptions of the classical linear regression model are hardly satisfied in real life situation. Violation of the
assumption of independent explanatory variables and error terms in linear regression model leads to the problems of
multicollinearity and autocorrelation respectively. Estimators to handle each problem have been separately developed
by different authors. Moreover, in practice, these two problems do co-exist but estimators to handle them jointly are
rare. Consequently, this research proposed and validate two estimators, Feasible Ordinary Ridge Estimators (FORE)
and Feasible Generalized Ridge Estimators (FGRE), to handle the problems of multicollinearity and autocorrelation
separately. The existing and proposed estimators were categorized into five (5) groups namely: One–Stage Estimators
(OSE), Two–Stage Estimators (TSE), Feasible Generalized Least Square Estimators (FGLSE), Two-Process
Estimators (TPE) and Modified Ridge Estimators (MRE). Monte Carlo experiments were conducted one thousand
(1000) times on a linear regression model exhibiting different degrees of multicollinearity ( 0.4, 0.6, 0.8, 0.95
and 0.99) with both normally and uniformly distributed regressors and autocorrelation (
) at six sample sizes (n =10, 20, 30, 50, 100 and 250). In this study our autocorrelation
is set to Zero (ρ = 0). Finite sampling properties of estimators namely; Bias (BAS), Mean Absolute Error (MAE),
Variance (VAR) and most importantly Mean Square Error (MSE) of the estimators were evaluated, examined and
compared at each specified level of multicollinearity, autocorrelation and sample sizes. These were done by ranking
the estimators on the basis of their performances according to the criteria so as to determine the best estimator. Results
of the investigation when multicollinearity alone was in the model revealed that the best estimator is in the category
of One-Stage Estimator (OSE). With normally distributed regressor, the best estimator is generally the existing
estimator OREKBAY except under the bias criterion. At this instance, the estimators FGLSE, ML and CORC are the
best. Also, with uniformly distributed regressor, it was observed that the best estimator under all criteria is the existing
estimator OREKBAY except under the bias criterion. At this instance the OLSE and FGLSE - ML are the best.

Downloads

Published

2022-06-15

Issue

Section

Articles