Detect Multicollinearity and Diagnosis


Multicollinearity is a problem when there is very high correlation between predictors. It increases the noise in the datawhich result to inaccurate results.



1. It might happened because of wrong dummy variables.
2. It might possible that one predictors get calculated with other variables and we included both the variables in the model.
3. It is also caused by the duplicacy of variables where name are different however both are same variables.



1. Regression coefficient may not be computed precisely. The standard will be too high in that case.
2. Multicollinearity changes the magnitudes of the regression coefficient for different samples.

In case of Multicollinearity, range of confidence interval of the coefficient will be high. And in most of the cases,
null hypothesis will not rejected.


1. First, it can be detected with the help of Variance Inflation Factor (VIF). If the VIF is more than 10 then there is
multicollinearity problem.If you want to be more conservative then benchmark for VIF can be lowered till 4.
2. Second, if individual factor is not significant but overall output is showing significant result.
3. Third, while running the model on two different samples, you find significant changes in the coefficient values.
3. Fourth, if you add or drop one predictor and model show big differences then there is problem of multicollinearity.

How to handle Multicollinearity in R


Variance inflation factors measure the inflation in the variances of the coefficients due to collinearity that exist among the predictors.


model <- lm(mpg ~ disp + hp + wt + qsec, data = mtcars)
## # A tibble: 4 x 3
## Variables Tolerance VIF
## <chr> <dbl> <dbl>
## 1 disp 0.125 7.99
## 2 hp 0.194 5.17
## 3 wt 0.145 6.92
## 4 qsec 0.319 3.13


If you find the above VIF value is greater than 10 then it should either removed or use catiously.


Leave a Reply

Your email address will not be published. Required fields are marked *