The collinearity problem in regression arises when at least one linear function of the independent variables is very nearly equal to zero. (Techni- cally, a set of vectors is collinear when a linear function is exactly equal to zero. In general discussions of the collinearity problem, the term “collinear” is often used to apply to linear functions that are only approximately zero. This convention is followed in this text.) This near-singularity may arise in several ways.

When assessing bivariate correlations, two issues should be considered. First, correlations of even .70 (which represents “shared” variance of 50%) can impact both the explanation and estimation of the regression results. Moreover, even lower correlations can have an impact if the correlation between the two independent variables is greater than either independent variable’s correlation with the dependent measure (e.g., the situation in our earlier example of the reversal of signs).

• The suggested cutoff for the tolerance value is. 10 (or a corresponding VIF of 10.0), which corresponds to a multiple correlation of .95 with the other independent variables. When val- ues at this level are encountered, multicollinearity problems are almost certain. However, problems are likely at much lower levels as well. For example, a VIF of 5.3 corresponds to a multiple correlation of .9 between one independent variable and all other independent vari- ables. Even a VIP of 3.0 represents a multiple correlation of .82, which would be considered high if between dependent and independent variables.

Therefore,

**Tolerance**:

proportion of a regressor’s variance not accounted for by other regressors in the model

low tolerance values are an indicator of multicollinearity

**Variance Inflation Factor (VIF)**

the reciprocal of the tolerance

large VIF values are an indicator of multicollinearity