For insight on the construction of this matrix, consider the typical least squares regression formulation. Here, the response (Y) is a linear function of predictors (x’s) plus error (ε):
Each row of the data table contains a response value and values for the p predictors. For each observation, the predictor values are considered fixed. However, the response value is considered to be a realization of a random variable.
Considering the values of the predictors fixed, for any set of Y values, the coefficients, , can be estimated. In general, different sets of Y values lead to different estimates of the coefficients. The Correlation of Estimates option calculates the theoretical correlation of these parameter estimates. (For technical details, see Details of Custom Test Example.)
A high positive correlation between two estimates suggests that a collinear relationship might exist between the two corresponding predictors. Note, though, that you need to interpret these correlations with caution (Belsley et al. 1980, p. 185, 92–94). Also, a rescaling of a predictor that shifts its mean changes the correlation of its parameter estimate with the intercept’s value.
1.
|
2.
|
Select Analyze > Fit Model.
|
3.
|
4.
|
Select Total Population, Median School Years, Total Employment, and Professional Services and click Add.
|
5.
|
6.
|
Click Run.
|
7.
|
From the Response red triangle menu, select Estimates > Correlation of Estimates.
|
Figure 3.41 Correlation of Estimates Report
The report (Figure 3.41) shows high negative correlations between the parameter estimates for the Intercept and Median School Years (–0.9818). High negative correlations also exist between Total Population and Total Employment (–0.9746).