Centering in linear regression is one of those things that we learn almost as a ritual whenever we are dealing with interactions. We are taught time and time again that centering is done because it decreases multicollinearity and multicollinearity is something bad in itself. In my opinion, centering plays an important role in the interpretation of OLS multiple regression results when interactions are present, but I dunno about the multicollinearity issue. The thing is that high intercorrelations among your predictors (your “Xs” so to speak) makes it difficult to find the inverse of , which is the essential part of getting the correlation coefficients. But that was a thing like YEARS ago! Nowadays you can find the inverse of a matrix pretty much anywhere, even online! However, we still emphasize centering as a way to deal with multicollinearity and not so much as an interpretational device (which is how I think it should be taught).
Anyhoo, the point here is that I’d like to show what happens to the correlation between a product term and its constituents when an interaction is done. Let’s take the following regression model as an example:
Because and
are kind of arbitrarily selected, what we are going to derive works regardless of whether you’re doing
or
. I am gonna do
. In any case, we first need to derive the elements of
in terms of expectations of random variables, variances and whatnot. Remember that the key issue here is
. You’ll see how this comes into place when we do the whole thing:
This last expression is very similar to what appears in page #264 of the Cohen et.al. blue regression textbook. It is not exactly the same though because they started their derivation from another place. But you can see how I could transform mine into theirs (for instance, there is a from which I could get a version for
but my point here is not to reproduce the formulas from the textbook. The point here is to show that, under centering,
which leaves
. The reason as for why I am making explicit the product
is to show that whatever correlation is left between the product and its constituent terms depends exclusively on the 3rd moment of the distributions. For any symmetric distribution (like the normal distribution) this moment is zero and then the whole covariance between the interaction and its main effects is zero as well.
Let’s take the case of the normal distribution, which is very easy and it’s also the one assumed throughout Cohen et.al and many other regression textbooks. I’ll show you why, in that case, the whole thing works. Consider
following a bivariate normal distribution such that:
Then for and
both independent and standard normal we can define:
Now take:
And multiply both terms:
Now, that looks boring to expand but the good thing is that I’m working with centered variables in this specific case, so and:
Notice that, by construction, and
are each independent, standard normal variables so we can express the product as
because
is really just some generic standard normal variable that is being raised to the cubic power.
Now, we know that for the case of the normal distribution so:
So now you know what centering does to the correlation between variables and why under normality (or really under any symmetric distribution) you would expect the correlation to be 0.