The equicorrelation matrix is the matrix where the entries on the diagonal are all equal to
and all off-diagonal entries are equal to some parameter
which lies in
. If we were to write out the matrix, it would look something like this:
Alternatively, we can write it as , where
is the column vector with all entries being 1 and
is the identity matrix.
Here are some useful properties of the equicorrelation matrix:
- It is a Toeplitz matrix, and hence has the properties that all Toeplitz matrices has (see e.g. this link).
- It has two eigenvalues. The first eigenvalue is
, with associated eigenvector
. The second eigenvalue is
, with
associated eigenvectors
, where the entries of
are
This can be verified directly by doing the matrix multiplication.
. This is because the determinant of a square matrix is equal to the product of its eigenvalues.
is positive definite if and only if
. A sketch of the proof can be found in the answer here. It boils down to proving some properties of the determinant expression in the previous point.
. This can be verified directly by matrix multiplication. It can also be derived using the Sherman-Morrison formula.
Hi, there should be a typo in the inverse of Sigma, the `{\bf 11}^T` should be in the numerator, not in the denominator.
LikeLike
Thanks for spotting that! I’ve amended the formula.
LikeLike
What about the covariance matrix version of this correlation matrix, i.e. with diagonal element \sigma^2_i and off-diagonal value \rho\sigma_i\sigma_j ? Can we deduce some of its properties? Thanks!!
LikeLike
If E is the equicorrelation matrix and S is the diagonal matrix with the \sigma_i’s on the diagonal, then the covariance matrix is simply SES. With this formula we should be able to derive analogs of the properties above.
LikeLike