I learnt about **Isserlis’ theorem** (also known as **Wick’s probability theorem**) at a talk today. The theorem comes from a paper from 1918, which is listed as Reference 1 below. In the words of Reference 2, the theorem

… allows [us] express to the expectation of a monomial in an arbitrary number of components of a zero mean Gaussian vector in terms of the entries of its covariance matrix only.

We introduce some notation (as in Reference 2) to state the theorem succinctly. Let be a set of integers such that for all . The need **not** be distinct. For any vector , denote

with the convention that . Let denote the set of all pairings of , i.e. partitions of into disjoint pairs. For a pairing , let denote the set of indices such that the pairs in are .

(As an example, if , one possible pairing is . For this pairing, a possible choice of is , with and .)

We are now ready to state the theorem:

**Theorem (Isserlis’ theorem):** Let be a set of integers such that for all , and let be a Gaussian vector with zero mean. If is even, then

If is odd, then .

Here are some special cases of Isserlis’ theorem to demonstrate how to interpret the equation above. If for , there are 3 possible pairings, giving us

If we take for , there are still 3 possible pairings, and we get

This tells us that the 4th moment of a mean-zero 1-dimensional gaussian random variable is 3 times the square of its 2nd moment.

As a final example, if we take and , we still have 3 possible pairings, and we get

References:

- Isserlis, L. (1918) On a formula for the product-moment coefficient of any order of a normal frequency distribution in any number of variables.
- Vignat, C. (2011) A generalized Isserlis theorem for location mixtures of Gaussian random vectors.

### Like this:

Like Loading...

*Related*