Assume that you have a random number generator which gives you (i.e. uniform distribution on the interval ) and a cumulative distribution function . Assume for simplicity that is strictly increasing, so that is well-defined as a functional inverse. Inverse transform sampling allows us to use this set-up to draw samples from : If then . Thus, we can take to be our sample.

(If is not strictly increasing, we can take .)

Did you know that we can modify this slightly to draw samples , conditional on , where ? Instead of taking as the sample, take instead.

Here is the proof: Let be the CDF of given . Then if , and for , .

Now, for any ,

which means that has the same distribution as , i.e. (by an application of inverse transform sampling).

### Like this:

Like Loading...

*Related*

Hi! I’m also a second-year PhD student in Stanford, working mainly on physics and information theory. While the proof of inverse transform sampling is nice, I’m not sure I’ve ever had to draw samples from a CDF conditioned on the relevant r.v. being greater than some value – could you give an example of where this is useful?

LikeLike

Hi! Sampling from truncated distributions comes up often in Gibbs sampling. It can also come up in some importance sampling applications. This paper by Prof Art Owen contains an example of this: https://arxiv.org/pdf/1710.06965.pdf

LikeLike

Pingback: Sampling a multivariate Gaussian subject to a half-space constraint | Statistical Odds & Ends

Pingback: Importance sampling example | Statistical Odds & Ends