WeiYa's Work Yard

A dog, who fell into the ocean of statistics, tries to write down his ideas and notes to save himself.

The Applications of Monte Carlo

Posted on September 07, 2017 0 Comments

Monte Carlo Integration

In order to calculate the following integral

we often adopt Monte Carlo technique

  1. Firstly, compute the volume $V$ of region $D$
  2. Secondly, sample $\mathbf x^{(1)}, \mathbf x^{(2)},\ldots, \mathbf x^{(m)}$ uniformly from region $D$
  3. Finally, approximate the integral by

According to law of large numbers, we have

and from central limit theorem,

where $\sigma^2 = var(g(\mathbf x))$

Importance Sampling

It is worthy noting that the above sampling is uniform. However, it is often difficult to produce uniform random samples in an arbitrary region $D$. To overcome this problem, we can adopt importance sampling in which one generates random samples $\mathbf x^{(1)}, \mathbf x^{(2)},\ldots, \mathbf x^{(m)}$ from a nonuniform distribution $\pi(\mathbf x)$ that puts more probability mass on important parts of the state space $D$. Then one can estimate integral $I$ as

Optimization Problem

Suppore we need to find the minimum of a target function $h(\mathbf x)$. The problem is equivalent to finding the maximum of another function, $q_T(\mathbf x)=exp(-h(\mathbf x)/T)$ (as long as $T>0$).

In the case when $q_T(\mathbf x)$ is integrable for all $T>0$, we can make up a family of probability distributions:

If we can sample from $\pi_T(\mathbf x)$ when $T$ is sufficiently small, resulting random draws will most likely be located in the vicinity of the global minimum of $h(\mathbf x)$. This consideration is the basis of the well-known simulated annealing algorithm and is also key to the tempering techniques for designing more efficient Monte Carlo algorithms.

References

Liu, Jun S. Monte Carlo strategies in scientific computing. Springer Science & Business Media, 2008.


Published in categories Monte Carlo