# Exponential Twisting in Importance Sampling

##### Posted on Sep 18, 2019
Tags: Importance Sampling

Consider the problem of estimating

where $X$ is a random vector of $\IR^N$ with probability density $f$, and $G$ is a function from $\IR^N$ to $\IR$.

In the framework of Gaussian vector, the ordinal density $f(x)$ is

where $x = (x_1,\ldots,x_N)$.

Some research is mainly on adding a drift vector $\mu=(\mu_1,\ldots,\mu_N)$ and the new corresponding density is

then the key problem is how to achieve the following minimum

since

For a cumulative distribution function $F$ on $\IR^N$, define the cumulant generating function of $F$,

where $\phi_\vartheta(x)$ is called exponential twisting function, if $\phi_\vartheta(x)$ is chosen to satisfy $\psi(\vartheta) < \infty$.

For each twisting function $\phi_\vartheta(x)$, set

It can be showed that $F_{\phi_\vartheta}(x)$ is a probability distribution and $\{F_{\phi_\vartheta},\vartheta\}$ form an exponential family of distributions. The transformation from $F$ to $F_{\phi_\vartheta}$ is called Exponential Twisting or Exponential change of measure.

If $F$ has a density $f$, then $F_{\phi_\vartheta}$ has density

here the weighted function $W(x)=e^{-\phi_\vartheta(x)+\psi(\vartheta)}$, so $f(x)=W(x)f_{\phi_\vartheta}(x)$.

The aim is to compute the expectation $V=E[G(X)]$. Using exponential twisting function $\phi_\vartheta(x)$, we have

and the second moment of $G(X)W(X)$ under $P_{\phi_\vartheta}$

By the Cauchy-Schwarz inequality,

It follows that

In order to achieve the optimal situation, we have

but in this case, $\psi(\vartheta)=\log(E[G(X)])$ = \log(V) cannot be computed analytically, as $V$ is unknown.

Apply Taylor Expansion for $\log(G(X))$,

where $\rho=\sqrt{\sum_{i=1}^N(X_i-X_{i0})^2}$.

Write

In order to avoid overfitting and simplify the least square regression, remove the cross terms,

Published in categories Memo