How to obtain moment bound from the importance sampling identity?

  • I
  • Thread starter WMDhamnekar
  • Start date
  • Tags
    Probability
  • #1
WMDhamnekar
MHB
376
28
TL;DR Summary
Let ##m(t) =E[X^t]## The moment bound states that for a > 0, ##P\{ X \geq a \}\leq m(t)a^{-t} \forall t > 0##. How would you prove this result using importance sampling identity?
Let ##X## be a non-negative random variable and let a > 0. We want to bound the probability ##P\{X \geq a\}## in terms of the moments of X.
- Define a function ##h(x) = \mathbb{1}\{x \geq a\}##, where ##\mathbb{1}\{\cdot\}## is the indicator function that returns 1 if the argument is true and 0 otherwise. Then, we have ##P\{X \geq a\} = E_f[h(X)]##, where ##E_f## denotes the expected value with respect to the pdf of X.
- Choose another random variable Y with probability density function (pdf) ##f_Y(y)## such that ##f_Y(y) > 0## whenever ##f_X(y) > 0##, where ##f_X(x)## is the pdf of X. This is called the importance distribution. Define the importance weight as ##w(x) = f_X(x)/f_Y(x)##.
- Apply the importance sampling identity to write ##E_f[h(X)] = E_g\left[\frac{h(Y)w(Y)}{g(Y)}\right]## where the expectation on the right-hand side is taken with respect to Y.
Now how to proceed further? Can we use here Jensen's Inequality?
 
Physics news on Phys.org
  • #2
My Answer:
The importance sampling identity states that for any measurable function f and random variable X with probability density function p, the expected value of f(X) can be expressed as:

##E[f(X)] = \int f(x) p(x) dx = \int f(x) \frac{p(x)}{q(x)} q(x) dx,##

where q is another probability density function that we choose. This identity allows us to estimate E[f(X)] by sampling from q instead of p.

Now, let's move on to proving the moment bound using the importance sampling identity. We want to show that for any positive number a and any positive exponent t, the probability that ##X^t## is greater than or equal to ##a^t## is bounded by ##m(t) \cdot a^{-t}##.

To do this, we will choose the function ##f(x) = \mathbb{I}(x \geq a)##, where ##\mathbb{I}## is the indicator function that takes the value 1 if the condition inside the parentheses is true, and 0 otherwise. By applying the importance sampling identity to ##f(x^t)##, we have:

##E[f(X^t)] = \int \mathbb{I}(x^t \geq a^t) \frac{p(x^t)}{q(x^t)} q(x^t) dx^t.##

Now, notice that when ##x^t \geq a^t##, the indicator function takes the value 1, and 0 otherwise. Therefore, we can rewrite the above expression as:

##E[f(X^t)] = \int \mathbb{I}(x^t \geq a^t) \frac{p(x^t)}{q(x^t)} q(x^t) dx^t = \int_{x^t \geq a^t} \frac{p(x^t)}{q(x^t)} q(x^t) dx^t.##

Now, let's focus on the integral ##\int_{x^t \geq a^t} \frac{p(x^t)}{q(x^t)} q(x^t) dx^t##. We can rewrite this integral as the expectation of the random variable ##\frac{p(X^t)}{q(X^t)} \cdot q(X^t)##, where ##X^t## is sampled from the distribution q. Therefore, we have:

##E[f(X^t)]=\int_{x^t \geq a^t} \frac{p(x^t)}{q(x^t)} q(x^t) dx^t = E\left[\frac{p(X^t)}{q(X^t)} \cdot q(X^t)\right].##

Now, since ##\frac{p(X^t)}{q(X^t)} \cdot q(X^t)## is a non-negative random variable, we can apply Markov's inequality, which states that for any non-negative random variable Y and any positive constant c, we have ##P\{Y \geq c\} \leq \frac{E[Y]}{c}##. Applying Markov's inequality to our expression, we get:

##P\left\{\frac{p(X^t)}{q(X^t)} \cdot q(X^t) \geq {a^t}\right\} \leq \frac{E\left[\frac{p(X^t)}{q(X^t)} \cdot q(X^t)\right]}{a^t} =m(t) {a^{-t}}.##

But notice that the event ##\left\{\frac{p(X^t)}{q(X^t)} \cdot q(X^t) \geq {a^t}\right\}## is equivalent to the event ##\left\{X^t \geq a^t\right\}##. Therefore, we have:

##P\left\{X^t \geq a^t\right\} \leq {m(t)}{a^{-t}}.##

And there you have it! We have successfully proved the moment bound using the importance sampling identity. This result tells us that for any positive number a and any positive exponent t, the probability that ##X^t## is greater than or equal to ##a^t## is bounded by ##m(t) \cdot a^{-t}##.

I think now this answer seems to look correct. Isn't it?
 

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
0
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
548
  • Set Theory, Logic, Probability, Statistics
Replies
17
Views
871
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
515
  • Calculus and Beyond Homework Help
Replies
19
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
701
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
4K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
862
Back
Top