Person

Ph.D. Student, expected 2020

Biostatistics Department

Harvard T.H. Chan School of Public Health

Teaching

Publications

Selected ones. Full list here

Question of the Day

More questions? here!

Consider an infinite sequence of independent Bernoulli trials $X_1, X_2, \cdots.$ Suppose $\mathbb{P}(X_i=1)=1/4^i$, show:

  1. $\mathbb{P}(\sum_{i=n}^{\infty}X_i > 0) \le \frac{1}{3\times 4^{n-1}}$.
  2. Use the previous result to show $\mathbb{P}(\sum_{i=1}^{\infty}X_i < n) \ge 1-\frac{1}{3\times 4^{n-1}}$.
  3. Find $\mathbb{P}(\sum_{i=1}^{\infty}X_i < \infty)$.
  4. Show that $\mathbb{P}(\sum_{i=1}^{\infty}X_i = 0) < e^{-1/3}$.

Harvard Biostatistics Qualifying Exam (Jan 2017)

solution

  1. By Markov Inequality, we have \begin{align} LHS &= \mathbb{P}(\sum_{i=n}^{\infty}X_i \ge 1) \le \mathbb{E}(\sum_{i=n}^{\infty}X_i) = \sum_{i=n}^{\infty} \frac{1}{4^i} \\ &= \lim_{m\rightarrow \infty} \frac{1}{4^n} \frac{1-(1/4)^{m-n-1}}{1-(1/4)} = RHS \end{align}
  2. It follows from: $$\mathbb{P}(\sum_{i=1}^{\infty}X_i < n) \ge \mathbb{P}(\sum_{i=n}^{\infty}X_i = 0)$$
  3. Sending $n$ to $\infty$, we obtain that the probability is 1. Alternatively, we can use Borel Cantelli. Since the probabilities are summable, we have $\mathbb{P}(\{X_i=1 \; i.o.\}) = 0$. And it follows that $\exists \, \Omega_0$ with $\mathbb{P}(\Omega_0)=1$, and $\forall \, \omega \in \Omega_0$, we can find a large number $M$ such that $X_i(\omega)=0$ for all $i \ge M$, implying $\sum_{i=1}^{\infty}X_i(\omega) < M < \infty$.
  4. Using the fact $1-x < e^{-x}$ for $x>0$, we have \begin{align} LHS &= \prod_{i=1}^\infty \mathbb{P}(X_i=0) = \prod_{i=1}^{\infty} \big( 1-\frac{1}{4^i} \big) \\ &< \prod_{i=1}^{\infty} \big( e^{-1/4^i} \big) = e^{\sum_{i=1}^\infty -1/4^i} =RHS \end{align}

Contact