Profiling a random variable
E(X) 與 Var(X) 可以理解為一個隨機變數的快照:E(X) 大致描述X的中央趨勢 (central tendency)而 Var(X) 則反映 X 的分散程度。更進一步
Chebyshev’s Inequality
Let X be a random variable with mean \( E(X)= \mu, Var(X)=\sigma^2 \). Then for any c >0,
$$ P(|X-\mu| \geq c) \leq \frac{Var(X)}{c^2}.$$
ref: Textbook: 13.2, page 183
Remark
- Chebyshev’s bound is universal, i.e. for any random variable with finite mean and variance. Therefore, it may not be sharp for some random variables. Besides, it does not work well when c is small.
- Example: Let X be \( N(\mu, \sigma^2)\) and take \( c=\sigma/2, \sigma, 2 \sigma \). Compute
$$ P(|X-\mu| \geq c) $$
and compare with the Chebyshev’s bound.
Probability Integral formula
Let X be a random variable with cdf F. Then \( F(X) \sim U(0,1). \)
Hint of proof. You may start by assuming F has a inverse funciton \( F_{-1}\) to get some ideas about what’s going on. However, PIF holds without this assumption.
Homework 6
Textbook Sec 7.6: 7.2, 7.4, 7.8, 7.9, 7.15, 7.17; Sec 13.6: 13.1. To be discussed in class 12/06.