Idea Transcript
CONTINUOUS RANDOM VARIABLES
1. Continuous random variable Definition 1.1. A random variable X is said to be continuous if its distribution function can be written as Z x fX (u)du, P [X ≤ x] = −∞
for some integrable fX : R → [0, ∞), which is called density function of X.
Remark 1.1. The intuition behind fX is that P [X ∈ [x, x + dx]] ≈ f (x)dx. Remark 1.2. If F is differentiable then fX can be chosen to be the derivative of F . Lemma 1.1. If X has a density function f , then R∞ (1) −∞ f (x)dx = 1, (2) P [X = x] = 0 for any x, Rb (3) P [a ≤ X ≤ b] = a f (x)dx. More generally, for nice sets B ⊆ R Z f (x)dx. P [X ∈ B] = B
Remark 1.3. The nice sets we are talking about are called Borel sets. They include, intervals, union of intervals and basically all intervals on which you could actually compute the integral. 2. Expectation Definition 2.1. We define the expectation of a continuous random variable to be Z ∞ E(X) = xf (x)dx. −∞
Proposition 2.1. For X a positive continuous random variable Z ∞ E(X) = (1 − F (x))dx. 0
1
2
If X and g(X) are continuous random variables then Z ∞ g(x)fX (x)dx E[g(X)] = −∞
3. Important continuous random variables Uniform distribution: X is uniform on 0 F (x) = (x − a)/(b − a) 1
[a, b] if if x ≤ a if a < x ≤ b if x > b.
Exponential distribution X is exponential with parameter λ > 0, if F (x) = 1 − e−λx ,
x ≥ 0.
Normal distribution X has a normal (or gaussian) distribution of mean µ and variance σ 2 > 0 if it has a density function given by (x − µ)2 1 f (x) = √ exp − , x ∈ R. 2σ 2 2πσ 2 This distribution is very important and is denoted N (µ, σ 2 ). Proposition 3.1. Let X be a random variable with N (µ, σ 2 ) distribution. For any a, b ∈ R, we have that aX + b has distribution N (aµ + b, a2 σ 2 ). 4. Dependence Definition 4.1. The joint distribution function F : R2 → [0, 1] of X and Y , is given by F (x, y) = P (X ≤ x and Y ≤ y).
Definition 4.2. The random variables X and Y are jointly continuous with joint density function f : R2 → [0, ∞) if Z y Z x F (x, y) = f (u, v)dudv, for x, y ∈ R. −∞
−∞
Remark 4.1. If F is sufficiently differentiable, e.g. C 2 , then f (x, y) =
∂2 F (x, y). ∂x∂y
3
Definition 4.3. When X and Y are defined jointly, we call marginal distribution functions of X and Y , FX (x) = P [X ≤ x] = lim F (x, y) and FY (x) = P [Y ≤ x] = lim F (x, y). y→∞
x→∞
5. Properties Theorem 5.1. If g : R2 → R is nice Z E[g(X, Y )] = g(x, y)f (x, y)dxdy. R2
Proposition 5.1. E[aX + bY ] = aE[X] + bE[Y ]. Proposition 5.2. If X and Y have joint density function f then Z = X + Y has density function Z ∞ fZ (z) = f (x, z − x)dx. −∞
Proposition 5.3. Two continuous random variables are independent (i.e. P (X ≤ x and Y ≤ y). = P (X ≤ x)P (Y ≤ y)), if, and only if, fX,Y (x, y) = fX (x)fY (y).