Skip to main content
Back

Jointly Distributed Random Variables, Expectation, and Properties: Study Notes

Study Guide - Smart Notes

Tailored notes based on your materials, expanded with key definitions, examples, and context.

Jointly Distributed Random Variables

Marginal and Joint Distributions

When dealing with two or more random variables, their joint distribution describes the probability structure of all variables together. The joint probability mass function (for discrete variables) or joint probability density function (for continuous variables) provides the probability or density for all combinations of values.

  • Marginal Distribution: The probability distribution of a subset of the variables, obtained by summing (discrete) or integrating (continuous) the joint distribution over the other variables.

  • Example (Continuous): If is the joint density of and , then the marginal density of $X$ is .

  • Example (Discrete): If is the joint mass function, then .

Independence

Random variables and are independent if the joint distribution factors into the product of their marginals:

  • For all sets and , .

  • For continuous variables: .

  • For discrete variables: .

Conditional Distributions

  • Discrete:

  • Continuous:

Order Statistics

The order statistics of a sample are the sorted values .

  • If are independent and identically distributed (i.i.d.) with density , the joint density of the order statistics is:

Exchangeable Random Variables

A sequence is exchangeable if the joint distribution is invariant under permutations:

  • for any permutation .

  • For discrete variables, is a symmetric function of .

Example: Drawing balls without replacement from an urn, the indicator variables for drawing a special ball are exchangeable.

Key Properties and Examples

  • Sum of Independent Exponential Random Variables: The sum of independent exponential() random variables is a gamma random variable with parameters $n$ and $\lambda$.

  • Polya's Urn Model: In this model, the sequence of draws is exchangeable, and the probability that the th ball is red is always , regardless of $i$.

  • Order Statistics of Uniform Variables: The spacings between order statistics of i.i.d. uniform(0,1) random variables are exchangeable.

Expectation and Its Properties

Definition of Expectation

  • Discrete:

  • Continuous:

  • If , then .

Expectation of Sums

  • Linearity: (holds for any random variables with finite expectations).

  • For random variables: .

  • Sample Mean: For i.i.d. with mean , where .

Indicator Variables and Counting

  • Let be the indicator of event ( if $A$ occurs, $0E[I_A] = P(A)$.

  • For , .

  • Boole's Inequality: .

Expectation of a Random Sum

  • If is a random variable (number of terms), and are i.i.d. and independent of $N$, then:

Conditional Expectation

  • Definition (Discrete):

  • Definition (Continuous):

  • Law of Total Expectation:

  • Law of Total Probability:

Conditional Variance

  • Definition:

  • Law of Total Variance:

Prediction and Best Predictors

  • The function that minimizes is .

  • The best linear predictor of given is , where:

,

Covariance, Variance of Sums, and Correlation

Covariance

  • Definition:

  • If and are independent, (but the converse is not always true).

  • Properties:

Variance of Sums

  • $Var(X_1 + ... + X_n) = \sum_{i=1}^n Var(X_i) + 2 \sum_{i

  • If are independent,

Correlation

  • Definition:

  • means and are uncorrelated (but not necessarily independent).

Moment Generating Functions (MGFs)

Definition and Properties

  • Definition:

  • For discrete :

  • For continuous :

  • Moments: The th moment is (the $n$th derivative at ).

  • MGF of Sums: If and are independent,

  • Uniqueness: The MGF (if it exists in a neighborhood of ) uniquely determines the distribution.

Common MGFs

  • Binomial():

  • Poisson():

  • Exponential(): for

  • Normal():

Multivariate Normal Distribution

Definition and Properties

  • A vector is multivariate normal if each is a linear combination of independent standard normal variables plus a constant.

  • The joint MGF is:

  • The joint distribution is completely determined by the means and covariances.

  • Sum of Independent Normals: The sum of independent normal random variables is normal, with mean and variance equal to the sum of the means and variances, respectively.

Sample Mean and Variance (Normal Case)

  • If are i.i.d. normal():

    • The sample mean and sample variance are independent.

    • has a chi-squared distribution with degrees of freedom.

Summary Table: Key Formulas and Properties

Concept

Formula / Property

Marginal (Continuous)

Marginal (Discrete)

Independence

or

Conditional Expectation

Law of Total Expectation

Covariance

Variance of Sum

$Var(\sum X_i) = \sum Var(X_i) + 2 \sum_{i

Correlation

MGF (Sum of Independent)

Best Linear Predictor

, ,

Additional info:

  • Many of the examples and exercises in the source material illustrate the application of these concepts to real-world problems, such as the coupon collector's problem, random walks, and the analysis of algorithms (e.g., quick-sort).

  • For more advanced or theoretical results (e.g., Stieltjes integrals, general definition of expectation), the notes provide a foundation for understanding expectation beyond the discrete and continuous cases.

Pearson Logo

Study Prep