Skip to main content
Back

Variance and Standard Deviation: Definitions, Differences, and Applications

Study Guide - Smart Notes

Tailored notes based on your materials, expanded with key definitions, examples, and context.

Variance and Standard Deviation

Introduction

Variance and standard deviation are two fundamental measures of statistical dispersion, describing how data points in a set differ from the mean. Understanding their definitions, calculations, and differences is essential for analyzing data variability in statistics.

Definitions

  • Variance: The statistical measure of how far the numbers in a data set are spread out from their average (mean). It quantifies the average squared deviation of each data point from the mean.

  • Standard Deviation: The measure of dispersion of values in a data set relative to their mean. It represents the absolute variability of the dispersion and is the square root of the variance.

Formulas

  • Variance (S2): Where:

    • = value in the data set

    • = mean of the data set

    • = number of values

  • Standard Deviation (S): Where:

    • = standard deviation

Key Differences Between Variance and Standard Deviation

Variance

Standard Deviation

Statistical measure of how far numbers are spread in a data set from their average.

Measure of dispersion of values in a data set relative to their mean.

Helps determine the size of the data spread.

Measures the absolute variability of the dispersion.

Calculated by taking the average of the squared deviations of each value from the mean.

Calculated by taking the square root of the variance.

Units are squared units (e.g., if data is in meters, variance is in meters squared).

Units are the same as the original data (e.g., meters).

Variance is one of the key aspects of asset allocation in investing portfolios.

Standard deviation can be used as a measure of market and security volatility in finance.

Further Explanation

  • Standard deviation looks at how spread out a group of values is from the mean, by looking at the square root of the variance.

  • Variance measures the average degree to which each point differs from the mean—the average of the squared differences.

  • Variance uses squares because it weighs outliers more heavily than data closer to the mean.

  • The unit of standard deviation is the same as the units of the original data, while the units of variance are the squared units.

Example

  • Suppose a data set contains the values: 2, 4, 4, 4, 5, 5, 7, 9.

    • Mean ():

    • Variance ():

    • Standard Deviation ():

Applications

  • Variance is used in portfolio theory to assess asset allocation and risk.

  • Standard deviation is widely used in finance to measure market volatility and risk.

Additional info: The notes infer that variance and standard deviation are foundational concepts in statistics and finance, and their differences are crucial for interpreting data spread and risk.

Pearson Logo

Study Prep