Table of Contents

Discrete Random Variables

If the random variable $X$ can take on a finite or countable infinite number of values $x_1, x_2, \ldots$ corresponding to probabilities $P(X=x_i) = p_i$, the expected value is defined as:

$$E[X] = \sum_{i} x_i p_i$$

Continuous Random Variables

If the random variable $X$ is continuous with a probability density function $f(x)$, the expected value is defined as:

$$E[X] = \int_{-\infty}^{\infty} x f(x) dx$$