## Table of Contents

- Random Variable Definition
- Types of Random Variable
- What is a Discrete Random Variable?
- What is a Continuous Random Variable?
- Lesson Summary

Understand what is a random variable and why it is used. Learn about the types of random variables and see examples of the random variables from everyday life.
Updated: 12/10/2021

- Random Variable Definition
- Types of Random Variable
- What is a Discrete Random Variable?
- What is a Continuous Random Variable?
- Lesson Summary

A random variable, also known as a stochastic variable, means a collection of possible outcomes and their corresponding probabilities. In practical use, the meaning of **random variable** can be intuitively understood to be a variable that may take on different values randomly but whose value is not known.

More specifically, random variable definition

is as a set of possible outcomes, called a **sample space**, along with a **probability distribution function** that assigns specific outcomes or groups of outcomes to numbers between 0 and 1 that represent probabilities.

The outcome can represent an event that will happen in the future, like the result of rolling a 6-sided dice. In this example, the sample space is the set of integers from 1 to 6, with each integer corresponding to one side of the dice. For a fair dice, the probability of each of these outcomes is 1/6.

A random variable does not necessarily need to represent something that will happen in the future. A random variable can also represent a quantity that already exists but for which the precise value is unknown. For example, in a doctor's office, the *systolic blood pressure of the next patient to be treated* could be seen as a random variable. Now, the patient has some particular systolic blood pressure, but it is not precisely known until measured.

Consider the example of rolling six-sided dice. The sample space **S** is a finite set of six integers:

{eq}S_\text{dice roll} = \{1,2,3,4,5,6\} {/eq}

In the blood pressure example above, the sample space is the set of nonnegative real numbers because blood pressure is measured as a single real number and cannot be negative:

{eq}S_\text{blood pressure} = \{x \in \mathbb{R} \mid x\geq 0\} {/eq}

Finally, consider flipping a coin repeatedly until it first comes up heads. The random variable representing *the number of coin flips required to get heads* has a sample space that is all of the positive integers (the natural numbers):

{eq}S_\text{flip coin until heads} = \{x \mid x \in \mathbb{N} \} = \{1,2,3, \ldots \} {/eq}

There are two types of random variables: **discrete** random variables and **continuous** random variables. Random variables are classified as discrete or continuous based on whether the sample space is **countable** or **uncountable**.

Discrete and continuous random variables are different in that, for a discrete random variable, each outcome in the sample space has an associated probability, while for a continuous random variable, each outcome instead has a **probability density** and probabilities are instead assigned to *ranges* of outcomes.

A discrete random variable is defined as a random variable for which the sample space is countable. A countable sample space is one that has either a finite number of outcomes, like rolling a six-sided dice, or has a **countably infinite** number of outcomes. An infinite sample space is countably infinite when it's possible to assign a natural number (a positive integer) to each outcome.

In the example above, where a coin is repeatedly flipped until heads come up, the sample space of the number of flips this takes is countably infinite, and therefore this random variable is classified as discrete according to the definition of a discrete random variable.

For a discrete random variable, every outcome in the sample space has an associated probability, and the random variable as a whole can be described using a probability distribution function in the form of a histogram.

The probability distribution function P gives the specific probabilities of the different outcomes. The probability that a person gets heads on the first coin flip is 1/2, so this means that P(1) = 1/2, as shown in this histogram.

The probability that it takes two coin flips in getting first heads is equal to the probability of getting tails on the first flip *and* getting heads on the second; that is, the probability is {eq}\frac{1}{2} \times \frac{1}{2} = \frac{1}{4}{/eq}. Likewise, the probability that they get the first heads on the {eq}n^{\text{th}} {/eq} coin flip is {eq}\frac{1}{2^n} {/eq}. Note that the sum of all of the probabilities in the probability distribution function is always 1.

A continuous random variable is defined as a random variable for which the sample space is **uncountable**. Usually, this means that the random variable can take on values from a range of real numbers. One example could be a person's *systolic blood pressure*. This is measured as a positive real number, and a typical value is approximately 120 mmHg.

For another example, consider throwing a dart at a circular dartboard. The *distance of the dart from the center of the dartboard* is a continuous random variable because it could be any real number between 0 if it were to hit the center exactly, and the radius of the dartboard, if it were to hit the very edge.

Suppose a dart is thrown at a dartboard with a radius of 1 meter, and it lands at some point on the board. The random variable representing the distance from the center must be some number between 0 and 1 meters. Therefore the sample space for this random variable example is the interval of the real number line {eq}S = [0,1] {/eq}

For a continuous random variable like this, individual outcomes in the sample space don't have probabilities, but rather a probability *densities*. The continuous random variable is represented by a **probability density function** with a continuous domain instead of a probability distribution function with a discrete domain.

With a continuous random variable, probabilities are associated with *ranges* of the sample space, called **events**, instead of individual points in the sample space. Probabilities are determined by integrating the probability density function over a range. The integral of the probability density function over the entire sample space is always 1, similar to how all of the probabilities in a discrete probability distribution function always sum to 1.

In this random variable example, to find the probability that the dart lands within 0.2 meters of the center of the target denoted P(x < 0.2), integrate the probability density function {eq}f(x) = -2x+2 {/eq} over the range {eq}[0,0.2] {/eq}:

{eq}P(x < 0.2) = \int_0^{0.2} \left ( -2x + 2 \right ) dx = 0.36 {/eq}

This means there is a probability of 0.36, or a 36% chance, that the dart will land within this range.

Events can also be collections of ranges. In general, in a continuous sample space, an event is an open subset of a sample space or any countable union or intersection of open subsets, along with their complements. This means that ranges can be conjoined using logical operators like OR, AND, and NOT to yield other events. For example, the chance of the dart landing within 0.1 meters OR greater than 0.9 meters from the target is an event with an associated probability. It can represented as {eq}P(x < 0.1 \text{ OR } x > 0.9) {/eq}, and has a probability of

{eq}P(x < 0.1) + P(x > 0.9) = \int_0^{0.1} \left (-2x + 2 \right ) dx + \int_{0.9}^1 \left ( -2x + 2 \right ) dx = 0.2 {/eq}

Random variables link the possible outcomes of some random event with probabilities. A random variable can be thought of as a function that has a domain including all possible outcomes of the random event, called a **sample space**. The **probability distribution function** maps specific outcomes to probabilities between 0 and 1 in the case of a discrete random variable. For a continuous random variable, the probability distribution function maps certain *subsets* of outcomes to probabilities. These subsets are referred to as **events** and can be specific ranges of the sample space or combinations of ranges. In a continuous random variable, the individual outcomes have **probability densities** instead of probabilities. The probability of an event is be found by integrating over the probability density function.

To unlock this lesson you must be a Study.com Member.

Create your account

Frequently Asked Questions

A random variable is a function that associates certain outcomes or sets of outcomes with probabilities. Random variables are classified as **discrete** or **continuous** depending on the set of possible outcomes or **sample space**.

A variable is a random variable when it is meant to represent the outcome of some random event. Usually, it is denoted by a capital letter, like X or Y.

Are you a student or a teacher?

Already a member? Log In

BackWhat teachers are saying about Study.com

Already registered? Log in here for access

Related Study Materials

Browse by subject