What is an expected value? How can you tell how many time you should expect a coin to land on heads out of several flips? This lesson will show you the answers to both questions!
Random Processes and Variables
We are all familiar with processes that involve uncertainty. One example is flipping a coin, in which there is an inherent uncertainty in whether it would land on heads or tails, and the best we can do is assign probabilities to all of the possible outcomes.
In this lesson, you will learn about continuous probability distributions from a theoretical perspective as well as how to find expected values.
Mathematically, we can treat such processes by defining random variables and the probability distribution function. A random variable is a variable that describes all of the possible outcomes of a random process.
There are two types of random variables - discrete and continuous. Discrete random variables involve processes in which the total number of possible outcomes is countable. For example, flipping a coin can be characterized by a discrete random variable. We can describe the possible outcomes using a random variable, X:
In contrast, continuous random variables involve processes in which the total number of possible outcomes is not countable. An example would be the time it takes for a random athlete to run one mile. Since, at least in theory, the time can be measured to an infinite precision, there is an uncountable number of possible time measurements. For instance, we can measure the time to be 6.2 minutes or 6.23 minutes or 6.234 minutes, and so on.
Therefore, we define a continuous random variable, X, associated with the time measurements for a one-mile run. Assuming that the fastest time cannot be less than 3.5 minutes and the longest time cannot exceed 11 minutes, we define the random variable as an interval:
Now, let's use this random variable to develop the theory behind continuous probability distributions.
Continuous Probability Distributions
Since any time measurement for the one-mile run can have an infinite precision, the curve of the probability distribution function will be continuous. This implies that for any measurement x sub 1, there exists a correspond value for f(x sub 1), and f(x) is also referred to as the probability density function, a continuous probability distribution function. One example is the Gaussian distribution:
Continuous probability distributions can have many other shapes, with the Gaussian being just one example. As long as we can map any value x sub 1 to a corresponding f(x sub 1), the probability distribution is continuous.
Expected Value Computation
The expected value, which can be thought of as the outcome that we should expect on average, is computed using the following formula for discrete probability distributions:
Inside the summation to calculate the expected value, E(x), we have the values that the random variable can take on, denoted by x sub K, that are multiplied by their corresponding probabilities, denoted by P(x sub K).
Let's go back to the earlier example of flipping a coin twice and counting the number of times it lands on heads. We can compute the expected value as follows.
After two coin flips, the possible outcomes of the coin landing on heads are 0, 1, and 2:
There is one possibility for the coin to land on tails both times,
two possibilities for the coin to land on heads in one of the flips,
and one possibility for the coin to land on heads both times.
Over 79,000 lessons in all major subjects
Get access risk-free for 30 days,
just create an account.
Plugging numbers into our formula from before, we get the following answer:
The result implies that if we were to flip a coin twice, we should expect it to land on heads in one of the two flips.
The expected value, E(x), of a continuous probability distribution, can be calculated using the following formula:
Knowing this, let's look at another example. Assume that the probability density function, f(x), is equal to 2x. Further, assume that the interval in which f(x) takes on non-zero values is between 1 and 5, as shown on the screen:
We can then compute the expected value as follows:
By plugging and chugging away, our answer becomes 82.7.
We discussed the meaning of discrete and continuous random variables, with a focus on the theory behind continuous probability distributions.
A random variable is a variable that describes all of the possible outcomes of a random process. Random variables can be of two types - discrete and continuous.
A discrete random variable is associated with a process that has a countable number of possible outcomes, such as in a coin flip, while a continuous random variable involves processes that have an uncountable number of possible outcomes, such as height and weight measurements.
In addition, the probability density function is a continuous probability distribution function. We have also learned how to compute expected values for both types of random variables. The expected value can be thought of as the outcome that we should expect on average.
Calculating the expected value involves summation for discrete random variables and integration for continuous random variables. You should now feel comfortable doing similar problems by yourself.
Vocabulary & Definitions
Random variable: A random variable is a discrete or continuous variable that describes all of the possible outcomes.
Probability density function: A probability density function is a continuous graph showing the distribution of the function.
Expected value: An expected value is the average outcome expected.
After viewing this lesson, you should be able to do the following:
Did you know… We have over 200 college
courses that prepare you to earn
credit by exam that is accepted by over 1,500 colleges and universities. You can test out of the
first two years of college and save thousands off your degree. Anyone can earn
credit-by-exam regardless of age or education level.