Developing Continuous Probability Distributions Theoretically & Finding Expected Values Video

An error occurred trying to load this video.

Try refreshing the page, or contact customer support.

Coming up next: Probabilities as Areas of Geometric Regions: Definition & Examples

You're on a roll. Keep up the good work!

Take Quiz Watch Next Lesson
 Replay
Your next lesson will play in 10 seconds
  • 0:05 Random Processes and Variables
  • 2:07 Continuous Probability…
  • 2:51 Expected Value Computation
  • 4:59 Lesson Summary
Save Save Save

Want to watch this again later?

Log in or sign up to add this lesson to a Custom Course.

Log in or Sign up

Timeline
Autoplay
Autoplay
Speed Speed
Lesson Transcript
Instructor: Artem Cheprasov
What is an expected value? How can you tell how many time you should expect a coin to land on heads out of several flips? This lesson will show you the answers to both questions!

Random Processes and Variables

We are all familiar with processes that involve uncertainty. One example is flipping a coin, in which there is an inherent uncertainty in whether it would land on heads or tails, and the best we can do is assign probabilities to all of the possible outcomes.

In this lesson, you will learn about continuous probability distributions from a theoretical perspective as well as how to find expected values.

Mathematically, we can treat such processes by defining random variables and the probability distribution function. A random variable is a variable that describes all of the possible outcomes of a random process.

There are two types of random variables - discrete and continuous. Discrete random variables involve processes in which the total number of possible outcomes is countable. For example, flipping a coin can be characterized by a discrete random variable. Suppose we flip a coin twice and ask about how many times it will land on heads. We can describe the possible outcomes using a random variable, X:


null


In contrast, continuous random variables involve processes in which the total number of possible outcomes is not countable. An example would be the time it takes for a random athlete to run one mile. Since, at least in theory, the time can be measured to an infinite precision, there is an uncountable number of possible time measurements. For instance, we can measure the time to be 6.2 minutes or 6.23 minutes or 6.234 minutes, and so on.

Therefore, we define a continuous random variable, X, associated with the time measurements for a one-mile run. Assuming that the fastest time cannot be less than 3.5 minutes and the longest time cannot exceed 11 minutes, we define the random variable as an interval:


null


Now, let's use this random variable to develop the theory behind continuous probability distributions.

Continuous Probability Distributions

Since any time measurement for the one-mile run can have an infinite precision, the curve of the probability distribution function will be continuous. This implies that for any measurement x sub 1, there exists a correspond value for f(x sub 1), and f(x) is also referred to as the probability density function, a continuous probability distribution function. One example is the Gaussian distribution:


null


Continuous probability distributions can have many other shapes, with the Gaussian being just one example. As long as we can map any value x sub 1 to a corresponding f(x sub 1), the probability distribution is continuous.

Expected Value Computation

The expected value, which can be thought of as the outcome that we should expect on average, is computed using the following formula for discrete probability distributions:


null


Inside the summation to calculate the expected value, E(x), we have the values that the random variable can take on, denoted by x sub K, that are multiplied by their corresponding probabilities, denoted by P(x sub K).

Let's go back to the earlier example of flipping a coin twice and counting the number of times it lands on heads. We can compute the expected value as follows.

After two coin flips, the possible outcomes of the coin landing on heads are 0, 1, and 2:


null


There is one possibility for the coin to land on tails both times,


null


two possibilities for the coin to land on heads in one of the flips,


null


To unlock this lesson you must be a Study.com Member.
Create your account

Register to view this lesson

Are you a student or a teacher?

Unlock Your Education

See for yourself why 30 million people use Study.com

Become a Study.com member and start learning now.
Become a Member  Back
What teachers are saying about Study.com
Try it risk-free for 30 days

Earning College Credit

Did you know… We have over 200 college courses that prepare you to earn credit by exam that is accepted by over 1,500 colleges and universities. You can test out of the first two years of college and save thousands off your degree. Anyone can earn credit-by-exam regardless of age or education level.

To learn more, visit our Earning Credit Page

Transferring credit to the school of your choice

Not sure what college you want to attend yet? Study.com has thousands of articles about every imaginable degree, area of study and career path that can help you find the school that's right for you.

Create an account to start this course today
Try it risk-free for 30 days!
Create an account
Support