Throughput vs. Latency

Instructor: Lonny Meinecke

Lonny teaches psychology classes at King University, and has a bachelor's degree in IT and a doctorate in psychology.

The actual speed of data on a network can be described by words like 'throughput' and 'latency.' In this lesson, we will define both of these words, and see how they help fulfill the demand for data on a network.

It's About Demand

The speed of data on a network can be described using many words you might not be familiar with. Two of those words are throughput and latency. Have you heard of these? Can you describe them? We will do that right now.

A good way to understand both throughput and latency is to think about the demand for data on your network. It wouldn't be much of a network if nothing happened - something has to happen to make using it worthwhile.

What happens most of the time is data goes from point A to point B, or there's some other sort of exchange. Somebody issues a demand for data, and somebody else sends that data. But things don't always go exactly as planned, so we have words to describe how they actually happen.


Demand for Data
demand for data

Demand Results in Throughput

First, let's talk about throughput. When you issue a demand for data, you get a certain amount of data in a certain amount of time. How much you get is called your 'throughput.' For example, you may demand two megabits per minute, and end up with just one megabit per minute. That actual one megabit a minute is your throughput (your demand was more, but that didn't quite happen). You don't want less than you asked for, do you? But you get less anyway, and we call that final amount throughput.


Throughput metaphor
throughput

Demand Results in Latency

Now let's talk about latency. When you issue a demand for data, a certain amount of time elapses before you get what you asked for. Ideally, no time at all would pass. In the real world, though, the exact time when you ask for data and the exact time when you get your data are not the same time. That delay is what we call latency. You don't want a delay, do you? But you get a delay, which we call latency.


Latency metaphor
latency

Why Throughput and Latency Matter

Generally speaking, throughput is a measurable thing. We can figure out how much data actually makes it from one point to another on a network in a given amount of time. However, bear in mind that many things fluctuate, so even throughput is a bit fuzzy. Mostly, we think of throughput as an average rate of data as a result of demand for it. It can never be more than your bandwidth (the maximum theoretical rate of your data). But it's important because we need some tangible idea of how much data we can expect to move from point A to point B in order to coordinate traffic on the network.

To unlock this lesson you must be a Study.com Member.
Create your account

Register to view this lesson

Are you a student or a teacher?

Unlock Your Education

See for yourself why 30 million people use Study.com

Become a Study.com member and start learning now.
Become a Member  Back
What teachers are saying about Study.com
Try it risk-free for 30 days

Earning College Credit

Did you know… We have over 200 college courses that prepare you to earn credit by exam that is accepted by over 1,500 colleges and universities. You can test out of the first two years of college and save thousands off your degree. Anyone can earn credit-by-exam regardless of age or education level.

To learn more, visit our Earning Credit Page

Transferring credit to the school of your choice

Not sure what college you want to attend yet? Study.com has thousands of articles about every imaginable degree, area of study and career path that can help you find the school that's right for you.

Create an account to start this course today
Try it risk-free for 30 days!
Create an account
Support