Throughput vs. Latency

Instructor: Lonny Meinecke

Lonny was once a software programmer (video game industry). He now teaches psychology at King University. He has a bachelor's in IT and a PhD in psychology.

The actual speed of data on a network can be described by words like 'throughput' and 'latency.' In this lesson, we will define both of these words, and see how they help fulfill the demand for data on a network.

It's About Demand

The speed of data on a network can be described using many words you might not be familiar with. Two of those words are throughput and latency. Have you heard of these? Can you describe them? We will do that right now.

A good way to understand both throughput and latency is to think about the demand for data on your network. It wouldn't be much of a network if nothing happened - something has to happen to make using it worthwhile.

What happens most of the time is data goes from point A to point B, or there's some other sort of exchange. Somebody issues a demand for data, and somebody else sends that data. But things don't always go exactly as planned, so we have words to describe how they actually happen.

Demand for Data
demand for data

Demand Results in Throughput

First, let's talk about throughput. When you issue a demand for data, you get a certain amount of data in a certain amount of time. How much you get is called your 'throughput.' For example, you may demand two megabits per minute, and end up with just one megabit per minute. That actual one megabit a minute is your throughput (your demand was more, but that didn't quite happen). You don't want less than you asked for, do you? But you get less anyway, and we call that final amount throughput.

Throughput metaphor

Demand Results in Latency

Now let's talk about latency. When you issue a demand for data, a certain amount of time elapses before you get what you asked for. Ideally, no time at all would pass. In the real world, though, the exact time when you ask for data and the exact time when you get your data are not the same time. That delay is what we call latency. You don't want a delay, do you? But you get a delay, which we call latency.

Latency metaphor

Why Throughput and Latency Matter

Generally speaking, throughput is a measurable thing. We can figure out how much data actually makes it from one point to another on a network in a given amount of time. However, bear in mind that many things fluctuate, so even throughput is a bit fuzzy. Mostly, we think of throughput as an average rate of data as a result of demand for it. It can never be more than your bandwidth (the maximum theoretical rate of your data). But it's important because we need some tangible idea of how much data we can expect to move from point A to point B in order to coordinate traffic on the network.

To unlock this lesson you must be a Member.
Create your account

Register to view this lesson

Are you a student or a teacher?

Unlock Your Education

See for yourself why 30 million people use

Become a member and start learning now.
Become a Member  Back
What teachers are saying about
Try it now
Create an account to start this course today
Used by over 30 million students worldwide
Create an account