Using the Magnitude Scale to Compare Star Brightness

Lesson Transcript
Instructor: Artem Cheprasov

Artem has a doctor of veterinary medicine degree.

The magnitude scale is used in astronomy to measure and compare star brightness. Learn about the magnitude scale and its components of apparent brightness and visual magnitude, then practice comparing brightness with a quick calculation example. Updated: 10/29/2021

The Magnitude Scale

One of the most famous and last great astronomers of the ancient world was Claudius Ptolemaeus. You know him better as Ptolemy. A genius of a man, he worked in Alexandria, the capital of all things science back in his day.

He developed a system for the motions of the planets, oriented maps using the cardinal directions, and invented the concepts of latitude and longitude.

He also did one other thing - what this lesson is all about. He described the brightness of stars he observed using what's called the magnitude system, or magnitude scale, an astronomical brightness scale.

Whether he completely came up with this idea on his own or it was simply a summary of ideas of astronomers that came before him (such as those of Hipparchus) is best left for a history lesson to debate. Instead, this lesson will describe what this magnitude scale is and how astronomers use it today.

An error occurred trying to load this video.

Try refreshing the page, or contact customer support.

Coming up next: Determining the Temperature of a Star

You're on a roll. Keep up the good work!

Take Quiz Watch Next Lesson
Your next lesson will play in 10 seconds
  • 0:02 The Magnitude Scale
  • 0:58 Apparent Brightness &…
  • 2:37 A Quick Calculation
  • 3:54 Lesson Summary
Save Save Save

Want to watch this again later?

Log in or sign up to add this lesson to a Custom Course.

Log in or Sign up

Speed Speed

Apparent Brightness & Visual Magnitude

The magnitude system Ptolemy apparently invented was based on apparent brightness, how bright a star appears to be to an observer. This property depends on the distance the observer is from the star. The sun will be fainter when viewed from Uranus than when viewed from Earth just like a flashlight appears fainter when you're farther away from it than when you're closer to it even though its true brightness never changes.

Anyways, ancient astronomers divided the stars they observed into six classes. The brightest stars were known as first-magnitude stars and the faintest stars were known as sixth-magnitude stars. This clearly means that the larger the magnitude, the fainter the star.

Using modern telescopes and detectors, today's astronomers have been more precise in estimating magnitude, so much so that magnitudes like 3.34 or 0.04 exist. Actually, some stars are so bright that the current magnitude scale moves into negative numbers. The scale has expanded into the other direction as well. Stars beyond about the sixth magnitude are ones that are so faint that we can't see them with our unaided eyes but we can see them with telescopes.

Today, each magnitude is more formally referred to as apparent visual magnitude, the apparent brightness of a star as seen from Earth, expressed using the magnitude scale. Since apparent visual magnitude depends on the human eye, it only includes the visible form of light. Infrared or ultraviolet light, which we can't see naturally, are excluded from any measurements even though some stars emit a lot of these types of light.

A Quick Calculation

The standard magnitude scale is able to tell us how much brighter one star is compared to another if we know their individual magnitudes.

To unlock this lesson you must be a Member.
Create your account

Register to view this lesson

Are you a student or a teacher?

Unlock Your Education

See for yourself why 30 million people use

Become a member and start learning now.
Become a Member  Back
What teachers are saying about
Try it now
Create an account to start this course today
Used by over 30 million students worldwide
Create an account