Using the Magnitude Scale to Compare Star Brightness
The Magnitude Scale
One of the most famous and last great astronomers of the ancient world was Claudius Ptolemaeus. You know him better as Ptolemy. A genius of a man, he worked in Alexandria, the capital of all things science back in his day.
He developed a system for the motions of the planets, oriented maps using the cardinal directions, and invented the concepts of latitude and longitude.
He also did one other thing - what this lesson is all about. He described the brightness of stars he observed using what's called the magnitude system, or magnitude scale, an astronomical brightness scale.
Whether he completely came up with this idea on his own or it was simply a summary of ideas of astronomers that came before him (such as those of Hipparchus) is best left for a history lesson to debate. Instead, this lesson will describe what this magnitude scale is and how astronomers use it today.
Apparent Brightness & Visual Magnitude
The magnitude system Ptolemy apparently invented was based on apparent brightness, how bright a star appears to be to an observer. This property depends on the distance the observer is from the star. The sun will be fainter when viewed from Uranus than when viewed from Earth just like a flashlight appears fainter when you're farther away from it than when you're closer to it even though its true brightness never changes.
Anyways, ancient astronomers divided the stars they observed into six classes. The brightest stars were known as first-magnitude stars and the faintest stars were known as sixth-magnitude stars. This clearly means that the larger the magnitude, the fainter the star.
Using modern telescopes and detectors, today's astronomers have been more precise in estimating magnitude, so much so that magnitudes like 3.34 or 0.04 exist. Actually, some stars are so bright that the current magnitude scale moves into negative numbers. The scale has expanded into the other direction as well. Stars beyond about the sixth magnitude are ones that are so faint that we can't see them with our unaided eyes but we can see them with telescopes.
Today, each magnitude is more formally referred to as apparent visual magnitude, the apparent brightness of a star as seen from Earth, expressed using the magnitude scale. Since apparent visual magnitude depends on the human eye, it only includes the visible form of light. Infrared or ultraviolet light, which we can't see naturally, are excluded from any measurements even though some stars emit a lot of these types of light.
A Quick Calculation
The standard magnitude scale is able to tell us how much brighter one star is compared to another if we know their individual magnitudes.
Astronomers have figured out that an increase of 5 magnitudes is equal to a 100-fold decrease in brightness. Thus, an increase in one magnitude is equal to the fifth root of 100, which is 2.512. This means something like a fourth magnitude star is 2.512 times brighter than a fifth magnitude one.
![]() |
In this way, you can actually figure out how the relative brightness of a star compares to another if you know their magnitudes. Just use the equation on your screen to do this. Here b stands for the apparent brightness of a specific star and m stands for the magnitude of a star.
If we know that star one has a magnitude of 8 and star two has a magnitude of 3, we subtract 8 from 3 to get -5; -5 then becomes the superscript over 2.512. Using a calculator to compute the answer, we get a value of about 0.01. This means that star one is only 1% as bright as star 2. That is, star one is really faint, and you'll need a telescope to see it.
Lesson Summary
The magnitude scale is an astronomical brightness scale. It's based on the principle of apparent brightness, how bright a star appears to be to an observer. This property depends on the distance the observer is from the star. Even though a star may be super bright intrinsically, it may be so far away from Earth that it appears really faint.
The apparent brightness of a star as seen from Earth, expressed using the magnitude scale, is known as apparent visual magnitude. The larger the magnitude, the fainter the star. You must remember that this magnitude scale only takes visible light into account.
Learning Outcomes
After this lesson, you should be able to:
- Define apparent brightness
- Explain what the magnitude scale is
- Describe the relationship between apparent brightness and visual magnitude
- Identify the equation to calculate relative brightness
To unlock this lesson you must be a Study.com Member.
Create your account
Register to view this lesson
Unlock Your Education
See for yourself why 30 million people use Study.com
Become a Study.com member and start learning now.
Become a MemberAlready a member? Log In
Back