Apparent Magnitude13.1 - Understand the astronomical magnitude scale and how apparent magnitude relates to the brightness of stars as viewed from Earth
Hipparchus, a Greek astronomer, devised a method of measuring the brightness of stars.
A bright star would be said to have an apparent magnitude of 1. A faint star has an apparent magnitude of 6.
There are a few important changes to this system since the invention of the telescope and space telescope. The ancient Greeks had access to neither. We can now see much fainter stars with binoculars, telescopes and even the Hubble Space Telescope.
A few stars, planets and of course our own Sun have been recategorised so they appear brighter than 1. Sirius appears at -1, Venus at -4, a full Moon at -9 and the Sun at -29.
Each difference in brightness represents an increase or decrease of 2.512. For the exam we can abbreviate this to 2.5.
So a star at magnitude 3 is 2.5 times brighter than a star at magnitude 4.
The same star would be 15.6 times brighter than a star at apparent magnitude 6. This is because there is a three times magnitude difference which is 2.5 x 2.5 x 2.5 = 15.6.
The difference between a star at magnitude 1 and one at 6 is 2.5 5 (to the power of 5) which is 100 (2.5 x 2.5 x 2.5 x 2.5 x 2.5 x 2.5).
In calculations apparent magnitude is represented by m.
Summary of scale of Magnitudes
|Difference in Magnitude||Brightness Ratio|
Mix & Match
|−3||Jupiter (max.), Mars (max.)|
|0||Vega, Saturn (max.)|
|6||Typical limit of naked eye|
|10||Typical limit of 7×50 binoculars|
|13||3C 273 quasar|
|Limit of 4.5–6 in (11–15 cm) telescopes|
|Limit of 8–10 in (20–25 cm) telescopes|
|27||Visible light limit of 8m telescopes|
|32||Visible light limit of HST|