Apparent Magnitude

- Use the scale of apparent magnitude
- Demonstrate an understanding of the apparent magnitude scale and how it relates to observed brightness of stars

Hipparchus, a Greek astronomer, devised a method of measuring the brightness of stars.

A bright star would be said to have an apparent magnitude of 1. A faint star has an apparent magnitude of 6.

There are a few important changes to this system since the invention of the telescope and space telescope. The ancient Greeks had access to neither. We can now see much fainter stars with binoculars, telescopes and even the Hubble Space Telescope.

A few stars, planets and of course our own Sun have been recategorised so they appear brighter than 1. Sirius appears at -1, Venus at -4, a full Moon at -9 and the Sun at -29.

Each difference in brightness represents an increase or decrease of 2.512. For the exam we can abbreviate this to 2.5.

So a star at magnitude 3 is 2.5 times brighter than a star at magnitude 4.

The same star would be 15.6 times brighter than a star at apparent magnitude 6. This is because there is a three times magnitude difference which is 2.5 x 2.5 x 2.5 = 15.6.

The difference between a star at magnitude 1 and one at 6 is 2.5 5 (to the power of 5) which is 100 (2.5 x 2.5 x 2.5 x 2.5 x 2.5 x 2.5).

In calculations apparent magnitude is represented by m.

Questions

You can practice some apparent magnitude calculations on the Magnitude Calculations page.

 

Summary of scale of Magnitudes
Difference in Magnitude Brightness Ratio
1 2.5
2 6.25
3 16
4 40
5 100
Drag & Drop
Links