Magnitude: How Bright Is That Star?
The magnitude system is astronomy’s brightness scale — and it runs backwards. Brighter objects have lower magnitudes. The Sun is $-26.7$, the full Moon is $-12.7$, and the faintest stars visible to the naked eye are about $+6$.
The ancient scale
Hipparchus (~150 BCE) ranked stars into six classes: the brightest were “first magnitude,” the faintest visible were “sixth magnitude.” In 1856, Pogson formalised this: a difference of 5 magnitudes corresponds to exactly a factor of 100 in brightness.
$$\frac{F_1}{F_2} = 100^{(m_2 - m_1)/5} = 10^{0.4(m_2 - m_1)}$$
One magnitude step ≈ $2.512\times$ brightness. The scale is logarithmic — our eyes perceive brightness roughly logarithmically, which is why the ancient system works so well.
Apparent vs absolute magnitude
Apparent magnitude ($m$) is how bright something looks from Earth. It depends on both intrinsic brightness and distance.
Absolute magnitude ($M$) is how bright something is — defined as the apparent magnitude it would have at a standard distance of 10 parsecs (32.6 light-years). The Sun’s apparent magnitude is $-26.7$ but its absolute magnitude is only $+4.83$ — a perfectly ordinary star, just very close.
The distance modulus connects them:
$$m - M = 5 \log_{10}\left(\frac{d}{10}\right)$$
where $d$ is the distance in parsecs. This single equation is one of the most-used tools in astronomy: measure $m$ (with a telescope), determine $M$ (from the star’s type), and you get $d$.
Some reference points
| Object | Apparent mag | Note |
|---|---|---|
| Sun | $-26.7$ | 400,000× brighter than full Moon |
| Full Moon | $-12.7$ | |
| Venus (max) | $-4.6$ | Brightest planet |
| Sirius | $-1.5$ | Brightest star |
| Naked eye limit | $\sim +6$ | ~9,000 stars visible |
| Hubble limit | $\sim +31$ | ~10 billion× fainter than naked eye |