Stargazing For Dummies
Book image
Explore Book Buy On Amazon

Astronomers measure a star’s brightness using something called the magnitude scale, where each star – or planet or faint fuzzy – has a magnitude brightness value. When looking up at stars from the Earth – which you’ll do – astronomers measure how bright the star appears to them. This is the star’s apparent magnitude. Because you can’t tell how far away a star is just by looking at it, you don’t know how intrinsically bright it is – what its absolute magnitude is.

The brighter an object is, the lower the magnitude number. For example, the North Star has an apparent magnitude of 2.0, whereas Betelgeuse in the constellation of Orion has an apparent magnitude of 0.4, which makes Betelgeuse brighter than the North Star.

Your eye can only see stars brighter (with a lower magnitude) than magnitude 6.5 (if you’ve got great eyesight, and in perfect sky conditions). Very bright objects (like the bright stars Sirius, Canopus and Arcturus, the planets Venus, and Jupiter, or the Moon) have negative magnitudes.

The magnitude scale is not a linear scale. For every point up the magnitude scale, an object gets around two-and-a-half times dimmer (2.52 to be more exact). So a star of magnitude 1 is 100 times brighter than a star of magnitude 6 (that is, there’s a difference of five magnitude points, and so a difference of 2.52 x 2.52 x 2.52 x 2.52 x 2.52 = 100 in brightness).

About This Article

This article is from the book:

About the book author:

Steve Owens is a freelance science writer and presenter with a passion for astronomy. He has been the recipient of the 'Campaign for Dark Skies' Award for Dark Sky Preservation, and he was nominated for the Arthur C. Clarke Award for public science engagement.

This article can be found in the category: