Astronomy

How are Stars Rated for their Brightness



Tweet
Kevin Manning's image for:
"How are Stars Rated for their Brightness"
Caption: 
Location: 
Image by: 
©  

Starlight Star Bright

How bright a star appears in the sky is commonly referred to as its apparent magnitude. More than 2,000 years ago during the Hellenistic period a Greek astronomer named Hipparchus came up with a system of classifying stars according to their brightness. In this original magnitude scale Hipparchus assigned the brightest stars a value of 1, and the faintest stars barely visible to the unaided eye he assigned a value of 6, a difference of 5 magnitudes or 100 times in brightness. In other words, a 1st magnitude star is 100 times brighter than a 6th magnitude star. Think of it as a foot race on a track team with 6 competitors. The one that breaks the tape coming in first place is the fastest runner, while the one that comes in 6th or last place is the slowest runner in the race. Similarly, the brightest star comes in first place on the magnitude scale, and each subsequent higher number of magnitude is fainter than the last.

Each level of magnitude is about 2 1/2 times difference in brightness than the next whole number magnitude (a logarithmic scale). This crude system was formalized in 1856 by Norman Robert Pogson. In the modern system, the magnitude scale is no longer limited to just 6 or only to visible light, so now it spans from the brightness of the sun far into the negative range at –26.73 to stars recorded by the Hubble Space Telescope down to around 30th magnitude. Many believe that the North Star, Polaris, is the brightest star of them all as seen from earth in the night sky, but it actually is not even in the top 20 brightest stars. That designation has been reserved for what is known as the "Dog Star" called Sirius (yes, like the satellite radio company). Many people also believe that the closest stars to us appear to be the brightest while the faintest stars are much further away. To the contrary, most of the nearest stars are too faint to be seen with the unaided eye while some of the brightest stars are very distant. This is because stars are intrinsically luminous (total energy output) due mainly to their size and temperature. According to the inverse square law for light, how bright a star appears is inversely proportional to the square of its distance. So if a given star was twice as far away it would appear 1/4th as bright. The absolute magnitude scale is a measurement of how bright a star would appear if it was placed at a standard fixed distance of 10 parsecs (~32 light years) away.

The accuracy of measuring stellar magnitudes to a small fraction is accomplished using photoelectric photometry. Magnitude values are complicated by the fact that light travels in differing wavelengths according to the source, so it is important to know the bandpass used in the light detector. Also, the atmosphere is thicker near the horizon and thinner straight overhead in the line of sight to the star, so to standardize stellar magnitudes their brightness is considered as if there were no atmosphere.

Tweet
More about this author: Kevin Manning

From Around the Web




ARTICLE SOURCES AND CITATIONS