Are the brightest stars low magnitude or high?
Apparent magnitude is a measure of the brightness of a celestial object as seen from Earth. The lower the number, the brighter the object. Negative numbers indicate extreme brightness. The full moon has an apparent magnitude of -12.6; the sun’s is -26.8. We can see objects up to 6th magnitude without a telescope. Apparent magnitude is abbreviated m. This system of rating the brightness of celestial objects was developed by the Greek astronomer Hipparchus in 120 B.C.
Absolute magnitude is a measure of the inherent brightness of a celestial object. This scale is defined as the apparent magnitude a star would have if it were seen from a distance of 32.6 light-years (10 parsecs). The lower the number, the brighter the object. Negative numbers indicate extreme brightness.
Astronomers use a special term to talk about the brightness of stars. The term is “magnitude”. The magnitude scale was invented by the ancient Greeks around 150 B.C. The Greeks put the stars they could see into six groups. They put the brightest stars into group 1, and called them magnitude 1 stars. Stars that they could barely see were put into group 6. So, in the magnitude scale, bright stars have lower numbers.
A star that is one magnitude number lower than another star is about two-and-a-half times brighter. A magnitude 3 star is 2.5 times brighter than a magnitude 4 star. A magnitude 4 star is 2.5 times brighter than a magnitude 5 star.
A star that is five magnitude numbers lower than another star is exactly 100 times brighter. A magnitude 1 star is 100 times brighter than a magnitude 6 star.