At AllTheScience, we're committed to delivering accurate, trustworthy information. Our expert-authored content is rigorously fact-checked and sourced from credible authorities. Discover how we uphold the highest standards in providing you with reliable knowledge.
The apparent magnitude of an object in outer space is how bright it appears on Earth, taking into account the effect of the Earth’s atmosphere. A brighter object has a lower magnitude than a dimmer one. The apparent magnitude scale is logarithmic, hence a star of magnitude one would be around two and a half times brighter than that of absolute magnitude two. Apparent magnitude is a commonly used measurement in astronomy, as it allows direct comparison of the relative brightness of two objects.
Visual apparent magnitude uses a scale where the lower the value the brighter the object for historical reasons. When stars where first being categorized, a star with a magnitude of one was considered to be in the brightest category. A star in category six was the faintest that a human eye could see. Since then the use of telescopes has meant it is now possible to see even more distant and dimmer stars. For example, the Hubble Space Telescope can see objects up to a magnitude of 31.5.
A star's apparent brightness depends on its size, as well as its distance from Earth. This is because the power emitted by a star follows an inverse square law, which means that if the distance is doubled the power decreases by four. For this reason, the apparent magnitude can only provide limited information about an object unless other variables are known.
While apparent magnitude is the brightness of a celestial object as seen from the Earth, the absolute magnitude is a measurement of the actual brightness of an object. In many situations, absolute magnitude is more useful than apparent magnitude since it takes into account the distance of an object. The apparent brightness of a star or other object must be known before the absolute magnitude can be calculated.
An important consideration when measuring magnitude is the frequency of light being emitted. All light-measuring instruments have a range of sensitivities depending on the light being measured, so the apparent brightness in one waveband may be different to that in another. To account for this, any measurement of apparent magnitude must include details on how it was obtained.
Some examples include the maximum brightness of Venus, which is -4.1; Sirius, the brightest star in the sky, which has a value of -1.47; and the maximum brightness of Pluto, which is 13.65. The sun has an observed magnitude of -26.7, making it the brightest object in the sky. By comparison, the full moon only has a magnitude of -12.6.