- published: 13 Jul 2012
- views: 43902
The apparent magnitude (m) of a celestial body is a measure of its brightness as seen by an observer on Earth, adjusted to the value it would have in the absence of the atmosphere. The brighter the object appears, the lower the value of its magnitude.
The scale now used to indicate magnitude originates in the Hellenistic practice of dividing stars visible to the naked eye into six magnitudes. The brightest stars were said to be of first magnitude (m = 1), while the faintest were of sixth magnitude (m = 6), the limit of human visual perception (without the aid of a telescope). Each grade of magnitude was considered twice the brightness of the following grade (a logarithmic scale). This somewhat crude method of indicating the brightness of stars was popularized by Ptolemy in his Almagest, and is generally believed to originate with Hipparchus. This original system did not measure the magnitude of the Sun. (For a more detailed discussion of the history of the magnitude system, see Magnitude.)
Stargazing Basics 2: Understanding star magnitude in astronomy
Absolute and Apparent Magnitude
Absolute and Apparent Magnitude
Astronomy - Measuring Distance, Size, and Luminosity (17 of 30) Apparent Magnitude
Top 10 Brightest Stars in the Night Sky (Apparent Magnitude)
Absolute Magnitude vs. Apparent Magnitude
Absolute vs Apparent Magnitude
Introductory Astronomy: Magnitudes of Stars
Unit 5 Astrophysics Lesson 7 Absolute and Apparent Magnitude
Glow On: Crash Course Kids #20.2
Building (Dan Terminus - Retrieving Apparent Magnitude)
ASB Hunter FOR SALE: "Apparent Magnitude"
Astronomy - Ch. 17: The Nature of Stars (3 of 37) Apparent Magnitude: Example
Astronomy - Ch. 17: The Nature of Stars (14 of 37) Apparent Magnitude: Another Look