Chapter 17 Analyzing Starlight

17.1 The Brightness of Stars

Learning Objectives

By the end of this section, you will be able to:

  • Explain the difference between luminosity and apparent brightness
  • Understand how astronomers specify brightness with magnitudes

Luminosity

Perhaps the most important characteristic of a star is its luminosity—the total amount of energy at all wavelengths that it emits per second. Earlier, we saw that the Sun puts out a tremendous amount of energy every second. (And there are stars far more luminous than the Sun out there.) To make the comparison among stars easy, astronomers express the luminosity of other stars in terms of the Sun’s luminosity. For example, the luminosity of Sirius is about 25 times that of the Sun. We use the symbol LSun to denote the Sun’s luminosity; hence, that of Sirius can be written as 25 LSun. In a later chapter, we will see that if we can measure how much energy a star emits and we also know its mass, then we can calculate how long it can continue to shine before it exhausts its nuclear energy and begins to die.

Apparent Brightness

Astronomers are careful to distinguish between the luminosity of the star (the total energy output) and the amount of energy that happens to reach our eyes or a telescope on Earth. Stars are democratic in how they produce radiation; they emit the same amount of energy in every direction in space. Consequently, only a minuscule fraction of the energy given off by a star actually reaches an observer on Earth. We call the amount of a star’s energy that reaches a given area (say, one square meter) each second here on Earth its apparent brightness. If you look at the night sky, you see a wide range of apparent brightnesses among the stars. Most stars, in fact, are so dim that you need a telescope to detect them.

If all stars were the same luminosity—if they were like standard bulbs with the same light output—we could use the difference in their apparent brightnesses to tell us something we very much want to know: how far away they are. Imagine you are in a big concert hall or ballroom that is dark except for a few dozen 25-watt bulbs placed in fixtures around the walls. Since they are all 25-watt bulbs, their luminosity (energy output) is the same. But from where you are standing in one corner, they do not have the same apparent brightness. Those close to you appear brighter (more of their light reaches your eye), whereas those far away appear dimmer (their light has spread out more before reaching you). In this way, you can tell which bulbs are closest to you. In the same way, if all the stars had the same luminosity, we could immediately infer that the brightest-appearing stars were close by and the dimmest-appearing ones were far away.

To pin down this idea more precisely, recall from the Radiation and Spectra chapter that we know exactly how light fades with increasing distance. The energy we receive is inversely proportional to the square of the distance. If, for example, we have two stars of the same luminosity and one is twice as far away as the other, it will look four times dimmer than the closer one. If it is three times farther away, it will look nine (three squared) times dimmer, and so forth.

Alas, the stars do not all have the same luminosity. (Actually, we are pretty glad about that because having many different types of stars makes the universe a much more interesting place.) But this means that if a star looks dim in the sky, we cannot tell whether it appears dim because it has a low luminosity but is relatively nearby, or because it has a high luminosity but is very far away. To measure the luminosities of stars, we must first compensate for the dimming effects of distance on light, and to do that, we must know how far away they are. Distance is among the most difficult of all astronomical measurements. We will return to how it is determined after we have learned more about the stars. For now, we will describe how astronomers specify the apparent brightness of stars.

The Magnitude Scale

The process of measuring the apparent brightness of stars is called photometry (from the Greek photo meaning “light” and –metry meaning “to measure”). As we saw Observing the Sky: The Birth of Astronomy, astronomical photometry began with Hipparchus. Around 150 B.C.E., he erected an observatory on the island of Rhodes in the Mediterranean. There he prepared a catalog of nearly 1000 stars that included not only their positions but also estimates of their apparent brightnesses.

Hipparchus did not have a telescope or any instrument that could measure apparent brightness accurately, so he simply made estimates with his eyes. He sorted the stars into six brightness categories, each of which he called a magnitude. He referred to the brightest stars in his catalog as first-magnitudes stars, whereas those so faint he could barely see them were sixth-magnitude stars. During the nineteenth century, astronomers attempted to make the scale more precise by establishing exactly how much the apparent brightness of a sixth-magnitude star differs from that of a first-magnitude star. Measurements showed that we receive about 100 times more light from a first-magnitude star than from a sixth-magnitude star. Based on this measurement, astronomers then defined an accurate magnitude system in which a difference of five magnitudes corresponds exactly to a brightness ratio of 100:1. In addition, the magnitudes of stars are decimalized; for example, a star isn’t just a “second-magnitude star,” it has a magnitude of 2.0 (or 2.1, 2.3, and so forth). So what number is it that, when multiplied together five times, gives you this factor of 100? Play on your calculator and see if you can get it. The answer turns out to be about 2.5, which is the fifth root of 100. This means that a magnitude 1.0 star and a magnitude 2.0 star differ in brightness by a factor of about 2.5. Likewise, we receive about 2.5 times as much light from a magnitude 2.0 star as from a magnitude 3.0 star. What about the difference between a magnitude 1.0 star and a magnitude 3.0 star? Since the difference is 2.5 times for each “step” of magnitude, the total difference in brightness is 2.5 × 2.5 = 6.25 times.

Here are a few rules of thumb that might help those new to this system. If two stars differ by 0.75 magnitudes, they differ by a factor of about 2 in brightness. If they are 2.5 magnitudes apart, they differ in brightness by a factor of 10, and a 4-magnitude difference corresponds to a difference in brightness of a factor of 40.You might be saying to yourself at this point, “Why do astronomers continue to use this complicated system from more than 2000 years ago?” That’s an excellent question and, as we shall discuss, astronomers today can use other ways of expressing how bright a star looks. But because this system is still used in many books, star charts, and computer apps, we felt we had to introduce students to it (even though we were very tempted to leave it out.)

The brightest stars, those that were traditionally referred to as first-magnitude stars, actually turned out (when measured accurately) not to be identical in brightness. For example, the brightest star in the sky, Sirius, sends us about 10 times as much light as the average first-magnitude star. On the modern magnitude scale, Sirius, the star with the brightest apparent magnitude, has been assigned a magnitude of −1.5. Other objects in the sky can appear even brighter. Venus at its brightest is of magnitude −4.4, while the Sun has a magnitude of −26.8. [link] shows the range of observed magnitudes from the brightest to the faintest, along with the actual magnitudes of several well-known objects. The important fact to remember when using magnitude is that the system goes backward: the larger the magnitude, the fainter the object you are observing.

Apparent Magnitudes of Well-Known Objects.
Illustration of the apparent magnitudes of well-known objects, and the faintest magnitudes observable by the naked eye, binoculars, and telescopes. At bottom is a scale labeled “Apparent magnitude”. The scale goes from -30 on the left, to zero in the center to +35 on the right. Above the scale are listed astronomical objects and telescopes, with lines connecting each to the scale below at its appropriate (and approximate) magnitude. Starting from the left we find the Sun at -26, the Moon at -13, Venus (at brightest) at -4.5, Jupiter and Mars at -3, Sirius at -1.5, Alpha Centauri at zero, Betelgeuse at about +0.5, Polaris at +2, the faintest object visible to the unaided eye at +6, Barnard’s Star at about +9, the faintest object visible with binoculars at +10, 1-meter telescope limit at about +19, faintest object visible with 4-meter telescope at about +26, Hale telescope limit at about +27, and finally the limit of Hubble & Keck at about +30.
Figure 1. The faintest magnitudes that can be detected by the unaided eye, binoculars, and large telescopes are also shown.

The Magnitude Equation
Even scientists can’t calculate fifth roots in their heads, so astronomers have summarized the above discussion in an equation to help calculate the difference in brightness for stars with different magnitudes. If m1 and m2 are the magnitudes of two stars, then we can calculate the ratio of their brightness $$\left(\frac{{b}_{2}}{{b}_{1}}\right)$$ using this equation:

$${m}_{1}-{m}_{2}=2.5\phantom{\rule{0.2em}{0ex}}\text{log}\left(\frac{{b}_{2}}{{b}_{1}}\right)\phantom{\rule{3em}{0ex}}\text{or}\phantom{\rule{3em}{0ex}}\frac{{b}_{2}}{{b}_{1}}={2.5}^{{m}_{1}-{m}_{2}}$$

Here is another way to write this equation:

$$\frac{{b}_{2}}{{b}_{1}}={\left({100}^{0.2}\right)}^{{m}_{1}-{m}_{2}}$$

Let’s do a real example, just to show how this works. Imagine that an astronomer has discovered something special about a dim star (magnitude 8.5), and she wants to tell her students how much dimmer the star is than Sirius. Star 1 in the equation will be our dim star and star 2 will be Sirius.

Solution
Remember, Sirius has a magnitude of −1.5. In that case:

$$\begin{array}{cc}\frac{{b}_{2}}{{b}_{1}}\hfill & ={\left({100}^{0.2}\right)}^{8.5-\left(-1.5\right)}={\left({100}^{0.2}\right)}^{10}\hfill \\ & ={\left(100\right)}^{2}=100\phantom{\rule{0.2em}{0ex}}\times\phantom{\rule{0.2em}{0ex}}100=10,000\hfill \end{array}$$

Check Your Learning
It is a common misconception that Polaris (magnitude 2.0) is the brightest star in the sky, but, as we saw, that distinction actually belongs to Sirius (magnitude −1.5). How does Sirius’ apparent brightness compare to that of Polaris?

Answer:

$$\frac{{b}_{\text{Sirius}}}{{b}_{\text{Polaris}}}={\left({100}^{0.2}\right)}^{2.0-\left(-1.5\right)}={\left({100}^{0.2}\right)}^{3.5}={100}^{0.7}=25$$

(Hint: If you only have a basic calculator, you may wonder how to take 100 to the 0.7th power. But this is something you can ask Google to do. Google now accepts mathematical questions and will answer them. So try it for yourself. Ask Google, “What is 100 to the 0.7th power?”)

Our calculation shows that Sirius’ apparent brightness is 25 times greater than Polaris’ apparent brightness.

 

Other Units of Brightness

Although the magnitude scale is still used for visual astronomy, it is not used at all in newer branches of the field. In radio astronomy, for example, no equivalent of the magnitude system has been defined. Rather, radio astronomers measure the amount of energy being collected each second by each square meter of a radio telescope and express the brightness of each source in terms of, for example, watts per square meter.

Similarly, most researchers in the fields of infrared, X-ray, and gamma-ray astronomy use energy per area per second rather than magnitudes to express the results of their measurements. Nevertheless, astronomers in all fields are careful to distinguish between the luminosity of the source (even when that luminosity is all in X-rays) and the amount of energy that happens to reach us on Earth. After all, the luminosity is a really important characteristic that tells us a lot about the object in question, whereas the energy that reaches Earth is an accident of cosmic geography.

To make the comparison among stars easy, in this text, we avoid the use of magnitudes as much as possible and will express the luminosity of other stars in terms of the Sun’s luminosity. For example, the luminosity of Sirius is 25 times that of the Sun. We use the symbol LSun to denote the Sun’s luminosity; hence, that of Sirius can be written as 25 LSun.

Key Concepts and Summary

The total energy emitted per second by a star is called its luminosity. How bright a star looks from the perspective of Earth is its apparent brightness. The apparent brightness of a star depends on both its luminosity and its distance from Earth. Thus, the determination of apparent brightness and measurement of the distance to a star provide enough information to calculate its luminosity. The apparent brightnesses of stars are often expressed in terms of magnitudes, which is an old system based on how human vision interprets relative light intensity.

Glossary

apparent brightness
a measure of the amount of light received by Earth from a star or other object—that is, how bright an object appears in the sky, as contrasted with its luminosity
luminosity
the rate at which a star or other object emits electromagnetic energy into space; the total power output of an object
magnitude
an older system of measuring the amount of light we receive from a star or other luminous object; the larger the magnitude, the less radiation we receive from the object

License

Icon for the Creative Commons Attribution 4.0 International License

BCIT Astronomy XXXX: YYYY Copyright © 2017 by OpenStax is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.

Share This Book