Supernovae and Einstein's "Greatest Blunder"View from the Hubble Space Telescope: Supernova 1994D (lower left) in a Virgo cluster galaxy about 50 million light years from Earth.
The exploding stars called supernovae are, in a sense, very rare phenomena. Supernovae appear when old, extremely dense stars called white dwarfs (having the mass of the sun compacted into a sphere the size of the earth) reach a critical mass and explode in a thermonuclear firestorm as bright as a billion stars. Statistically, an astronomer might hope to see one in the Milky Way every century or so. But these fireballs don't appear on schedule. No one has actually observed a supernova in our galaxy since Johann Kepler in 1604; Danish astronomer Tycho Brahe spotted one in 1572.
Until the 1920s, however, astronomers believed that the Milky Way was the entire universe. Then, using more powerful telescopes, Edwin Hubble established distances to other spiral nebulae and showed that they were entities like our own Milky Way. The vast reaches of intergalactic space greatly multiply one's chances of finding supernovae, which turn out to be not so rare after all. Using computer-controlled telescopes and digital cameras, astronomers now can scan several thousand galaxies per night. By comparing digital images of the same patch of sky, they can uncover about 10 supernova candidates per observing session.
In this way, professor of astronomy Robert Kirshner '70 and his colleagues have been able to study scores of very distant supernovae. Intriguingly, the supernovae have turned out to be considerably dimmer--about 25 percent fainter--than expected. This suggests, Kirshner says, that "their light has to travel farther from the explosion to the telescope," indicating that space is larger than previously supposed, and that cosmic expansion has been accelerating recently. These findings hold such significance that Science magazine designated them the "scientific breakthrough of the year" for 1998.
Using the "redshift"--the way the spectral profile of light shifts toward the red end of the spectrum as its source moves away from us--Hubble showed in 1929 that the velocities of receding stars and nebulae increase with their distance. From these data, scientists concluded that we live in an expanding universe. "Hubble's constant"--an index of the steady pace of cosmic expansion--suggests that the universe is about 15 billion years old.
But what if cosmic expansion has not been constant? Perhaps, billions of years ago, things moved apart much more rapidly, and we are currently witnessing a slowed-down universe. "It's like clocking a marathon runner over the last five miles of the course," says Kirshner. "He might be running more slowly toward the end. So if you extrapolated that pace to 26 miles, you'd overestimate his actual time for the course." Conversely, cosmic expansion may actually be accelerating--as Kirshner's supernovae studies suggest.
To interpret their data, Kirshner's group had to account for the fact that supernovae are not all created equal: some, in fact, are three times brighter than others. Reasoning from brightness alone, one could conclude that a bright supernova is much nearer than it actually is. Luckily, the explosions have somewhat predictable life cycles. They generally last one to two months, and light from bigger, brighter supernovae dims more slowly than that from lesser examples. Using time plots and spectra from these light curves, Kirshner's graduate student Adam Riess, Ph.D. '96, working with professor of astronomy and of physics William Press '69, found a way to calculate (within 12 percent) a supernova's inherent brightness, using solar luminosities as units. "Now we have a relationship between the shape of the light curve and the star's intrinsic brightness," says Kirshner. The typical "Type Ia supernova" that Kirshner studies is four billion times as bright as the sun.
This "standard candle" adds meaning to the discovery of unexpectedly dim supernovae. Scientists have expected that cosmic expansion would slow down due to the gravitational attraction of matter in the universe, which could offset the original energetic impetus of the Big Bang. The effect could resemble an earthly rocket held in equilibrium exactly at its "escape velocity"--able neither to escape the planet's gravitational field, nor to fall back to earth. But calculations of such an expansion would make the universe only 10 billion years old, and some of the oldest stars, like globular clusters in the Milky Way, are known to be 12 billion to 14 billion years old--making them older than the universe itself, a clear impossibility. Kirshner and other astronomers have measured the density of matter in the universe by taking a census of galaxies; these data indicate that matter is spread far too thinly--perhaps reaching only 30 percent of the critical level--to support the escape-velocity scenario.
Enter Albert Einstein, whose general theory of relativity, published in 1916, advanced a theory of gravity that would require the universe either to expand or contract. The Dutch astronomer Willem de Sitter told Einstein that observational data showed it was doing neither one, but was instead in stasis. To account for this, Einstein added to his field equations a "cosmological constant," a "vacuum energy" that stretches space and accelerates cosmic expansion. This cosmological constant could offset gravitational attraction and thus account for a static universe.
Only a few years later, Hubble's discoveries demolished the notion of a static cosmos, and Einstein later referred to the cosmological constant as his "greatest blunder." Not so fast, Albert. Such a factor could explain an accelerating expansion of the universe, along with those unexpectedly dim supernovae. If so, Kirshner's research might show Einstein's "greatest blunder" to be yet another triumph of the great physicist's imagination.
~ Craig Lambert