22 April 2009

Dark forces

Cosmos Magazine
Is the ultimate fate of our universe dependent on an utterly inexplicable form of dark energy?
Composition of the universe

Composition of the universe: According to the latest observational evidence, ordinary matter, including stars and planets, only make up a tiny fraction of the universe (5%). The rest is the elusive dark matter (25%) and dark energy(70%).

Here’s a question for you. What is the stuff that makes up the majority of the universe? Stars and galaxies? Hydrogen gas? Planets like ours? Nope. It’s none of them. Here’s a hint: we have absolutely no idea.

For more than 75 years, we’ve known the cosmos has been running away from us; caught up in an ongoing expansion, with galactic clusters edging away from each other like so many raisins in an inflating cosmological cupcake.

That’s hard enough to wrap one’s head around. But out in the remote reaches of the universe – beyond the familiar constellations and our local group of galaxies – cosmologists now tell us they see hints that the universe is undergoing a violent tug of war between gravity and a mysterious dark energy.

How did we come to this startling conclusion? It all began in 1915 when Albert Einstein, then at the University of Berlin, came up with his theory of general relativity.

His equations pointed to the fact that the natural state of the universe was one of flux: either expanding or contracting. But since that had yet to be confirmed observationally, he tried to make his equations fit the data at hand.

As a result, his ‘cosmological constant’ was born, in 1917. This proposition posits that there must be a repulsive ‘negative pressure’ to counteract the ‘positive pressure’ of gravity, in order to fit with the assumed static universe of the day.

However, by 1929, Edwin Hubble’s groundbreaking observations at California’s Mount Wilson Observatory had shown that the wavelength of light from distant galaxies is shifted further into the red portion of the spectrum compared with the light from nearer galaxies.

Even more astounding: there appeared to be a strong linear relationship between the distance and shift into the red.

The implication was staggering. If the light had been stretched, or ‘redshifted’, through the Doppler effect – the same distorting effect that causes an ambulance siren to rise in pitch when approaching or drop as it recedes – it could only mean one thing: everything in the universe was hurtling away from our humble Milky Way, and the further away it was, the faster it fled.

Either there was something uniquely repulsive – physically or figuratively – about our home, or the entire universe was undergoing explosive expansion.

Thus, in a 1932 joint paper by Einstein and Dutch academic Willem de Sitter, published in the Proceedings of the National Academy of Sciences journal, they noted that since the universe had been observed to be expanding – a cosmological constant was no longer necessary.

Furthermore, other cosmologists noted that if the universe was getting bigger, this implied it had been smaller in the past. If you went back far enough, the entire universe would have originated from a very small point: the Big Bang. The big question was whether the expansion would continue indefinitely, or whether it would slow or possibly reverse, leading to a ‘Big Crunch’.

While Einstein quickly dropped the cosmological constant from his equations and brushed it aside, Sir Arthur Eddington did not. Eddington, a professor of astronomy at the University of Cambridge in England, was prescient enough to observe in his 1934 lectures that, should Einstein’s “cosmical repulsion” get the upper hand, then the universe will go on expanding forever.

However, Eddington lacked the means to prove Einstein’s earlier ideas about a cosmological constant. Consequently, the idea of a cosmological constant was relegated to the fringes, occasionally rearing its head at cosmological conferences through the years, but largely cast aside as what Einstein called his “greatest blunder”.

Part of the problem was the lack of a reliable cosmological yardstick, a kind of extragalactic standard candle, to accurately measure the brightness of stars and use it to measure the distances between galaxies.

For example, if we know how luminous a star is in absolute terms, and measure how visible it appears to us, we can perform a simple calculation to determine its distance.

Astronomers needed these standard candles to measure the distance of far-off highly redshifted galaxies, thus enabling them to accurately pinpoint the rate of expansion of the universe. But the discovery of these standard candles wasn’t to come for another three decades.

Meanwhile, in 1965, Arno Penzias and Robert Wilson, working at Bell Labs in the U.S., serendipitously tuned in to a strangely pervasive radiation coming from all directions in space.

This ‘cosmic microwave background’ radiation (CMB) turned out to be the faint remaining echo of the Big Bang, which is why they could detect it wherever they pointed their antenna.

However, while the CMB gave us strong evidence that the Big Bang actually happened, it remained a mystery how such a colossal explosion could result in the smooth and relatively homogenous universe we see today.

One would expect an explosion of space, time and energy such as the Big Bang would end up with huge clumps of matter in some areas, and vast tracts of nothingness in others.

A few years after Penzias and Wilson’s discovery, Alan Guth of the Michigan Institute of Technology suggested that the universe went through a period of inflation – an incredibly rapid expansion lasting only a tiny fraction of a second in the moments after the Big Bang.

Between inflation and the CMB, cosmologists had a surprisingly detailed picture of the earliest days of our universe.

And they had pretty good inklings about the make-up and evolutionary history of the local universe. What they lacked was a thorough and detailed account of how the universe had expanded throughout its long history and, to put such an account together, they would need reliable standard candles to probe the evolution of the universe in between.

Then, three years after Penzias and Wilson’s discovery, astronomers finally found their intergalactic yardsticks. Charles Kowal predicted that Type Ia supernovae – which were known to explode with a very predictable brightness – would make perfect standard candles, and could yield distances accurate to within 10 per cent.

But what the astronomers found when they looked for these telltale supernovae was totally unexpected. In our basic model of an inflationary Big Bang, astronomers expected to see some evidence to indicate that the universe is decelerating. However, observations of Type Ia supernovae made in the late 1990s by two independent teams indicated that instead of decelerating, at certain epochs in its history, the cosmos was actually accelerating.

In 1988, Saul Perlmutter, now a professor of physics at the University of California, Berkeley, led a team called the Supernova Cosmology Project that quietly began supernovae observations on the 3.9-metre Anglo-Australian Telescope at Siding Spring Observatory near Coonabarabran in New South Wales.

A few years later, American-born researcher Brian Schmidt, an astronomer at the Australian National University in Canberra, then of Harvard University, led a team called the High-Z Supernova Search. Schmidt and colleagues began taking data using Chile’s Blanco four-metre telescope in early 1995.

“I had just moved to Australia,” says Schmidt, “and had just had a child four months earlier. Foolishly, I tried to manage our Chile search from Australia. Back then the Internet was almost nonexistent between Australia and Chile and the software wasn’t working.”

Despite these difficulties with communication, Schmidt’s team was able to scan a piece of sky larger than the size of the Moon every five minutes with enough sensitivity to find Type Ia supernovae halfway across the universe.

They surveyed a million galaxies each night, and found dozens of supernovae. Within two years they had found 14 useable objects taken from all over the sky.

In 1997, Adam Riess, a key member of Schmidt’s team, was a postdoctoral researcher at the University of California, Berkeley, and had done his thesis on measuring distances to Type Ia supernovae. He spent much of his time there analysing the team’s data and applying a method that took into account the dimming effect of cosmic dust.

“Towards the end of 1997,” says Riess, now a professor of astronomy at Johns Hopkins University in Baltimore, USA.

“I was asking the computer to tell me how much mass was in the universe from how much the universe was decelerating. The computer was reporting back negative mass, which was the computer’s way of saying the universe is accelerating. Negative mass doesn’t exist,” he says.

“Then I introduced Einstein’s cosmological constant to the data and it was a great fit. The data preferred a universe with a 70 per cent cosmological constant,” he says.

Instead of being thrilled that Einstein’s much-maligned cosmological constant had filled a needed gap in the equations, Riess and Schmidt began looking for where they must have gone wrong. But after painstaking checking, they could reach no other conclusion but that expansion of the universe was accelerating.

Meanwhile, by March of 1998, Perlmutter’s team had found more than 75 Type Ia supernovae of their own at high redshift, and were also coming to the conclusion that something was strange about their data.

“We submitted our paper to the Astronomical Journal in March 1998,” says Schmidt, “and [Perlmutter's team] submitted theirs to the Astrophysical Journal in September 1998. We came to similar conclusions, and learned that 72 per cent of the universe is made up of something we didn’t expect to be there.”

Riess, the lead author of the pivotal 1998 paper, soon left for the Space Telescope Science Institute and created his own team called the Higher-Z Supernova Search. There he was able to use the Hubble Space Telescope to look for an elusive epoch of cosmic deceleration.

Aside from observing acceleration in the rate of expansion of the universe at 6.5 billion years after the Big Bang, Riess and colleagues also found a so-called ‘subdominant dark energy’ only five billion years after the Big Bang.

Riess likens this subdominant dark energy to putting your foot on the accelerator in your car, while gravity is like having your other foot on the brake – at the same time. “Whichever you’re pressing harder is going to determine whether the car accelerates or decelerates”, says Riess, “We see the universe decelerating but we can tell that dark energy is still present.”

Cosmologists saw evidence of this subdominant dark energy starting to push the universe to expansion at around five billion years after the Big Bang. Around that time the universe was still in a state of deceleration, following rapid expansion caused by inflation immediately after the Big Bang.

Then, somewhere around nine billion years after the Big Bang, there’s a tipping (or coasting) point where there’s neither acceleration or deceleration.

Riess says his recent data shows evidence for full-fledged dark energy acceleration only three billion years ago; although other papers cite a dark-energy driven universe at times ranging from five to seven billion years ago. Perlmutter, Riess and Schmidt were awarded the US$1 million 2006 Shaw prize for their work in measuring cosmic acceleration.

However, while the supernovae measurements appear sound, theorists are still debating whether dark energy is actually the negative repulsive energy originally proposed by Einstein.

Theoretical physicists have had a field day suggesting that the observed effect acting against gravity might be caused by everything from gravitational leakage into extra dimensions, to faulty cosmological geometry, while some believe gravity may not be constant after all, but may actually vary from one corner of the universe to the other.

Robert Kirshner, an observational astronomer at Harvard University’s Centre for Astrophysics, and doctoral advisor to Riess and Schmidt, says that the controversy is not whether the universe is accelerating, but the interpretation of the data.

“The simplest idea is that dark energy is constant,” says Kirshner, “And as time goes by, the density of matter gets lower, so the effect of the dark energy becomes more noticeable. But it could be more complicated and could turn on and off and do wacky things. The balance between dark matter and dark energy has been changing over time.”

Dark matter is another cosmological enigma. Astronomers have long observed that galaxies are often clumped together in clusters. In 1933, Swiss astronomer Fritz Zwicky observed the motions of these clusters, and calculated the mass necessary to explain their movements.

The figures he came up with were perturbing, to say the least. He calculated that the combined gravity of all the observed galaxies was insufficient to keep them clumped together.

Other observations of individual galaxies also showed unexpected results. The stars in the outer reaches of many galaxies were moving faster than expected. Something must have been tugging them towards the centre of their host galaxy lest they be flung out into deep space.

The solution: there must exist an unseen form of matter, which exerts a gravitational pull, helping keep galaxies together. Furthermore, this matter must have many times more mass than the billions of stars within the galaxies themselves.

Future observations will allow cosmologists to determine the history of dark energy and dark matter over time. Kirshner says that as the universe expands and becomes less dense, dark matter is losing its grip, but in its simplest theoretical form, dark energy has stayed constant. So, he says, there’s a tipping point between one side winning over the other.

However, some theorists believe dark energy-induced acceleration may be a dynamic process; it might switch on and off over time. But what could cause dark energy to suddenly switch off after billions of years remains speculation.

Then there’s the ‘cosmological constant problem’. Cosmologists have attempted to predict the value of the cosmological constant, representing the efficacy of dark energy, by using quantum field theory. This is the method of applying quantum mechanics to multi-particle systems, as found in nature.

However, the value resulting from such raw calculations is around 120 orders of magnitude larger than what is currently observed. So why is dark energy so much smaller than expected? There are, as yet, no satisfactory answers.

Some, such as theoretical physicist and Nobel laureate, Frank Wilczek, have even proposed the idea that at some point along the line, photons of visible light may be morphing from electromagnetic radiation to hypothetical subatomic particles known as axions and back again.

If so, axions might not behave or interact the same way as ordinary photons, and thus may not interact with the photon detectors so vital to current observational astronomy. If that were the case, they would make objects like distant supernovae look inherently fainter than they actually are, thus confounding our calculations of distance.

Finally, some theoretical physicists are explaining the observations with the idea that the universe may not be as homogeneous or isotropic as is currently believed. We may just happen to live in the centre of a large under-dense void in a universe that is much larger than we can see from Earth.

As a result, the observed rate of expansion would differ in the surrounding regions of the universe that are more dense, playing havoc with our figures. We might simply be interpreting supernovae outside our under-dense void as accelerating because locally our void is expanding at a faster rate than the rest of the universe.

“This is the fun time,” says Perlmutter. “We are just now getting new sets of data. If we don’t figure it out, we can despair later on.”

If dark energy is truly a cosmological phenomenon that has its roots in the fabric of the universe and physics as we understand them, then as Kirshner points out, it should show up in Earth-based laboratories. To date, particle physicists have been unable to test dark energy experimentally here on Earth.

The reasons why are reminiscent of an earlier era when astronomers wrangled with another problem in physics: the speed of light. They were able to determine that indeed the speed of light was finite from observations within the Solar System. However, they were unable to determine its speed until French physicist Armand Fizeau made ground-based measurements in 1849.

Christian Beck, a mathematical physicist at Queen Mary College, University of London, and colleagues, have proposed a way that dark energy be measured in the laboratory. The idea is that dark energy might be produced by very low frequency electromagnetic vacuum fluctuations. These might be detectable using current laboratory superconductors.

Three such European experiments are already in the works with first results expected in 2008.

Space agency NASA and the U.S. Department of Energy are taking a more conventional route by funding a joint dark energy mission. They have selected three proposals to compete for funding for a US$600 million dark energy space mission, which would launch around 2013. All would offer more supernovae measurements at higher precision in determining their distance than the current ground-based accuracies of eight or nine per cent.

Destiny, the Dark Energy Space Telescope, would detect and observe more than 3,000 supernovae over its two-year primary mission. SNAP, the Supernova/Acceleration Probe, is an ambitious proposal involving a spectrograph and both visible and infrared cameras. And ADEPT, the Advanced Dark Energy Physics Telescope, would conduct a large galaxy redshift survey, as well as map the three-dimensional positions of 100 million galaxies.

And for those who look at cosmology and physics as an abstraction – an esoteric science whose significance is hard to reconcile with a world labouring under pressing problems – we should remember that we appear to live in a privileged epoch in the history of the universe.

At this particular juncture of space-time in our small part of the universe, we’re in a remarkable position to determine the expansion history of the cosmos. That’s because we can observe it. In the future, we may not be so lucky, since dark energy is inherently accelerating
the universe at an exponential rate.

In a decelerating universe, nothing would disappear; the longer we wait the more we would see. An accelerating universe creates the opposite effect. In 100 billion years, dark energy’s accelerating effect will render the CMB completely undetectable, and with it will go the basis for determining modern cosmology. By then, we may look upon a virtually empty void rather than a star- and galaxy-filled sky.

“In about two trillion years, the redshift will be so great that every galaxy outside our local group of galaxies will have completely disappeared from view,” says Lawrence Krauss, a cosmologist at Case Western Reserve University in Ohio.

“Future intelligent observers may derive a completely incorrect view of the universe. We appear to live in a special time. But by the same token, that suggests it’s possible to rigorously infer wrong things about the universe because of things one can’t observe,” he says.

Meanwhile, Brian Schmidt is in two minds about the chances of settling the dark energy issue before he retires from research. “By retirement,” says Schmidt, “I’d hope to be in a position that we know what dark energy is and we’re working on something else. But I expect to still be scratching my head, wondering what dark energy is and working on something else anyway.”

Bruce Dorminey is a science journalist based in the U.S. and the author of Distant Wanderers: The Search for Planets Beyond the Solar System.

Sign up to our free newsletter and have "This Week in Cosmos" delivered to your inbox every Monday.

>> More information
Like us on Facebook
Follow @COSMOSmagazine
Add COSMOS to your Google+ circles