The idea that some gases in the atmosphere trap heat was first proposed in 1824 by Joseph Fourier. In 1859, experiments by John Tyndall identified the most important heat-trapping gases as water vapour and carbon dioxide. Half a century later, in 1896, Svante Arrhenius calculated that doubling the level of carbon dioxide would lead to a temperature increase of 5 to 6°C.
Arrhenius, though, was not only the first to predict that human-induced global warming might be a problem, he also came up with the first arguments for not worrying about it. His calculations suggested it would take 3000 years for industrial emissions at 1896 levels to double CO2, and that this would not happen because the oceans would absorb most of this extra CO2.
A few years later, Knut Ångström came up with another objection. His test-tube experiments suggested that adding more CO2 to the atmosphere would not trap any more heat.
CO2 absorbs only certain frequencies of infrared radiation (heat), and Ångström’s results suggested the levels in the atmosphere were already high enough to absorb 100% of the radiation at these frequencies. The experiments also appeared to show that the frequencies absorbed by CO2 largely overlapped with those absorbed by water, another reason to think that adding more CO2 to the atmosphere would make no difference.
For most scientists, this was the nail in the coffin of the global warming hypothesis, which was consigned to the history books for decades. But in the 1930s, physicists started to realise that high in the atmosphere, where pressure and temperature are much lower, the absorption properties of gases change. Ångström’s experiments had been done at sea level and room temperature.
With better equipment, the blurry absorption bands seen by Ångström also resolved into fine lines. It turned out that there was not nearly as much overlap between the infrared absorption spectra of water vapour and CO2 as thought. In 1956, calculations by Gilbert Plass proved that Ångström had got it wrong: adding more and more CO2 to the atmosphere would trap more and more heat.
Around the same time, a detailed study of ocean chemistry by Roger Revelle showed that the seas would not soak up nearly as much extra CO2 as had been assumed. Revelle hired Charles Keeling to start measuring atmospheric CO2 levels, resulting in the famous Keeling curve showing how the CO2 level has climbed since 1958.
So by the 1950s, it was starting to become clear that human activity was causing CO2 levels to rise and that this rise would reduce the loss of heat into space. The implication seemed clear: provided all the other factors affecting the climate did not change, the Earth would warm.
It is worth stressing that this conclusion depends only on atmospheric physics and oceanic chemistry. It does not rely on the study of past climate.
But it was a conclusion scientists were reluctant to draw. There were many uncertainties and complexities involved. And at the time other climate factors had changed: the world had cooled slightly after 1940, due mainly to sulphate emissions. It was only in the late 1970s that a handful of scientists, including Wallace Broecker, began to warn that global warming was an imminent problem.
Why does this history matter? Because it shows that scientists, now united in agreement, were once the greatest climate change sceptics.
There is still plenty of scientific debate about just much the world will warm, an issue complicated by the many feedback factors. There’s also lots of debate on specific effects, such as how tropical cyclone activity will change.
One by one, however, all the arguments against the central global warming hypothesis have been shown to be wrong. In fact, the strongest arguments were demolished long before most people alive today were born.
Yet there are still a handful of scientists, and far more non-scientists, who refuse to accept the idea of global warming. I am continually amazed by the way some individuals treat the idea of global warming with such extreme scepticism, yet uncritically seize upon anything that seems to challenge it, no matter how dubious.
For example, the remarkable correlation between CO2 levels and temperature going back 600,000 years is dismissed because of a few mismatches, but the fleeting correlation between cosmic ray intensity and low-altitude cloud cover, which broke down after less than two decades, is hailed as absolute proof that cosmic rays affect the climate.
You can find dozens of such arguments on the net, all purporting to prove that global warming is not real or will not be a problem. Most are based on misunderstandings or distortions of the science. Some are deliberate attempts to deceive.
Many of these arguments still find their way into mainstream media. Some are even parroted by politicians such as US Senator James Inhofe.
To non-scientists, these arguments can sound very persuasive. Why do CO2 levels only start to rise hundreds of years after the start of interglacial periods, for instance?
So for those who are perhaps confused by all the conflicting claims and want to find out what’s really going on, New Scientist has put together a special on climate myths.
Why? Because time is running out. We need to be debating how to achieve the drastic cuts in CO2 emissions that are required to reduce our impact on the climate, not wasting time endlessly rehashing a debate that was largely settled half a century ago.