Solar Power: The Real Force Behind Climate Change?

Solar Power: The Real Force Behind Climate Change?

Global-warming alarmists ignore the awesome power of the Sun’s influence on our climate. ...
Ed Hiserodt
Article audio sponsored by The John Birch Society

Carbon dioxide, that little molecule composed of one carbon atom and two oxygen atoms, is a modern-day environmental criminal. Its crime: trapping the Sun’s heat in the Earth’s atmosphere, potentially causing catastrophic global warming that will destroy all life on Earth. It’s so bad that the United Nations Intergovernmental Panel on Climate Change (IPCC) officially labeled it a pollutant in 2009. Humans, especially those living in wealthy Western countries, are said to be co-conspirators in the climate-change crime, since they release those criminal molecules into the atmosphere every day via industrial activity.

At least, that’s the story we’ve all been told for the past few decades. But before we charge anyone (or anything) with destroying the Earth’s environment, it would be wise to do a little more detective work to gather all the available evidence. In the following article, we’ll do just that. Spoiler alert: We’ll find that CO2 and human industrial activity have been falsely accused. The evidence points to the Sun, not CO2, as likely being the main driver of climate change, and that climate change is a perfectly natural phenomenon.

Before we get into the details of how solar activity helps drive the Earth’s climate, let’s take some time to briefly look at the “CO2 drives climate” argument and examine why it’s wrong.

Falsely Accused

The increase in atmospheric CO2 concentration over the last century and a half, from slightly under 300 parts per million (ppm), or 0.03 percent of the Earth’s atmosphere, to around 400 ppm, or 0.04 percent, is blamed for the observed warming of the Earth’s average temperature over that time period. Further, the fact that human industrial activity also increased dramatically over that same period seems to justify claims that the increase in CO2 levels is caused by humans. Ergo, humans are causing climate change.

“Climate scientists” often throw around terms such as CO2 forcing, sensitivity, positive feedback loop, tipping point, and runaway greenhouse when attempting to explain the supposedly disastrous effects of CO2 on our planet. Their argument goes something like this: Atmospheric CO2 concentration increases because of human industrial emissions; this is known as “CO2 forcing.” Because of the climate’s high “sensitivity” to CO2, this “forcing” will cause the Earth to warm, owing to the greenhouse effect. This will cause changes to the Earth, such as melting ice sheets, etc., which will lower the Earth’s albedo (i.e., cause less sunlight to be reflected back into space), which will further warm the Earth in sort of a vicious cycle of global warming. This cycle is known as a “positive feedback loop.” Once the atmospheric CO2 concentration reaches a certain level, the Earth’s climate will have reached a “tipping point” where, owing to positive feedback loops, a “runaway greenhouse” is inevitable, meaning the planet will essentially be destroyed or at least made uninhabitable, and nothing could be done to stop this. “Climate scientists” such as Bill McKibben, founder of 350.org, say that we should keep atmospheric CO2 levels at 350 ppm or lower in order to avoid any untoward effects. Since we’re already at 400 ppm, it sounds like we’re in big trouble — meaning humans must make immediate and drastic cuts to CO2 emissions in order to stave off certain doom, and we might already be too late.

Well, not so fast. First of all, the warming effect of increased CO2 concentrations is logarithmic, not linear. What does this mean? The best analogy is that of blankets on a cold winter night: The first one is a godsend, the second stops the shivering, and the third brings peaceful comfort. But by the time you pile on the ninth or 10th blanket, the effect is imperceptible. Likewise with CO2: While the Earth’s temperature could be expected to rise by one degree Celsius if pre-industrial CO2 levels were doubled (i.e., to 600 ppm), that would not be the case for another doubling. Evidence for this can be seen in temperature reconstructions done by paleoclimatologists. Using a commonly accepted age for the Earth of 4.6 billion years, paleoclimatologists’ reconstructions of past climates place CO2 levels between 1,000-2,000 ppm for much of the Earth’s history, with levels sometimes reaching as high as 4,000 ppm. So was the planet fried to a crisp? No! Reconstructions place mean surface temperatures, at most, at 7° Celsius higher than today’s levels — so much for runaway global warming caused by CO2 crossing some soon-to-be achieved “tipping point.”

What’s more, there’s not even a strong correlation in more recent times between temperatures and CO2 levels. Most data reveal that temperature drives CO2 levels, not the other way around. Case in point: In the late 1800s, the Earth started coming out of a cold period known as the Little Ice Age that began around 1500. Atmospheric CO2 concentrations did not start increasing until after this warming trend began, and didn’t really start taking off until the mid 20th century. So how can warming cause CO2 levels to increase, and not the other way around? Put simply, as the Earth warms, the oceans warm up. Water maintains its temperature better than air, so the oceans will warm up more slowly. Warm water holds less CO2, so as the oceans finally warm, they off-gas CO2 into the atmosphere.

Understanding this solves a major problem: Computer models used by “climate scientists” to predict climate assume CO2 is the driver of temperature increases, but the observed temperatures  are always lower than what the models had predicted. Increases in CO2 levels have not caused an increase in temperature anywhere near predictions. The September 2017 edition of the prestigious Nature Geoscience journal contained an admission that, as the London Times put it, “We Were Wrong, Climate Scientists Concede.” According to the Times, “The world has warmed more slowly than had been predicted by computer models, which were ‘on the hot side’ and overstated the impact of emissions on average temperature, research has found.”

It doesn’t seem very logical to assume that a weak greenhouse gas (CO2) is the main driver of climate change. What, then, are some other ingredients in this complex stew we call “climate”?

Here Comes the Sun

Suggest to climate alarmists that the Sun may in fact be a primary driver of climate change, and you will likely get the dismissive retort: “We’ve already considered that and found it insignificant.” Actually, they are right, in a way. In a universe where there are variable stars that pulse at thousands of times per second, our Sun is a model of constancy. Solar output varies by only about 0.15 percent. This small change in intensity is indeed too small to account for climatic changes — at least directly. But our Sun has other ways to shed its light on climate.

Every 11 years, the polarity of the Sun reverses, and the Sun goes through a cycle of increasing and decreasing sunspot activity, solar flares, and solar radiation. These solar cycles are numbered, with the 1755-1766 cycle being labeled Cycle 1. We are currently in Cycle 24. Throughout history there have also been longer cycles in which a series of the 11-year cycles see a relatively “inactive” sun, with few sunspots even during the peaks of the 11-year cycles. It is during these times that the Earth experiences a cooler climate. During periods of high solar activity, with many sunspots, average global temperatures will be warmer.

One such inactive cycle came between the early 1600s and approximately 1715, where astronomers recorded a phenomenal dearth of sunspots. Research in the 1800s by German astronomer Gustav Spörer reveals that during a 28-year period (1672-1699) there were fewer than 50 sunspots. By comparison, we would typically see 40,000 to 50,000 sunspots over a similar period today. Spörer was a contemporary of British astronomer Walter Maunder, who identified a period from 1645 to 1715 now known as the Maunder Minimum. We know little about the solar cycles preceding that period since such cycles had not yet been realized by observers. But some reconstructions indicate that the Maunder Minimum followed several periods of decreasing activity. Incidentally, this trend of decreasing solar activity, “bottoming out” with the Maunder Minimum, fits very neatly with the aforementioned Little Ice Age. The Dalton Minimum (circa 1790-1830), named after English meteorologist John Dalton, occurred as temperatures recovered from the Little Ice Age, and appears to have slowed the return to earlier norms.

There is no lack of researchers who present studies on the similarity between today’s waning of the solar magnetic flux and periods of low sunspot activity such as the Little Ice Age. Since the 1970s, Valentina Zharkova, mathematics professor at Great Britain’s Northumbria University, has been studying solar cycles. Earlier this year she presented to the National Astronomy Meeting in Wales a new model of solar cycles, which scientists agree is an amazingly accurate prediction tool. According to the Royal Astronomical Society, “Predictions from the model suggest that solar activity will fall by 60 percent during the 2030s to conditions last seen during the ‘mini ice age’ that began in 1645.” In fact, a decrease in solar activity from Cycle 22 to the current Cycle 24 are typical of conditions leading to a “global cooling” phase.

Image: IngaNielsen/iStock/Getty Images Plus

This article appears in the January 22, 2018, issue of The New American.