Category Archives: AGW

Make The Climate Change Radicals Walk The Walk, Not Just Talk The Talk


Glenn Reynolds, a Tennessee University Law Professor posted in USAToday, where he frequently contributes opinion columns, “ Ban AC for DC “ with the subtitle being “If our rulers think global warming is a crisis, let them be a good example for the rest of us”

goreandhairdryer

 

 

 

Reynolds says:

“In this, I’m inspired by Rep. Lamar Smith, R-Tex., who noticed something peculiar recently. It seems that EPA Administrator Gina McCarthy, who spends a lot of time telling Americans that they need to drive less, fly less, and in general reduce their consumption of fossil fuels, also flies home to see her family in Boston “almost every weekend“; the head of the Clean Air Division, Janet McCabe, does the same, but she heads to Indianapolis. In air mileage alone, the Daily Caller News Foundation estimates that McCarthy surpasses the carbon footprint of an ordinary American.

Smith has introduced a bill that wouldn’t target the EPA honchos’ personal travel, though: It provides, simply, that “None of the funds made available by this Act may be used to pay the cost of any officer or employee of the Environmental Protection Agency for official travel by airplane.”

This makes sense to me. We’re constantly told by the administration that “climate change” is a bigger threat than terrorism.  And as even President Obama has noted, there’s a great power in setting an example: “We can’t drive our SUVs and eat as much as we want and keep our homes on 72 degrees at all times … and then just expect that other countries are going to say OK.”

Reynolds thinks expanding Representative Smith’s proposed legislation would useful as he notes in the following:

  1. Extend Smith’s bill to cover the entire federal government. We have Skype now, and Facetime. There’s no reason to fly to meetings. I’d let the President keep Air Force One for official travel, but subject to a requirement that absolutely no campaign activity or fundraisers take place on any trips in which the president travels officially.
  2. Obama makes a great point about setting the thermostat at 72 degrees. We should ban air conditioning in federal buildings. We won two world wars without air conditioning our federal employees. Nothing in their performance over the last 50 or 60 years suggests that A/C has improved things. Besides, The Washington Post informs us that A/C is sexist, and that Europeans think it’s stupid.
  3. In fact, we should probably ban air conditioning in the entire District of Columbia, to ensure that members of Congress, etc. won’t congregate in lobbyists’ air-conditioned offices.
  4. Speaking of which, members of Congress shouldn’t be allowed to fly home on the weekends. Not only does this produce halfhearted attention to their jobs — the so-called “Tuesday to Thursday Club” — but, again, it produces too much of a carbon footprint. Even if they pay for the travel out of campaign funds, instead of their own budgets, they need to set an example for the rest of us — and for those skeptical foreigners that Obama mentioned.

Reynolds takes a swipe at Leonardo DiCapprio as well. And what about Michelle Obama’s vacations!!!

The full posting is a good read, ( somewhat tongue-in-cheek in some parts.)

Do you think the warmers really believe in this catastrophic global warming stuff? Does not look like it. I think it demonstrates that they are using this to increase the size of the government through regulations (and thus their power.) That was the motivation of the founders of this movement.

cbdakota

Barbers In Danger Of Being Replaced By Computerization


I am  preparing a blog on some interesting speculation and study on the effect that  robots and computers will have on employment.  A study by Frey and Osborne titled “The future of employment : How susceptible are jobs to computerization”  evaluates some 700+ job categories as to  the  likelihood they will be replaced  by computers and or robots.  The autors have a table that ranks occupations according to their probability of being eliminate by computerization.  For example, in the no. 4 spot as one of the least likely to be computerized is Mental Health and Substance Abuse Social Workers with a probability of 0.0031.  Chemical Engineers are ranked 77 with a probability of 0.017.  Jobs that are most certain to be replaced by computerization are those at the end of the Author’s ranking.  For example,  Tax Preparers are ranked 695 with a probability of 0.99 which is as close to certain as one can get.

What caught my eye were Barbers, ranked 439  with a probability of 0.8  that are considered a near certainty. So I thought I would look into this a little further because my local barber is a friends.  I found the following research underway that most certainly will be worked out and replace him.

cbdakota

Hysteria and DDT


The Pacific Research Institute has released a video entitled “Hysteria’s History: Why is Alarmism so Dangerous?-Part 4“. The video’s purpose is to expose people to the historical progression of environmental alarmism that has often resulted in poor and contradictory policy proposals” This video discusses the ban on DDT. DDT had been very successful in nearly eradicating malaria around the world. Following the ban, malaria deaths skyrocketed.

DDT sprayed indoors, where it can keep  mosquitos in check and have no effect on the local wild life is a responsible action.   But it took skeptics to point this out over and over again before the green movement reeled in their efforts to keep DDT banned.

Hopefully, we skeptics,  showing that the catastrophic global warming movement is more hysteria than facts, will eventually cause the greens to concede in this case as well.

cbdakota

 

 

 

 

UK Scientist Doubts Decarbonization by 2050 Is Possible. Thinks Other Unfunded Threats Are More Compelling.


M J Kelly, Electrical Engineering Division Department of Engineering, Universtiy of Cambridge has written “Lessons from Technology Development for Energy and Sustainability” and posted on the  Cambridge Journals on Line.

The following is the Abstract from his posting where he sets up the quandary that faces the organizations wishing to decarbonize the planet by 2050.

There are lessons from recent history of technology introductions which should not be forgotten when considering alternative energy technologies for carbon dioxide emission reductions.

The growth of the ecological footprint of a human population about to increase from 7B now to 9B in 2050 raises serious concerns about how to live both more efficiently and with less permanent impacts on the finite world. One present focus is the future of our climate, where the level of concern has prompted actions across the world in mitigation of the emissions of CO2. An examination of successful and failed introductions of technology over the last 200 years generates several lessons that should be kept in mind as we proceed to 80% decarbonize the world economy by 2050. I will argue that all the actions taken together until now to reduce our emissions of carbon dioxide will not achieve a serious reduction, and in some cases, they will actually make matters worse. In practice, the scale and the different specific engineering challenges of the decarbonization project are without precedent in human history. This means that any new technology introductions need to be able to meet the huge implied capabilities. An altogether more sophisticated public debate is urgently needed on appropriate actions that (i) considers the full range of threats to humanity, and (ii) weighs more carefully both the upsides and downsides of taking any action, and of not taking that action.

 

M J Kelly discusses this issue at length in his posting and I suggest you read it in its entirety . This posting will look at conclusions and some suggestions Kelly derives when he examined the current  programs to reduce CO2. He’s not optimistic that decarbonization has much of a chance of accomplishing what the greens want. In fact he thinks the money could be spend better on addressing more immediate threats than those posed by the so-call catastrophic global warming. Here he summarizes his thoughts:

It is surely time to review the current direction of the decarbonization project which can be assumed to start in about 1990, the reference point from which carbon dioxide emission reductions are measured. No serious inroads have been made into the lion’s share of energy that is fossil fuel based. Some moves represent total madness. The closure of all but one of the aluminium smelters that used gas-fired electricity in the UK (because of rising electricity costs from the green tariffs that are over and above any global background fossil fuel energy costs) reduces our nation’s carbon dioxide emissions. 62 However, the aluminium is now imported from China where it is made with more primitive coal-based sources of energy, making the global problem of emissions worse! While the UK prides itself in reducing indigenous carbon dioxide emissions by 20% since 1990, the attribution of carbon emissions by end use shows a 20% increase over the same period.

Interestingly, he talks about the UK exporting manufacturing to other nations in order to reduce CO2 emissions.  Then the goods from these nations come back to the UK made in less efficient factories and the attributed CO2 result in an increase in the UK net emissions.     

It is also clear that we must de-risk all energy infrastructure projects over the  next two decades. While the level of uncertainty remains high, the ‘insurance policy’ justification of urgent large-scale intervention is untenable, and we do not pay premiums if we would go bankrupt as a consequence. Certain things we do not insure against, such as a potential future mega-tsunami, 64 or a supervolcano, 65 or indeed a meteor strike, even though there have been over 20 of these since 2000 with the local power of the Hiroshima bomb! 66 Using a significant fraction of the global GDP to possibly capture the benefits of a possibly less troublesome future climate leaves more urgent actions not undertaken.

Two important points remain. The first is that there is no alternative to business as usual carrying on, with one caveat expressed in the following paragraph. Since energy use has a cost, it is normal business practice to minimize energy use, by increasing energy efficiency (see especially the recent improvement in automobile performance), 67 using less resource material and more effective recycling. These drivers have become more intense in recent years, but they were always there for a business trying to remain competitive.

The second is that, over the next two decades, the single place where the greatest impact on carbon dioxide emissions can be achieved is in the area of personal behaviour. Its potential dwarfs that of new technology interventions. Within the EU over the last 40 years there has been a notable change in public attitudes and behaviour in such diverse arenas as drinking and driving, smoking in public confined spaces, and driving without a seatbelt. If society’s attitude to the profligate consumption of any materials and resources including any forms of fuel and electricity was to regard this as deeply antisocial, it has been estimated we could live something like our present standard of living on half the energy consumption we use today in the developed world. 68 This would mean fewer miles travelled, fewer material possessions, shorter supply chains, and less use of the internet. While there is no public appetite to follow this path, the short term technology fix path is no panacea.

Over the last 200 years, fossil fuels have provided the route out of grinding poverty for many people in the world (but still less than half of all people) and Fig. 1 shows that this trend is certain to continue for at least the next 20 years based on the technologies of scale that are available today. A rapid decarbonization is simply impossible over the next 20 years unless the trend of a growing number who succeed to improve their lot is stalled by rich and middle class people downgrading their own standard of living. The current backlash against subsidies for renewable energy systems in the UK, EU and USA is a sign that all is not well with current renewable energy systems in meeting the aspirations of humanity.

Figure 1. (a) The 40% growth of global energy consumption since 1995 and the projected 40% growth until 2035, with most of the growth between 1995 and 2035 being provided by fossil fuels, 21and (b) the cause of this growth is the rise in the number of people living in the middle class as described in the text. 22

 

Finally, humanity is owed a serious investigation of how we have gone so far with the decarbonization project without a serious challenge in terms of engineering reality. Have the engineers been supine and lacking in courage to challenge the orthodoxy? Or have their warnings been too gentle and dismissed or not heard? Science and politicians can take too much comfort from undoubted engineering successes over the last 200 years. When the sums at stake are on the scale of 1–10% of the world’s GDP, this is a serious business.

cbdakota

*M.J. Kelly (2016). Lessons from technology development for energy and sustainability. MRS Energy & Sustainability, 3, E3 doi:10.1557/mre.2016.3.

 

 

Making It Criminal To Be A Skeptic—The First Amendment Is Under Siege


Senator Whitehouse (D-RI) is calling for RICO investigations of skeptics and fossil fuel companies. California legislators writing a bill allowing for the prosecution of climate change dissent—fortunately it died this past  Thursday. Seventeen  State Attorney Generals investigating Exxon. Calls to silence skeptical views are becoming more frequent. A number of major US newspapers are prohibiting discussion of Skeptical views.  This theme parallels the Social Justice Warriors efforts to impose their view of politically correct and thus allowable speech. The First Amendment to the Constitution is under siege by the media and the government itself. The Amendment was designed to prevent the Government from squashing dissenting views and is often considered the medias first line of defense from the government crackdowns such as are common in socialist, communist and dictatorial governments (e.g. Venezuela, China and Iran.)

From an earlier Climate Change Sanity blog:

”Climate science acts like it is fighting a holy war. There are only those who are just and those who must be silenced and stopped at all costs. Anyone who mounts reasonable logical, empirical, or skeptical challenges to the orthodoxy must be ruined, not by counterfactual evidence, but by vicious attack”.

Obviously the warmers are not winning the hearts and minds of free people. One reason for this is that the disinformation primarily comes from the warmers. The predictions of catastrophe are many and they have not come true. And you do not need to be a climate scientist to understand how the warmers continue to get it wrong. The mainstream media is complicit in the distribution of this disinformation.

 

Look at these postings where you can get some idea of how poor their predictions are:

CAGW Predictions –Zombie And Others

Quotes from the Founders Of the Global Warming Movement

More Green Predictions Are Way Off Base

5 IPCC Assessments Don’t Show Correlation Of Temperature and Severe Weather

How Reliable Are Climate Models?

And some stories of manipulation of Data to get the results they want

Can We Trust the EPA Secrete Science

Doctor Brown and Temperature Tampering

Research Papers Show IPCC Climate Sensitivity Are Too High

And Bjon Lomborg shows how just a fraction of the money wasted on these erroneous green studies could really make a difference in people’s lives:

Bjorn Lomborg Say Global Warming Poor Place to Spend Money

These are just a few of postings on Climate Change Sanity that show you need to be a skeptic.

And please contact you legislators and tell them to protect the public from those who want to take away our First Amendment rights.

cbdakota

Nuclear Energy Is The Energy Source Of The Future–So Why Is It Dying Now?


A posting by Michael Shellenberger, “Clean Energy is on the Decline — Here’s Why, and What We Can Do About It”  discusses the demise of nuclear power plants. He notes that while low natural gas prices have undercut the economics of nuclear plants, the real problem they face is the bias against nukes. He notes than many State regulations refuse to class nukes as “renewable” energy thus not getting subsidized as do solar and wind energy. These same state regulations require a mix of solar and wind generated energy be part of the mix sold by utilities but specifically do not include nuclear power as part of the required mix. Why he asks does nuclear, an energy source that emits no carbon dioxide (CO2), get excluded. And further, nukes are base-load plants. Meaning when put on-line they produce power whether the sun shines or the wind blows.  And an added benefit, nukes produce enormous amounts of power while occupying very little space.

Shellenberger says:

“Consider that in the U.S., utilities have either closed or announced premature closures of seven plants in three years. At least eight more are at risk of early closure in the next two years. In 2011, Germany announced it would close all of its nuclear plants. Swedish utility Vattenfall announced late last year that it would be forced to close several reactors prematurely.”

The irony of this, for example in Germany, is that the nukes are being replaced by brown coal fueled power plants. Brown coal is probably the biggest emitter of CO2 per KWh of any normal power source.

“Everywhere the underlying reason is the same: anti-nuclear forces, in tandem with rent-seeking economic interests, have captured government policies. On one extreme lies Germany, which decided to speed up the closure of its nuclear plants following Fukushima. In Sweden the government imposed a special tax on nuclear. In the U.S., solar and wind receive 140 and 17 times higher levels of subsidy than nuclear. And states across the nation have enacted Renewable Portfolio Standards, RPS, that mandate rising wind and solar, and that exclude nuclear.”

Continue reading

Indian, Japanese and Chinese Scientists Publish Research That Predicts Little Ice Age or Maunder Minimum Coming Soon.


The Times of India posted “Sunspots point to looming “little ice age” quoting scientists and astronomers from Physical Research Laboratory in India and their counterparts in China and Japan have fresh evidence that Earth may be heading for another “little ice age” or maybe even another Maunder Mimimum.

Their findings are very similar to those of our scientists. They report that:

“….our blazing sun has been eerily turning quiet and growing less active over the last two decades.”

Continue reading

Solar Cycle 24 Activity Report- Mid-May 2016


Solar Cycle 24 activity has peaked and it is on its way to a minimum.   Of course the exact date of the minimum and the start of Solar Cycle 25 is not known.  Because normal Cycle life is nominally 11 years,  start of Cycle 25 should be in 2019,   The Sunspot number   (Wolf number 30 day average) for April was 38!!  The chart below shows the rather sharp drop from March’s 54.9.

The black line (Ri) is the 30 day average—if you look carefully you will see it is the sum of the green line (Rsouth) and the red line (Rnorth).   The dashed blue line is the “official Sun Spot number.  It is the smoothed count of a 13 month average monthly Sunspot count divided by 12. The oldest and newest monthly count are reduced to half and the other numbers are given their full count.  And is always 6 months behind the most recent month.  This is the way it has been done for many years; hence it has a history record to use for comparing Solar Cycles.

May is showing an up-tick in the Sunspot count.  This up and down, mostly down, will countinue  for several years.

Activity chartNote the change in the 30 Day Wolf Sunspot number–its about 50 now.

The final chart is a comparison of Solar Cycle 23 and Solar Cycle 24:

Solar irradiance was said to have dropped more than usual.  It will have to fall off for months before it is likely to an event of consider “very interesting”.

cbdakota

All charts  by Solen.info/solar

 

 

 

 

 

 

 

 

 

 

 

 

 

The smoothed count is a 13-month averaged sunspot count using this Belgium’s formula:
Rs= (0.5 Rm-6 + Rm-5 + Rm-4 + Rm-3 + Rm-2 + Rm-1 + Rm + Rm+1 + Rm+2 + Rm+3 + Rm+4 + Rm+5 + 0.5 Rm+6 ) / 12

Rs = smoothed monthly sunspot count
Rm = One month’s actual sunspot count

The “-6” through “+6” appended to each Rm is the number of months before or after the month whose smoothed count is being calculated. The beginning and ending months in the formula are only given half the value of the others.

 

Great Lakes’ Weather Becomes Global Warming And Then Reverses


Warmers are eager to see global warming in any weather event that seems to have a negative effect.   We all have seen their contradictory claims that global warming causes dangerous warming, and dangerous cold, and dangerous floods and dangerous droughts, etc. .

Another one of their variable weather happenings that was declared proof of dangerous global warming has  come unglued.  A Breitbart Posing “Great Lakes Go From ‘Climate Change-Induced’ Low Water Levels To Record Highs In Three Yearsillustrates this point:

Between 2010 and 2013, residents of the states surrounding the Great Lakes were told that climate change was permanently altering their environment and the record low water levels being recorded in the lakes may be the new normal. But now, only three years later, news reports are worried about beach erosion because the lakes have rebounded to record high levels of water.  “implications for the environment and the economy.”

This week, throughout the Chicago media landscape, as well as in reports in Michigan and Wisconsin, stories about a loss of swimming areas on public beaches are filling airwaves and newspaper pages. Residents and city officials are warning citizens that water levels in Lake Michigan and the other lakes are so high that the shallow swimming areas have been reduced as the water rises. Reports are also express worry over beach erosion and fears that the rising water is a danger to other infrastructure like roads.

In Chicago, for instance, notes that water levels have risen a whopping four feet since 2013 and the new water is “swallowing up beaches.”

The Chicago Tribune reports that the northern suburb of Evanston is losing beachfront property. “All our beaches are shrinking,” Evanston parks director Lawrence Hemingway said.  For its part, Chicago’s Fox affiliate worries that the city’s lakeshore bike path is being destroyed by the higher water levels. The Detroit Free Press also noted that the high water is erasing beaches and the water is at highs not seen since the 1990s.

These report are starkly different than those from the 2011 to 2013 timeframe. Then the news reports were warning that the:

lakes were irreversibly shrinking and that climate change was desolating both commerce and the environment“.

In 2013, for instance, Chicago’s Public Television WTTW bemoaned a “dramatic” change in the climate that was warming the lakes, lowering water levels, and threatening to destroy commerce and the environment.

The local PBS story also went national as the PBS Newshour ran stories on the environmental disaster the lakes were experiencing. In 2012 National Geographic sonorously warned that the “climate-related trend” was on the verge of laying waste to the region. Crain’s Detroit was also writing in 2013 that communities living on the edges of the region’s monumental bodies of water were going to have to “adapt” to the new normal of climate change.

 Naturally, far left sources were absolutely sure that global warming was drying up the lakes. In 2013 far left website Think Progress worried its readers with claims that climate change was “damaging” the lakes and would present. The Natural Resource Defense Council even contemplated lawsuits to prevent cities on the lakes from tapping into them as a source of water.

Still, it is amazing to see the difference in coverage. Today, with water levels hitting record highs, news reporters and city officials worry over their loss of beachfront property and not a word is mentioned of climate change. Yet only three years ago the same officials and news reporters were sure that climate change was here to stay and we’d better get used to the shrunken Great Lakes.

What a difference a few years makes.

I will bet that in a year or two, all the stories about low water levels being a definite indicator of catastrophic global warming will be forgotten and the narrative will be that the rising waters are a definite indicator of catastrophic global warming.

cbdakota

 

 

 

Denying The Climate Catastrophe: 5A Argument For Attributing Past Warming To Man (Warren Meyers Essay)


This posting is a continuation of  the Warren Meyers Essay debunking the Climate Catastrophe theory.   Here he takes the reader though the warmerist’s reasoning of why CO2 emitted from fossil fuels will result in a climate catastrophe.

cbdakota

Having established that the Earth has warmed over the past century or so (though with some dispute over how much), we turn to the more interesting — and certainly more difficult — question of finding causes for past warming.  Specifically, for the global warming debate, we would like to know how much of the warming was due to natural variations and how much was man-made.   Obviously this is hard to do, because no one has two thermometers that show the temperature with and without man’s influence.

I like to begin each chapter with the IPCC’s official position, but this is a bit hard in this case because they use a lot of soft words rather than exact numbers.  They don’t say 0.5 of the 0.8C is due to man, or anything so specific.   They use phrases like “much of the warming” to describe man’s affect.  However, it is safe to say that most advocates of catastrophic man-made global warming theory will claim that most or all of the last century’s warming is due to man, and that is how we have put it in our framework below:

 

click to enlarge

 

By the way, the “and more” is not a typo — there are a number of folks who will argue that the world would have actually cooled without manmade CO2 and thus manmade CO2 has contributed more than the total measured warming.  This actually turns out to be an important argument, since the totality of past warming is not enough to be consistent with high sensitivity, high feedback warming forecasts.  But we will return to this in part C of this chapter

Past, Mostly Abandoned Arguments for Attribution to Man

There have been and still are many different approaches to the attributions problem.  In a moment, we will discuss the current preferred approach.  However, it is worth reviewing two other approaches that have mostly been abandoned but which had a lot of currency in the media for some time, in part because both were in Al Gore’s film An Inconvenient Truth.

Before we get into them, I want to take a step back and briefly discuss what is called paleo-climatology, which is essentially the study of past climate before the time when we had measurement instruments and systematic record-keeping for weather.   Because we don’t have direct measurements, say, of the temperature in the year 1352, scientists must look for some alternate measure, called a “proxy,”  that might be correlated with a certain climate variable and thus useful in estimating past climate metrics.   For example, one might look at the width of tree rings, and hypothesize that varying widths in different years might correlate to temperature or precipitation in those years.  Most proxies take advantage of such annual layering, as we have in tree rings.

One such methodology uses ice cores.  Ice in certain places like Antarctica and Greenland is laid down in annual layers.  By taking a core sample, characteristics of the ice can be measured at different layers and matched to approximate years.  CO2 concentrations can actually be measured in air bubbles in the ice, and atmospheric temperatures at the time the ice was laid down can be estimated from certain oxygen isotope ratios in the ice.  The result is that one can plot a chart going back hundreds of thousands of years that estimates atmospheric CO2 and temperature.  Al Gore showed this chart in his movie, in a really cool presentation where the chart wrapped around three screens:

click to enlarge

 

 

As Gore points out, this looks to be a smoking gun for attribution of temperature changes to CO2.  From this chart, temperature and CO2 concentrations appear to be moving in lockstep.  From this, CO2 doesn’t seem to be a driver of temperatures, it seems to be THE driver, which is why Gore often called it the global thermostat.

But there turned out to be a problem, which is why this analysis no longer is treated as a smoking gun, at least for the attribution issue.  Over time, scientists got better at taking finer and finer cuts of the ice cores, and what they found is that when they looked on a tighter scale, the temperature was rising (in the black spikes of the chart) on average 800 years before the CO2 levels (in red) rose.

This obviously throws a monkey wrench in the causality argument.  Rising CO2 can hardly be the cause of rising temperatures if the CO2 levels are rising after temperatures.

It is now mostly thought that what this chart represents is the liberation of dissolved CO2 from oceans as temperatures rise.  Oceans have a lot of dissolved CO2, and as the oceans get hotter, they will give up some of this CO2 to the atmosphere.

The second outdated attribution analysis we will discuss is perhaps the most famous:  The Hockey Stick.  Based on a research paper by Michael Mann when he was still a grad student, it was made famous in Al Gore’s movie as well as numerous other press articles.  It became the poster child, for a few years, of the global warming movement.

So what is it?  Like the ice core chart, it is a proxy analysis attempting to reconstruct temperature history, in this case over the last 1000 years or so.  Mann originally used tree rings, though in later versions he has added other proxies, such as from organic matter laid down in sediment layers.

Before the Mann hockey stick, scientists (and the IPCC) believed the temperature history of the last 1000 years looked something like this

 

click to enlarge

Generally accepted history had a warm period from about 1100-1300 called the Medieval Warm Period which was warmer than it is today, with a cold period in the 17th and 18th centuries called the “Little Ice Age”.  Temperature increases since the little ice age could in part be thought of as a recovery from this colder period.  Strong anecdotal evidence existed from European sources supporting the existence of both the Medieval Warm Period and the Little Ice Age.  For example, I have taken several history courses on the high Middle Ages and every single professor has described the warm period from 1100-1300 as creating a demographic boom which defined the era (yes, warmth was a good thing back then).  In fact, many will point to the famines in the early 14th century that resulted from the end of this warm period as having weakened the population and set the stage for the Black Death.

However, this sort of natural variation before the age where man burned substantial amounts of fossil fuels created something of a problem for catastrophic man-made global warming theory.  How does one convince the population of catastrophe if current warming is within the limits of natural variation?  Doesn’t this push the default attribution of warming towards natural factors and away from man?

The answer came from Michael Mann (now Dr. Mann but actually produced originally before he finished grad school).  It has been dubbed the hockey stick for its shape:

click to enlarge

The reconstructed temperatures are shown in blue, and gone are the Medieval Warm Period and the Little Ice Age, which Mann argued were local to Europe and not global phenomena.  The story that emerged from this chart is that before industrialization, global temperatures were virtually flat, oscillating within a very narrow band of a few tenths of a degree.  However, since 1900, something entirely new seems to be happening, breaking the historical pattern.  From this chart, it looks like modern man has perhaps changed the climate.  This shape, with the long flat historical trend and the sharp uptick at the end, is why it gets the name “hockey stick.”

Oceans of ink and electrons have been spilled over the last 10+ years around the hockey stick, including a myriad of published books.  In general, except for a few hard core paleoclimatologists and perhaps Dr. Mann himself, most folks have moved on from the hockey stick as a useful argument in the attribution debate.  After all, even if the chart is correct, it provides only indirect evidence of the effect of man-made CO2.

Here are a few of the critiques:

  • Note that the real visual impact of the hockey stick comes from the orange data on the far right — the blue data alone doesn’t form much of a hockey stick.  But the orange data is from an entirely different source, in fact an entirely different measurement technology — the blue data is from tree rings, and the orange is form thermometers.  Dr. Mann bristles at the accusation that he “grafted” one data set onto the other, but by drawing the chart this way, that is exactly what he did, at least visually.  Why does this matter?  Well, we have to be very careful with inflections in data that occur exactly at the point that where we change measurement technologies — we are left with the suspicion that the change in slope is due to differences in the measurement technology, rather than in the underlying phenomenon being measured.
  • In fact, well after this chart was published, we discovered that Mann and other like Keith Briffa actually truncated the tree ring temperature reconstructions (the blue line) early.  Note that the blue data ends around 1950.  Why?  Well, it turns out that many tree ring reconstructions showed temperatures declining after 1950.  Does this mean that thermometers were wrong?  No, but it does provide good evidence that the trees are not accurately following current temperature increases, and so probably did not accurately portray temperatures in the past.
  • If one looks at the graphs of all of Mann’s individual proxy series that are averaged into this chart, astonishingly few actually look like hockey sticks.  So how do they average into one?  McIntyre and McKitrick in 2005 showed that Mann used some highly unusual and unprecedented-to-all-but-himself statistical methods that could create hockey sticks out of thin air.  The duo fed random data into Mann’s algorithm and got hockey sticks.
  • At the end of the day, most of the hockey stick (again due to Mann’s averaging methods) was due to samples from just a handful of bristle-cone pine trees in one spot in California, trees whose growth is likely driven by a number of non-temperature factors like precipitation levels and atmospheric CO2 fertilization.   Without these few trees, most of the hockey stick disappears.  In later years he added in non-tree-ring series, but the results still often relied on just a few series, including the Tiljander sediments where Mann essentially flipped the data upside down to get the results he wanted.  Taking out the bristlecone pines and the abused Tiljander series made the hockey stick go away again.There have been plenty of other efforts at proxy series that continue to show the Medieval Warm Period and Little Ice Age as we know them from the historical record:

 

 

 

 

click to enlarge

As an aside, Mann’s hockey stick was always problematic for supporters of catastrophic man-made global warming theory for another reason.  The hockey stick implies that the world’s temperatures are, in absence of man, almost dead-flat stable.   But this is hardly consistent with the basic hypothesis, discussed earlier, that the climate is dominated by strong positive feedbacks that take small temperature variations and multiply them many times.   If Mann’s hockey stick is correct, it could also be taken as evidence against high climate sensitivities that are demanded by the catastrophe theory.

 

The Current Lead Argument for Attribution of Past Warming to Man

So we are still left wondering, how do climate scientists attribute past warming to man?  Well, to begin, in doing so they tend to focus on the period after 1940, when large-scale fossil fuel combustion really began in earnest.   Temperatures have risen since 1940, but in fact nearly all of this rise occurred in the 20 year period from 1978 to 1998:

click to enlarge

To be fair, and better understand the thinking at the time, let’s put ourselves in the shoes of scientists around the turn of the century and throw out what we know happened after that date.  Scientists then would have been looking at this picture:

click to enlarge

Sitting in the year 2000, the recent warming rate might have looked dire .. nearly 2C per century…

click to enlarge

Or possibly worse if we were on an accelerating course…

click to enlarge

 

Scientists began to develop a hypothesis that this temperature rise was occurring too rapidly to be natural, that it had to be at least partially man-made.  I have always thought this a slightly odd conclusion, since the slope from this 20-year period looks almost identical to the slope centered around the 1930’s, which was very unlikely to have much human influence.

click to enlarge

But never-the-less, the hypothesis that the 1978-1998 temperature rise was too fast to be natural gained great currency.  But how does one prove it?

What scientists did was to build computer models to simulate the climate.  They then ran the computer models twice.  The first time they ran them with only natural factors, or at least only the natural factors they knew about or were able to model (they left a lot out, but we will get to that in time).  These models were not able to produce the 1978-1998 warming rates.  Then, they re-ran the models with manmade CO2, and particularly with a high climate sensitivity to CO2 based on the high feedback assumptions we discussed in an earlier chapter.   With these models, they were able to recreate the 1978-1998 temperature rise.   As Dr. Richard Lindzen of MIT described the process:

What was done, was to take a large number of models that could not reasonably simulate known patterns of natural behavior (such as ENSO, the Pacific Decadal Oscillation, the Atlantic Multidecadal Oscillation), claim that such models nonetheless accurately depicted natural internal climate variability, and use the fact that these models could not replicate the warming episode from the mid seventies through the mid nineties, to argue that forcing was necessary and that the forcing must have been due to man.

Another way to put this argument is “we can’t think of anything natural that could be causing this warming, so by default it must be man-made.  With various increases in sophistication, this remains the lead argument in favor of attribution of past warming to man.

In part B of this chapter, we will discuss what natural factors were left out of these models, and I will take my own shot at a simple attribution analysis.