Category Archives: Climate Models

U.S.Chamber of Commerce to Pres. Obama–How to Create Jobs


The U.S. Chamber of Commerce sent a letter to President Obama and Congress on creating jobs.The letter’s purpose is stated as follows:

OPEN LETTER TO CONGRESS AND THE PRESIDENT OF THE UNITED STATES

The most immediate priority facing our nation is to create jobs for the 25 million Americans who are unemployed, underemployed, or have simply given up looking for work.

To create jobs, we must enact policies that promote and sustain stronger economic growth. We must also address extraordinary fiscal and competitive challenges that are smothering growth and driving away jobs. At the same time, there are specific steps Congress and the administration can take right now to spur faster job growth in America’s private sector without adding to the deficit.

The letter has a number of sections. I have picked out one of them that relates to the Climate Change Sanity blogs theme:

2. PRODUCE MORE AMERICAN ENERGY

Let American energy workers and businesses responsibly develop all sources of domestic energy immediately. This will not only create jobs but will generate new government revenues, protect our energy security, and release us from the grip of some unfriendly governments.

                              Open offshore resources. Almost 190,000 new jobs could be created by 2013 if permitting in the Gulf of Mexico for offshore development returned to pre-moratorium levels. In Alaska, opening up energy production off the coast would create 54,700 jobs.

                              Expand access on federal lands. By expanding oil and gas exploration on federal lands, we could create 530,000 jobs, reduce imports by 44% by 2025, and increase government revenues by $206 billion.

                              Promote development of natural gas. Expanding the development of the nation’s massive shale gas deposits would create hundreds of thousands of jobs and help bring manufacturing back to the United States, especially in the chemicals and steel industries.

By 2020, natural gas production in Western Pennsylvania alone could create 116,000 new jobs, generate more than $2 billion in government revenues, and add $20 billion to the region’s economy.

                        Approve the Keystone XL pipeline. Construction of the Keystone XL oil pipeline connecting Canada to U.S. refineries in Texas would support 250,000 jobs, boost investment in the United States by $20 billion, and generate government revenues totaling $585 million.

Well said, and certainly in line with yesterday’s posting see here.

The other letter sections are as follows and worth reading:

  • Expand Trade and Global Commerce
  • Speed Up Infrastructure Projects
  • Welcome Tourists and Business Visitors to the U.S.
  • Speed Up Permits and Provide Regulatory Certainty and Relief
  • Pass Tax Incentives That Create Jobs While Increasing Revenues

cbdakota

 

 

One Billion Motor Vehicles And Peak Oil


Oilprice.com noted that in August, Wards Auto published a story saying that World motor vehicle count now stands at 1 billion.  The U.S. still has the largest registration at about 240 million.   In the Oilprice.com blog, the author considers what 1 billion vehicles and the likelihood of even more being added in the next 25 years might mean. It is interesting reading.  He seems to favor governmental intervention to ameliorate supply (read PEAK OIL) versus demand for fossil fuels.   He says:

It is highly unlikely that there will be anything approaching 240 million registered vehicles in the U.S. 25 years from now. From the vantage point of 2011, it seems probable that many will not be able to afford to own and operate personal motor vehicles of the size and types we have today.

He thinks that the newly mandated CAFÉ standard is just what we need and that we will have to abandon 6 passenger cars and other large sized vehicles. He says:

  In the U.S. we are now facing standards requiring that cars achieve an average of 54.5 MPG 15 years from now. First will come all sorts of weight reductions, such as eliminating spare tires, and adding more plastic and aluminum parts. Engines will become more efficient and car bodies will become more aerodynamic.  Although these changes will be costly, it does not take much arithmetic to conclude that if energy costs are three or four times higher than they are today then mileage will become the key factor by which motor vehicles are judged.

Detractors of these new mileage standards are usually people who have little grasp, or prefer not to think about where real energy costs are going to be 15 years from now. They point out the advanced materials required to build a low-weigh, high mileage, vehicles will be so great that it will push cars beyond what many, if not most, can afford.  

Due to governmental interference, the U.S. is facing an artificial Peak Oil problem.   This artificial Peak Oil Problem is really a part of the Peak Energy Problem that governmental interference is causing.    We have a lot of fossil fuels.  The U.S has the largest reserve of fossil fuels in the world.  It is likely that North America could become energy independent.  Yes, no propping-up Venezuela nor other countries that don’t have our best interest in mind.   And what a break for our balance of payments.  Becoming completely energy independent might possibly be the wrong thing to do because the prices of crude oil could fall below our production cost thanks to the U.S. bringing on more production capacity.  I don’t want the government to dictate how much crude we should produce or purchase.  Let the market decide whether we produce or buy.

Peak Oil will come sometime, but not in the near future.  What the U.S. is facing is an ideological, artificial Peak Oil problem.   The Obama administration gives money to “renewable fuels” programs and tells us that we must do this to reduce the purchase of foreign crude.   How the government thinks they can do this with renewable fuels is beyond comprehension.  Renewable fuels, are now neither economic nor reliable enough to do that.  In fact, the electrical grid people that distribute the nation’s electricity have found it necessary to have fossil fuel powered back-up capacity equal to the wind or solar capacity.  The renewables can’t be scheduled, meaning their supply is too erratic to provide steady voltage and current.  The wind slows down or stops or the sun goes behind clouds and the former balance of supply and demand goes south. They have to have something as a backup to keep the lights on.  Their second argument is that fossil fuels not be used as combustion results in C02.   The fossil fuel back-up capacity blows that argument.  See here and here to read about the folly of renewable fuels.

The Radical Environmentalists fight every attempt to develop our resources.   Oil in Alaska, offshore oil, oil in the Baaken field, nuclear power, low cost coal,etc..  It doesn’t matter, they are against it.   They use global warming, polar bears, darter fish, left-handed ground squirrels  (I guess I made that one up) and one of my favorites–the Houston toad.    According to some reports only 300 Houston Toads remaining in the world and they have been placed on the endangered species list.  “A world without the Houston toad ... is not a world we can physically live in,” says Paul Crump, a reptile and amphibian keeper at the Houston Zoo who works with the small brown toads.  Who knew?  The world is on the way to a collapse. More dangerous issue than the Osama binLaden threat so lets get the Seal Teams to see nothing bad happens to those warty little buggers. (SARC).

Fracking and the oil pipeline from Canada are the causes du jour for the radical environmental crowd.   It is patently clear that they will only be satisfied when this country is reduced to a third world status.   And our Government supports their activities through the EPA and other departments.  God Bless Michelle Bachmann and her vow to eliminate the EPA if she is elected President.  If she is not, she should be given the job as the EPA Administrator.

We will run out of economically recoverable oil some day.  Same for natural gas, iron ore, etc.  But the many forecasts made by experts about when the oil peak would occur have always been vastly overstated.

We quoted The Oilprice.com author saying that in 15 years the price will be 3 to 4 times higher than today.  It could happen but only if we just sit back and let it happen.  For a more realistic assessment of the Peak Oil tipping point, lets look at what has been said on a WardsAuto.com posting titled “Oil’s Price Always Comes Down.”

Five years ago, I believed in the Peak Oil theory. It postulated that global oil production would peak in 2006, and the following shortage would send prices skyrocketing. Sure enough, in 2008 a barrel of oil shot up to $150.

But less than 12 months later, oil plummeted to less than $40 a barrel. Yes, the price now is back up to $100, but I no longer believe in Peak Oil. Here’s why:

Brazil recently discovered massive oil reserves off its coast that match or beat Saudi Arabia’s. Brazil will start tapping those reserves before this decade is out. In Iraq, infrastructure is being put in place to increase oil production six or seven times greater than today, potentially making it the largest oil producer in the world.

And in the U.S., a new drilling technique called hydraulic fracturing is the mother of all game changers. (My emphasis)  Texas wildcatters figured out a way easily extracting natural gas and oil from shale. Using high-pressure water and sand, they fracture the shale, releasing trapped gas. As a result, the U.S. has added 100 years of natural gas use (at current rates), and the price of natural gas has fallen to nearly half from its peak in 2008.

Hydraulic fracturing, or fracking as it’s also called, is controversial. Some environmentalists have seized on it as the next great danger to the planet. A documentary called “Gasland” probably will win an Academy Award for hysterically pointing out the dangers of fracking.

Of course, “Gasland” approaches its topic with the impartiality and evenhandedness of pseudo-documentaries such as “Roger and Me” and “Who Killed The Electric Car?” So far, fracking has been done mostly in the U.S., but it soon will spread to the rest of the world. (My emphasis) Before this decade is out, we are going to see vast increases in the amount of oil and natural gas available. And this will have enormous implications for the auto industry and policy planners.

Closing out is a good time to call for a lesson from “Minnesotans 4 Global Warming”.

http://www.youtube.com/watch?v=nWiKvNDTjB4&feature=player_embedded

cbdakota

Will There Be Global Famine in 2050?


A report authored by Dr. Craig Idso titled “Estimates of Global Food Production in the Year 2050” asks the question ”Will we produce enough to adequately feed the world?” Idso says that researchers are estimating that global food production must increase by 70 to 100% to adequately feed 9 billion people in 2050.

Idso deals with  this question focusing on the world and also subgroups such as Europe, North America, Africa, etc. To do this, he compares forecast crop growth resulting from higher atmospheric CO2 and improved agricultural technology against the increased demand for food resulting from forecast population growth.

The data used by Idso is sourced from the UN’s Food and Agriculture Organization (FAO) to quantify the food crops, the UN’s IPCC Fourth Assessment Report (IPCC FAR) for an estimate of the atmospheric CO2 concentration in 2050 and the UN’s medium variant population projections for the year 2050.    Additionally he used the Plant Growth Database of CO2 Science to define the effect of increased atmospheric CO2 on plant growth.  Finally he works out food production estimates that will come due to the “Techno-intel effect”.  This “effect” is the advancement in agricultural technology and scientific research that expands our knowledge or intelligence based—e.g., the Green Revolution/GM seed work, etc.

Crops

The FAO database lists 169 crops.  Idso uses 45 of those crops in his work as these 45 crops account for 95% of the world food production.  To provide greater understanding, tabled below are the top 5 crops that together provide more than 55% of the world crop food sources.

Specific Crop % Of Total Production
Sugar Cane 21.240
Maize (corn) 10.283
Rice, paddy 9.441
Wheat 9.372
Potatoes 4.871

              FAO Data Base for World Food Production 2009.

                                               Top 5 Crops

The Specific crops vary in their ranking from subgroup to subgroup.

Population

The UN provides the forecast 2050 world population of nominally 9 billion.  Idso adds:

Another concern with respect to future population is whether or not the use of medium variant data from the United Nations is too conservative. Indeed, the medium variant population estimate for the year 2050 has recently been revised upward from 8.9 to 9.2 billion persons. A more realistic estimate of future population may be to use the constant fertility variant, which is weighted more heavily on current population trends and which foresees a global population of 11 billion in 2050.

Atmospheric CO2

Based on the Intergovernmental Panel on Climate Change’s best median estimate of this number (derived from the A1B scenario, ISAMS, in the IPCC’s Fourth Assessment Report, see http://www.ipcc-data.org/ancilliary/tar-isam.txt), we find that we could expect an increase in atmospheric CO2 concentration of 145 ppm between 2009 and 2050.

Enhanced Growth Via Greater Atmospheric CO2 Content

In my last posting I discussed the results of a vast number of trials done to quantify the results of increased atmospheric CO2 content.  Click here to get more detail, but I have lifted a table from that posting which gives the reader a feel for the enhanced growth that results from increased levels of atmospheric CO2.

PLANT No. OF STUDIES DRY WEIGHT INCREASE %(Arithmetic mean)
Corn 20 21.3
Rice 188 35.8
Soy Beans 179 46.5
Wheat 235 32.1
Sugar Cane 11 34

 Effect of Atmospheric CO2 Increased 300ppm Over Ambient

 

Techno-intel

Providing seeds that can adapt to growing condition or seeds imbued with resistance to fungus or insects is accomplished by techniques such as mutagenesis and genetic engineering.

Wiki says this about Norman Borlaug, considered the Father of the Green Revolution:

During the mid-20th century, Borlaug led the introduction of these high-yielding varieties combined with modern agricultural production techniques to Mexico, Pakistan, and India. As a result, Mexico became a net exporter of wheat by 1963. Between 1965 and 1970, wheat yields nearly doubled in Pakistan and India, greatly improving the food security in those nations.[4] These collective increases in yield have been labeled the Green Revolution, and Borlaug is often credited with saving over a billion people worldwide from starvation.[5] He was awarded the Nobel Peace Prize in 1970 in recognition of his contributions to world peace through increasing food supply.

From Wiki: 

Genetically modified foods (GM foods or GMO foods) are foods derived from genetically modified organisms, (GMOs). Genetically modified organisms have had specific changes introduced into their DNA by genetic engineering techniques. These techniques are much more precise[1] than mutagenesis (mutation breeding) where an organism is exposed to radiation or chemicals to create a non-specific but stable change.

Idso estimates that the likely increase in food production from 2009 to 2050 will be about 51.5%. Idso assigns 34.5% to Techno-intel and the remaining 17% to CO2 aerial fertilization. Note that the 51.5% is substantially less than the 70 to 100% increase believed by many experts to be needed.

Conclusions

Idso tables the results for the World, the Regions and the Sub Regions.  The calculated food supply has two cases.  Case 1 assumes that the increase in food supply is due only to Techno-intel.  Case 2 assumes that the increase is a result of both Techno-intel AND CO2 aerial fertilization.

World food supplies in 2050 will not be secure in Case 1 nor in the enhanced case 2.  For the Regions, Europe has a secure food source in Case 1 as well as Case 2. This can be explained by the expectation that Europe is the only Region where the population declines.  Africa, Asia, North America, Oceania and South America do not have secure foods supplies in Case 1.  In Case 2, Africa, Asia and Oceania  still do not have food source security.   North America and South America get a “maybe” in Case 2 as regard their food security.

Idso sums it up this way:

It is clear from the results obtained above that a global food security crisis is indeed looming on the horizon. If population projections and estimates of the amounts of additional food needed to feed the rising population of the planet prove correct, humanity will still fall short of being able to adequately feed the 9.1 billion persons expected to be inhabiting the Earth in the year 2050, even utilizing all yield-enhancing benefits associated with technological and intelligence advancements plus the aerial fertilization effect of Earth’s rising atmospheric CO2 content.

So what can be done to deal with the projected food production shortfall? Based on the results described above, there are only three possible avenues to achieving food security in the future: (1) greater gains must be achieved in the techno-intel sector than presently forecasted, (2) benefits from atmospheric CO2 enrichment must be increased, or (3) world population growth must be slowed to reach a lesser value by 2050.

Abstracting Dr. Idso’s report is a perilous undertaking.   The report is 43 pages and this posting is about one tenth that size.  Such reduction can introduce errors or poor assumptions that are not in the full report and can only be chalked-up to me.

Idso’s report  indicates that the world will be better served to have a goodly supply of atmospheric CO2 that can do aerial fertilization of the World’s food supply.  As a skeptic of the CO2 theory of run-away global warming, I can comfortably support the idea that atmospheric CO2 has more benefits than drawbacks.  Further, GM crops are a major benefit.   I like this comment by Borlaug regarding critics of his work:

“some of the environmental lobbyists of the Western nations are the salt of the earth, but many of them are elitists. They’ve never experienced the physical sensation of hunger. They do their lobbying from comfortable office suites in Washington or Brussels…If they lived just one month amid the misery of the developing world, as I have for fifty years, they’d be crying out for tractors and fertilizer and irrigation canals and be outraged that fashionable elitists back home were trying to deny them these things”.[54]

cbdakota

Should You Worry About CO2 in Our Atmosphere?


Should you worry about CO2 in our atmosphere?

Atmospheric carbon dioxide (CO2) is the basis for nearly all life on our planet.  Plants need at least 150ppm of atmospheric CO2 to grow.  The plants are the source of food for all animals.  There would be no T-Bones steaks were it not for plants.

That would seem to answer the question unless you are one of the radicals that believe to save the Earth, all humans must die.

But there is more.  Many scientists believe that global famine has been avoided by the increase in atmospheric CO2 from a pre-industrial level of about 270ppm to the current level of about 390ppm.   Before examining why scientist think CO2 increases can help avoid famine, let’s look at this VIDEO:

The levels of CO2 used in that video are outside normal considerations.  But much more modest increases in atmospheric CO2 are beneficial,too.    (So you can make the connection with the video and perhaps your own experience, cowpeas, are an important food legume crop in semi-arid tropics covering Asia, southern Europe and Central and South America.   In the Southern US cowpeas are called black eyed peas.)

The CO2 Science’s Plant Growth Data Base has an impressive compilation of peer-reviewed scientific studies that report the growth responses of plants to atmospheric CO2 enrichment.  Click here to see all the plants studied.

The following table lists a selected group of plants and the dry weight response to a 300ppm CO2 increase over ambient.

PLANT No. OF STUDIES DRY WEIGHT INCREASE %(Arithmetic mean)
Corn 20 21.3
Rice 188 35.8
Soy Beans 179 46.5
Wheat 235 32.1
Sugar Cane 11 34

 Effect of Atmospheric CO2 Increased 300ppm Over Ambient

The tables also provide response data on Photosynthesis (Net CO2 exchange rate).

The greater the amount of CO2 not only increases the quantity, its effect on the quality of the plant is not significantly altered.  Some studies have suggested that the protein levels are reduced, but other studies have indicated that the protein levels are increased.  Other factors, such as ozone (O3), soil nitrogen and sulfur dioxide (SO2) effect the outcome both positively and negatively.  Click here for more discussion of the quality of the plants tested.

It is hard to argue with all this data and just the common sense notion that warmer weather, more CO2 and more rainfall will provide bigger crop yields. And that the increase in crop yields will be beneficial in view of the forecast increase in the world’s population.   We all know that it surely will continue to warm as it part of a natural cycle.  We need to worry when the cycle reverses and the temperatures begin to drop.  Surely some one is yelling at his computer display right now shouting about the droughts that are going to occur when man-made global warming really kicks in.    Ok, but for every warmer that says the world will become a desert, there is another taking about the vast rainfall that is and will continue to occur.  It is some kind of an unhealthy theory that every weather or climate event, snow, heat, drought, wind, no wind, rising temperatures, dropping temperatures, sea level rise, sea level drop, you name it, are all caused by CO2.

More on CO2 and famine in the next blog.   Growth enhancement using forecast changes in atmospheric CO2 will be examined.

cbdakota

Solar Cycle 24 Continues to Under Perform the Early Projections.


Cycle 24 Sunspots count continues to underperform early forecasts.   Chief forecaster, David Hathaway, Ph.D., Heliospheric Team Leader, NASA Marshall Space Flight Center, Huntsville, Alabama has frequently revised his forecast of maximum monthly average Sunspot count.  A look at the lowering of the NASA forecast over the years:

Before I pile it too heavily on Dr. Hathaway, most of the experts were as wrong as he was.  One of the few that accurately forecast Cycle 24 was Dr. Lief Svalgaard of the Helioseismic and Magnetic team of Stanford’s Solar Dynamics Observatory (SDO),   Svalgaard predicted 75 as the maximum number in 2004 and has since revised it downward to 72.

Show below is the Cycle 24 recorded Sunspot monthly average numbers through July 2011 versus the current NOAA forecast of 90.

Before you begin to question my grasp of consistency in numbers, please be aware that the Sunspot number is measured several different ways.   The folks in business recognize this and have put together a team to try to bring about uniformity.  One of the more obvious questions is —are we reading more sunspots now because we have much better optics?  The following abstract from this program lays out some of the problems.

The Sunspot number (SSN) record (1610-present) is the primary time sequence of solar and solar-terrestrial physics, with application to studies of the solar dynamo, space weather, and climate change. Contrary to common perception, and despite its importance, the international sunspot number (as well as the alternative widely-used group SSN) series is inhomogeneous and in need of calibration. We trace the evolution of the sunspot record and show that significant discontinuities arose in ~1885 (resulting in a ~50% step in the group SSN) and again when Waldmeier took over from Brunner in 1945 (~20% step in Zürich SSN). We follow Wolf and show how the daily range of geomagnetic activity can be used to maintain the sunspot calibration and use this technique to obtain a revised, homogeneous, and single sunspot series from 1835-2011.

Where do we go from here?

Find and Digitize as many 19thcentury geomagnetic hourly values as possible

Determine improved adjustment factors based on the above and on model of the ionosphere

Co-operate with agencies producing sunspot numbers to harmonize their efforts in order to produce an adjusted and accepted sunspot record that can form a firm basis for solar-terrestrial relations, e.g. reconstructions of solar activity important for climate and environmental changes

To learn more about sunspot counting click here.

As the magnetic fields are what drive sunspots, here is a current look at Sun’s north and south magnetic fields:

Chart Courtesy of Wilson Solar Observatory

It is obvious that Cycle 24 is different from the Cycles 21, 22 and 23 that precede it.   The magnetic field strength is much weaker and angle of approach to the X axis (0 microTesla line) is very much steeper than that of Cycle 24. Click here, here and here for further discussion of the magnetic field.

How about a look ahead to cycle 25 and beyond.  I don’t know enough to have much confidence in this forecast Ed Fix, but David Archibald seems to think it is viable.  Click here for more info:

The green line is the solar cycle record from 1914 to 2010, with alternate cycles reversed. Solar Cycles 19 to 23 are annotated. The red line is the model output, from which the lengths of individual solar cycles in the mid-21st Century can be calculated.

Mr. Fix has Cycle 25 duplicating the current Cycle 24.  Cycles 26,and 27 are about half again as large as 24 and 25. Even so his forecast for Cycles through 28 are all considerably less active than Cycles 18 through 23.  Does this mean an extended global cooling?

cbdakota

AGW Computer “Fails” Resource


Following several brief comments about another AGW scientist owning up to the weakness of the computer models, is a site that lists failed AGW climate computer models projections.  Remember it is these computer projection upon which rests the entire rationale for the manmade global warming theory,

Kevin Trenberth is “Distinguished Senior Scientist in the Climate Analysis Section of the National Center for Atmospheric Research”.  Trenberth has been a lead author for IPCC Global Warming Reports. He is also one of the Climategate gang.  In one of the hacked emails he sent to his compatriots he said:   “The fact is that we can’t account for the lack of warming at the moment, and it is a travesty that we can’t. The CERES data published in the August BAMS 09 supplement on 2008 shows there should be even more warming: but the data are surely wrong. Our observing system is inadequate.”  (My emphasis)  He later explained that what he really meant is that the globe is still heating up but nobody can figure out where the heat is going.  Recently Dr Spencer and Dr Braswell seem to have explained this.   See here for their paper On the Misdiagnosis of Climate Feedbacks from Variations in Earth’s Radiant Energy Balance  Roy W. Spencer, and William D. Braswell.

So we are talking about a major leader in the AGW theory crowd.  He recently has published a paper in “Climate Research”.  In that paper according to CO2 Science:

…..(he) compares the projections of state-of-the-art climate models with what is known about the real world with respect to extreme meteorological events related to atmospheric moisture, such as precipitation and various types of storm systems, as well as subsequent extreme consequences such as droughts, floods and wind damage. So what does he find?

 The C3 blog  summarizes that paper as follows:

Specifically, Trenberth takes issue with the climate models’ inadequacies in regards to precipitation. Such as:

                  …all models contain large errors in precipitation simulations, both in terms of mean fields and their annual cycle, as well as their characteristics: the intensity, frequency, and duration of precipitation…”

                  “…relates to poor depiction of transient tropical disturbances, including easterly waves, Madden-Julian Oscillations, tropical storms, and hurricanes…”

                  “…confidence in model results for changes in extremes is tempered by the large scatter among the extremes in modeling today’s climate, especially in the tropics and subtropics…”

                  “…it appears that many, perhaps all, global climate and numerical weather prediction models and even many high-resolution regional models have a premature onset of convection and overly frequent precipitation with insufficient intensity,…”

                  “…model-simulated precipitation “occurs prematurely and too often, and with insufficient intensity, resulting in recycling that is too large…”

                  “…a lifetime of moisture in the atmosphere that is too short, which affects runoff and soil moisture…”

                  and finally, he has a NSS moment…”major challenges remain to improve model simulations of the hydrological cycle.”

Ok we skeptic were pretty sure that was the case.  But remember that group still wants us to bet the future on their models.

I want to lead you  to a treasure trove of AGW computer model “Fails”.  If you click HERE you will get a listing of computer models failures.

Here are some of the recent titles:

NASA Research Reveals Antarctica Ice Sheet Melt Just A Fraction of Climate Model Predictions

New Research: Experts Determine German Flooding Has Not Increased From Global Warming As Predicted

IPCC Prediction That Global Warming Would Cause More Wildfires Proves To Be Wrong

Last Week Had The Global Warming Alarmists Admit To Zero Warming Since 1998, Now An Admission That Models Don’t Work

Since 1990, IPCC’s Climate Predictions Have Been Wrong – Billions Wasted On Worthless Fortunetelling

A Spectacular Failure: Latest HadCrut & NASA Temperatures Significantly Below IPCC Climate Model Predictions

Hansen’s Global Climate Model In Total Fail: Predicted Ocean Heat Goes Missing

Look at the other links that take you to more good information.

cbdakota

Skeptics Ahead on Science but Lag on Politics/Media


The Skeptics are winning the science battle but are still running behind in the political /media arena.  What can we do to help?

SCIENCE

Scientists are abandoning the man-made global warming  (AGW) theory in increasing numbers.  They are recognizing the obvious:  The skeptic’s science is based upon observationally based science whereas the AGWers theory is based on computer projections.

Observationally Based Science versus Computer Projections

Amazingly, the AGWers will often say that the facts are wrong because their computer comes up with different answers.  This is most recently illustrated with respect to the recent reports on global sea level. From a WattsUpWithThat posting:

A few months ago a widely-publicized article by Houston and Dean was published in the Journal of Coastal Research (and on your site), noting that although sea-level is rising; the tide gauge data does not show any increased rate of rise (acceleration) for the 20th and early 21st centuries.  This augmented by a >).”>recent paper authored by an Australian scientist as well.

Houston and Dean (2011) considered only tide-gauge records with lengths greater than 60 years, noting that shorter record lengths are “corrupted” by decadal fluctuations.

Rahmstorf and Vermeer (RV) had previously reported on sea level change using their computer-aided program that provided different results of those of Houston and Dean.  RV attacked the Houston and Dean entry.  Houston and Dean responded to the RV criticism by saying:

RV link sea-level rise with temperature using a simple linear relationship with two free variables of opposite signs that allow them to “fit” any smooth data set. However, they are curve fitting, not modeling physics, so the approach cannot be used to predict future sea level.

A recent workshop of the Intergovernmental Panel on Climate Change (IPCC, 2010) considered the semi-empirical approaches of Rahmstorf (2007), Vermeer and Rahmstorf (2009), and others and concluded, “No physically-based information is contained in such models …” (p. 2) and “The physical basis for the large estimates from these semi-empirical models is therefore currently lacking” (p. 2). Other recent studies show slowing or reversal of the sea level.  See

The AGWers Are Getting Desperate

For some 12 years, global temperatures have not shown any discernable trend upward to match the increasing amounts of atmospheric CO2.   At last the AGWers know the reason, its volcanoes or it might be China’s coal based power plant emission.   Certainly we can be grateful that the computers have resolved (well sort of) this issue.  So coal based power plants actually make the global temperatures cooler and all along we have been told just the opposite.

For more information read this link and if you want a laugh read this link.

POLITICS AND THE MEDIA

The Need For an Informed Public

For the nearly 10 years that I have been actively involved in discussions and reporting on global warming, I have always believed that the science was on the skeptic’s side.   In a status review of global warming prepared for some State Senators in 2003, I stated that being right about the science would probably not be enough to win this struggle with the AGWers.  For example, the taxing and regulating authority that would stem from enacting Cap and Trade legislation will drive the politicians.   I think the beginning of the end of AGW driven legislation will take place when the public began experiencing pain of the resulting financial burden.   But are we going to be reduced to third-world status as a nation before we can turn the ship of state around?

How can we avoid this national  destruction on the altar of the watermelon (red on the outside/green in the inside)  movement?

First principle should be that the people who are going to be asked to pay for these green programs be completely informed of the consequences of the regulations or legislation being enacted.  This is not happening now.

LEGISLATION

Let’s remember that the House of Representatives in 2009 passed legislation for imposing Cap and Trade on fossil fuel use.  The bill was over a thousand pages long.  The Democrat leadership pushed this massive attempt to bring the nation’s energy under the control of the Government without anyone fully understanding what was in the bill.  The committee chairmen said they did not know!!!!!!   In an attempt to mollify the unhappy conservatives, they agreed to have the bill read.  So those clowns hired a speed-reader.  I believe that a legislative rule should be enforced that requires no bill can be voted upon without a minimum of a week’s worth of legislative sessions following proposed law being published unless a ¾ vote in favor of suspending the rule is passed.   This would not impose a significant burden upon the members.  The objective would be to raise their constituents’ understanding and the legislators should not be afraid of doing that.    Fortunately, as you know, the Senate failed to pass companion Cap and Trade legislation and thus it was never enacted.

REGULATION

Regulations for Cap and Trade are being written by the EPA.  Yes, the EPA is writing regulations for legislation that could not get approval in Congress.  Part of the blame for this are five  of the nine members of the Supreme Court.

  Massachusetts v. Environmental Protection Agency, 549 U.S. 497 (2007),[1] is a U.S. Supreme Court case decided 5-4 in which twelve states and several cities of the United States brought suit against the United States Environmental Protection Agency (EPA) to force that federal agency to regulate carbon dioxide and other greenhouse gases as pollutants.

Despite the knowledge that this legislation could not get passed in Congress and despite the fact that CO2 was excluded from the Clear Air Act, the Supremes gave the EPA the authority to determine if CO2 were a threat and if so, to write regulations to control it.

The EPA used the 2007 IPCC Global Warming report as their  science basis.  The EPA  asked for comments on their study and then they ignored any response that said that AGW science was badly flawed.  People within the EPA that expressed doubt were told to be quiet.  The EPA found CO2 to be a threat and began writing regulations.  These regulations are vast and growing.

There is a little irony here in that the environmentalists want all sources of CO2 to be regulated.  The EPA does not want to do that because the enormity of the ensuing burden.   Every furnace exhausts CO2, every hospital, every mall,  almost everything that makes our nation go would have to be monitored and reported.  The regulation overload will quickly result in demands for changes.  In fact I believe the EPA worries that it would result in legislation taking CO2 out of the Clean Air Act again.

Here again, the straightforward thing would be for your representatives to inform you of what the impact on them will be.  Congress should limit the damage the Supreme Court and The Executive (EPA) Branches do when they usurp the Legislatures prerogatives,   by passing legislations that restores the balance of powers.

Any other suggestions?

THE MEDIA

We all value the freedom of the press as guaranteed in the US Constitution. However the media, by and large, is supportive of BIG GOVERNMENT versus more individual freedom and responsibility. So they practice a form of soft censorship themselves by only reporting one side of the story.   One would expect better of them.  Although their domination of “what is fit to print” has been somewhat weakened by the ubiquitous Internet, it still is the primary input of news and information for most of the citizens of the US.  If our citizens would do less American Idol and pay more attention to what the politicians are doing, it would have a salutary effect on the their personal well-being and the nation’s well-being.

Surely some part of their misguided reporting of climate science is because they are not trained scientifically.  They apparently are too lazy or too intimidated to try to research the issues.  A science reporter from a newspaper in my area has obviously no curiosity or no understanding of what a millimeter is.  He reported about the danger of calving Antarctic Ice that would raise sea level several millimeters per year.  Recently he did a fairly straightforward report on the transfer of State Climatologist title from one PhD to another.  The one surrendering the title is a notable skeptic and frequent co-author of papers with other notables such as Willie Soon and Sallie Baliunas.   At the end of the report about the transfer, the reporter took a cheap shot at the skeptic saying that the skeptic was know to be a member of a group that was part of another group that once received money from Exxon.  If the reporter had any level of curiosity he could find out that the most of Exxon’s grant money is for groups that are working on alternative energy.  If the reporter believes funding by an advocate of a particular position is wrong, then why not then report on monies granted by Greenpeace, of the World Wildlife Fund to AGW scientists and groups. Secondly, the grants of money by AGW groups swamp the piddling amount the skeptics receive.  These grants are governmental and NGOs supplied and they total into the billions.  See these reports for further information about the distribution of monies.Here & here & here.

One suggestion is that you keep up with the skeptic blogs like WUWT, Ice Cap, Climate Depot, Heartland, Climate Audit, Science, etc and I hope, Climate Change Sanity and spread the information widely.

Also write to the newspapers.  Tell them when they are off base.  Suggest things they should look into.

If you have some thoughts on all of this, let me know.

cbdakota

Climate Models Not Ready For Prime Time


The preceding posting, Climate Modelers are Wizard of Oz’s Spawn,  noted that the backcasting used to prove the models,  was not scientifically viable/honest. I worked in systems operations in manufacturing facilities where solutions to problems were proposed and then tested to see if they worked in the real world.   The technique of  backcasting to fit an experience curve has been around for a long time. When the model seemed to match history, the  “solution”  resulting from that model was employed going forward.  Sometimes it worked and sometimes it did not work. In the real world, you have to test, test  and retest your premises to assess the confidence of the rightness of the solution.  The concept of proving your solutions is not the standard in the science of global warming climate modeling as far as I can tell.  And my view is that the global climate dynamic is vastly greater that any of the problems we were solving in the operating facilities, thus the likelihood of obtaining a high degree of certainty is problematic.

Lets look at a summary of a recent posting that lists 10 issues that demonstrates that the models are not ready for prime time. This is from The Hockey Schtick blog where more detail is provided in that posting and can be read by clicking here.

1,            IPCC admits climate models have not been verified by empirical observations to assess confidence

2            IPCC admits it is not clear about which tests are critical to verify and assess confidence in the models.

3            Of 16 identified climate forcings, IPCC admits only two have a high level of understanding. Most of the others are said to have a low level of understanding.

4            Of the two identified as having high level of understanding (greenhouse gases and positive feedback) they are actually not well understood with empirical satellite data showing sensitivity to doubling CO2 with feedback is only about 0.7°C which is a factor of 4 less than IPCC climate models.

5            Climate models falsely assume “back-radiation” from greenhouse gases can heat the oceans. In fact IR radiation can only penetrate the surface a few microns with all the energy used  in the phase change of evaporation–which in fact cools the oceans.

6            UV radiation is capable of penetrating the ocean to a depth of several meters. The IPCC models ignore UV.

7            IPCC is not certain whether clouds have a net cooling or warming effect even though it is shown empirically that clouds are many times more important than greenhouse gases.

8            Ocean oscillation can have huge effects on climate and these are not incorporated into the models.

9            The traditional climate models fail to properly reconstruct the correct amplitude of climate oscillations that have clear solar/astronomical signature.

10            Climate models continue to greatly exaggerate sensitivity to CO2 by 67%. Despite the climate modeler having admitted this, they are unwilling or unable to tweak the models to match observed temperatures.

cbdakota

Climate Modelers are Wizard of Oz’s Spawn


If you look closely, it’s not demonstrated science but the climate models that are the basis for all the forecasts of catastrophe will result from manmade global warming.  The models, cited by the IPCC in their reports, supposedly demonstrated that the global temperatures recorded from 1978 to 1998 could only have occurred because of additional atmospheric CO2 from the increased use of fossil fuels.  Thus we are to believe that they have modeled the atmosphere so when the models look to the future they must give accurate projections.

But we know that these same models do not forecast worth a damn.  How can it be the models that all showed agreement with the past don’t get the future right? But perhaps more importantly why is it that the future forecasts don’t agree with one another.  That mystery is explain by Warren Meyer in his 9 June 2011 posting in Forbes:

When I looked at historic temperature and CO2 levels, it was impossible for me to see how they could be in any way consistent with the high climate sensitivities that were coming out of the IPCC models.  

My skepticism was increased when several skeptics pointed out a problem that should have been obvious.  The ten or twelve IPCC climate models all had very different climate sensitivities — how, if they have different climate sensitivities, do they all nearly exactly model past temperatures?  If each embodies a correct model of the climate, and each has different climate sensitivity, only one (at most) should replicate observed data.  But they all do. 

The answer to this paradox came in a 2007 study by climate modeler Jeffrey Kiehl. To understand his findings, we need to understand a bit of background on aerosols. Aerosols are man-made pollutants, mainly combustion products, which are thought to have the effect of cooling the Earth’s climate.

What Kiehl demonstrated was that these aerosols are likely the answer to my old question about how models with high sensitivities are able to accurately model historic temperatures.  When simulating history, scientists add aerosols to their high-sensitivity models in sufficient quantities to cool them to match historic temperatures.  Then, since such aerosols are much easier to eliminate as combustion products than is CO2, they assume these aerosols go away in the future, allowing their models to produce enormous amounts of future warming.

Specifically, when he looked at the climate models used by the IPCC, Kiehl found they all used very different assumptions for aerosol cooling and, most significantly, he found that each of these varying assumptions were exactly what was required to combine with that model’s unique sensitivity assumptions to reproduce historical temperatures.  In my terminology, aerosol cooling was the plug variable.

The problem, of course, is that matching history is merely a test of the model — the ultimate goal is to accurately model the future, and arbitrarily plugging variable values to match history is merely gaming the test, not improving accuracy.

This is why, when run forward, these models seldom do a very credible job predicting the future.  None, for example, predicted the flattening of temperatures over the last decade.  And when we look at the results of these models, or at least their antecedents, from twenty years ago, they are nothing short of awful.  NASA’s James Hansen famously made a presentation to Congress in 1988 showing his model runs for the future, all of which show 2011 temperatures well above what we actually measure today.

Meyer adds that: “Rather than real science, the climate models are in some sense an elaborate methodology for disguising our uncertainty.  They take guesses at the front-end and spit them out at the back-end with three-decimal precision.  In this sense, the models are closer in function to the light and sound show the Wizard of Oz uses to make himself seem more impressive, and that he uses to hide from the audience his shortcomings.”

So there we have it, the modelers jigger the system with enough variables to have the predetermined variables such as the positive feedback that boosts CO2 effect by a multiple of 3 or 4 be over ridden when doing the back cast and then drop the jiggering (in this case, aerosols) for future forecasts.

cbdakota

Solar Cycle 24-A Game Changer Revisited


On the 14th of June at the AAS conference in Las Cruces,  a group of scientist from the National Solar Observatory (NSO) suggested that the familiar sunspot cycle may be shutting down.   They observed that the spots were fading (weaker), that the current Cycle 24 was showing fewer spots and that Cycle 25 was behind the normal schedule in its formation.

Sunspots have been recorded for hundreds of years and they are a very visible proxy for solar activity.  Solar activity is also visible in the numbers and strength of flares and coronal mass ejections (CME).  The solar cycle is nominally about 11 years in duration.  It begins with a relatively quiet sun and then there is a ramping up of sunspots, etc. maximizing  about half way through a cycle.   At this time the Sun’s north and south magnetic poles “flip” and sunspots, etc. begin ramping down to a relatively quiet Sun.

Drs.Frank  Hill  of the NSO explains that he and his team are using “helioseismology to measure sun-wide oscillations of the solar surface”.   Sound waves of extremely low frequency that emanate from deep within the Sun induce up-and-down oscillations in the sun’s outer gas layer. Measurements of these surface motions can be used to make maps of solar surface velocity, called Dopplergrams, from which physical conditions such as temperature, composition and the interior magnetic field can be inferred.  Dr. Hill reported on “a jet-stream-like flow within the sun that they have been monitoring since 1995 using helioseismology.

The stream, which is coincident with the sunspots, has an east-west zonal flow inside the Sun at about 4000 miles beneath the Sun’s surface.   The following figure presented at the Conference is illustrative of what Hill and his team have discovered.

The annotated chart’s  yellow and red bands trace the solar jet streams.  The black contours denote sunspot activity.   Cycle 24 (the current cycle) streams can be seen beginning about 1998-1999 at about 60° lattitude north and south.  These streams begin converging toward the equator.  At about 22°, sunspot activity begins.  Ultimately the streams will reach the equator at a time of solar maximum.  See Cycle 24 and the Butterfly Diagram for more on this.

The stream that began at the 60° latitude splits with part of it going toward the poles and the other part toward the equator.

Note that Cycle 23 stream heading for the equator was more active when it reached approximately 22°  than is Cycle 24 and that the angle of approach to the equator was steeper than that currently occurring in Cycle 24.   Dr Hill reports that it took  3 years for Cycle 24 to cover a ten-degree range that only took 2 years for Cycle 23.   Thus Cycle 24 is “slower” than Cycle 23.

The determination of this magnetic jet stream was first made by the   instrumentation on the SOHO satellite launched  December 2, 1995 using  a Michelson Doppler Imager.  It was replaced by a Helioseismic and Magnetic Imager (HMI) in Feb 2010 on a Solar Dynamics Observation satellite.  The HMI is said to be many time more sensitive and it will report almost continuously.    The unit uses a 16 million-pixel camera  configured to show blue images where the Sun’s  oscillations are moving the surface toward the HMI camera and red when it moves away.  The satellite is in orbit about 22,000 miles above the Earth’s surface at about the point where the Sun’s and the Earth’s gravitational pull are equal.

Richard Altrock, manager of the Air Force’s coronal research program has observed that the remnants of the magnetic jet stream go poleward as far as 85° where they die.

Returning to the figure it can be noted that Cycle 24 magnetic jet stream was forming in the 1998-2000 timeframe.   Noted on the figure is “Cycle 25??? 2019? 2030?”. Dr Hill points out that the magnetic jet stream for Cycle 25 should have been forming already but there is no sign of it yet.  The press release for regarding this situation suggests that: Cycle 25 will be greatly reduced or may not happen at all.

Latter, I will post on the work by Matt Penn and William Livingston that shows a weakening trend in the strength of the sunspots.

So what do we make of this?  Because of the satellite programs underway in the US and Europe primarily, we are probably doubling our knowledge of the Sun every few year.  But we still don’t know much about the Sun.  Reading the postings on this topic leads me to believe that the solar experts are not of one mind on the idea that this means the climate is about to get much cooler.
My bias is to say that we are going to see years of global cooling.  I say that based upon the reconstructed history of the Maunder and other minimums.  The only good thing I believe that can come from a period of cooling is to put a stake in the heart of the corrupt science that is the AGW theory.  I am not sure we can say with any certainty that more CO2 in the atmosphere and perhaps more naturally caused global warming is a bad thing. Who is to say that 2 or 3 more degrees would be bad.  Only the models in their ignorance are sure of this.  But extended cold could cause a lot of starvation.  Lets hope this does not happen.

So,  stayed tuned.

cbdakota