The Paris Agreement is Failing


Germany, Poland, etc., for example, are not meeting their self-imposed commitments regarding CO2 emissions reductions. In fact, the Paris Agreement bookkeepers show that almost no one is meeting their commitments. Let’s look at the graphic they have developed to show the status:

 

 

 

The chart shows how the key nations or national groups are performing with respect to meeting the self-imposed commitments for CO2 emission reductions.  The Paris Agreement objective is to hold Global Temperature rise to 1.5C by 2050.   These initial commitments are not enough to do that but were planned to be a start with the nations and national groups accomplishing further reductions as time passes.  That may be problematic if they can not even make the “easy to accomplish” initial commitments.

I hope you can read the chart, but just in case you can’t it is constructed as follows:

Across the top are 6 performance categories—   

·         Role Model

·         1.5C Paris Agreement compatible

·         2C Compatible

·         Insufficient

·         Highly insufficient

·         Critically Insufficient

No one has made the Role Model category

Morocco and The Gambia are 1.5cº Paris Agreement Compatible.   Are you beginning to see why I say the Agreement is failing if only these two inconsequential nations (with respect to emissions) make the grade.

The 2cº compatible  category has Bhutan, Costa Rica, Ethiopia, India and the Philippines. Well, India is a major emitter but what they tell the Paris Group and where the Indian leaders appear to be taking the country are very different.

Insufficient category nations are Australia, Brazil, the EU, Kazakhstan, Mexico, New Zealand, Norway, Peru and Switzerland. The EU is a major emitter, but the others are not.  

Highly insufficient nations are Argentina, Canada, Chile, China, Indonesia, Japan, Singapore, South Africa, South Korea and the UAE.   China is already the world’s No.1 CO2 emitter and they don’t plan to stop increasing their emissions until 2030.

And now for the Critically Insufficient we have Russian Federation, Saudi Arabia, Turkey, the USA, and Ukraine.

The US has reduced its emissions as a result of the ongoing change over from coal to natural gas.  That change over is driven by economics, and not by government edict.

There is another surprise waiting for those that think the all is well with the Paris Agreement.  Starting next year, a $100billion fund is to be created by the “developed nations”.  The money is then available to the less developed nations to accomplish CO2 reductions in their nation.  Each year a new $100billion are to be deposited into this fund by those same developed nations, perhaps forever.

 $10billion was to be deposited into this fund over the past 5 years.  Then President Obama, chipped in $2billion.   Last I looked, the fund has not reached the expected $10billion.  These countries can’t come up with $8billion in 5 years.  Make a guess if they are going to contribute another $100billion into that fund next year.  And the year after that, and the year after that, etc.

You might think that the writers of the Paris Agreement wanted to make a joke to see if anyone would catch it by making this arrangement—China, the world’s largest emitter of CO2 and the 2nd largest economy in the world, is part of the group that can draw on that fund, not contribute to it.     Unfortunately, it is not a joke.

Cbdakota

Renewables Are Better At Creating Jobs Than At Creating Energy


Anericanexperiment blog posted”Energy Industry There to Produce Energy, not Jobs” written by John Phelan..The author begins by quoting Gregg Mast of Clean Energy Economy Minnesota who is boasting about clean energy jobs growth.  Mast says:

 “The fact is,the number of clean-energy jobs has grown every year since the release of the first Clean Jobs Midwest-Minnesota report in 2016, and these good-paying jobs have been added at a faster pace than the statewide average.”

 

Countering Gregg Mast’s boast,  Phelan responds by saying:

“This might sound like great news, but there is something missing from this celebration. It is something vital. Indeed, from an economic point of view, it is the most vital thing of all: How much energy are these workers actually producing?  Increasing productivity — the ratio of outputs produced to inputs used — is key to economic growth and raising living standards”.

So, how productive are these new clean-energy workers? How much energy does each produce?  Sadly, the answer seems to be “not much.” According to data on electric-power generation by primary energy sources from the Energy Information Administration and figures for employment in each sector from the U.S. Energy and Employment Report, we can see that, in 2017,   the 412 workers employed in Minnesota’s natural-gas sector produced an average of 16,281 megawatt hours of electricity each. For coal, the figure was 13,230 megawatt hours produced for each of the 1,722 workers employed in the state.

But for renewable wind and solar, the numbers are far less encouraging. In terms of megawatt hours produced per worker, Minnesota’s wind sector came in a somewhat distant third. Each of the 1,966 workers here generated an average of just 5,665 megawatt hours in 2017. This was just 43 percent of the amount of electricity a Minnesota coal worker produced annually and 35 percent of that produced by a natural-gas worker.

For solar, the numbers are even worse. In 2017, each of Minnesota’s 3,800 solar-energy workers produced an average of just 157 megawatt hours. This was just 1.2 percent of the energy produced by a coal worker and only 1 percent of that which a natural-gas worker produced.

The chart below illustrates the above:

 

 

 

In terms of that vital ratio of outputs (energy generated) to inputs (number of workers), wind energy is a low-productivity sector compared to natural gas and coal. Solar is even worse. Piling more inputs into these sectors when they could be more productive in other sectors lowers productivity and economic welfare. This is certainly not something to be celebrated — from an economic point of view, at least.

Mast and Clean Energy Economy Minnesota need to remember that the point of an energy industry is to generate energy, not to generate jobs.

A response by supporters of wind and solar is that there are workers out there insulating homes.  How many of solar’s 3800 jobs are insulating homes?

cbdakota

A Little Perspective on 2 Degrees Centigrade


The alarmists are telling us that the global average temperature must not exceed two degrees Centigrade.   I think a little perspective is in order. The 2º C is an anomaly. Here is what it would look like on your thermometer.

 

 

 

 

 

 

cbdakota

Predicting Solar Cycle 25


Prior to the advent of SC 24, there were many predictions of  its level of activity.  Most of the predictions were for a replicate of SC 23. Leif  Svalgaard’s  predicted a major change in  its level of activity.     He was predicting about half as active as were most of the predictors and  we know now that he was right.     Svalgaard’s method  used  the Solar Polar Field Strength  to make the prediction. 

I know many of you know all about the solar polar fields, but for those that do not, let me review what the following chart tells us.

 

 

 

The X axis is time beginning on 7 Dec 1976.  It extends out to 2 Feb 2019 showing part or all of SCs 21,22,23 and 24.  Solar Cycle 24 began January 2008 and is forecast to end late 2019 or early 2020.  SC 24 was at maximum activity during April 2014 with  a smoothed sunspot number of 111..  The maximum typically occurs when the South magnetic field and the North magnetic field reverse positions. In the chart above, the red (South) moving line crosses the zero Field strength line noted on the Y axis  going south and the north (Blue) moving line crosses zero going north. These fields continue toward the poles where they begin producing sunspots in the high Sun latitudes.  The fields begin to move toward the zero line and the new SC 25 will begin.

To make the prediction one has to  use the black line, the north field strength minus the south field strength,  to make the prediction.  The time to make the prediction is when the black line is the furthest from the Y axis zero line.  This occurred in the spring of 2004 so they predicted SC24  to be small.  If you look at the black line, say in January 2018 it is about the same distance from the zero line so the gurus are saying SC25will be about the same size as SC24. 

If Leif Svalgaard reads this, he would probably say I have oversimplified the procedure and do not have it exactly right.  So be it. 

The National Weather Service posted the following   Solar experts predict the Sun’s activity in Solar Cycle 25 to be below average, similar to Solar Cycle 24

April 5, 2019 – Scientists charged with predicting the Sun’s activity for the next 11-year solar cycle say that it’s likely to be weak, much like the current one. The current solar cycle, Cycle 24, is declining and predicted to reach solar minimum – the period when the Sun is least active – late in 2019 or 2020.

Solar Cycle 25 Prediction Panel experts said Solar Cycle 25 may have a slow start, but is anticipated to peak with solar maximum occurring between 2023 and 2026, and a sunspot range of 95 to 130. This is well below the average number of sunspots, which typically ranges from 140 to 220 sunspots per solar cycle. The panel has high confidence that the coming cycle should break the trend of weakening solar activity seen over the past four cycles.

“We expect Solar Cycle 25 will be very similar to Cycle 24: another fairly weak cycle, preceded by a long, deep minimum,” said panel co-chair Lisa Upton, Ph.D., solar physicist with Space Systems Research Corp. “The expectation that Cycle 25 will be comparable in size to Cycle 24   means that the steady decline in solar cycle amplitude, seen from cycles 21-24, has come to an end and that there is no indication that we are currently approaching a Maunder-type minimum in solar activity.”

This is the Experts’ chart below and it shows the 24 SCs with the maximum sunspots ( see Y axis) and the time it occurred.. Also, the Experts have plotted SC25 on the chart::

 

The Experts are also predicting SC26 when they say “The expectation that SC 25 will be comparable in size to SC 24   means that the steady decline in solar cycle amplitude, seen from cycles 21-24, has come to an end and that there is no indication that we are currently approaching a Maunder-type minimum in solar activity.”

I wonder if that is wishful thinking?

cbdakota

Solar Cycle 24 is Nearing Completion.


Sometime ago, every month I blogged a brief report on the activity of the Sun.  I have the urge to do that again, so here goes.

Solar Cycle (SC) 24 has just about run its course. It is forecast to give over to SC 25 in late 2019/early 2020 and when it does, that’s call the 24SC minimum.

Sunspots are a proxy for Solar activity.  The chart below shows the average number of sunspots in each month.  The blue dashed line is a 13-month averaged sunspot count.  It is the official sunspot number.  (The formula for the count is shown at the end of this posting.)  The official number of sunspots peaked in April of 2014 thus the solar maximum happened then.

The chart below  illustrates how recent SCs compare to SC 24:

 

All three of the preceding SCs were much more active than SC24.

As side note,  the SCs on average last for 11 years, or saying another way, 132 months.  At one time, it was believed that if the SC was over before 11 years it was generally an active SC.  More than 11 years, less active.

 

The chart below shows the 24 SCs and the chart makers attempt at a SC25.  The X axis is in years from 1749 to an estimated 2040.  The Y axis is sunspots

One can see that SCs 23, 22, 21, 19, and 18 represent a very active sun.  The maker of the chart calls this the “modern warm period”.   Looking back the chart maker has noted the time of the “Dalton Minimum” and the “little ice age”.  These periods of low solar activity coincide with the periods of low global temperatures. Perhaps you can see why many scientists are forecasting that global temperatures will soon be dropping.  Also one can speculate that  the global warming  we have experienced may be a product of the past 60 years of a very active sun.    Ok, now one more reading of the chart might suggest that we are due for a period of low solar activity thus a drop in global temperatures.   The chart maker’s projection of SC 25 to  be lowest in recorded history is very likely to be wrong.  However the batting average of the predictors of future SCs is not too stellar  so who knows.

Throughout the recent past, claims were made that the global temperature was going to drop because SC 24 was relatively inactive.  I do not think that the temperature did drop.  I believe I read  one article where the claim was that SC 24 was the reason that the increased CO2 in the atmosphere did not raise the temperature as much as it should  have.  I don’t believe that one.

Was SC 24 definitely an uniquely quiet SC?  I think so.

The sunspot activity of the cycles in comparison. The numbers in the diagram are obtained by summing the monthly differences between the observed SSN and the mean (blue in Fig.1) up to the current cycle month 125. ( I am not sure whom to attribute this chart but I got it from Prof. Fritz Vahrenholt and Frank Bosse who write the diekaltesonne  blog.)

This shows that at just about 10-1/2 years,  SC 24  has had 4464 fewer sunspots than the average  SC.  It also shows that SCs 5 and 6, had the  fewest sunspot and those two SC are coincident with the Dalton Minimum.   SC 12, 13, 14, 15, and 16 were way low on sunspots and they coincided with the little ice age.

It is clear that the sun was much less active as demonstrated by the sunspot record.  I expected a clear sign by the end of its cycle, which we have not yet seen,  of a cooling global temperature  trend. Some think we have that, but I do not see what I expected from the UAH satellite global temperature readings.  The temperature  has declined since the last El Nino but it has not been lowered to the temperature before that El Nino.

Next a look at Solar Cycle 25.

Cbdakota

Sunspot Counting–Woolf Number

The smoothed count is a 13-month averaged sunspot count using this Belgium’s formula:
Rs= (0.5 Rm-6 + Rm-5 + Rm-4 + Rm-3 + Rm-2 + Rm-1 + Rm + Rm+1 + Rm+2 + Rm+3 + Rm+4 + Rm+5 + 0.5 Rm+6 ) / 12
Rs = smoothed monthly sunspot count
Rm = One month’s actual sunspot count
The “-6” through “+6” appended to each Rm is the number of months before or after the month whose smoothed count is being calculated. The beginning and ending months in the formula are only given half the value of the others.*

 

Some Thoughts on How Mills’ Managed The Report


I hope you enjoyed the Mark Mills’ report “New Energy Economy: An Exercise in Magical Thinking” that I have serialized on my blog.   If you have not seen it you can click here to begin the 10 parts. He does an excellent job of demonstrating why the Paris Agreement is unworkable, and of course, the even less believable New Green Deal.  And he did it without once entering into the argument whether CO2 is a serious threat to the globe or not.  

Mills barely mentioned nuclear generation in the report except for several small insertions.  It is possible that fossil fuels powered electrical production might be employed to handle the swings in demand in the future with nuclear the backbone of power generation.  Currently the catastrophic greens reject any use of nukes. So he did not need nukes to make his argument.   

cbdakota

New Energy Economy” An Exercise in Magical Thinking Part 10 Energy Revolutions Are Still Beyond The Horizon


This is the final part of the serialization of Mark Mills’ report New Energy Economy: An Exercise in Magic Thinking.

=======================================================================

Energy Revolutions Are Still Beyond the Horizon

 When the world’s poorest 4 billion people increase their energy use to just 15% of the per-capita level of developed economies, global energy consumption will rise by the equivalent of adding an entire United States’ worth of demand.92 In the face of such projections, there are proposals that governments should constrain demand, and even ban certain energy-consuming behaviors. One academic article proposed that the “sale of energy-hungry versions of a device or an application could be forbidden on the market, and the limitations could become gradually stricter from year to year, to stimulate energy-saving product lines.”93 Others have offered proposals to “reduce dependency on energy” by restricting the sizes of infrastructures or requiring the use of mass transit or car pools.94

The issue here is not only that poorer people will inevitably want to—and will be able to—live more like wealthier people but that new inventions continually create new demands for energy. The invention of the aircraft means that every $1 billion in new jets produced leads to some $5 billion in aviation fuel consumed over two decades to operate them. Similarly, every $1 billion in data centers built will consume $7 billion in electricity over the same period.95 The world is buying both at the rate of about $100 billion a year.96

The inexorable march of technology progress for things that use energy creates the seductive idea that something radically new is also inevitable in ways to produce energy. But sometimes, the old or established technology is the optimal solution and nearly immune to disruption. We still use stone, bricks, and concrete, all of which date to antiquity. We do so because they’re optimal, not “old.” So are the wheel, water pipes, electric wires … the list is long. Hydrocarbons are, so far, optimal ways to power most of what society needs and wants.

More than a decade ago, Google focused its vaunted engineering talent on a project called “RE<C,” seeking to develop renewable energy cheaper than coal. After the project was canceled in 2014, Google’s lead engineers wrote: “Incremental improvements to existing [energy] technologies aren’t enough; we need something truly disruptive. … We don’t have the answers.”97 Those engineers rediscovered the kinds of physics and scale realities highlighted in this paper.

An energy revolution will come only from the pursuit of basic sciences. Or, as Bill Gates has phrased it, the challenge calls for scientific “miracles.”98 These will emerge from basic research, not from subsidies for yesterday’s technologies. The Internet didn’t emerge from subsidizing the dial-up phone, or the transistor from subsidizing vacuum tubes, or the automobile from subsidizing railroads.

However, 95% of private-sector R&D spending and the majority of government R&D is directed at “development” and not basic research.99 If policymakers want a revolution in energy tech, the single most important action would be to radically refocus and expand support for basic scientific research.

Hydrocarbons—oil, natural gas, and coal—are the world’s principal energy resource today and will continue to be so in the foreseeable future. Wind turbines, solar arrays, and batteries, meanwhile, constitute a small source of energy, and physics dictates that they will remain so. Meanwhile, there is simply no possibility that the world is undergoing—or can undergo—a near-term transition to a “new energy economy.”

================================================================

 I know it was a lot of reading, but Mills does a marvelous job of making his thoughts easily understandable and convincing.

Mills’ entire report can be downloaded by clicking here. 

The pages of numbered references are found by clicking “to read more”.

cbdakota

Continue reading

New Energy Economy:An Exercise in Magical Thinking Part 9 Digitalization Won’t Uberize the Energy Sector.


Continuing the serialization of Mark Mills’ report New Energy Economy: An Exercise in Magical Thinking.  This part is Digitalization Won’t Uberize the Energy Sector.

=================================================

Digitalization Won’t Uberize the Energy Sector     

Digital tools are already improving and can further improve all manner of efficiencies across entire swaths of the economy, and it is reasonable to expect that software will yet bring significant improvements in both the underlying efficiency of wind/solar/battery machines and in the efficiency of how such machines are integrated into infrastructures. Silicon logic has improved, for example, the control and thus the fuel efficiency of combustion engines, and it is doing the same for wind turbines. Similarly, software epitomized by Uber has shown that optimizing the efficiency of using expensive transportation assets lowers costs. Uberizing all manner of capital assets is inevitable. Uberizing the electric grid without hydrocarbons is another matter entirely.

The peak demand problem that software can’t fix

In the energy world, one of the most vexing problems is in optimally matching electricity supply and demand (Figure 6). Here the data show that society and the electricity-consuming services that people like are generating a growing gap between peaks and valleys of demand. The net effect for a hydrocarbon-free grid will be to increase the need for batteries to meet those peaks.

 

All this has relevance for encouraging EVs. In terms of managing the inconvenient cyclical nature of demand, shifting transportation fuel use from oil to the grid will make peak management far more challenging. People tend to refuel when it’s convenient; that’s easy to accommodate with oil, given the ease of storage. EV refueling will exacerbate the already-episodic nature of grid demand.

To ameliorate this problem, one proposal is to encourage or even require off-peak EV fueling.85 The jury is out on just how popular that will be or whether it will even be tolerated.

 

Although kilowatt-hours and cars—key targets in the new energy economy prescriptions—constitute only 60% of the energy economy, global demand for both is centuries away from saturation. Green enthusiasts make extravagant claims about the effect of Uber-like options and self-driving cars. However, the data show that the economic efficiencies from Uberizing have so far increased the use of cars and peak urban congestion.86 Similarly, many analysts now see autonomous vehicles amplifying, not dampening, that effect.87

That’s because people, and thus markets, are focused on economic efficiency and not on energy efficiency. The former can be associated with reducing energy use; but it is also, and more often, associated with increased energy demand. Cars use more energy per mile than a horse, but the former offers enormous gains in economic efficiency. Computers, similarly, use far more energy than pencil-and-paper.

Uberizing improves energy efficiencies but increases demand

Every energy conversion in our universe entails builtin inefficiencies—converting heat to propulsion, carbohydrates to motion, photons to electrons, electrons to data, and so forth. All entail a certain energy cost, or waste, that can be reduced but never eliminated. But, in no small irony, history shows—as economists have often noted—that improvements in efficiency lead to increased, not decreased, energy consumption.

If at the dawn of the modern era, affordable steam engines had remained as inefficient as those first invented, they would never have proliferated, nor would the attendant economic gains and the associated rise in coal demand have happened. We see the same thing with modern combustion engines. Today’s aircraft, for example, are three times as energy-efficient as the first commercial passenger jets in the 1950s.88 That didn’t reduce fuel use but propelled air traffic to soar and, with it, a fourfold rise in jet fuel burned.89

Similarly, it was the astounding gains in computing’s energy efficiency that drove the meteoric rise in data traffic on the Internet—which resulted in far more energy used by computing. Global computing and communications, all told, now consumes the energy equivalent of 3 billion barrels of oil per year, more energy than global aviation.90

 The purpose of improving efficiency in the real world, as opposed to the policy world, is to reduce the cost of enjoying the benefits from an energy-consuming engine or machine. So long as people and businesses want more of the benefits, declining cost leads to increased demand that, on average, outstrips any “savings” from the efficiency gains. Figure 7 shows how this efficiency effect has played out for computing and air travel.91

 

Of course, the growth in demand growth for a specific product or service can subside in a (wealthy) society when limits are hit: the amount of food a person can eat, the miles per day an individual is willing to drive, the number of refrigerators or lightbulbs per household, etc. But a world of 8 billion people is a long way from reaching any such limits.

The macro picture of the relationship between efficiency and world energy demand is clear (Figure 8). Technology has continually improved society’s energy efficiency. But far from ending global energy growth, efficiency has enabled it. The improvements in cost and efficiency brought about through digital technologies will accelerate, not end, that trend.

 

 

 

 

 

=====================================================

The serialization of Mark Mills’ report concludes with the next part titled Energy Revolutions Are Still Beyond the Horizon.

cbdakota

New Energy Economy: An Exercise in Magical Thinking Part 8 Sliding Down the Renewable Asymptote.


Continuing serialization of Mark Mills’ report New Energy Economy: An Exercise in Magical Thinking.

This part 8.  

=================================================== 

Sliding Down the Renewable Asymptote  

Forecasts for a continual rapid decline in costs for wind/solar/batteries are inspired by the gains that those technologies have already experienced. The first two decades of commercialization, after the 1980s, saw a 10-fold reduction in costs. But the path for improvements now follows what mathematicians call an asymptote; or, put in economic terms, improvements are subject to a law of diminishing returns where every incremental gain yields less progress than in the past (Figure 4).

 

 

 

This is a normal phenomenon in all physical systems. Throughout history, engineers have achieved big gains in the early years of a technology’s development, whether wind or gas turbines, steam or sailing ships, internal combustion or photovoltaic cells. Over time, engineers manage to approach nature’s limits. Bragging rights for gains in efficiency—or speed, or other equivalent metrics such as energy density (power per unit of weight or volume) then shrink from double-digit percentages to fractional percentage changes. Whether it’s solar, wind tech, or aircraft turbines, the gains in performance are now all measured in single-digit percentage gains. Such progress is economically meaningful but is not revolutionary.

The physics-constrained limits of energy systems are unequivocal. Solar arrays can’t convert more photons than those that arrive from the sun. Wind turbines can’t extract more energy than exists in the kinetic flows of moving air. Batteries are bound by the physical chemistry of the molecules chosen. Similarly, no matter how much better jet engines become, an A380 will never fly to the moon. An oil-burning engine can’t produce more energy than what is contained in the physical chemistry of hydrocarbons.

Combustion engines have what’s called a Carnot Efficiency Limit, which is anchored in the temperature of combustion and the energy available in the fuel. The limits are long established and well understood. In theory, at a high enough temperature, 80% of the chemical energy that exists in the fuel can be turned into power.74 Using today’s high-temperature materials, the best hydrocarbon engines convert about 50%–60% to power. There’s still room to improve but nothing like the 10-fold to nearly hundredfold revolutionary advances achieved in the first couple of decades after their invention. Wind/solar technologies are now on the same place of that asymptotic technology curve.

For wind, the boundary is called the Betz Limit, which dictates how much of the kinetic energy in air a blade can capture; that limit is about 60%.75 Capturing all the kinetic energy would mean, by definition, no air movement and thus nothing to capture. There needs to be wind for the turbine to turn. Modern turbines already exceed 45% conversion.76 That leaves some real gains to be made but, as with combustion engines, nothing revolutionary.77 Another 10-fold improvement is not possible.

For silicon photovoltaic (PV) cells, the physics boundary is called the Shockley-Queisser Limit: a maximum of about 33% of incoming photons are converted into electrons. State-of-the-art commercial PVs achieve just over 26% conversion efficiency—in other words, near the boundary. While researchers keep unearthing new non-silicon options that offer tantalizing performance improvements, all have similar physics boundaries, and none is remotely close to manufacturability at all—never mind at low costs.78 There are no 10-fold gains left.79

Future advances in wind turbine and solar economics are now centered on incremental engineering improvements: economies of scale in making turbines enormous, taller than the Washington Monument, and similarly massive, square-mile utility-scale solar arrays. For both technologies, all the underlying key components—concrete, steel, and fiberglass for wind; and silicon, copper, and glass for solar—are all already in mass production and well down asymptotic cost curves in their own domains.

While there are no surprising gains in economies of scale available in the supply chain, that doesn’t mean that costs are immune to improvements. In fact, all manufacturing processes experience continual improvements in production efficiency as volumes rise. This experience curve is called Wright’s Law. (That “law” was first documented in 1936, as it related then to the challenge of manufacturing aircraft at costs that markets could tolerate. Analogously, while aviation took off and created a big, worldwide transportation industry, it didn’t eliminate automobiles, or the need for ships.) Experience leading to lower incremental costs is to be expected; but, again, that’s not the kind of revolutionary improvement that could make a new energy economy even remotely plausible.

As for modern batteries, there are still promising options for significant improvements in their underlying physical chemistry. New non-lithium materials in research labs offer as much as a 200% and even 300% gain in inherent performance.80 Such gains nevertheless don’t constitute the kinds of 10-fold or hundredfold advances in the early days of combustion chemistry.81 Prospective improvements will still leave batteries miles away from the real competition: petroleum.

There are no subsidies and no engineering from Silicon Valley or elsewhere that can close the physics-centric gap in energy densities between batteries and oil (Figure 5). The energy stored per pound is the critical metric for vehicles and, especially, aircraft. The maximum potential energy contained in oil molecules is about 1,500% greater, pound for pound, than the maximum in lithium chemistry.82 That’s why the aircraft and rockets are powered by hydrocarbons. And that’s why a 20% improvement in oil propulsion (eminently feasible) is more valuable than a 200% improvement in batteries (still difficult).

 Finally, when it comes to limits, it is relevant to note that the technologies that unlocked shale oil and gas are still in the early days of engineering development, unlike the older technologies of wind, solar, and batteries. Tenfold gains are still possible in terms of how much energy can be extracted by a rig from shale rock before approaching physics limits.83 That fact helps explain why shale oil and gas have added 2,000% more to U.S. energy production over the past decade than have wind and solar combined.84

==================================================

Next up is  Part 9 Digitalization Won’t Uberize the Energy Sector.

cbdakota

 

 

New Energy Economy: An Exercise in Magical Thinking Part 7 Moore’s Law Misapplied.


Continuing serialization of Mark Mills’ report New Energy Economy: An Exercise in Magical Thinking.  This is part 7 Moore’s Law Misapplied.  Moore is well known for his prediction  that the number of transistors in a dense integrated circuit would double every two years.  But Mills points out this doesn’t work for renewable energy.

=====================================================

Moore’s Law Misapplied 

Faced with all the realities outlined above regarding green technologies, new energy economy enthusiasts nevertheless believe that true breakthroughs are yet to come and are even inevitable. That’s because, so it is claimed, energy tech will follow the same trajectory as that seen in recent decades with computing and communications. The world will yet see the equivalent of an Amazon or “Apple of clean energy.”70

 This idea is seductive because of the astounding advances in silicon technologies that so few forecasters anticipated decades ago. It is an idea that renders moot any cautions that wind/solar/batteries are too expensive today—such caution is seen as foolish and shortsighted, analogous to asserting, circa 1980, that the average citizen would never be able to afford a computer. Or saying, in 1984 (the year that the world’s first cell phone was released), that a billion people would own a cell phone, when it cost $9,000 (in today’s dollars). It was a two-pound “brick” with a 30-minute talk time.

Today’s smartphones are not only far cheaper; they are far more powerful than a room-size IBM mainframe from 30 years ago. That transformation arose from engineers inexorably shrinking the size and energy appetite of transistors, and consequently increasing their number per chip roughly twofold every two years—the “Moore’s Law” trend, named for Intel cofounder Gordon Moore.

The compound effect of that kind of progress has indeed caused a revolution. Over the past 60 years, Moore’s Law has seen the efficiency of how logic engines use energy improve by over a billionfold.71 But a similar transformation in how energy is produced or stored isn’t just unlikely; it can’t happen with the physics we know today.

In the world of people, cars, planes, and large-scale industrial systems, increasing speed or carrying capacity causes hardware to expand, not shrink. The energy needed to move a ton of people, heat a ton of steel or silicon, or grow a ton of food is determined by properties of nature whose boundaries are set by laws of gravity, inertia, friction, mass, and thermodynamics.

If combustion engines, for example, could achieve the kind of scaling efficiency that computers have since 1971—the year the first widely used integrated circuit was introduced by Intel—a car engine would generate a thousandfold more horsepower and shrink to the size of an ant.72 With such an engine, a car could actually fly, very fast.

If photovoltaics scaled by Moore’s Law, a single postage-stamp-size solar array would power the Empire State Building. If batteries scaled by Moore’s Law, a battery the size of a book, costing three cents, could power an A380 to Asia.

But only in the world of comic books does the physics of propulsion or energy production work like that. In our universe, power scales the other way.

An ant-size engine—which has been built—produces roughly 100,000 times less power than a Prius. An antsize solar PV array (also feasible) produces a thousandfold less energy than an ant’s biological muscles. The energy equivalent of the aviation fuel actually used by an aircraft flying to Asia would take $60 million worth of Tesla-type batteries weighing five times more than that aircraft.73

 The challenge in storing and processing information using the smallest possible amount of energy is distinct from the challenge of producing energy, or of moving or reshaping physical objects. The two domains entail different laws of physics.

The world of logic is rooted in simply knowing and storing the fact of the binary state of a switch—i.e., whether it is on or off. Logic engines don’t produce physical action but are designed to manipulate the idea of the numbers zero and one. Unlike engines that carry people, logic engines can use software to do things such as compress information through clever mathematics and thus reduce energy use. No comparable compression options exist in the world of humans and hardware.

 Of course, wind turbines, solar cells, and batteries will continue to improve significantly in cost and performance; so will drilling rigs and combustion turbines (a subject taken up next). And, of course, Silicon Valley information technology will bring important, even dramatic, efficiency gains in the production and management of energy and physical goods (a prospect also taken up below). But the outcomes won’t be as miraculous as the invention of the integrated circuit, or the discovery of petroleum or nuclear fission

====================================================

Upcoming is Part 8 Sliding Down the Renewable Asymptote.

cbdakota