Friday, September 27, 2024


A basic flaw in IPCC science

Detailed research is underway that threatens to undermine the foundations of the climate science promoted by the IPCC since its First Assessment Report in 1992. The research is re-examining the rural and urban temperature records in the Northern Hemisphere that are the foundation for the IPCC’s estimates of global warming since 1850. The research team has been led by Dr Willie Soon (a Malaysian solar astrophysicist associated with the Smithsonian Institute for many years) and two highly qualified Irish academics – Dr Michael Connolly and his son Dr Ronan Connolly. They have formed a climate research group CERES-SCIENCE. Their detailed research will be a challenge for the IPCC 7th Assessment Report due to be released in 2029 as their research results challenge the very foundations of IPCC science.

The climate warming trend published by the IPCC is a continually updated graph based on the temperature records of Northern Hemisphere land surface temperature stations dating from the mid 19th Century. The latest IPCC 2021 report uses data for the period 1850-2018. The IPCC’s selection of Northern Hemisphere land surface temperature records is not in question and is justifiable. The Northern Hemisphere records provide the best database for this period. The Southern Hemisphere land temperature records are not that extensive and are sparse for the 19th and early 20th Century. It is generally agreed that the urban temperature data is significantly warmer than the rural data in the same region because of an urban warming bias. This bias is due to night-time surface radiation of the daytime solar radiation absorbed by concrete and bitumen. Such radiation leads to higher urban night-time temperatures than say in the nearby countryside. The IPCC acknowledges such a warming bias but alleges the increased effect is only 10 per cent and therefore does not significantly distort its published global warming trend lines.

Since 2018, Dr Soon and his partners have analysed the data from rural and urban temperature recording stations in China, the USA, the Arctic, and Ireland. The number of stations with reliable temperature records in these areas increased from very few in the mid-19th Century to around 4,000 in the 1970s before decreasing to around 2,000 by the 1990s. The rural temperature recording stations with good records peaked at 400 and are presently around 200. Their analysis of individual stations needs to account for any variation in their exposure to the Sun due to changes in their location, OR shadowing due to the construction of nearby buildings, OR nearby vegetation growth. The analysis of rural temperature stations is further complicated as over time many are encroached by nearby cities. Consequently, the data from such stations needs to be shifted at certain dates from the rural temperature database to either an intermediate database or to a full urban database. Consequently, an accurate analysis of the temperature records of each recording station is a time-consuming task.

This new analysis of 4,000 temperature recording stations in China, the USA, the Arctic, and Ireland shows a warming trend of 0.89ºC per century in the urban stations that is 1.61 times higher that a warming trend of 0.55ºC per century in the rural stations. This difference is far more significant than the 10 per cent divergence between urban and rural stations alleged in the IPCC reports; a divergence explained by a potential flaw in the IPCC’s methodology. The IPCC uses a technique called homogenisation that averages the rural and urban temperatures in a particular region. This method distorts the rural temperature records as over 75 per cent of the temperature records used in this homogenisation methodology are urban stations. So, a methodology that attempts to statistically identify and correct some biases that may be in the raw data, in effect, leads to an urban blending of the rural dataset. This result is biased as it downgrades the actual values of each rural temperature station. In contrast, Dr Soon and his coworkers avoided homogenisation so the temperature trends they identify for each rural region are accurate as the rural data are not distorted by the readings from nearby urban stations.

The rural temperature trend measured by this new research is 0.55ºC per century and it indicates the Earth has warmed 0.9ºC since 1850. In contrast, the urban temperature trend measured by this new research is 0.89ºC per century and indicates a much higher warming of 1.5ºC since 1850. Consequently, a distorted urban warming trend has been used by the IPCC to quantify the warming of the whole of the Earth since 1850. The exaggeration is significant as the urban temperature record database used by the IPCC only represents the temperatures on 3-4 per cent of the Earth’s land surface area; an area less than 2 per cent of the Earth’s total surface area. During the next few years, Dr Willie Soon and his research team are currently analysing the meta-history of 800 European temperature recording stations. When this is done their research will be based on very significant database of Northern Hemisphere rural and urban temperature records from China, the USA, the Arctic, Ireland, and Europe.

This new research has unveiled another flaw in the IPCC‘s temperature narrative as trend lines in its revised temperature datasets are different from those published by the IPCC. For example, the rural records now show a marked warming trend in the 1930s and 1940s while there is only a slight warming trend in the IPCC dataset. The most significant difference is the existence of a marked cooling period in the rural dataset for the 1960s and 1970s that is almost absent in the IPCC’s urban dataset. This later divergence upsets the common narrative that rising carbon dioxide levels control modern warming trends. For, if carbon dioxide levels are the driver of modern warming, how can a higher rate of increasing carbon dioxide levels exist within a cooling period in the 1960s and 1970s while a lower increasing rate of carbon dioxide levels coincides with an earlier warming interval in the 1930s and 1940s? Or, in other words, how can carbon dioxide levels increasing at 1.7 parts per million per decade cause a distinct warming period in the 1930s and 1940s while a larger increasing rate of 10.63 parts per million per decade is associated with a distinct cooling period in the 1960s and 1970s! Consequently, the research of Willie Soon and his coworkers is discrediting, not only the higher rate of global warming trends specified in IPCC Reports, but also the theory that rising carbon dioxide levels explain modern warming trends; a lynchpin of IPCC science for the last 25 years.

Willie Soon and his coworkers maintain that climate scientists need to consider other possible explanations for recent global warming. Willie Soon and his coworkers point to the Sun, but the IPCC maintains that variations in Total Solar Irradiance (TSI) are over eons and not over shorter periods such as the last few centuries. For that reason, the IPCC point to changes in greenhouse gases as the most obvious explanation for global warming since 1850. In contrast, Willie Soon and his coworkers maintain there can be short-term changes in solar activity and, for example, refer to a period of no sunspot activity that coincided with the Little Ice Age in the 17th Century. They also point out there is still no agreed average figure for Total Solar Irradiance (TSI) despite 30 years of measurements taken by various satellites. Consequently, they contend research in this area is not settled.

The CERES-SCIENCE research project pioneered by Dr Willie Soon and the father-son Connolly team has questioned the validity of the high global warming trends for the 1850-present period that have been published by the IPCC since its first report in 1992. The research also queries the IPCC narrative that rising greenhouse gas concentrations, particularly carbon dioxide, are the primary driver of global warming since 1850. That narrative has been the foundation of IPCC climate science for the last 40 years. It will be interesting to see how the IPCC’s 7th Assessment Report in 2029 treats this new research that questions the very basis of IPCC’s climate science.

**************************************************

You Are Being Fooled If You Think Wind Energy Is Cheap

The late great P.J. O’Rourke once wrote, in The Atlantic in April 2002, that: “Beyond a certain point complexity is fraud…. when someone creates a system in which you can’t tell whether or not you’re being fooled, you’re being fooled.”

Which brings us to wind energy and its complicated contractual arrangements with modern electricity grids.

It’s not just a simple matter of bidding on contracts and supplying power when needed.

No, it’s become a mare’s nest of renewables mandates, portfolio standards, feed-in-tariffs, first-to-the-grid rules, dispatch, curtailment, hype, blame and losses that somehow no one saw coming.

And yes, you’re being fooled.

Debates over alternative energy, including its alleged cost advantages, often take place at a very high, almost abstract level. The general macro rule is that aggressive climate policies cause rising energy costs and often deindustrialization as well, not to mention brown- and blackouts, and this generalization is borne out across a wide range of jurisdictions.

When someone does go down into the weeds, as Parker Gallant frequently does in the Canadian province of Ontario, it’s amazing what muck he finds there.

For instance, he recently wrote that:

“on Sunday, September 8th, a somewhat cool summer day, Ontario’s peak electricity demand only reached a low 15,567 MW at Hour 20 which could have been principally supplied by nuclear and hydro baseload power (with a sprinkling of natural gas generation) but those IWT (industrial wind turbines) were humming!

At that hour those IWT generated 1,661 MWh but it really wasn’t needed however, due to their ‘first-to-the-rights’, IESO were obliged to either take their power or curtail it.”

To “curtail” means to tell the turbine owners not to generate then pay them a guaranteed price way above the market rate anyway, in this case $120/MWh when the going rate was $26.60.

So IESO, the “Independent Electricity System Operator” (and “Independent” here means it’s a government agency that isn’t answerable to anyone including the politicians we elect) bought the power and sold it at a massive loss.

And that triumph of wasting money and power is just the tip of the cashberg here.

As, we predict with some confidence, you’d find where you live as well especially if the green zealots designed the power system or tried to. It need not have been done with that express purpose; as with much else in human affairs, particularly governmental, complexity can arise from incompetence, including in managing growing systems.

But if those who operate and work within a system tolerate a degree of complexity over many years so great that not even they can really understand what’s happening, and the public is utterly incapable of grasping it and imposing accountability, they are guilty of perpetuating it out of self-interest, zealotry or both.

Power systems are generally complicated for necessary reasons as well as others. But when it comes to alternative energy, the impulse to make it look as good as possible pushes insiders in the direction of distortions and distractions that, if anyone else were doing it for a cause they did not share, would surely cause them to shake their heads.

To take one trivial example, we frequently hear about how more and more of a system’s power needs are being met by wind and solar every day. But of course if a power system is required to purchase those kinds first, then the fact that it buys a lot of its energy from them doesn’t prove that they are “winning” or “cheaper” or anything else.

The massive conventional backup system sits there, perfectly capable of powering the grid, often at lower price, and is forbidden by law or regulation from doing so.

In the case of Ontario, the IESO is forced to buy extremely expensive alternative energy and then dump much cheaper conventional onto the market, which the green crowd calls a triumph. As Gallant went on, showing just how baroque that provincial system is and how broke it’s making residents (and using more exclamation marks than we would advise):

“if we fast forward to September 10th, 11th and 12th, IESO reported much higher demand of 17,734 MW on the 10th, 18,522 MW on the 11th and 19,583 MW on the 12th… how did those IWT perform?

As it turns out, those 4,900 MW of IWT capacity were pretty well absent during the peak hours generating only 308 MWh on the 10th, 287 MWh on the 11th and 128 MWh on the 12th!

So, totaling their performance over the peak hours during those three days they operated at a miserly 4.8 percent of their capacity despite those ‘first-to-the-grid’ rights they enjoy! For the full 72 hours of those three days their total generation was 27,706 MWh which was only 7.8 percent of their rated capacity!”

Brutal. And it gets worse because luckily the province has natural gas backups and:

“Over the three days those natural gas plants ramped up and down as needed and provided the grid with 276,747 MWh or ten (10) times what those IWT generated.”

This kind of thing makes for depressing reading as well as a headache. In another post, he cites a new report by the MacDonald-Laurier Institute that calculated the real cost of IWT over a decade and put a major dent in claims that wind and solar are cheap.

Especially given, for instance, that from 2020-23 alone the province paid just one (also government) entity, Ontario Power Generation or OPG, to “spill” some 6.6 TWh (yes, that’s terawatt/hours) generated by hydro dams.

Now the power in question mostly came from hydro… but it only wasn’t needed in the grid because of the mandated preference for wind that was used instead, making the hydro power redundant, and that wind costs far more.

So even once you net out the sale proceeds, the real cost of that “spill” exceeds $1 billion but in ways that, as you’ve doubtless noticed, are extremely hard to understand.

Or not, in that on the macro scale they drive up the cost of government and everything else. The system haemorrhages millions a day and it adds up to a sum you just can’t make vanish. As the report’s executive summary observes:

“Hoping to jump-start wind generation, Premier Dalton McGuinty’s government established high wind prices, fixed for 20 years, which averaged $151/MWh over the 2020–23 period.

As the sector grew, so did the fiscal liability of those contracts. Multi-billion-dollar government subsidies started in 2017 and will total $7.3 billion for the current fiscal year (Ontario 2024a), equivalent to 0.65 percent of provincial GDP (Ontario 2024b).”

Not of the provincial budget. Of the entire provincial economy. It’s fully three percent of the provincial budget which, at $214.5 billion, has reached a record high. Accounting jiggery-pokery just can’t make that kind of excessive costs vanish even if it can make them hard to understand and fix.

The MLI report ends up saying the real value of IWT generation is $46/MWh, less than 30 percent of what they’re paid to make it. And we’re meant to believe it’s a bargain?

Yup. An article in The Hub by the “Director for Ontario at Clean Prosperity” burbles that “The economic case for new clean electricity is now undeniable.”

And claims that:

“anyone still suggesting that [Ontario’s nominally conservative but actually rudderless] Premier Doug Ford has flip-flopped on renewable energy may not be keeping up with the pace of technological change.”

Because see batteries or something. Even that piece concedes that:

“Ontario’s prior missteps with clean electricity are well known. The previous government’s Green Energy Act offered renewable energy project developers guaranteed premium prices for their power that far exceeded fair market value.

It threatened to undermine energy affordability in the province. The Ford government quickly repealed the Green Energy Act when it was elected in 2018 and until recently was reluctant to risk driving up energy prices by making new investments in renewables.

Six years later, we’re in a new economic and technological reality. The cost of batteries has plummeted by 80 percent in the last decade. Since the repeal of the Green Energy Act, the cost of wind power has fallen 40 percent and the cost of solar roughly 30 percent.”

In just six years? Really? And nobody else noticed, since governments everywhere are still massively subsidizing the stuff?

The article propagandizes that:

“Ontario’s Independent Electricity System Operator (IESO) now expects to pay less than half as much for new renewable energy generation than in the mid-2000s.

More cost declines are forecast. The investment case for zero-carbon energy will only get stronger.”

But when’s the last time IESO’s proclaimed expectations were fulfilled, or were even comprehensible given the hideous complexity of Ontario’s power system? Who can even really tell?

At that point citizens are being had, by design or by fortuitous happenstance from a certain point of view.

As Alexander Hamilton and James Madison wrote in Federalist #62 in 1788:

“It will be of little avail to the people, that the laws are made by men of their own choice, if the laws be so voluminous that they cannot be read, or so incoherent that they cannot be understood; if they be repealed or revised before they are promulgated, or undergo such incessant changes that no man, who knows what the law is to-day, can guess what it will be to-morrow.”

If you cannot tell whether your power system really is delivering more cheaply because of wind, solar, geothermal or sunbeams from cucumbers, it is of little avail that you are being asked to applaud it and can, in principle, vote for a number of different people who will perpetrate it and its complexity regardless.

*************************************************

It’s Time to Follow the Navy’s 50-Year Safety Record of Nuclear Power Generation

By Ronald Stein, P.E.

Delivery of affordable, abundant, reliable, clean, and emissions-free electricity to customers is very important to modern quality of life. Achieving this is threatened by a vulnerable grid and the intermittency of wind and solar electricity generation methods. To meet the coming power supply crisis for the demands of data centers and AI, it’s time to stimulate conversations about electricity generation to meet the needs of the end users.

The nuclear power systems developed for the Navy have functioned well for five decades. All commissioned U.S. Navy submarines and supercarriers built since 1975 are nuclear-powered. Other military services are now getting on board. If such a profoundly reliable and resilient system for the generation of emissions free electricity that is continuous and interruptible can be extended to the commercial power market, it would allow a variety of suppliers to compete for the business of the end user, allowing greatly reduced electricity prices.

Today, about 440 nuclear power reactors are in operation in 32 countries and Taiwan, with 62 new reactors under construction. As of August 1, 2023, the United States had 54 nuclear power plants with 93 operating commercial nuclear reactors in 28 states. These plants generate about 20% of the country’s electricity. Nuclear power has the competitive advantage of being the only baseload power source that can accommodate the desired expansion of a clean electricity supply to the end users that is emission free, continuous, and uninterruptible.

As of May 2024, there were 214 nuclear reactors permanently shut down worldwide. The United States recorded the largest number of shutdowns, with 41 units. More recently, twelve U.S. nuclear power reactors have permanently closed since 2012. We’ve also experienced the shutdown of nuclear plants in California and the recent one in New York, which were perfectly viable and profitable.

Another seven U.S. reactor retirements have been announced through 2025, with total generating capacity of 7,109 MW (equal to roughly 7% of U.S. nuclear power production).

However, announced retirements have not always occurred as planned: 16 reactors previously announced for permanent closure have continued operating pursuant to state interventions that provide them with additional revenue sources. Those 16 reactors in 6 states represent 15,734 MW of electricity generation capacity (16% of total U.S. nuclear power production). Recent studies have identified many other U.S. reactors as being “at risk” of shutdown for economic reasons, although their closures have not been announced.

Next-generation reactors are the Small Modular Reactor (SMR) and the Fast Breeder Reactor (FBR). SMRs are new types of reactors that produce Slightly Used Nuclear Fuel (SUNF) but do NOT recycle it. While SMR deployment is beneficial for various applications and will be part of future electricity mixes, these are very different from the Fast Breeder Reactor (FBR) that use fast neutrons to generate more nuclear fuels than they consume while generating power, dramatically enhancing the efficiency of energy resources.

Commercialization of nuclear power for the generation of electricity that is emissions free, continuous, and uninterruptible, seems to be more practical than ever before.

The introduction of intermittent power has disrupted the “on-demand” delivery system in that sun and wind patterns force the utility to adopt steadier “baseload” power production to accommodate these patterns. This has increased the chances of power blackouts and brownouts when weather conditions are not ideal. The reason for these changes is the desire to migrate the power production sources to “clean electricity.”

Coal and natural gas can supply continuous, uninterruptable, and adjustable baseload power that can be adjusted as demand changes. However, these are considered “dirty electricity.” Since baseload power is essential for a constant supply of electricity, additional baseload power sources are likely to be nuclear power plants since they are not “dirty electricity” producers.

Consistent and resilient power delivery is a national security issue and a quality-of-life issue. People and economies have grown to depend on electricity so much that they no longer have alternative methods to replace heat, lights, food preservation, and air conditioning in the event of a power outage. So, economic electricity must be delivered to people 100% of the time or serious disruptions in their lives and the economy will be apparent, including loss of life in certain medical situations.

The largest impediment to this goal appears to be mainstream media and the climate-NGO-industrial-environmental complex that is against nuclear since it massively increases taxpayer subsidies for renewables and the political attitude to eliminate nuclear power from the market in the United States. They also encourage massive unnecessary Government regulation, thus increasing the price of nuclear power.

The nuclear power production industry has the best industrial safety record among all industries for electricity production. So, the fear that most needs attention is the one surrounding spent nuclear fuel, which is commonly referred to as “nuclear waste.” The solution, then, lies in educating heads of state, mainstream media, and policymakers by extending the concept of recycling to include the unspent energy in used nuclear fuels, a method that can convince people that the “nuclear waste” issue is being dealt with, the cost of power is competitive, and that the production of nuclear power is safe.

Recycling Slightly Used Nuclear Fuel (SUNF) in a Fast Breeder Reactor (FBR) provides all these remedies in a way that is competitive and publicly acceptable.

The advantages to recycling used nuclear fuel in Fast Breeder Reactors are many:

It provides a solution to the disposition of the stockpile of Slightly Used Nuclear Fuel (SUNF).

Current inventories of SUNF provide an essentially unlimited supply of domestic fuel.

The fuel material is already mined, so the energy produced is much closer to 100% clean, and further environmental degradation from mining operations is not required.

The public would be more receptive to nuclear power because “waste” is being used as “fuel,” reducing the retention of unspent fuels and diminishing perceived risks.

The design is “intrinsically safe”. This means that the reactor is designed to cool sufficiently in the case of an accident without human intervention.

The current stockpile of SUNF has a value of $10 Trillion when the electric power that it produces is sold at 1 cent per kWh.
Process heat can be used for industrial purposes such as hydrogen, freshwater production, and synthetic fuel production.
Rather than pursue renewables of wind and solar that require huge land footprints, huge taxpayer subsidies, and even then, only generate electricity occasionally, it’s time to focus our technology resources on the nuclear power production industry that has the best industrial safety record among all companies and a track record of producing the cheapest non-subsidized electricity.

Specifically, focusing technology on commercializing emissions-free electricity that is continuous and uninterruptible to support the exponential growth of power demands from data centers, AI, airports, hospitals, telemetry, and the military. A great primer for definitions and companies engaged in the Small Modular Reactor (SMR) and the Fast Breeder Reactor (FBR) space is An Introduction to Advanced Nuclear Reactors.

For a brief primer education on the electricity generation marketplace, please view the 1-hour video with Chris Powers, and Robert Bryce at Power Hungry, as they discuss energy, politics, nuclear, and fossil fuels.

The main growth of electric power usage is coming from new data centers housing AI technologies. It is expected that over the next few decades, 50% of additional electric power will be needed just for AI, but data centers CANNOT run on occasional electricity from wind and solar. It’s time to stimulate conversations about electricity generation to meet the demands of the end users.

************************************************

Florida braces for Hurricane Helene as storm expected to wreak 'unprecedented damage'

It’s a pity New York cannot power its economy on hubris, but state officials this week gave it a try.

The “Future Energy Economy Summit” convened in Syracuse by Gov. Hochul brought together many of the same groups and characters who have shaped New York’s current energy policy — and somehow expected something different out of them.

Helming the summit was Richard Kauffman, New York’s “energy czar,” who for almost a decade has played a major role both in setting New York energy policy and working with companies in the energy sector.

Kauffman is hardly unique in this respect: If anything, he’s the embodiment of New York’s energy policy, which for decades has increasingly seen the political class imposing its preferences — sometimes quite suddenly, and sometimes quite lucratively — with little regard for how their whims translated into higher costs or reduced reliability for families and businesses.

Albany turned seemingly on a dime in 2016, going from encouraging the use of natural gas to blockading projects to bring natural gas to New York or New England.

The switch left the Northeast more reliant on older power plants and dirtier fuels while preventing customers from getting cleaner-burning fuel.

State officials soon after bullied Indian Point, a profitable nuclear plant that was generating close to a quarter of downstate’s electricity, to shut down.

Then they pressed ahead with costly plans to build wind turbines off Long Island, allowing the tiny roster of eligible developers to essentially name their price — because the rushed process left few companies able to bid.

These policies pushed up both costs and emissions, but they brought political windfalls: Environmentalists celebrated Indian Point’s closure, and inefficient construction unions cheered when the state effectively banned non-unionized construction firms from working on the offshore projects.

And Hamptonites breathed a sigh of relief as the federal government — at New York’s urging — quietly excluded their viewshed from the areas eligible for offshore wind development.

Albany took things to a more extreme level in 2019 when Gov. Andrew Cuomo signed the Climate Leadership and Community Protection Act.

The law went far beyond setting targets for reducing greenhouse gas emissions and increasing renewable energy generation.

It imposed arbitrary, mandatory levels for solar, offshore wind and energy storage projects — while excluding more reliable hydroelectric and nuclear generation from that same public funding.

Each of these constraints made it more expensive, and less practical, for the state to hit its own greenhouse gas goals.

***************************************

All my main blogs below:

http://jonjayray.com/covidwatch.html (COVID WATCH)

http://dissectleft.blogspot.com (DISSECTING LEFTISM)

https://westpsychol.blogspot.com (POLITICAL CORRECTNESS WATCH -- new site)

https://john-ray.blogspot.com/ (FOOD & HEALTH SKEPTIC -- revived)

https://australian-politics.blogspot.com (AUSTRALIAN POLITICS)

http://snorphty.blogspot.com (TONGUE-TIED)

https://immigwatch.blogspot.com (IMMIGRATION WATCH)

http://jonjayray.com/select.html (SELECT POSTS)

http://jonjayray.com/short/short.html (Subject index to my blog posts)

***********************************************

No comments: