Tuesday, May 28, 2024


Despite Alarmist Claims, US Hasn’t Seen An EF5 Tornado In 11 Years

An EF5 tornado is one of the most catastrophic weather events on Earth

Monstrous twisters of this magnitude can destroy entire neighborhoods in the blink of an eye, grow to be more than a mile wide and pack winds over 200 mph — stronger than any Category 5 hurricane on record across the Atlantic basin.

On May 20, 2013, an extremely powerful tornado destroyed a huge part of Moore, Okla. Eleven years later, it remains the most recent tornado to be rated EF5, the strongest possible rating on the Enhanced Fujita Scale.

The 11-year gap is the longest since official U.S. records began in 1950.

Before the Moore tornado, the blockbuster tornado season in 2011 led to the confirmation of five EF5 twisters, including the Joplin, Missouri, EF5 that killed 161 people. A total of 50 tornadoes have been rated F5/EF5 since records began in the United States in 1950.

Because most weather instruments can’t survive tornadoes, the EF scale estimates tornado strength based on NWS staff investigating damage indicators.

Meteorologist Bob Henson said in 2023 that the current EF5 “drought” is hard to explain since damage estimates can be subjective.

Damage to a “well-constructed building” is the most common factor that helps the National Weather Service (NWS) confirm an EF5, yet many homes in the U.S. do not meet that criteria.

Henson quotes Tanya Brown-Giammanco, director of Disaster & Failure Studies at the National Institute of Standards and Technology, who says that many houses are missing key features to be considered wind resistant, disqualifying them from being used to determine if a twister reached EF5 status.

The Enhanced Fujita system is not likely to change from a ground-damage-based scale, Henson says, but new standards may be implemented to improve rural damage assessments based on damage to wind turbines, irrigation systems, farm silos, churches, and passenger vehicles.

The National Windstorm Impact Reduction Program at NIST is developing these standards, which would have to be adopted by NOAA’s Storm Prediction Center to change the Enhanced Fujita Scale.

Radar data, by definition measured above ground, cannot be used to rate tornadoes on the EF scale. This precedent was reaffirmed by the El Reno tornado on May 31, 2013, which tracked just south of El Reno, Oklahoma. At peak strength, Doppler radar measured winds over 300 mph.

The National Weather Service initially rated El Reno as an EF5, but subsequent damage investigations were unable to find damage indicators above EF3 since it largely tracked over open fields.

Because of the damage found, the El Reno tornado, despite being the largest twister ever recorded at 2.6 miles wide, was confirmed as an EF3.

************************************************

What the IPCC Says about Drought

ROGER PIELKE JR.

Last week, I testified before the Senate Committee on the Budget in a hearing titled, Droughts, Dollars, and Decisions: Water Scarcity in a Changing Climate.1 The hearing was the 18th in the Committee’s series on climate change this Congress, prompting the Wall Street Journal to suggest “the old-fashioned idea that the Budget Committee ought to focus on the budget.”

The hearing could easily have been held the Senate Agriculture Committee, and indeed, almost all the questions from senators to the witnesses came from Budget Committee members who are also on the Agriculture Committee.

I was invited to testify on what the IPCC (Intergovernmental Panel on Climate Change) says about drought — and I focused my testimony on the finding of the IPCC Sixth Assessment (AR6) Working Group 1 (WG1). I always appreciate the opportunity to testify before Congress, and I thank Senators Whitehouse and Grassley for the opportunity.2

Aside from my testimony, there was essentially no discussion of climate change — it was mostly about local farming and urban water management, both crucially important. All the witnesses were excellent, and the Senators asked some worthwhile questions.

Some tidbits I left with from the other witnesses:

Despite a large increase in population, Southern California has cut its water consumption by about half since the 1970s.

Almost all of the world’s carrot seeds are produced in the high desert of Oregon.

Despite variability and changes in U.S. climate, agricultural productivity has continued to increase, and with no end in sight.

My testimony focused on summarizing what the IPCC AR6 Working Group 1 said about drought, with a focus on the United States (specifically, North America).

Low confidence (2 in 10) in detection of changes in drought across the U.S., with the exception of increasing “agricultural and economical drought” in Western North America at medium confidence (5 in 10).

No ability to express any confidence in how drought may change from a 1995 to 2014 baseline under future temperature changes of >1.5C from that baseline (Note: a 1.5C change from that recent baseline is about the same as a 2.5C change from preindustrial, which is similar to a “current policies” baseline and well below a SSP2-4.5 scenario).3

In fact, the IPCC has not achieved detection of trends in drought anywhere in the world at a level consistent with the IPCC’s threshold for detection (i.e., at least very high confidence or 9 in 10). The IPCC has detected an increase in hydrological drought in the Mediterranean and North East South America with high confidence (8 in 10) but has, respectively, only medium confidence and low confidence in attribution in those two regions (5 in 10 and 2 in 10).

The IPCC does conclude with high confidence that human-caused climate change affects the hydrological cycle, and thus drought. However, achieving detection and attribution of trends in the IPCC’s various definitions of drought — both observed and projected — in the context of significant internal variability remains a challenge.

Don’t take it from me. Here is what the IPCC AR6 concluded:

"There is low confidence in the emergence of drought frequency in observations, for any type of drought, in all regions. Even though significant drought trends are observed in several regions with at least medium confidence (Sections 11.6 and 12.4), agricultural and ecological drought indices have interannual variability that dominates trends, as can be seen from their time series (medium confidence)"

In fact, published studies are lacking that explore when signals of projected changes in drought might emerge from the background of internal climate variability, under the IPCC’s framework for detection and attribution:

"Studies of the emergence of drought with systematic comparisons between trends and variability of indices are lacking, precluding a comprehensive assessment of future drought emergence."

Given the closing “jaws of the snake” due to the growing recognition of the implausibility of extreme climate scenarios, it will be interesting to see what future “time of emergence” studies say about projected changes in drought. I’ll have a post dedicated to this neglected topic in the coming weeks.

As I said at the hearing, it is easy to perform anecdotal attribution of any weather and climate event that happens anywhere on the planet (Turbulence! Home runs! Migraines!). The IPCC tells us that reality is just a bit more complicated.

*******************************************

Yes, Aratina Solar Project Will Down Iconic Joshua Trees in Southern California

image from https://pbs.twimg.com/media/GMwJhgqa4AAdZBW?format=jpg

In 2023, the U.S. Fish and Wildlife Service determined that Joshua Trees are not endangered. They concluded they are unlikely to be significantly threatened in the next 50 years.

However, a study conducted in 2013 revealed that Joshua trees are experiencing a halt in reproduction across approximately half of their range within Joshua Tree National Park. As temperatures rise and conditions become drier, it is anticipated that the available habitat for Joshua trees will diminish significantly. By the end of the century, as much as 90 percent of the Joshua Tree habitat could vanish due to these environmental changes. You read that right, a reduction of 90 percent.

So, in the name of our Solar Agenda, to protect the world from greenhouse gases, let’s destroy another species and wait until it is too late to save them. I am not anti-solar. I am anti-destroying a species for the sake of an agenda.

The trouble is that the “climate crisis” narrative has given those who wish to impose a particular green energy source on us the moral authority to do so.

This particular saga began in 2021 when the Kern County Board of Supervisors approved the Aratina Solar Project despite residents’ objections.

Despite comments and concerns from residence in Boron and Desert Lake, the Kern County Board of Supervisors approved a solar farm project which will include five different sites in the East Kern County area; the board voted on the approval at their October 12th meeting in Bakersfield.

8-minute Solar Energy’s Aratina Solar Center would provide 250 megawatts of power which is enough to power 93,000 homes to a pair of community choice organizations that contract electricity on behalf of residential customers in the Monterrey Bay and Silicone Valley areas of California.

The agreements represent a new and growing market for a company that’s integrating large photovoltaic solar arrays with battery installations to provide sun power 24 hours a day at prices low enough to compete with natural gas fired power plants.

In fact, on its website, Avantus (formerly 8-Minute Solar Energy) admits it has every intention of chopping down the Joshua Trees on its website:

The kicker…they are destroying the dress to save the trees from “climate change.”

Avantus is working to preserve native Mojave plants like Joshua Trees while also preserving California’s ability to achieve its clean energy goals – and the economic and climate benefits that come with them. While trees will be impacted during project construction, vastly more Joshua Trees are being threatened by climate change caused by rising greenhouse gas emissions, which the Aratina solar project directly addresses.

If California had just not gutted our nuclear power capabilities and looked at next-generation options, and if the state didn’t ludicrously decide “net zero” was a sensible and responsible goal. Perhaps the Joshua Trees and the wildlife end habitat dependent on them would not now be threatened.

**********************************************

Australia: Deck was stacked as CSIRO estimated the cost of nuclear power

The cost of nuclear energy is twice the cost of renewables, so sayeth the Commonwealth Scientific and Industrial Research Organisation. But why is the CSIRO in the non-scientific game of providing assumption-driven estimates of the cost of generating electricity in different ways?

On the face of it, it looks like a bit of buck-passing by the Australian Energy Market Operator, which enlisted the assistance of the CSIRO some years ago. This is a task for engineers, economists and accountants – not scientists.

Modelling is not science, and ­estimating costs is also not science. By rights, the CSIRO should have declined the request. Its reputation has been markedly sullied.

Let’s consider the latest version of the CSIRO’s GenCost report. As with all modelling, it’s a case of garbage in, garbage out. The assumptions in it range from the plausible to the absolutely ridiculous.

The most glaring errors in the report are the assumptions about the upfront costs of nuclear plants, their rates of utilisation and their lifespans. The assumption on the capacity of wind power is also laughable and the assumed life­spans of both wind and solar are too long.

It looks suspiciously like a tail-wagging-the-dog exercise: how to ensure that nuclear power looks extraordinarily expensive compared with the preferred renewable energy option of the federal and state governments.

The fact that Australia is the only country of the largest 20 economies in the world not to have nuclear power didn’t seem to awaken the curiosity of the CSIRO team. Should we be assuming that all their governments are simply stupid by having such an expensive form of generation?

And how could it be the case that a very large number of countries are now aggressively in­vesting in more zero-emissions nuclear plants?

Indeed, our main ally, the US, has a target of tripling the amount of nuclear power by 2050.

The international figures are clear: countries with high wind and solar shares in their generation of electricity actually have relatively high electricity prices. They include Germany, Britain, Spain, Denmark and Italy, as well as the states of California and South Australia. By contrast, those countries with very low renewable shares have the cheapest electricity: Russia, United Arab Emirates, Korea and India.

It is worth pausing here to briefly outline the methodology of the GenCost report. It uses levelised cost of electricity, or LCOE, as the key metric – a measure that includes both the cost of installation as well as the expected lifetime of the asset. The cost of the fuel is added, which is zero for wind and solar but material for other means of generation.

The capacity factors of different means of generation are then taken into account. They should vary between 25 and 33 per cent for wind and solar but the GenCost report has onshore wind at 48 per cent and offshore wind at 52 per cent, which are both clearly errors. The capacity factor for nuclear should be in the 90s but in one scenario, the CSIRO puts the figure at 53 per cent, another clanger.

But the key is this: the LCOE is the wrong measure to use. What is required is a system-wide LCOE because of the inherent intermittency of wind and solar and the inviolable objective of 24/7 power. When the wind blows and the sun shines, the cost of generating electricity by these means is very low. But because the wind doesn’t blow all the time and the sun sets, ­expensive back-up (or firming) is required.

This back-up must be added to the cost of both wind and solar. And account must be taken of both extended wind droughts and cloudy periods – short-duration batteries will simply be inadequate. In practical terms, the option of long-­duration, affordable batteries simply doesn’t exist and affordable pumped hydro is not possible in this country.

Last year’s GenCost report was a major hit job on the highly prospective Small Nuclear Reactors which are still being developed, although Canada is further down this path than other countries.

By choosing just one pilot scheme in Utah that was subsequently abandoned, the report was based on the worst-case scenario. It’s hard to avoid the conclusion that this was quite deliberate. This time, the decision was made to include tried and tested large-scale nuclear plants in its comparison of generating costs. The upfront costs of building nuclear plants are very substantial and they can also take some years to complete. There are also quite a few examples of cost blowouts and delays – in Finland and the UK, for example.

The GenCost report uses the relatively successful example of Korea’s nuclear program to estimate the expected capital cost of a large-scale plant. The figure is put at $8700 per kilowatt, which sounds reasonable enough. But the figure is then arbitrarily doubled because it would be the “first-of-a-kind” in Australia. It is simply asserted that “FOAK premiums of up to 100 per cannot be ruled out”.

This is absurd. After all, Australia would be importing the expertise from experienced players were nuclear plants to be built here. And as the nuclear energy industry enjoys a significant renaissance around the world, the number of companies and the depth of talent involved are increasing markedly. By the time Australia is in a position to consent to nuclear plants, it is inconceivable that the FOAK would be double. This assumption makes a substantial difference to the final results.

Stung by the criticism that previous GenCost reports failed to take into account the cost of transmission needed to get renewable energy to the grid, this latest version makes some effort to do so. But instead of focusing on the entire cost of transmission, which feeds into retail prices, only the cost of additional transmission is included in the analysis. Again this is a bias in favour of renewable energy.

Of course, one of the advantages of nuclear plants is that they can be located where existing transmission lines exist; the cost of foregone investment in transmission by rights should be included as reducing the cost of nuclear.

They can also last more than 80 years, even though the GenCost report bizarrely gives them a lifespan of 30 years. Solar and wind are assumed to last 25 years, which is far too long.

Of course, no serious investors would take much notice of the GenCost report or any of the other selective pieces of analysis put out by various government departments. Their analysis would be based on carefully derived figures subject to sensitivity analysis. The key now is for both the federal and state government bans on nuclear power to be lifted so the potential investors can sharpen their pencils and get to work.

***************************************

My other blogs. Main ones below

http://dissectleft.blogspot.com (DISSECTING LEFTISM )

http://edwatch.blogspot.com (EDUCATION WATCH)

http://pcwatch.blogspot.com (POLITICAL CORRECTNESS WATCH)

http://australian-politics.blogspot.com (AUSTRALIAN POLITICS)

http://snorphty.blogspot.com/ (TONGUE-TIED)

https://immigwatch.blogspot.com (IMMIGRATION WATCH)

https://awesternheart.blogspot.com (THE PSYCHOLOGIST)

http://jonjayray.com/blogall.html More blogs

*****************************************

No comments: