Monday, March 29, 2021

Economically destructive cap and trade for HFCs is here

This is a bit complicated so bear with me. To begin with, Biden’s avalanche of climate scare executive orders included one telling the State Department to prepare the Kigali Amendment to the Montreal Protocol on Substances that Deplete the Ozone Layer for submission to the Senate, for ratification.

The Kigali Amendment has nothing to do with ozone depletion (a fanciful tale in itself), quite the contrary in fact. The 1987 Montreal Protocol mandated the phase out of CFCs, the primary refrigerant and aerosol propellant at the time. CFC were globally replaced with HFCs, at great expense and bother.

The 2015 Kigali Amendment now mandates the phase out of HFCs. I am not making this up.

HFCs do not threaten the ozone layer, so they have nothing to do with the Montreal Protocol. But the Protocol community decided to do what is called “mission creep”. They crept over to global warming, where HFCs are considered a problem. They have what is called a high “global warming potential” or GWP, so they too have to go.

Reportedly Obama and Kerry played a big role in creeping the Protocol, but they never submitted it for Senate ratification, knowing it would never get past the Republicans.

But now, to steal a great line: A funny thing happened on the way to the Senate.

Remember the giant Omnibus Appropriations Act passed in February? It funded the federal government and the Covid stimulus to the tune of $2.3 trillion. It also, as usual, included some riders that probably could not pass by themselves.

Well on page 1074 we find the “American Innovation and Manufacturing Act” or AIM. Incredibly, AIM includes the entire Kigali Amendment. (Still not making it up.) Not by name, mind you, but all the HFC phaseout rules and time table, pretty much word for word. I wonder if the Senators that voted for this addition to the Omnibus knew they were letting Kigali in the back door?

Simply put, this is a cap and trade system for HFCs, with a cap that declines over time to the point where almost no HFCs are allowed in America.

So, for example, AIM and Kigali establish the same cap. Here it is almost funny. Kigali was formulated in 2015 so the cap is mostly based on how much HFC was made and imported in 2011-13. Back then this was recent supply data. AIM uses the same dates, even though the present HFC supply situation may be very different. The only explanation for AIM using 8 to 10 year old data to establish the cap today is that it is Kigali by the back door.

So the Senate will be asked to ratify something that is already law. I hope they refuse but maybe it is not worth the filibuster. To paraphrase another great line, the greens don’t need no stinking Senate. They got Kigali in the back door already. Note that China, the world’s biggest producer of HFCs, has not ratified Kigali or implemented it by law.

In any case we now have before us a looming declining-cap and trade system for HFCs. EPA is moving quickly and is expected to propose the regulations for this system shortly. There are some big potential problems.

Keep in mind that some of the many primary uses of HFCs, in vast quantities, are these:

Air conditioning in cars, homes and big buildings

Refrigerators, freezers and chillers

Aerosol sprays

Electric power transformers

Heat pumps

Structural foam

Fire suppression

Note too that the leading candidates for HFC replacement are HFOs, which presently have some serious problems. For example they can be flammable, and they do not last all that long. In fact one reason they do not have a high GWP is that they self destruct quickly in air.

By far the biggest problem with AIM is that outdated cap. Technically it is called a “baseline” which sounds friendlier than a cap. But the HFC allowances EPA will distribute, in ever decreasing amounts, are from the baseline, so it is the cap.

In fact there are two big problems here. First of all we really have very little information on HFC production and import 8 to 10 years ago. EPA recently acknowledged this, again in an almost funny way.

They issued what is called a “Notice of data availability” or NODA. Normally a agency issues a NODA to announce the availability of new data, hence the name. But EPA’s NODA says they do not have good data about HFCs in 2011-13 and asks for suggestions. So this is really an EPA “Notice of data unavailability” or NODU.

Second it looks like HFC use today is much greater than it was 8-10 years ago. It is crucial that the AIM baseline developed by EPA be accurate, and especially that EPA’s estimates are not significantly lower than reality. The baseline determines the allocations of allowances and these must be adequate, lest there be a severe shortage of HFCs.

To take an extreme example, suppose the EPA consumption baseline is just half of what is needed for business as usual. In that case the initial, modest 10% reduction which is effective immediately, becomes a destructive 55% cut. If EPA is low by just 20%, the 10% reduction still balloons to 28%. Such a shortage of allowable HFCs could wreak havoc with certain industries and important products.

Then there is the problem of how EPA can even find all of the companies that use HFCs or import HFC containing products, much less how they can allocate the increasingly limited allowances to them. With SO2 allowances it was relatively easy because we knew where the big coal-fired power plants were and they did not change over time.

In contrast, HFC use and importation is a highly dynamic situation. For just one small example, roughly half of the cars sold in America are imported and all of them contain HFCs. In the case of aerosol sprays the allowance allocation problem is mind boggling.

There are other major problems, some not in Kigali, but this is enough to make the point. We are looking at a cap and trade phaseout of a ubiquitous harmless chemical, all in the name of climate change.

AIM is climate craziness personified, a prescription for disaster, especially economically destructive shortages.


The Social Cost of Carbon and Climate Sensitivity Is Model Manipulation at Its Finest

The “social cost of carbon” is a calculation that the Biden administration is looking to use to justify stringent regulation of carbon dioxide emissions.

Missouri Attorney General Eric Schmitt—joined by Arkansas, Arizona, Indiana, Kansas, Montana, Nebraska, Ohio, Oklahoma, South Carolina, Tennessee, and Utah—have now filed a lawsuit, arguing that the use of this metric in policy would constitute an overreach of government power.

Carbon dioxide is an odorless gas that is the basis for almost all plant life on earth. Of course, carbon dioxide emissions have slightly warmed the surface temperature of the planet, but on the other hand, access to plentiful energy has been accompanied by a doubling of life expectancy in the developed world and an elevenfold growth in personal wealth, as noted in the 2016 book “Lukewarming.”

A recent commentary that one of us (Kevin Dayaratna) published, titled “Why the Social Cost of Carbon Is the Most Useless Number You’ve Never Heard Of,” presented years of research on the topic conducted at The Heritage Foundation’s Center for Data Analysis.

He noted how easy it is to artificially raise the social cost of carbon by manipulating various assumptions in the calculation, including the discount rate, which is essentially an estimate of how much money invested today in say, the stock market, will grow in the future.

The Office of Management and Budget has recommended calculating things like the social cost of carbon with discount rates of 3%, 5%, and 7%. Obviously, at the higher rates, the social cost of carbon becomes pretty low. Using a 3% discount rate, the Biden administration hiked the social cost of carbon up to $51 per ton, a significant increase from the Trump administration’s $7 per ton.

Even that might not do for the Biden administration, which could rely upon the recent arguments made by University of Chicago economist Michael Greenstone, who said that the discount rate should be 2% or lower.

Additionally, in order to determine the social cost of carbon, we need to have a good idea of how much the planet’s surface will warm under various policy scenarios.

To calculate this level, scientists have for decades used computer models to find the “sensitivity” of climate to an arbitrary amount of carbon dioxide emissions. This sensitivity is usually the calculated warming, in terms of temperature, for a doubling of atmospheric carbon dioxide.

Here is a dirty little secret that few are aware of: All those horrifying future temperature changes that grace the front pages of papers of record aren’t really the predicted warming above today’s level. Instead, they are the difference between two models of climate change.

The “base climate” isn’t the observed global temperature at a given point in time. Instead, it is what a computer model simulates temperatures to be prior to any significant changes in carbon dioxide.

Reality need not apply to these calculations. And there are sometimes very big differences between the base models and reality, especially in the high latitudes of both hemispheres, and over the world’s vast tropics.

The usual procedure is then to instantaneously quadruple carbon dioxide and let the model spin up to an equilibrium climate. Then—hold onto your hat—that number is divided by two, taking advantage of the fact that warming varies linearly with increasing carbon dioxide, something that has been known for a long time. The final figure is called the equilibrium climate sensitivity to doubled carbon dioxide.

With regard to the equilibrium climate sensitivity, climate science is very odd: The more money we spend studying it, the more uncertain our forecasts become.

This fact is becoming increasingly obvious as a new suite of models is emerging that will be incorporated in the next climate science report from the U.N.’s Intergovernmental Panel on Climate Change, to be released next year.

For decades, there was no real narrowing of the range of the equilibrium climate sensitivity, since a 1979 National Academy of Sciences report, “Carbon Dioxide and Climate: A Scientific Assessment,” chaired by Jule Charney of the Massachusetts Institute of Technology.

The “Charney Sensitivity,” as it came to be called, was 1.5-4.5 C for the lower atmospheric warming that would be caused by a doubling of carbon dioxide.

Subsequent assessments, such as some of the serial “scientific assessments” of the Intergovernmental Panel on Climate Change, gave the same range, or something very close.

Periodically, the U.S. Department of Energy runs what it calls “coupled model intercomparison projects.” The penultimate one, used in the 2013 Intergovernmental Panel on Climate Change assessment, contained 32 families of models with a sensitivity range of 2.1-4.7 C, and a mean value of 3.4 C—i.e., warmer lower and mean values than Charney.

Nevertheless, the Intergovernmental Panel on Climate Change rounded this range back to the good old 1.5-4.5 C, because there was some skepticism about the warmer models.

Despite these differences between various base climate models and the doubled carbon dioxide calculation, reality-based calculations of the equilibrium climate sensitivity by other researchers yield much lower sensitivities, between 1.4 and 1.7 C.

The new coupled model intercomparison projects model suite, on the other hand, displays an even larger range of sensitivity beyond what has been observed. The range of models currently available (which is most of them), is 1.8-5.6 C, and an estimate of the mean is 4 C, and is likely what the Biden administration may very well use to determine the social cost of carbon.

So, sadly, the new coupled model intercomparison project models are worse than the older ones.

A 2017 study shows that, with one exception, the older coupled model intercomparison project models made large systematic errors over the entire globe’s tropics. The exception was a Russian model, which also had the lowest sensitivity of all, at 2.05 C.

Last year, researchers examined the new coupled model intercomparison projects model suite, and what they found was not good:

Rather than being resolved, the problem has become worse, since now every member of the CMIP6 generation of climate models exhibits an upward bias in the entire global troposphere as well as in the tropics.

A very recent paper just published in Geophysical Research Letters indicates that it may be that new estimates of the enhancements of clouds by human aerosol emissions are the problem. Interestingly, the model that has the least cloud interaction is the revised Russian model, and its sensitivity is down to 1.8 C, but it nonetheless still overpredicts observed global warming.

When it became apparent that the new models were predicting even more warming than their predecessors, Paul Voosen, the climate correspondent at Science magazine, interviewed a number of climate scientists and found that the new, “improved” renditions of the cloud-aerosol interaction is causing real problems, either completely eliminating any warming in the 20th century or producing far too much.

One of the scientists involved, Andrew Gettelman, told Voosen that “it took us a year to work that out,” proving yet again that climate scientists modify their models to give what French modeler Frederic Hourdin called an “anticipated acceptable result.”

Acceptable to whom? Hourdin’s landmark paper clearly indicates that it is scientists, not objective science, who subjectively decide how much warming looks right.

The implications of the systematic problems with coupled model intercomparison project models and other manipulated models on the social cost of carbon may be big: The Biden administration will rely on these models to beef up the social cost of carbon as well.

In fact, the Obama administration had done so by using an outdated equilibrium climate sensitivity distribution that was not grounded in reality that inflated its social cost of carbon estimates.

In fact, peer-reviewed research conducted by Kevin Dayaratna, Pat Michaels, Ross McKitrick, and David Kreutzer in two separate journals has illustrated that that under reasonable and realistic assumptions for climate sensitivity, alongside other assumptions, the social cost of carbon may effectively be zero or even negative.

It is now apparent that the reason for using the social cost of carbon to begin with is very simple: to be able to control the energy, agricultural, and industrial sectors of the economy, which will result in big costs for ordinary Americans with little to no climate benefit in return.

So altogether, we have one manipulated class of models—models determining climate sensitivity—likely being used as a basis for manipulating the social cost of carbon. The consequences on the social cost of carbon’s uncertainty are profound.

As a result, the public should be very cautious about accepting new calculations of the social cost of carbon. Although the social cost of carbon is based on an interesting class of statistical models, its use in policy should also serve a case study of model manipulation at its finest.


Canada: Supreme Court Rules Mandatory Carbon Price Constitutional

The Supreme Court of Canada, in a pivotal victory for Prime Minister Justin Trudeau’s climate policy, has ruled that the government’s decision to mandate a national carbon price to reduce greenhouse gas emissions is constitutional.

In a split 6-3 decision issued Thursday morning, Canada’s highest court ruled in favor of the nation’s federal government following a hotly contested legal battle over the government’s decision to impose a minimum fuel charge on all distributors and producers of carbon-based fuel in the country. The move, approved by parliament in 2018, received immediate pushback from a number of Canada’s provinces who claimed the decision was a blatant overreach on behalf of the government and argued that such decisions fall solely under their provincial authority.

These challenges resulted in a lengthy court battle, numerous appeals and conflicting rulings before ultimately landing at the feet of the Supreme Court, which has now definitively found that the decision to issue the carbon price was legal.

Chief Justice Richard Wagner, writing for the majority, said that at the heart of the legal battle rests the stark reality that climate change is a very real threat to the safety and wellbeing of humanity.

“Climate change is real,” Wagner wrote in Thursday’s ruling. “It is caused by greenhouse gas emissions resulting from human activities, and it poses a grave threat to humanity’s future. The only way to address the threat of climate change is to reduce greenhouse gas emissions.”

Because the threat of climate change is so severe, Wagner says, the gravity of the problem gives the government authority to act under the “peace, order and good government” clause of the Canadian Constitution. The clause, commonly referred to as the POGG clause, is rarely successfully cited as the basis for governmental action but does nonetheless give federal leaders the authority to act on issues that relate to the entire nation.

Wagner says this is where Trudeau’s carbon price prevails. While the provinces — namely the more conservative or oil-centric Alberta, Ontario and Saskatchewan areas — claim that managing natural resources to combat climate change is something they can do independently, POGG can be activated when there is a clear inability on behalf of all the provinces to come together and fix the problem themselves.

The chief justice says that allowing provinces to handle this on their own would only hold Canada back from combating climate change as a collective nation. Even if the majority of provinces were able to coordinate their efforts, it would only take a small number of provinces unwilling to impose a minimum carbon price to undermine the actions of the rest of the country.

The ruling states that only though a national and unified approach, in which all provinces are bound to playing by the same carbon pricing rules, does Canada have a chance at successfully reducing its carbon footprint.

The judge notes that provinces are still allowed to regulate themselves when it comes to their carbon pricing systems and can still chose their own regulatory frameworks when it comes to emissions standards. All they have to do is comply with the minimum standards laid out by the federal government or else they risk being slapped with an increased carbon tax.

Wagner also notes that carbon pricing works. The justice writes that there is a “”broad consensus among international bodies” that setting these minimum prices can significantly cut back on greenhouse emissions from carbon, with the idea being that the more carbon costs the less people will actually use it.

The ruling’s mention of an international consensus regarding the effectiveness of carbon pricing methods could have some possible implications for other nations that are considering similar measures. Implementing a carbon pricing system in the United States for instance, has often been cited as crucial step in overhauling America’s energy policies, but has so far failed to get off the ground.

Thursday’s ruling officially gives Trudeau’s climate policy the ability to forge ahead with imposing minimum carbon prices, which will continue to rise throughout the next decade to further discourage carbon use. The minimum price is currently set at $30 per 1.1 ton of emissions, but the government says it will continue to rise the minimum over the next few years before it ultimately hits $170 per 1.1 ton of emissions by 2030.


We should learn what lessons from Fukushima?

Lesson #1: People died from forced evacuations, not from radiation

Dr. Kelvin Kemm

A decade has passed since the Great East Japan Earthquake, and the name Fukushima is etched into history. But few people know the truth of what happened. The phrase, “the lessons learned from Fukushima,” is well-known. But how do people implement them, if they don’t know what happened, or what lessons they should actually learn?

It was after lunch on 11 March 2011 that a giant earthquake occurred 72 kilometers (45 miles) off the Oshika Peninsula in Japan. It registered 9.0 on the Richter Scale, making it the largest ’quake ever recorded in Japan. The undersea ground movement, over 30 km (18 miles) beneath the ocean’s surface, lifted up a huge volume of water, like an immense moving hill. Meanwhile, the ground shockwave travelled toward the land at high speed. It struck Japan and shook the ground for six terrifying minutes.

The shock wave travelled under 11 nuclear reactors, including two separate Fukushima complexes: Fukushima-Diani and Fukushima-Daiichi. (Diani means ‘Complex 1’ and Daiichi ‘Complex 2’.) All 11 reactors shut down, as they were designed to do, and no doubt all the reactor operators breathed a great sigh of relief. It was premature.

The mound of sea water was still traveling. As the water “hill” entered shallow water, nearer the land, it was lifted up into a towering wave as high as 40 meters (130 feet!) in places. Then, some 50 minutes after the earthquake, the tsunami struck the Fukushima-Daiichi nuclear power station. Some kilometres away, when water struck the Fukushima-Diani nuclear power station, it was “only” 9 m (30 ft) high, which was not as devastating as at Daiichi. Diani did not make it into the news.

The water jumped the protective sea walls at Fukushima-Daiichi. The sighs of relief from a half hour before turned into concern and dread. Over at the Fukushima Diani power station, 12 km (7 mi) to the south, water also caused damage to machinery, but the reactors were not harmed. There was no risk of radiation release, so the Diani power station was of no interest to the international media. Diani was safely shut down to “cold shutdown” after two days.

As a result, over the past decade, any reference to “Fukushima” has meant only the Daiichi power station and not the other one.

The devastating tsunami swept up to 10 km (6 mi) inland in places, washing away buildings, roads, and telecommunication and power lines. Over 15,000 people were killed, mainly by drowning.

Although all the nuclear reactors had shut down to a state known as “hot shutdown,” the reactors were still very hot and needed residual cooling for many hours after the urgent fast shutdown. People instinctively know not to put their hands on the engine block of a car right after it has been switched off. Nuclear reactors are the same and need to cool down until they reach the safe state known as “cold shutdown.”

A nuclear reactor has pumps that send water through the reactor until it cools. But the Fukushima electrical pumps failed, because the tsunami had washed away the incoming electrical lines. So the reactor system automatically switched to diesel-driven generators to keep the cooling pumps going; but the water had washed away the diesel fuel supply, meaning the diesels worked for only a short while. Then it switched to emergency batteries; but the batteries were never designed to last for days, and could supply emergency power for only about eight hours.

The hot fuel could not be cooled, and over the next three or four days the fuel in three reactors melted, much like a candle melts.

The world media watched, and broadcast the blow-by-blow action. Japanese authorities started to panic under the international spotlight. The un-circulating cooling water was boiling off inside the reactors resulting in a chemical reaction between hot fuel exposed to hot steam. This led to the production of hydrogen gas. As the steam pressure rose, the engineers decided to open valves to release the pressure. That worked as planned, but it released the hydrogen as well.

Hydrogen, being light, rose up to the roof, where the ventilation system was not working, because there was no electricity. After a while some stray spark ignited the hydrogen which exploded, blowing the lightweight roof off the building right in front of the world’s TV cameras. The Fukushima news just became much more dramatic. Authorities were desperate to show the world some positive action.

They progressively ordered the evacuation of 160,000 people living around the Fukushima neighbourhood. That was a mistake. As days and weeks passed, it materialized that not one single person was killed by nuclear radiation. Not one single person was even injured by nuclear radiation, either. Even today, a decade later, there is still no sign of any longer-term radiation harm to any person or animal. Sadly, however, people did die during the forced evacuation.

So one of the lessons learned from Fukushima is that a huge amount of nuclear power can be struck by the largest earthquake and tsunami ever recorded, and nobody gets harmed by nuclear radiation.

Another lesson learned is that an evacuation order issued too hastily did harm and kill people.

World Nuclear Association Director-General Dr. Sama Bilbao y León said: “The rapidly implemented and protracted evacuation has resulted in well-documented significant negative social and health impacts. In total, the evacuation is thought to have been responsible for more than 2,000 premature deaths among the 160,000 who were evacuated. The rapid evacuation of the frail elderly, as well at those requiring hospital care, had a near-immediate toll.” [emphasis added]

She added: “When facing future scenarios concerning public health and safety, whatever the event, it is important that authorities take an all-hazards approach. There are risks involved in all human activities, not just nuclear power generation. Actions taken to mitigate a situation should not result in worse impacts than the original events. This is particularly important when managing the response to incidents at nuclear facilities – where fear of radiation may lead to an overly conservative assessment and a lack of perspective for relative risks.”

Thus, a decade later, we can contemplate the cumulative lessons learned. Above all, they are that nuclear power is far safer than anyone had thought. Even when dreaded core meltdowns occurred, and although reactors were wrecked, resulting in a financial disaster for the owners, no people were harmed by radiation.

We also learned that, for local residents, it would have been far safer to stay indoors in a house than to join the forced evacuation. We also learned that governments and authorities must listen to the nuclear professionals, and not overreact, even though the television news cameras look awfully close.

Fukushima certainly produced some valuable lessons. Governments, news media and the public need to learn the correct lessons from them.

Dr Kelvin Kemm is a nuclear physicist and is CEO of Stratek Business Strategy Consultants, a project management company based in Pretoria. He conducts business strategy development and project planning in a wide variety of fields for diverse clients. Contact him at

Email from




Thursday, March 25, 2021

The Story About Offshore Oil Drilling Environmentalists DESPERATELY Don't Want You to Hear

“Louisiana officials say the state’s oil and gas industry is in danger. This comes after President Joe Biden cancelled a March oil lease sale in the Gulf of Mexico. Nearly 80 million acres of available leases would have been sold this week. The damage to Louisiana’s (and the nation’s) oil and gas companies started in January when President Biden signed an executive order banning all new oil and gas leases on public land and waters for 60 days," reports KLFY.

Now for the cheering:

“Cancelling this huge offshore Gulf oil auction helps protect our climate and life on Earth. President Biden understands the urgent need to keep this oil in the ground…This is a great step toward phasing out all offshore drilling and bringing environmental justice to the Gulf Coast and Alaska. We need to help restore coastal communities and marine life," the Center for Biological Diversity said in a statement.

And speaking of “marine life.” If bona-fide science has crowned “Global Warmists” with 10-foot dunce caps, then over half a century of scientific evidence has crowned anti-offshore drilling activists with 50-foot dunce caps. That offshore oil drilling—far from an environmental disaster, is empirically an environmental bonanza—has been pounded home with a vengeance in study after study. The science, you might say, is settled. To wit:

According to the Energy Information Administration, "Gulf of Mexico federal offshore oil production accounts for 17% of total U.S. crude oil production." Yet with over 3000 of the 4000 plus offshore oil production platforms in the Gulf of Mexico off her coast, Louisiana provides almost a third of North America’s commercial fisheries.

A study by LSU’s sea grant college showed that 70 percent of Louisiana’s offshore fishing trips target these structures. “Oil platforms as artificial reefs support fish densities 10 to 1000 times that of adjacent sand and mud bottom, and almost always exceed fish densities found at both adjacent artificial reefs of other types and natural hard bottom,” revealed a study by Dr. Bob Shipp, professor at the Marine Sciences department of the University of South Alabama in Mobile, Alabama. “Evidence indicates that massive areas of the northwestern Gulf of Mexico were essentially empty of Red Snapper stocks for the first hundred years of the fishery. Subsequently, areas in the western Gulf have become the major source of red snapper, concurrent with the appearance of thousands of petroleum platforms.”

In brief, “villainous” Big Oil produces marine life at rates that puts to shame “wondrous” Earth Goddess Gaia. “The fish Biomass around an offshore oil platform is ten times greater per unit area than for natural coral reefs,” also found Dr. Charles Wilson of LSU’s Department of Oceanography and Coastal Science (emphasis added). "Ten to thirty thousand adult fish live around an oil production platform in an area half the size of a football field.”

But you’re very conveniently “forgetting” the infamous BP oil spill! comes the retort from Environmentalist Whackos.

Glad you mentioned that. Because only one year after the infamous spill, the FDA’s Gulf Coast Seafood Laboratory, the National Oceanic and Atmospheric Administration’s National Seafood Inspection Laboratory, the Louisiana Department of Wildlife and Fisheries, the Louisiana Department of Health and Hospitals, along with similar agencies from neighboring Gulf coast states, have methodically and repeatedly tested Gulf seafood for cancer-causing “polycyclic aromatic hydrocarbons.”

“Not a single sample [for oil or dispersant] has come anywhere close to levels of concern,” reported Olivia Watkins, executive media advisor for the Louisiana Department of Wildlife and Fisheries.

“All of the samples have been 100-fold or even 1,000-fold below all of these levels,” reported Bob Dickey, director of the FDA’s Gulf Coast Seafood Laboratory. “Nothing ever came close to these levels.”

That this proliferation of seafood in the Gulf of Mexico came because – rather than in spite – of the oil production rattled many environmental cages and provoked a legion of scoffers.

Amongst the scoffers were some Travel Channel producers, fashionably greenish in their views. They read these claims in a book by yours truly—"The Helldiver’s Rodeo”—that Publishers Weekly hailed as “highly-entertaining!” (Ted Nugent’s blurb certainly didn’t help against their scoffing!)

The book describes an undersea panorama that (if true) could make an interesting show for the network, they concluded, while still scoffing. They scoffed as we rode in from the airport. They scoffed over raw oysters, grilled redfish and seafood gumbo that night. More scoffing through the Hurricanes at Pat O’Brien’s. They scoffed even while suiting up in dive gear and checking the cameras as we tied up to an oil platform 20 miles in the Gulf off the southeast Louisiana coast.

But they came out of the water bug-eyed and indeed produced and broadcast a Travel Channel program showcasing a panorama that turned on its head every environmental superstition against offshore oil drilling. Schools of fish filled the water column from top to bottom – from 6-inch blennies to 12-foot sharks. Fish by the thousands. Fish by the ton.

The cameras were going crazy. Do I focus on the shoals of barracuda? Or that cloud of jacks? On the immense schools of snapper below, or on the fleet of tarpon above? How ’bout this – whoa – hammerhead! We had some close-ups, too, of coral and sponges, the very things disappearing off Florida’s pampered reefs—a state that bans offshore oil drilling. Off Louisiana, they sprout in colorful profusion from the huge steel beams —acres of them. You’d never guess this was part of that unsightly structure above. The panorama of marine life around an offshore oil platform staggers anyone who puts on goggles and takes a peek, even (especially!) the most worldly scuba divers. Here’s a video peek at this seafood bonanza.


Toyota Warns (Again) About Electrifying All Autos. Is Anyone Listening?

Depending on how and when you count, Japan’s Toyota is the world’s largest automaker. According to Wheels, Toyota and Volkswagen vie for the title of the world’s largest, with each taking the crown from the other as the market moves. That’s including Volkswagen’s inherent advantage of sporting 12 brands versus Toyota’s four. Audi, Lamborghini, Porsche, Bugatti, and Bentley are included in the Volkswagen brand family.

GM, America’s largest automaker, is about half Toyota’s size thanks to its 2009 bankruptcy and restructuring. Toyota is actually a major car manufacturer in the United States; in 2016 it made about 81% of the cars it sold in the U.S. right here in its nearly half a dozen American plants. If you’re driving a Tundra, RAV4, Camry, or Corolla it was probably American-made in a red state. Toyota was among the first to introduce gas-electric hybrid cars into the market, with the Prius twenty years ago. It hasn’t been afraid to change the car game.

All of this is to point out that Toyota understands both the car market and the infrastructure that supports it perhaps better than any other manufacturer on the planet. It hasn’t grown its footprint through acquisitions, as Volkswagen has, and it hasn’t undergone bankruptcy and bailout as GM has. Toyota has grown by building reliable cars for decades.

When Toyota offers an opinion on the car market, it’s probably worth listening to. This week, Toyota reiterated an opinion it has offered before. That opinion is straightforward: The world is not yet ready to support a fully electric auto fleet.

Toyota’s head of energy and environmental research Robert Wimmer testified before the Senate this week, and said: “If we are to make dramatic progress in electrification, it will require overcoming tremendous challenges, including refueling infrastructure, battery availability, consumer acceptance, and affordability.”

Wimmer’s remarks come on the heels of GM’s announcement that it will phase out all gas internal combustion engines (ICE) by 2035. Other manufacturers, including Mini, have followed suit with similar announcements.

Tellingly, both Toyota and Honda have so far declined to make any such promises. Honda is the world’s largest engine manufacturer when you take its boat, motorcycle, lawnmower, and other engines it makes outside the auto market into account. Honda competes in those markets with Briggs & Stratton and the increased electrification of lawnmowers, weed trimmers, and the like.

Help us STOP Joe Biden’s radical agenda by becoming a PJ Media VIP member. Use promo code AMERICAFIRST to receive 25% off your VIP membership.
Wimmer noted that while manufactures have announced ambitious goals, just 2% of the world’s cars are electric at this point. For price, range, infrastructure, affordability, and other reasons, buyers continue to choose ICE over electric, and that’s even when electric engines are often subsidized with tax breaks to bring pricetags down.

The scale of the switch hasn’t even been introduced into the conversation in any systematic way yet. According to FinancesOnline, there are 289.5 million cars just on U.S. roads as of 2021. About 98 percent of them are gas-powered. Toyota’s RAV4 took the top spot for purchases in the U.S. market in 2019, with Honda’s CR-V in second. GM’s top seller, the Chevy Equinox, comes in at #4 behind the Nissan Rogue. This is in the U.S. market, mind. GM only has one entry in the top 15 in the U.S. Toyota and Honda dominate, with a handful each in the top 15.

Toyota warns that the grid and infrastructure simply aren’t there to support the electrification of the private car fleet. A 2017 U.S. government study found that we would need about 8,500 strategically-placed charge stations to support a fleet of just 7 million electric cars. That’s about six times the current number of electric cars but no one is talking about supporting just 7 million cars. We should be talking about powering about 300 million within the next 20 years, if all manufacturers follow GM and stop making ICE cars.

Simply put, we’re gonna need a bigger energy boat to deal with connecting all those cars to the power grids. A LOT bigger.

But instead of building a bigger boat, we may be shrinking the boat we have now. The power outages in California and Texas — the largest U.S. states by population and by car ownership — exposed issues with powering needs even at current usage levels. Increasing usage of wind and solar, neither of which can be throttled to meet demand, and both of which prove unreliable in crisis, has driven some coal and natural gas generators offline. Wind simply runs counter to needs — it generates too much power when we tend not to need it, and generates too little when we need more. The storage capacity to account for this doesn’t exist yet.

We will need much more generation capacity to power about 300 million cars if we’re all going to be forced to drive electric cars. Whether we’re charging them at home or charging them on the road, we will be charging them frequently. Every gas station you see on the roadside today will have to be wired to charge electric cars, and charge speeds will have to be greatly increased. Current technology enables charges in “as little as 30 minutes,” according to Kelly Blue Book. That best-case-scenario fast charging cannot be done on home power. It uses direct current and specialized systems. Charging at home on alternative current can take a few hours to overnight to fill the battery, and will increase the home power bill.

That power, like all electricity in the United States, comes from generators using natural gas, petroleum, coal, nuclear, wind, solar, or hydroelectric power according to the U.S. Energy Information Administration. I left out biomass because, despite Austin, Texas’ experiment with purchasing a biomass plant to help power the city, biomass is proving to be irrelevant in the grand energy scheme thus far. Austin didn’t even turn on its biomass plant during the recent freeze.

Half an hour is an unacceptably long time to spend at an electron pump. It’s about 5 to 10 times longer than a current trip to the gas pump tends to take when pumps can push 4 to 5 gallons into your tank per minute. That’s for consumer cars, not big rigs that have much larger tanks. Imagine the lines that would form at the pump, every day, all the time, if a single charge time isn’t reduced by 70 to 80 percent. We can expect improvements, but those won’t come without cost. Nothing does. There is no free lunch. Electrifying the auto fleet will require a massive overhaul of the power grid and an enormous increase in power generation. Elon Musk recently said we might need double the amount of power we’re currently generating if we go electric. He’s not saying this from a position of opposing electric cars. His Tesla dominates that market and he presumably wants to sell even more of them.

Toyota has publicly warned about this twice, while its smaller rival GM is pushing to go electric. GM may be virtue signaling to win favor with those in power in California and Washington and in the media. Toyota’s addressing reality and its record is evidence that it deserves to be heard.

Toyota isn’t saying none of this can be done, by the way. It’s just saying that so far, the conversation isn’t anywhere near serious enough to get things done.


Nations Aren’t Acting as If Climate Change Poses Existential Crisis

Political leaders and media personalities are fond of saying climate change poses an existential threat to humans and the planet. The weight of scientific evidence doesn’t support this oft-made claim.

Despite what is reported almost daily by the mainstream media, data from the United Nations Intergovernmental Panel on Climate Change (IPCC) and the U.S. National Oceanic and Atmospheric Administration (NOAA) show no increase in extreme weather events as the earth has modestly warmed over the past 150 years. Indeed, IPCC and NOAA data show the number of extreme cold spells, drought, floods, heatwaves, hurricanes, tornadoes, and wildfires have all either declined modestly or remained relatively stable since the late 1870s.

Despite these irrefutable facts, leaders from nations around the world have signed multiple international agreements, the latest being the 2015 Paris climate agreement, intended to avert a supposed pending climate disaster.

However, their actions do not match their words. The U.N. recently reported the same political leaders who publicly signed the Paris Climate Agreement committing them to restrict emission from their countries, have enacted domestic policies that actually increase emissions.

As of February 26, the U.N. says only 75 of the more than 190 countries that have ratified the Paris Climate Agreement have tendered firm commitments and detailed plans to cut emissions, despite having committed to deliver those plans by 2020. Adding insult to injury, the U.N. says “the level of ambition communicated through these NDCs indicates that changes in these countries’ total emissions would be small, less than -1%, in 2030 compared to 2010 … [whereas the] IPCC, by contrast, has indicated that emission reduction ranges to meet the 1.5°C temperature goal should be around -45% in 2030 compared to 2010.”

Whether it is large emitters or small, the reality is nations are putting poverty reduction and economic growth (rightly in my opinion) ahead of climate action.

Let’s look at a few examples.

India is the world’s third-largest greenhouse gas emitter. Under the Paris Agreement, India did not pledge to cut its emissions, rather it said it would reduce emissions intensity (emissions as a percentage of GDP). The problem with this is, even if India exceeds its carbon intensity reduction goals, its total emissions will still have increased substantially. As a result, the U.N. observes, “with current energy targets and policies, emissions are projected to keep increasing (by 24-25 percent above 2019 levels in 2030) and show no signs of peaking, in particular due to the lack of a policy to transition away from coal. Such an increase of emissions is not consistent with the Paris Agreement.”

Indeed, 70 percent of India’s electric power is generated by burning coal. And India’s most recent estimate is that by 2030 coal use for energy will increase by 40 percent.

The news is even worse out of China, the world’s biggest emitter. China, which is responsible for approximately 25 percent of the world’s emissions, vaguely indicated it expected carbon dioxide emissions to peak by 2030. Peak at what level?

The Chinese Communist Party recently released its five-year plan for economic development and it contained no reduction in coal use. It would be surprising if it did. In recent years, China has built dozens of new coal-fueled power plants, with hundreds more in various stages of construction, development, and planning. China intends to build coal-fueled power plants in Africa, throughout Asia, and the Middle East.

Simultaneously, China is disincentivizing new construction of wind and solar facilities, which the National Energy Administration (NEA) referred to as “unreliables.”

Even Argentina, a relatively small emitter, will have trouble squaring its development goals with its Paris climate commitment. At a recent conference, President Alberto Fernandez said Argentina’s goal was to reach net-zero greenhouse gas emissions by 2050. Meanwhile, back home, far from the limelight of international climate conferences, Fernandez announced the government was doubling down on fossil fuels, saying, “Today we are relaunching the oil and gas economy,” starting with $5 billion in government subsidies, to develop its shale fields.

Political elites don’t really fear a climate apocalypse is in the offing. Rather, they are using the threat of the “climate change” hobgoblin to accrue ever more power and control over peoples’ lives. For politicians, this is what the climate scare is and always has been about.

H.L. Mencken once famously quipped, “The urge to save humanity is almost always a false front for the urge to rule.” Nowhere is this truer than in the push to save the world from climate change.


What Australians really think about climate change

Sampling, sampling, sampling. The revelation that only one in seven Australians take climate change seriously is very encouraging but ALL the figures below have to be taken with a large grain of salt.

The "sample" was derived from an online panel study and the biases of online studies are well-known, to say nothing of the inaccuracies in panel studies. Online samples tend to skew Left. So even the 7% is probably an overstimate

The journal article is "Australian voters’ attitudes to climate action and their social-political determinants" in

Just one in seven Australians considered climate change their decisive issue when voting in the 2019 federal election.

But some 80 per cent say action to reduce Australia’s greenhouse gas emissions is important, including almost 70 per cent of Coalition voters.

They are two key findings from new ANU research published today in the journal PLOS ONE, based on online and telephone surveys with more than 2000 Australian voters after the 2019 poll.

In the paper, researchers Dr Rebecca Colvin and Professor Frank Jotzo looked at some of the reasons why, in the “climate election,” the party that was offering the more “status quo” emissions policy was returned to government.

They found 52 per cent of survey respondents said climate change was a factor in how they voted in 2019, but it was the single biggest issue for just 13 per cent of voters – or slightly more than one in seven people.

Asked whether this finding could be a source of hope or despair for supporters of climate action, Dr Colvin said both interpretations were possible.

“One way to look at it is that there isn’t a massive unbridgeable divide across the political spectrum on climate change,” she told News Corp. “There are lots of people who say they want to see action on climate change but they’re not determining their votes on it – but that broad based of social support is there.”




Wednesday, March 24, 2021

It takes big energy to back up wind and solar

Power system design can be extremely complex but there is one simple number that is painfully obvious. At least it is painful to the advocates of wind and solar power, which may be why we never hear about it. It is a big, bad number.

To my knowledge this big number has no name, but it should. Let’s call it the “minimum backup requirement” for wind and solar, or MBR. The minimum backup requirement is how much generating capacity a system must have to reliably produce power when wind and solar don’t.

For most places the magnitude of MBR is very simple. It is all of the juice needed on the hottest or coldest low wind night. It is night so there is no solar. Sustained wind is less than eight miles per hour, so there is no wind power. It is very hot or cold so the need for power is very high.

In many places MBR will be close to the maximum power the system ever needs, because heat waves and cold spells are often low wind events. In heat waves it may be a bit hotter during the day but not that much. In cold spells it is often coldest at night.

Thus what is called “peak demand” is a good approximation for the maximum backup requirement. In other words, there has to be enough reliable generating capacity to provide all of the maximum power the system will ever need. For any public power system that is a very big number, as big as it gets in fact.

Actually it gets a bit bigger, because there also has to be margin of safety or what is called “reserve capacity”. This is to allow for something not working as it should. Fifteen percent is a typical reserve in American systems. This makes MBR something like 115% of peak demand.

We often read about wind and solar being cheaper than coal, gas and nuclear power, but that does not include the MBR for wind and solar. What is relatively cheap for wind and solar is the cost to produce a unit of electricity. This is often called LCOE or the “levelized cost of energy”. But adding the reliable backup required to give people the power they need makes wind and solar very expensive.

In short the true cost of wind and solar is LCOE + MBR. This is the big cost you never hear about. But if every state goes to wind and solar then each one will have to have MBR for roughly its entire peak demand. That is an enormous amount of generating capacity.

Of course the cost of MBR depends on the generating technology. Storage is out because the cost is astronomical. Gas fired generation might be best but it is fossil fueled, as is coal. If one insists on zero fossil fuel then nuclear is probably the only option. Operating nuclear plants as intermittent backup is stupid and expensive, but so is no fossil fuel generation.

What is clearly ruled out is 100% renewables, because there would frequently be no electricity at all. That is unless geothermal could be made to work on an enormous scale, which would take many decades to develop.

It is clear that the Biden Administration’s goal of zero fossil fueled electricity by 2035 (without nuclear) is economically impossible because of the minimum backup requirements for wind and solar. You can’t get there from here.

One wonders why we have never heard of this obvious huge cost with wind and solar. The utilities I have looked at avoid it with a trick.

Dominion Energy, which supplies most of Virginia’s juice, is a good example. The Virginia Legislature passed a law saying that Dominion’s power generation had to be zero fossil fueled by 2045. Dominion developed a Plan saying how they would do this. Tucked away in passing on page 119 they say they will expand their capacity for importing power purchased from other utilities. This increase happens to be to an amount equal to their peak demand.

The plan is to buy all the MBR juice from the neighbors! But if everyone is going wind and solar then no one will have juice to sell. In fact they will all be buying, which does not work. Note that the high pressure systems which cause low wind can be huge, covering a dozen or more states. For that matter, no one has that kind of excess generating capacity today.

To summarize, for every utility there will be times when there is zero wind and solar power combined with near peak demand. Meeting this huge need is the minimum backup requirement. The huge cost of meeting this requirement is part of the cost of wind and solar power. MBR makes wind and solar extremely expensive.

The simple question to ask the Biden Administration, the States and their power utilities is this: How will you provide power on hot or cold low wind nights?


When climate alarmism meets cancel culture


Across the world, politicians are now promising climate policies costing tens of trillions of dollars – money we don’t have and resources that are desperately needed elsewhere.

Yet, climate campaigners tell us, if we don’t spend everything on climate now, nothing else matters, because climate change threatens our very civilisation. As US President Joe Biden says: climate change is “an existential threat”.

Yes, climate change is a real problem. However, it is typically vastly exaggerated, and the resulting alarmism is exploited to justify the wasteful spending of trillions.

Pointing this out will get you cancelled. I should know, because I have personally been on the receiving end of this climate alarmism enforcement for years. Last week, I was scheduled to give a public lecture at Duke University in the US when a group of climate-politicised professors – some who write for the UN Climate Panel – publicly asked Duke to cancel my appearance.

One of my presentation points was highlighting the latest full UN Climate Panel report which estimates the total cost of climate change. They found that unmitigated climate change in half a century will reduce general welfare equivalent to lowering each person’s income by between 0.2 and 2 per cent. Given that the UN expects each person on the planet to be much better off – 363 per cent as wealthy as today – climate might cause us to only be 356 per cent as rich by then. That is a problem, but certainly not the end of the world.

Why don’t most people know this? Because stories of catastrophe and human guilt garner more clicks and are better for weaponising political arguments. Unfortunately, we’re unlikely to make good decisions if we’re panicked.

READ MORE:Climate change risks new cold war|Japan giant eyes Australian hydrogen|OECD looks beyond climate with Cormann|Bill Gates: time for green innovation to address climate change|Heat on corporates’ climate claims
The political forces looking to spend the climate trillions and the academia segment supplying the fear want to scrub the climate debate of anything but the scariest scenarios. They want an unwavering allegiance to vigorous spending on climate policy, no matter its effectiveness.

They insist on treating this issue as a moral binary choice instead of a realistic balancing of costs and effectiveness which would allow for our many other challenges to be heard as well.

Certainly, the professors at Duke didn’t want anyone to hear dissenting facts.

They tried to stop the lecture through outright lies, such as claiming that my funding comes from Exxon and the Koch brothers. These claims are categorically untrue. They also declared that I had been deemed scientifically dishonest, although the mock trial which originated that claim has been completely overturned and annulled because it contained no arguments.

More worryingly, they raged about how climate catastrophes are so terrible that we should not allow any more climate debate. Yet, their claims were almost uniformly untrue. They said that “much of the Australian continent” had been devoured in climate-induced fire. But we know from satellite measurements, published in Nature, that while the fires near population centres had severe impacts, the total land area burned was 4 per cent – one of the lowest-ever percentages, from an average this century of 6.2 per cent and last century of 10.1 per cent. Four per cent is not “much of the Australian continent”. Such claims are more like rantings from people who have been watching too much alarmist TV.

They claimed that “countless lives” are being lost to climate-related disasters worldwide. Yet, the International Disaster Database shows that in the 2010s, 18,357 people died each year from climate-related impacts such as floods, droughts, storms, wildfire and extreme temperatures. That is the lowest death count in the past century, a 96 per cent decline since the 1920s, despite a larger global population. And 2020 had an even lower death count at 8086.

Yet, we can only know this when we’re allowed to hear facts presented. Thankfully, Duke University didn’t cave to their craven arguments, but the attempts to suppress free speech, facts and knowledge continue.

The easiest way to get societies to authorise the spending of tens of trillions we don’t have is to scare us. The academic and activist faction that sets the threatening tone in the climate conversation want dissent eliminated, leaving themselves the only ones authorised to tell you how scared you should be. To avoid wasting trillions, we should not let them.

Bjorn Lomborg is President of the Copenhagen Consensus and Visiting Fellow at the Hoover Institution, Stanford University. His latest book is False Alarm: How Climate Change Panic Costs us Trillions, Hurts the Poor, and Fails to Fix the Planet. The original version of this story appeared on the Fox News website.


The EPA Is Failing the American People

March 22 marks the annual observance of World Water Day, a campaign organized by the United Nations. The theme for 2021 is the importance of access to clean drinking water.

Despite what you may think, a lack of access to clean water is not just an issue in developing countries. It’s a problem in the United States, too.

According to the Centers for Disease Control and Prevention (CDC), 7.2 million Americans suffer from waterborne illnesses each year due to contaminated water supply. At least 600,000 Americans are forced to make emergency room visits and an estimated 6,630 individuals die from these illnesses. This creates an estimated $3.3 billion in annual health care costs.

Heavy metal, fecal bacteria, and nuclides are some of the contaminants that can harm human health.

Are you at risk? The answer might prove difficult to determine.

The passage of the Safe Drinking Water Act tasked the Environmental Protection Agency (EPA) with overseeing and regulating America’s public water systems giving them the responsibility of efficiently and effectively maintaining safe drinking water.

The EPA’s Safe Drinking Water Information System tracks water contamination levels. The database allows the public to search in their area, view a historical list of water quality violations, and learn of potential long-term effects of each contaminant.

But despite many organizations and millions of Americans relying on this data, it has been almost three years since many of the records have been updated.

Some of the data also seems corrupted. EPA’s data for Florida included data for St. Tammany Parish—which is in Louisiana.

Beginning in November, my organization attempted to reach the EPA to inquire about when the agency would provide updated data. The agency’s website had out-of-date information about personnel, too, and calls went unanswered. We still haven’t heard from them.

While the EPA moves at a glacial pace, some Americans have to deal with continually unreliable water.

Florencia Ramos, a farmworker and mother living in El Rancho, California, has been purchasing clean, bottled water for her family for over a decade. “If you don’t have clean water, you have to go get some,” she says. Her water, once contaminated with high levels of nitrates, is now polluted with disinfectant byproducts.

After several years of the El Rancho community failing residents, they decided to take matters into their own hands by forming the AGUA Coalition, a regional organization dedicated to securing safe, clean water for surrounding communities. AGUA currently represents 26 impacted communities.

A Gallup Poll released in March 2017 found 63 percent of Americans are a “great deal” concerned about pollution of drinking water—the highest percentage since 2001. When given a list of six environmental threats, including air pollution and global warming, respondents overwhelmingly agreed that contamination of drinking water was their No. 1 worry.

In Texas alone, data shows over 22,000 incidents of nitrates exceeding the maximum contaminant level in tap water over the last 28 years.

In early 2020, it was discovered that high levels of toxic chemicals, also known as PFAS, were making residents of Westfield, Massachusetts, sick. Some municipal wells were taken offline while another had a $500,000 filtration system installed to bring the water quality back to government-set standards.

On this World Water Day, it’s important to remember that access to clean drinking water is a human right. The EPA cannot continue to fail the American people by allowing the unknown contaminants in tap water to remain a dangerous mystery.


Many Climate Crisis Claims Are Based on Manipulated Science

We are constantly being warned by activists, politicians, and some climate scientists that we face a climate crisis; that if humanity collectively doesn’t alter its lifestyle and consumption patterns now, the world will end in 10 years, 12 years, 50 years—pick your number.

This is a lie, and I suspect most of the people making these apocalyptic prophesies know it. For them, it’s the modern-day equivalent of Plato’s noble lie—lying to people to get them to act in ways they don’t realize are in their own best interest. Not coincidentally, those telling the lie profit from it in terms of influence, money, power, or all three.

This lie is not in fact noble, nor is it based upon sound science. Rather, it is perpetrated through the regular suppression of inconvenient scientific data: data altered, suppressed, or scrubbed from journals and textbooks, which put the lie to insupportable claims made by politically connected climate scientists that an anthropogenic climate apocalypse is in the offing.

The big lie is built on a faulty premise that science can realistically trace the cause of a modest recent warming of the earth primarily to human greenhouse gas emissions, and that from this we can confidently predict what the world will look like 50, 100, and 300 years from now. Award winning climate scientist Richard Lindzen, Ph.D, described the big lie this way:

“One problem with conveying our message is the difficulty people have in recognizing the absurdity of the alarmist climate message. They can’t believe that something so absurd could gain such universal acceptance. Consider the following situation. Your physician declares that your complete physical will consist in simply taking your temperature. This would immediately suggest something wrong with your physician. He further claims that if your temperature is 98.7F rather than 98.6F you must be put on life support. Now you know he is certifiably insane. The same situation for climate is considered ‘settled science.’”

Among the most egregious attempts to suppress inconvenient climate science came in 2001 when the United Nations Intergovernmental Panel on Climate Change (IPCC) tried to replace settled climate history with the “hockey stick” graph. The hockey stick dispensed with the long recognized medieval warm period from approximately 950 AD to 1250 AD, and the little ice age, which ran from approximately 1350 AD through 1850 AD. Its originators postulated global temperatures had been fairly stable over the past millennium, until the twentieth century when they began to rise sharply, due to an increase in carbon dioxide emissions. This fit the IPCC’s climate change narrative, so it embraced it as the truth. Ultimately, even the IPCC couldn’t defend the hockey stick temperature reconstruction and removed it from subsequent reports.

Then came Climategate, in which a treasure trove of inconvenient email exchanges between IPCC connected climate scientists were hacked and leaked. These emails detailed the scientists hiding data that indicated the recent warming trend was not historically unusual, and censoring scientific research that undermined claims of apocalyptic warming.

The Surface Station Project exposed the dirty little secret that temperature readings from the vast majority of the ground-based temperature stations were compromised by urban growth, skewing temperature readings higher. Indeed, research found 89 percent of surface stations—nearly 9 of every 10—fail to meet the National Weather Service’s requirements that stations must be 30 meters (about 100 feet) or more away from an artificial source of heat.

Additional scientific misconduct comes in the form of temperature monitoring agencies “adjusting” raw temperature recordings from unbiased, isolated temperature stations and reporting them in a way that indicates past temperatures were colder than were actually measured and recent temperatures have been warmer than actually measured. This action produces an artificially steep temperature trend, making recent warming appear larger than it has been. In some instances, when these nefarious actions were exposed, the government agencies involved tried to scrub the official records of past temperatures. Fortunately, in the age of the internet, where data once posted is forever, these Orwellian attempts to rewrite climate history have largely failed.

When global warming went on a 15-year hiatus, with temperatures flat-lining despite a steady rise in carbon dioxide emissions, a team of climate researchers at the National Oceanic and Atmospheric Administration (NOAA) altered how ocean temperatures were measured. Voilà, like magic, the hiatus disappeared. As David Rose wrote for the Daily Mail, describing the incident “[NOAA researchers] took reliable readings from buoys but then ‘adjusted’ them upwards - using readings from seawater intakes on ships that act as weather stations … even though readings from the ships have long been known to be too hot.”

Most recently, some of the same characters that brought the world the “hockey stick” have now published a widely publicized paper that claims a long-recognized ocean circulation pattern, the Atlantic Multidecadal Oscillation (AMO), which impacts climate, never existed at all but was an artifact of volcanic pulses. Commenting on this paper, climate scientist Judith Curry, Ph.D., writes:

“Wow. In one fell swoop, the pesky problems of the ‘grand hiatus’ in the mid 20th century, debates over the attribution of 20th century warming and the role of multidecadal internal variability, and the difficulty of attributing the recent increase in Atlantic hurricane activity to AGW, all go away. Brilliant! Almost as ‘brilliant’ as the Hockey Stick.”

There is little doubt the earth is warming, but the list of breaches of the scientific method and ethics by researchers whose careers are intimately tied to the “truth” of climate alarmism provides more than enough reason to doubt the claim that the science is settled and the earth is doomed, absent giving government authoritarian control over all aspects of peoples’ lives.




Monday, March 22, 2021

States sue to block “social cost” of carbon

Twelve states have asked a Federal Court to keep federal agencies from using the so-called Social Cost of Greenhouse Gases to calculate the benefits of emission reduction regulations.

The new cost estimates, ordered by President Biden on day one, claim enormous distant future damages from today’s emissions of carbon dioxide, methane and nitrous oxide. Preventing these supposed damages could justify massive new regulations, but the States say this is illegal because Congress never authorized it.

The Social Cost of Carbon (SCC) has been around for some time. Obama introduced it as a policy measure, which Trump then canceled. Now Biden has brought it back and made it worse.

In a way SCC personifies the craziness of the climate scare. The whole scare is based on outlandish doomsday computer models and SCC is arguably the most absurd of all.

The fundamental absurdity of the Social Cost of Carbon is that it goes out 300 years to get the supposed economic damages due to today’s minor emissions of carbon dioxide. That’s right, these computer models claim to know what the world’s economy will be for the next 300 years. The claim is absurd because technological and economic progress make the future world unknowable.

Consider what America was like 300 years ago. To begin with America did not yet exist. George Washington had yet to be born. European settlement of North America was confined almost entirely to a band a hundred miles or so along the east coast or up some big rivers.

Power was by hand, horse and waterwheel. Travel by horse and water. Yet to come were the steam engine, electricity, motors, cars, airplanes and a billion other inventions that changed the world in unimaginable ways.

The fact that today we have computers does not make our next 300 years any more predictable than their 300 was in 1720. Given that the pace of technological change has increased our future is probably even less knowable.

Since the 300 year prediction claims of SCC are absurd, why do the alarmists make them? It is the only way they can get serious future damages out of the computer models. Even in these hot models, the annual adverse impact of our emissions is very small. But according to the models this small impact goes on for hundreds of years so it adds up. This extreme assumption itself is unproven.

Even worse, the damage is estimated as a fixed fraction of economic activity, so it increases every year as the economy grows. In 300 years the global economy grows tremendously, just as it did in the last 300 years. The near term damages are negligible. In some models the CO2 increase is actually beneficial in the short run.

This economic growth factor is an important part of the SCC scam, because it negates what is called the “discount rate”. In cost benefit analysis, future costs are discounted back to present value, sort of like reverse compound interest. The further into the future a given amount of damage is, the less it is worth today. The amount of this discounting is the discount rate, which is typically around 3 to 5% per year.

Normally this discounting makes distant future costs negligible. But SCC computer modeling gets around it by having the global economy grow faster than the discount rate.

They even claim to know the cumulative 300 year economic damages due to our emissions to an exact dollar per ton amount. I am not making this up. Given the scam it is no surprise that this amount corresponds to the amount of proposed carbon taxes. Moreover, it increases every year, just like the proposed taxes and carbon control regulations.

Biden widened the scam by adding the Social Cost of Methane (and Nitrous Oxide). Natural gas is mostly methane so this is part of the new war on gas. But cows are also a major source so now we have a war on cows. The Agriculture Department, which regulates cows, is included in the Social Cost executive order, so they are being sued along with EPA and a bunch of other regulatory agencies.

Skeptics argue that methane is a trivial greenhouse gas, especially because it competes with water which is by far the dominant GHG. Alarmist want us to stop eating beef and dairy products, as well as killing natural gas, so methane gets a big Social Cost estimate.

In its way the State’s lawsuit extends well beyond the incredibly stupid Social Cost issue. Biden has issued a raft of potentially punitive climate EO’s. Executive orders are only supposed to affect internal Executive Branch operations. In fact every one contains language saying this, so they are supposedly not subject to judicial review the way public regulations are.

The States point out quite correctly that requiring rule making federal agencies to use Social Cost will have a tremendously adverse effect on the American people. They argue that under the Constitution only such actions specifically authorized by Congress can be ordered. Thus the suit is not against the EO or the supposed science but against the agencies acting on it without authorization.

Here is how the States put it in their Complaint to the Federal Court:

“This quintessentially legislative policy has enormous consequences for America’s economy and people. In theory, the Biden Administration’s calculation of “social costs” would justify imposing trillions of dollars in regulatory costs on the American economy every year to offset these supposed costs. In practice, President Biden’s order directs federal agencies to use this enormous figure to justify an equally enormous expansion of federal regulatory power that will intrude into every aspect of Americans’ lives—from their cars, to their refrigerators and homes, to their grocery and electric bills. If the Executive Order stands, it will inflict hundreds of billions or trillions of dollars of damage to the U.S. economy for decades to come. It will destroy jobs, stifle energy production, strangle America’s energy independence, suppress agriculture, deter innovation, and impoverish working families. It undermines the sovereignty of the States and tears at the fabric of liberty.”

If the Court rules in favor of the States it might blunt Biden’s climate executive order onslaught. However, the absurd Social Cost calculations were done by an Interagency Working Group and the Court might be reluctant to rule that the agencies cannot implement their supposed science without legislative authorization. Agencies often do their own science on the way to rule making.

How this extremely important case will turn out is far from clear. Stay tuned!


Recycling’s Economic Realities, Now and Tomorrow

Americans like to believe our resources are as unlimited as the possibilities for our future.

That may be true. Ideas like “peak oil” – which seemed on the verge of winning acceptance just a decade ago – have petered out as science and engineering have pointed the way to discoveries of energy deposits so rich the nation has become a net energy exporter.

More than that though, America has made tremendous gains by learning to use the resources we already produce more efficiently. These gains are a winner for everyone, producers and consumers alike.

This is certainly true where recycling is concerned. Technology has progressed to a point where certain items such as plastics can be used again and again and again. The technology exists – and it's affordable. Unfortunately, in many cases it is still cheaper to use virgin materials to make new products than it is to use recycled materials, industry has not made the necessary transition.

When considering the costs, one not only has to consider how much is involved in producing something but also the expense of disposal. That is not an industry concern so much as it is a problem of the commons, that area in which no one person or group is specifically responsible because everyone is generally responsible.

Putting plastics in the ground forever after they’ve been used a single time is wasteful and inefficient. Recycling laws are supposed to cut down on that waste but, because companies – many of whom talk the talk but don’t walk the walk when it comes to being environmentally friendly – aren’t making the conversation to recycled materials in their production stream because of fluctuations in the price and inconsistency in the supply.

There is a solution. The introduction of minimum recycled content (MRC) standards is the kind of light-touch regulation that’s environmentally friendly and will create jobs as new industry develops without imposing confiscatory costs on consumers or producers.

Simply put, adding an MRC requirement ensures the demand for recyclable materials remains economically viable by maintaining a balance with our abundant supply and demand of recyclables – at a manageable cost while building resilience in recycling end markets.

In a March 2019 Gallup Poll, 65 percent of respondents said they agreed with the statement “protection of the environment should be given priority” over other concerns including cost. America is waking up to the need to do things differently. And no wonder. After rising for decades, U.S. recycling rates have plateaued, the U.S. Environmental Protection Agency says, because Americans are creating more trash. In 1990, the total generation of municipal solid waste was 208.3 million tons. By 2018, that amount had risen to 292.4 million tons, 23.7 million tons higher than the year before. This requires action.

To increase the demand for recycled materials in the manufacturing supply chain, California has already enacted legislation requiring minimum recycled content for plastic beverage containers. New Jersey is considering similar legislation with MRC standards for plastic beverage and rigid plastic containers.

The MRC approach is a step toward circularity, a concept many people believe is part of the economy of the future in developed parts of the world. In a circular economy, waste materials become inputs rather than outputs that government and industry can embrace as a revenue source while protecting the environment. Additionally, the adoption of recycled materials as a stable source of predictable revenue may give city and county governments the ability to avoid economically and socially damaging tax increases that send residents and businesses away.

The Manhattan Institute, a New York-based think tank that deals with urban issues from a free-market perspective, argues that rethinking municipal recycling will help local governments save money. Now, through the MRC approach, they can be transformed into an effort in which price signals assure taxpayer-diverted recyclables are beneficially reused again and again is the right step forward.

According to the Harris Poll, four in five Americans (80 percent) agree governments at all levels should prioritize the use of recyclable products/materials when making purchasing decisions. Recycling is demand-driven; thus, increasing the use of recycled content in manufacturing is critical to the success of recycling programs.

Recycling and composting also help reduce carbon footprints. Utilizing nearly 94 million tons of compostable materials in 2019 meant the equivalent of 42 million cars taken off the road. On average, recycling one ton of materials saves three tons of carbon emissions. The MRC approach saves energy and reduces greenhouse gas emissions as manufacturers and packaging producers will have to use recycled materials to make new products.


How to Think about Climate Change

By WILLIAM HAPPER (The Cyrus Fogg Brackett Professor Emeritus of Physics at Princeton University)

The best way to think about the frenzy over climate is to consider it a modern version of the medieval Crusades. You may remember that the motto of the crusaders was “Deus vult!”, “God wills it!” It is hard to pick a better virtue-signaling slogan than that. Most climate enthusiasts have not gone so far, but some actually claim that they are doing God’s work. After decades of propaganda, many Americans, perhaps including some of you here today, think there really is a climate emergency. Those who think that way, in many cases, mean very well. But they have been misled. As a scientist who actually knows a lot about climate (and I set up many of our climate research centers when I was at the Department of Energy in the early 1990s) I can assure you that there is no climate emergency. There will not be a climate emergency. Crusades have always ended badly. They have brought discredit to the supposed righteous cause. They have brought hardship and death to multitudes. Policies to address this phony climate emergency will cause great damage to American citizens and to their environment.

Climate frenzy is really heating up recently. On February 4th Senator Bernie Sanders, Congresswoman Alexandra Ocasio-Cortez, and Congressman Earl Blumenauer introduced “legislation mandating the declaration of a national climate emergency. The National Climate Emergency Act directs the President of the United States to declare a national climate emergency and mobilize every resource at the country’s disposal to halt, reverse, mitigate and prepare for the consequences of this climate crisis.” (This is from Mr. Blumenauer’s website.) But this is utter nonsense. There is no climate crisis, and there will not be a climate crisis

It gets worse when you get to the state levels where there are fewer checks and balances. These are the remarks made last week by Charles Ismay, the Undersecretary for Climate Change in Massachusetts to the Vermont Climate Council:

So let me say that again, 60% of our emissions that need to be reduced come from you, the person across the street, the senior on fixed income, right . . . there’s no bad guy left, at least in Massachusetts to point the finger at, to turn the screws on, and you know, to break their wills, so they stop emitting. That’s you. We have to break your will. Right, I can’t even say that publicly.

A few days later Mr. Ismay resigned and had he not, his governor would have fired him. But, that’s the way crusades are. This is really not a question of science. This is a question of a secular religion for some. It is a question of money for others. It is a question of power for others. But whatever it is, it is not science.

Part of the medieval crusades was against the supposed threat to the holy sites in Jerusalem. But a lot of it was against local enemies. The medieval Inquisition really did a job on the poor Cathars, on the Waldensians of southern France, and on the Bogomils in the Balkans. Climate fanatics don’t know or care any more about the science of climate than those medieval Inquisitors knew or cared about the teachings of Christ.

Just about everyone wants to live in a clean environment. I do, and I am sure everyone here does. This is a photograph of Shanghai, and that’s real air pollution. You can just barely see the Bottle Opener Building in the back through all the haze. Some of this is due to burning coal. But a bigger fraction is due to dust from the Gobi Desert. They have had this type of pollution in Shanghai since the days of Marco Polo and long before. Part of it is burning stubble of the rice fields, which is traditionally done before planting next year’s crop. This is real pollution. I would not want to live in a city like that. If there is anything to do that would make it better, I would certainly support that.

But, none of this has anything to do with CO2. CO2 is a gas you cannot see, smell or taste. So, hare-brained schemes to limit emissions of CO2, which is actually beneficial, as I will explain a little bit later, will only make it harder to get rid of real pollutants like what I just showed you in Shanghai.

So, let’s talk about CO2. Number one, it is not a pollutant at all. We breathe out lots of CO2. Many people are surprised to learn that they exhale a little more than two pounds of CO2 a day. You people in this room are putting out a lot of CO2. I actually brought a CO2 meter here which I am going to turn on. It takes about 30 seconds to warm up, but we will see what the levels are. Before I came down here, my wife Barbara and I measured the CO2 on our balcony, and it is about 400 parts per million outside this building. The meter is warming up now. I will show what the results are in a minute. But our breath is not that different from the output of a power plant. Power plants take in normal air, and they consume most of the oxygen by burning coal, or natural gas, or oil. The exhaust that comes out of the stack is mostly the nitrogen that was already there—a little bit of oxygen that was not used up, along with water vapor and CO2. Our breath is similar, except it has a lot more oxygen. So, you can give mouth-to-mouth resuscitation, but you couldn’t if your breath was like the power plant exhaust. Your breath contains about four percent CO2, six percent water. The power plant has a bit more CO2 and correspondingly less oxygen. But our breath is definitely not a pollutant. In fact, our breathing reflex is determined by CO2. It is not determined by oxygen. It is not a lack of oxygen; it is too much CO2 that makes you take another breath of air.

We have just seen how well solar farms and windmills work in Texas in last week’s cold spell. They never did work terribly well. We have to be grateful to Nature. She seems to have a sense of humor, and she has taught us a good lesson—I hope. People seem to be slow learners. A major problem with renewable energy sources (solar, wind) is that they take up a lot of space. I preferred this field when it was nice and green instead of weedy panels. It is quite weedy now. This was soon after it was built. The panels do not work at all at night. You need something else to provide electrical power at night. Solar panels do not work if it is a cloudy day. They do not work terribly well in the winter when the Sun is low. So, it is pure virtue signaling. Solar power makes no economic sense unless you are massively subsidized by the state and federal governments.


How to End Biden’s Fake Climate Apocalypse

If there’s no pushback against the Left, we’ll see a dramatic drop in our standard of living.

With the wave of executive orders and legislation coming from the Biden administration, and the cultural antics of his woke supporters, Biden’s war on fossil fuels has received insufficient attention. Yet energy is the lifeblood of our economy, and making traditional energy sources vastly more expensive is the single most destructive aspect of Biden’s policies. If this country does not successfully mobilize against these policies, the vast majority will experience a dramatic drop in their standard of living.

Supposedly the assault on fossil fuels — via regulation; cancellation of pipelines; concocting a huge, wholly imaginary “social cost of carbon”; taxes; and solar and wind mandates — is necessary to save the planet from imminent catastrophe produced by man-made global warming.

But genuine climate scientists, as we know from those who dare to speak up, are amazed and horrified. Richard Lindzen, long at the top of the field as a former professor of atmospheric sciences at MIT, laments that the situation gets sillier and sillier. He told the recent CPAC conference (his message was read by the Heartland Institute’s James Taylor):

One problem with conveying our message is the difficulty people have in recognizing the absurdity of the alarmist climate message. They can’t believe that something so absurd could gain such universal acceptance. Consider the following situation. Your physician declares that your complete physical will consist in simply taking your temperature. This would immediately suggest something wrong with your physician. He further claims that if your temperature is 98.7F rather than 98.6F you must be put on life support. Now you know he is certifiably insane. The same situation for climate is considered “settled science.”

So how did an absurd message gain such widespread acceptance? The answer is something people find it hard to wrap their heads around: we aren’t dealing with science at all. We confront an apocalyptic movement, the kind of movement, recurring across time and space, that Richard Landes describes in Heaven on Earth: Varieties of the Millennial Experience. Its scientific veneer makes it credible to a modern audience. If today a charismatic leader cried, “Repent. Sacrifice your goods. The end of the earth is nigh,” at best he might attract a few dozen oddball followers. But when essentially the same message is clothed in the language of science, it sweeps the world.

In Roosters of the Apocalypse I point out the uncomfortable similarities between the global warming apocalypse and the apocalypse that led the Xhosa tribe (in today’s South Africa) in 1856 to destroy their economy, which was based on cattle as ours is on energy. Relying on the vision of a 15-year-old orphan girl, the Xhosa killed an estimated half million of their cattle, ceased planting crops, and destroyed their grain stores. In return the girl promised the Xhosa’s ancestors would drive out the British and bring an even greater abundance of cattle and grain. By the end of 1857 a third to a half of the population — between 30,000 and 50,000 souls — had starved to death.

Even the age of the “prophetic” girl suggests a modern parallel. Greta Thunberg didn’t start the global warming apocalypse, but she was 15 when she began spending her school days in front of the Swedish Parliament carrying a sign reading “School Strike for Climate,” heralding the international children’s crusade against global warming she would lead a year later.

In some ways the current apocalypse is surprising. Landes reports that to be successful, an apocalypse needs to bring elites on board, and elites tend to be a hard sell, especially when prophecies demand a society self-mutilate. But in this case not only have elites been won over with breathtaking ease, but they have proved more susceptible over time than the man in the street. A recent Gallup poll found only 3 percent of the public citing climate as a key concern.

If people understand the menace that global warming policies pose to their way of life, there should be a huge pool of followers.

Dissent is drowned out as educational, political, media, cultural, and business elites speak with one voice. Even fossil fuel companies have thrown in the towel. The American Petroleum Institute, the oil industry’s top lobbying group, is set to propose setting a price on carbon emissions. Children are being indoctrinated in global warming doctrine from kindergarten on, in humanities as well as science classes. My granddaughter, in sixth grade in a Manhattan public school, has a class in “Clifi” (Climate Fiction), where the children read stories on the dreadful aftermath of a climate apocalypse. Politicians at the state and local level pass mandates for expensive (and unreliable) renewables to replace fossil fuels at ever earlier dates. Even conservatives are caught up in the fever. At the most recent CPAC a group urged Republicans to “get in front” on the issue and outflank the Democrats.

What can be done to prevent the global warming locomotive from steamrolling over our economy? Thus far efforts have focused on countering global warming science with better science. The Chicago-based Heartland Institute has organized 13 international conferences since 2008. The media has all but blacked out coverage, so neither the conferences nor the steady stream of climate research the Institute publishes receive any notice. The CO2 Coalition, which emphasizes that CO2, far from being a pollutant, is a nutrient vital for life, is given similar short shrift. For example, although the coalition includes distinguished scientists, Wikipedia defines it as “a climate change alarmist denial advocacy organization,” whose claims “are disputed by the vast majority of climate scientists.”

There are also excellent websites, such as Climate Depot, offering space to scientific research casting doubt on apocalyptic claims. Marc Morano, who runs the site, had the distinction in 2009 of being chosen by news outlet Grist as one of only five “criminals against humanity, against planet Earth itself” and in 2012 of being named “Climate Change Misinformer” of the Year by Media Matters.

Pitting one scientific study against another hasn’t worked. That’s because most climate scientists are on the global warming grant gravy train, the public can’t follow the abstruse language of academic studies of climate, and the apocalypse is only superficially about climate anyway. Under the circumstances, a mass movement against this folly would seem to be the only way to get through to a larger public. If people understand the menace that global warming policies pose to their way of life, there should be a huge pool of followers. Texas might be a good place to start, given its recent unexpected stay in the freezing dark, and the stark failure of its wind turbines. One advantage of such a movement is that it would cross party lines. Democratic-voting union members stand to lose their well-paid jobs in fossil fuel industries, with workers in China cornering much lower-paid jobs in solar and wind (despite pie-in-the-sky promises by President Biden and newly appointed climateer-in-chief John Kerry).

The new movement could be titled “Lights On.” Participants should have fun. There was never a claim of “settled science” more ripe for ridicule. How about contests for college students rewarding those who can document the largest number of disproven prophecies of global warming doom (for example, the end of snow, no more Arctic glaciers, U.S. coasts under water, all with specified dates now long past)? In Breitbart, John Nolte recently claimed to have found 44 of them. There can be no shortage of candidates for an award of “False Prophet of the Year.” Or “Global Warming Hypocrite of the Year,” for which John Kerry would be an outstanding candidate with his private jet, yachts, multiple mansions, and cars. And what about an award to a prominent media figure for the most absurd claim for global warming causation? One of Lindzen’s favorites is the Syrian civil war.

And how about reviving the chronicle of Climategate, which almost wiped out faith in the apocalypse before the media buried the scandal? In 2009, a hacker downloaded candid emails among top climate scientists in England and the United States that bemoaned recalcitrant data, described the “tricks” (their term) used to coax the data, reported efforts to keep the views of dissenters out of reputable journals and UN reports, and boasted of deletion of data to make it unavailable to other researchers. “If science is on your side, why do you need to make it up?” would make a good bumper sticker or t-shirt slogan.

There could be a bumper sticker with comedian George Carlin’s line: “The Planet has been through a lot worse than us.” There could be t-shirts that proclaim, “Wind Is for Sailboats.” There should be songs and cartoons (many of these can already be found on the website

The movement can have fun, but it must also be serious: members will only back politicians prepared to fight to maintain our access to cheap, reliable energy. To the extent solar and wind can someday compete on an even playing field, without subsidies and mandates, they are welcome to the energy mix.

For the current apocalypse to come to an end, the notion that man-made global warming poses an existential threat must come to be seen as ridiculous. Otherwise the policies of shutting down our traditional energy supplies to stave off this absurd end of days will themselves become an existential threat.