Thursday, April 15, 2021

The world will break through the more ambitious Paris climate target of 1.5 degrees as soon as 2030 but may still avoid a more catastrophic 2 degrees of warming if governments act immediately to dramatically reduce emissions, according to a new report.

Just another prophecy based on guesswork and bound to be as wrong as all the ones before it

The Climate Council report, Aim High, Go Fast, is based on new data from the Intergovernmental Panel on Climate Change and echoes similar findings by the Australian Academy of Science issued last week, but has prompted a dissenting report from one prominent Australian climate scientist, Bill Hare.

It warns that the more ambitious Paris target of holding warming to 1.5 degrees above pre-industrial levels cannot be achieved without what it calls “significant overshoot” and “drawdown”. Drawdown refers to the possibility of using as yet non-existent large-scale carbon dioxide removal technology to help cool and stabilise the climate after overshooting the target.

In the report the Climate Council says that in view of Australia’s historical contribution to global warming, its high emissions and its natural advantages in renewable energy generation, the government should now aim to reduce emissions by 75 per cent below 2005 levels by 2030 and reach net zero by 2035.

So far the government has committed to reducing emissions by 26-28 per cent by 2030 and has set no net-zero target, but said it would prefer to reach that milestone earlier than 2050.

Barrier Reef doomed as up to 99% of coral at risk, report finds
Prime Minister Scott Morrison is expected to face more pressure to commit to more ambitious actions at a climate summit to be hosted by United States President Joe Biden next week and during the lead-up to the next UN climate talks in Glasgow in November.

Asked if such an abrupt reduction was possible, one of the report’s authors, executive director of the Australian National University Climate Change Institute Will Steffen, cited the example of allied nations transforming their economies in five years to defeat the Axis power in World War II.

“The point is, it’s going to be a tough decade, no doubt about it,” he said. “There’ll be some disruption soon, but it’ll be an exciting decade and it’ll set us up for a much brighter future after 2030.”

To reach such targets Professor Steffen said the government would need to immediately halt the expansion of coal and gas and plan to support affected communities as fossil fuels were phased out. Secondly, Australia would have to reach almost 100 per cent renewables in its energy system by 2030.

The report finds “multiple lines of evidence” that the world will break through 1.5 degrees: the increasing pace at which the world has been warming since 2016; new scientific understanding of the climate system’s sensitivity; and the increasing rate of sea levels rising. There is also an analysis of global greenhouse gas emissions which are now in line with the highest of four scenarios considered in the fifth assessment of the Intergovernmental Panel on Climate Change, the UN’s lead climate change body.

“We now face a more dangerous future, with further risks and damages locked in,” says the report.

“We have reached the endgame and if we are to limit further disruption then we must dramatically step up the scale and pace of action. Inaction or delay in the face of so much evidence is in fact an active commitment to massive global climate disruption and damage.”

Counting the increasing costs of droughts and flooding rains
Professor Steffen said the impact of temperature rises did not go up in a linear fashion, and that 2 degrees of warming was far worse than 1.5 degrees.

“The issue here is that past inaction on climate change has cost us dearly. There is plenty of momentum in the climate system, it is like trying to turn a battleship around,” he said.

“The mantra I keep going back to is that every tenth of a degree matters.”

But Bill Hare, a lead author on the fourth IPCC assessment and founder of Climate Analytics, said he believes both the Climate Council and the Australian Academy of Science had found further evidence for the need for immediate and dramatic action. But he did not agree with the view that holding global temperatures rises to 1.5 degrees was virtually impossible.

His dissenting report, co-authored by his colleague Dr Carl-Friedrich Schleussner, said it is not possible to draw conclusions on temperature rises over short time periods used by the Climate Council report; that sea level rise is a lagging rather than leading indicator of climate change; and that the Climate Council had made mistakes in its interpretation of so-called carbon emissions budgets. They further question the report’s analysis of climate sensitivity.

“The evidence presented in the Climate Council of Australia report itself does not support their claim that 1.5°C will be exceeded,” they write.

Mr Hare told the Herald and The Age he believed the evidence of physics and economics showed that 1.5 degrees was still achievable and that the target itself was a critical policy tool supporting international efforts to tackle climate change.

“[The 1.5 degree target] has become mainstream in the global climate debate, it is why nations are talking about net zero by 2050 rather than 2070.”


NASA measures direct evidence humans are causing climate change

The so-called "direct evidence" is in fact a series of estimates, with all the frailties inherent in that. Viscount Mockton comments:

"The paper says that previously the Earth's energy imbalance was detected by models, and then says that it is now detected by "radiative kernels" - which are models. Same difference.

Actually, radiative imbalance is measured both by satellites and by the ARGO bathythermographs. The paper on which IPCC (2021) will chiefly rely, Von Schuckmann et al. (2020), finds the radiative imbalance - i.e., the fraction of the total forcing in recent decades that has not yet resulted in warming - to be 0.87 Watts per square meter. Of this, 70% is anthropogenic (Wu et al. 2019, Scafetta 2021), so that the anthropogenic contribution to the imbalance is 0.61 Watts per square meter - if anything, a little more than the 0.53 given in the paper now being spun by the Marxstream media.

One can work out equilibrium sensitivity to doubled CO2 directly from this imbalance, together with a few other items of data: it is equal to 0.7 x 1.04 x 3.52 / (3.2 - 0.7 x 0.87), or 1.0 K, not the almost 4 K imagined by official climatology."

It may come as a surprise, given the extensive body of evidence connecting humans to climate change, that directly-observed proof of the human impact on the climate had still eluded science. That is, until now.

In a first-of-its-kind study, NASA has calculated the individual driving forces of recent climate change through direct satellite observations. And consistent with what climate models have shown for decades, greenhouse gases and suspended pollution particles in the atmosphere, called aerosols, from the burning of fossil fuels are responsible for the lion's share of modern warming.

In other words, NASA has proven what is driving climate change through direct observations — a gold standard in scientific research....

What NASA has done in this study is to calculate, or quantify, the individual forcings measured from specialized satellite observations to determine how much each component warms or cools the atmosphere. To no one's surprise, what they have found is that the radiative forces, which computer models have indicated for decades were warming the Earth, match the changes they measure in observations.....

Specifically, this study has been able to calculate solid numbers for the changes in heat trapped in the Earth system from the individual contributors that influence heat transfer, like radiation, clouds and water vapor, for the period 2003-2019. The researchers did that by analyzing satellite observations and applying what they call "radiative kernels" to disentangle the various components controlling the transfer, absorption and emission of heat inside the Earth system and what is sent back out into space. Up to this point, satellite observations of Earth's radiation budget had only measured the sum total of radiation changes, not the individual components.

Then there are also feedbacks in the climate system which account for a smaller but still important amount of warming. One example of this is the fact that as the atmosphere warms it can hold more water vapor, and that means it can trap more heat, further allowing for more water vapor to build up. This is a positive feedback which perpetuates warming.




Saturday, April 03, 2021

Stunning Israeli Discovery About Reducing Cancer Mortality

They have rediscovered radiation hormesis. We have known about it since the 1930s but no matter

Israeli researchers looked at radiation and cancer data for the entire United States and reaching a stunning conclusion that people living in areas with higher background radiation actually have lower incidents of some key cancers, Ben-Gurion University (BGU) reported this week.

The earth always has a low level of normal background radiation that comes from the sun and cosmic rays as well as from terrestrial sources. The BGU research team took massive amounts of data for radiation levels in the United States and how it affects the entire American population and cancer rates.

The scientists noted that since the 1960s, the general thinking has been that any radiation is bad, and as a result, hundreds of billions of dollars are spent around the world to reduce radiation levels as much as possible.

“We examined whether background radiation impacts human longevity and cancer mortality. Our data covered the entire US population of the 3139 US counties, encompassing over 320 million people,” said the report, written by BGU professors Vadim Fraifeld and Marina Wolfson and Dr. Elroei David of the Nuclear Research Center.

Their findings were stunning and showed that the traditional thinking about background radiation appears to have been completely wrong.

“Exposure to a high background radiation displays clear beneficial health effects in humans,” the scientists reported in their study that was published in the medical journal Bioge​​rontology.

They found that higher background radiation levels lead to lower levels of lung, pancreatic and colon cancers in men and women as well as lower rates of brain and bladder cancers in men.

With higher radiation levels, life expectancy increased.

The team used the U.S. Environmental Protection Agency’s radiation dose calculator, retrieved data about background radiation from the entire country, and compared it with cancer rate data and life expectancy.

At the same time, they noted that the higher background radiation levels produced no decrease in leukemia or cervical, breast or prostate cancer.

“All in all, it is reasonable to suggest that a radiation threshold does exist, yet it is higher than the upper limit of the natural background radiation levels in the U.S.,” the researchers wrote, concluding that it is time to revise the apparently outdated thinking that all radiation is bad.


Climate Change is 'Big One' of 'Emerging Risks,' Says Treasury Sec.

Democrats found another opportunity to tackle their pet project of climate change. On Wednesday, Sec. of Treasury Janet Yellen made her inaugural remarks as head of the Financial Stability Oversight Council.

In reporting on Sec. Yellen's remarks, Politco's Victoria Guida wrote, with added emphasis:

Treasury Secretary Janet Yellen on Wednesday called climate change “an existential threat” and the biggest emerging risk to the health of the U.S. financial system, pledging to marshal regulatory forces to guard against its harmful effects.

Yellen made the promise during her inaugural appearance as the head of the Financial Stability Oversight Council, a panel of top regulators tasked with policing Wall Street behavior that has the potential to crash the entire economy.

The council held its first public meeting under Yellen's leadership Wednesday and focused on climate for the first time since Congress established the body in 2010. The group includes the heads of the Federal Reserve and the Securities and Exchange Commission.

“We cannot only look back and learn the lessons of last year,” Yellen said at that meeting. “We must also look ahead, at emerging risks. Climate change is obviously the big one.”

Sec. Yellen didn't waste any time then. It doesn't come as too much of a surprise though. Earlier this month, the secretary met with Jubilee USA Network, as well as leaders of the Jewish faith, Reuters reported. According to the Treasury, "She noted that the Administration is committed to using the full power of the U.S. federal government to address climate change as part of the Build Back Better plan."

Not everyone was on board, however, especially when it comes to concerns with the FSOC. As Andrew Ackerman and Kate Davidson wrote in their reporting for the Wall Street Journal:

Conservative critics have for years attacked the FSOC as too political, opaque and overreaching. Michael Piwowar, a former Republican member of the Securities and Exchange Commission, described the council in a 2014 speech as a “Firing Squad on Capitalism,” over what he described as its lack of accountability.

“It will be up to Secretary Yellen to determine how transparent she wants the FSOC to be,” said Mr. Piwowar, now executive director of the Milken Institute Center for Financial Markets, in an interview Tuesday.

Pennsylvania Sen. Pat Toomey, the ranking Republican on the Senate Banking Committee, questioned the FSOC’s work on climate. “I remain concerned that FSOC members may seek to advance a progressive social agenda on global warming, which is beyond the scope of their respective missions and authorities,” he said in a statement Wednesday. “This effort is not grounded in science or economics, but is instead a self-fulfilling prophecy: claim there are future regulatory risks for carbon intensive industries, then use unelected, unaccountable financial regulators to impose regulatory costs on those activities.”

Earlier this month, Speaker of the House Nancy Pelosi (D-CA) raised eyebrows when she referenced "climate change" when it comes to the crisis at the border. As Beth reported:

According to Pelosi, illegal aliens are crossing the border in hopes of finding work. She cited corruption in Mexico and "climate change" in the Northern Triangle – El Salvador, Guatemala and Honduras – that makes farming almost impossible, as reasons the migrants make the trek to the United States.


EPA’s Totalitarian Frontal Assault on America

Biden-Harris Environmental Protection Agency Administrator Michael Regan is clearly on a mission. He has “bold aspirations, and a long to-do list,” says The Washington Post. But to succeed, the Post acknowledges, he must “help the EPA get its groove back.” As Reagan put it, “We’ve got a lot of work to do, starting with rebuilding staff morale and getting all our staff back to feeling as if they matter, their voices matter.”

Regan says his job is “to restore the scientific integrity and the utilization of data, of facts, as we move forward, and make some very important decisions.” His second goal is to increase “cooperation” between the EPA and its “subordinate” state environmental agencies. EPA will dictate; states will fall in line.

A big step toward that goal was extending a Memorandum of Agreement between the EPA, the Environmental Council of the States, and the Association of State and Territorial Health Officials. According to Regan, “EPA is committed to building on the values of transparency, respect, and an open dialogue that are the cornerstone of a successful partnership with the states.” As EPA defines the terms.

An Obama-appointed federal judge just restored the EPA’s use of “secret science” in formulating regulations that businesses and industries must follow just because the EPA says so – with “scientific evidence” that cannot be cross-examined. U.S. District Judge Brian Morris (in Great Falls, Montana) took just hours before vacating the Trump EPA rule that would have ended this Star Chamber style rule.

For decades, the EPA relied on unreviewable studies to impose draconian restrictions on businesses and industries, and thus on the U.S. economy. Trump wanted to bring true transparency to the process. Opponents claimed the Trump secret science” rule would block the use of critical public health studies kept secret supposedly to protect the identities of trial participants – which of course was not the case.

As Trump EPA Administrator Andrew Wheeler explained, the “secret science” rule in no way blocked previous “secret” studies; rather, it created tiers in which preference is given to studies with public data. Peer reviewers looking at a study’s raw data did not need to know any of the subject’s names, so no patient confidentiality was at risk. Moreover, in most cases, a review of basic methods, statistics and results is sufficient to determine if they actually support the study’s conclusion. Wheeler also noted:

“Too often Congress shirks its responsibility and defers important decisions to regulatory agencies. These regulators then invoke science to justify their actions, often without letting the public study the underlying data. Part of transparency is making sure the public knows what the agency bases its decisions on.”

Now the Biden-Harris EPA has revived its old policy based on a failed 2015 (Obama-Biden) rulemaking that twists the Clean Air Act language in an effort to destroy auto racing as a sport in the USA.

The EPA claims modifying a vehicle previously certified for street driving for use as a competition-only racecar is unlawful even for vehicles that are trailered and never driven on public roads again. This policy seeks to end a 50-year-old American tradition. It has no precedent; even California exempts racing vehicles from regulation.

Not only does the EPA claim it is illegal to convert a vehicle for racing by modifying its emission system; it claims manufacturing, selling or installing race parts for such vehicles is likewise unlawful. It’s even said enforcement actions against high performance parts – including superchargers, tuners and exhaust systems – will now be a top priority.

This policy constitutes a direct assault on the nation’s 1,300 racetracks, tens of thousands of participants and vehicle owners, and millions of racing fans nationwide. It is also a death blow to retail sales of racing products, a $2 billion a year industry. The move appears to be part of the Obama-Biden-Harris EPA strategy to rid the planet (or at least the USA) of internal combustion engines by taking away the romance of the racecar.

To try to thwart this EPA power grab, the Specialty Equipment Market Association has filed an amicus curiae (friend of the court) brief in a lawsuit filed by Gear Box Z, Inc. challenging the racecar conversion ban. The industry group is also supporting the Recognizing the Protection of Motorsports Act (RPM Act), which reaffirms the legality of converting street vehicles into race-only vehicles and confirms the legitimacy of producing, marketing and installing racing equipment.

They should prevail. But with today’s courts, do even the most specific laws still matter?

A recent Wall Street Journal editorial says the Biden-Harris EPA has a secret plan to force massive CO2 emissions reductions under the Clean Air Act, using ozone as its vehicle of choice. “Plan B” is the fallback strategy to be implemented once it is clear that even the Democrat-controlled Congress will not enact economy-killing anti-fossil fuel legislation. The ultimate goal is total fossil fuel eradication.

Under Plan B, EPA will reset the National Ambient Air Quality Standard (NAAQS) for ground-level ozone to zero – way below natural levels that Mother Nature herself emits! The “science” is based on a questionable 2017 study from Harvard’s T. C. Chan School of Public Health, which claims there is no safe level of ozone in the atmosphere. How do you prosecute Mother Nature?

Plan B responds to the failure of the Obama-Biden Clean Power Plan, which was blocked by the U.S. Supreme Court. It reflects former EPA Administrator Lisa Jackson’s blunt admission that it is technically infeasible and even legally questionable to regulate CO2 as a “criteria pollutant” under the Clean Air Act.

The simple reasons are fundamental. CO2 is what humans and animals exhale. It is what plants inhale to support photosynthesis and produce the oxygen that most life on Planet Earth requires to exist. It does not cause asthma or other diseases. CO2 emissions generated in a locality cannot be measured reliably and certainly cannot be reduced within the 10-year timetable for criteria pollutants. CO2 is not a pollutant.

Using ozone and the NAAQS to regulate CO2 is reportedly the brainchild of Joe Goffman, whom the Biden-Harris Administration has installed as principal deputy assistant administrator for the EPA’s Office of Air and Radiation. Goffman, a chief architect of the Obama era Clean Power Plan, is known as EPA’s “law whisperer.” His specialty is “teaching old laws to do new tricks.”

Goffman’s plan was jump-started on January 19, 2021, when 16 Democratic state attorneys-general filed a legal challenge to the EPA’s recently reauthorized ozone NAAQS. Their one-paragraph sue-and-settle lawsuit claims the standards are “unlawful, arbitrary and capricious and therefore must be vacated.”

The Trump EPA in December 2020 had retained the ozone NAAQS at levels set in 2015 by the Obama-Biden Administration. That action marked only the second time since the 1970 Clean Air Act was enacted that EPA completed its ozone NAAQS review within the mandatory 5-year timeframe.

As the Wall Street Journal explains, Democratic AGs, green groups and top Biden environmental regulators intend to impose the Green New Deal on states through backdoor regulations, because they know they can’t get it through the front door of Congress, even this sycophantic Congress.

Under this nefarious scheme – which could be imposed this year without any “open dialogue” in Congress – every state would be forever out of compliance, director Steve Milloy emphasizes. It is simply impossible to eliminate natural background levels of ozone. But this action would give EPA effective and arbitrary control over the entire economy, especially fossil fuel use.

Giving unelected bureaucrats and a like-minded political cabal “effective and arbitrary control” over the entire U.S. economy creates a dictatorship of faceless and nameless totalitarians whose diktats the political class can claim they are powerless to upend. This is where America is headed, unless we stop these power-crazed autocrats.


Biden threatens American energy while Beijing is waiting in the wings

China’s economy is emerging from the COVID-19 pandemic better than ever while America’s economy looks to regain its footing after an entire year lost. Lockdowns decreased oil demand and forced shutdowns at many American refineries, costing thousands of jobs across the country. China’s refineries are outperforming expectations as the rest of the world looks to recover from the coronavirus crisis. China has overtaken the U.S. in crude oil imports, yet President Biden is attacking America’s energy sector. Biden’s energy policy threatens our nation’s small refinery workers and endangers the competitive edge the United States maintains over China.

President Biden’s attack on the American energy sector is an extreme governmental overreach threatening both good paying jobs and the free market. If these small refineries and their workers disappear, we will suffer skyrocketing energy prices and risk a near total collapse of our economy. Meanwhile China and its communist dictators gather strength and position themselves to replace America as the world’s leader.

American energy jobs are being systematically wiped out by President Biden and his allies. As a former public affairs chief of staff at the Department of Labor, it is incredibly disheartening to see the executive office take such harmful actions against honest workers and blatantly undermine American economic interests. Biden shut down the Keystone XL pipeline and threw the region into a state of economic uncertainty. Eastern Montana stands to lose one of its largest taxpayers and nearly 4000 jobs. Biden ally Gretchen Whitmer’s assault on the Line 5 pipeline, a lifeline for refineries and jobs in Michigan and Ohio, is another example of a harmful government overreach undercutting the American energy sector.

Refineries in Pennsylvania, Michigan, Ohio, and across the country are in desperate need of waivers to burdensome Renewable Fuel Standard (RFS) requirements, but President Biden is ignoring their pleas and instead supporting overly restrictive requirements for RFS waivers, taking the side of big government versus American workers. Between the shutdowns of critical infrastructure and denials of RFS waivers, America’s small refiners and their employees are being squeezed out of the global marketplace.

Despite Biden’s claim to be “tough” on China, his actions against American energy are playing right into the Communist Party’s hands. President Biden has an opportunity right now to begin to put to rest the lingering suspicion that he is bought and paid for by China by ending this interference into the American energy market which directly benefits the very Chinese government which embarrassed the Biden administration in Anchorage, Alaska earlier this month. Biden’s policy reversals are bolstering the Chinese efforts to surpass

American refineries in crude oil processing. In fact, for several months in 2020 during the worse of the pandemic, China actually processed more crude oil than net inputs of crude oil to U.S. refineries.

These recent developments in the Chinese energy sector are cause for serious concern. They are indicative of a more aggressive China, who no longer fears retaliatory actions from the United States. Recently, a senior Chinese foreign policy official claims the United States can no longer speak from a position of strength when criticizing China. If President Biden continues to attack domestic energy, the Chinese official’s claim will become reality.

If our leaders in Washington D.C. wish to maintain economic parity with China, then President Biden must cease his needless attacks on our energy economy. As an advocate for limited government and a free market, I worry that Biden’s actions will cause energy infrastructure to collapse, causing undue harm throughout the country and allowing China’s totalitarian, oppressive regime to capitalize on what would amount to little more than a US energy capitulation.




Thursday, April 01, 2021

Climate change to blame for early cherry blossom season in Japan

Like most of the rest of the world, Japan has been slowly warming for the last century or so. So that could have some effect on cherry blossoming. Pretending that global warming is the only infulence or even the major influence is however slipshod.

The obvious influence is urbanization. Urban centres are warmer and that is even more so as society becomes ever more energy intensive. The more people use air-conditioners in summer and heaters in winter the greater will be the heat output into the urban environment. Much warmer cities rather than the trivial increase in global warming would be the major influence on cherry blossoming

About 63 million people in a normal year flock to Japan to see its most famous flower in full bloom.

Cherry blossoms, or sakura, hit their peak bloom in April, when they paint the country’s parks and gardens with shades of pink and white and fuel a multimillion spring tourism boom.

But this year, in the ancient capital city of Kyoto, the cherry blossoms hit peak bloom too early, on March 26 – the earliest since the Japan Meteorological Agency began collecting data on the flowers 70 years ago.

Others say the bloom is even earlier than what’s been noted in diaries and poetry from Kyoto that date back hundreds of years, AP reported.

According to the 2021 data in Kyoto, the cherry blossoms reached peak bloom 10 days ahead of the 30-year average, and it was a similar story in other cities across Japan.

Scientists fear climate change is to blame.

“We can say it’s most likely because of the impact of the global warming,” Shunji Anbe from the Japan Meteorological Agency told AP, adding the trees were sensitive to temperature changes.

The average March temperature in Kyoto, a key destination for cherry blossoms, rose to 10.6C in 2020, up from 8.6C in 1953. This March, the average temperature was even higher, at 12.4C.

Of the 58 benchmark trees across Japan that are tracked by the agency, 40 reached their peak bloom before the start of April, with 14 blooming in record time, according to AP.

It normally takes about two weeks for the first bud to appear and all the blossoms to fall from the tree.

Benjamin Cook, a research scientist at Columbia University, told The Washington Post the cherry blossom peak bloom date had been relatively stable for about 1000 years, from the years 812 to 1800, before a sharp shift to earlier in spring.

“Since the 1800s, warming has led to a steady trend toward earlier flowering that continues to the present day,” he said.

“Some of this warming is due to climate change, but some is also likely from an enhanced heat island effect due to increased urbanisation of the environment over the last couple of centuries.”


Carbon War Pits Politics Against Reality

President Joe Biden is taking “aggressive action,” the White House recently announced, to ensure that the United States achieves a “carbon pollution-free power sector” by 2035 and a “net-zero economy” just 15 years later.

On the other side of the pond, Boris Johnson, Britain’s prime minister, has pledged that the United Kingdom will reduce its carbon emissions by 68% in the next nine years, while the European Parliament has voted to reduce emissions by 60%.

In an ideal world, this would be great news. In the real world—one in which renewable energy is still way too inefficient and dependent on subsidies from governments already drowning in debt—such objectives are unrealistic.

Global debt was expected to reach an estimated $277 trillion, or some 365% of the globe’s entire gross domestic product (GDP), by the end of last year, the World Economic Forum has estimated. The U.S. fisc is equally out of whack. Data from the Federal Reserve Bank of St. Louis shows that U.S. debt has exceeded annual GDP every year since 2015, long before the multi-trillion-dollar COVID-19 “relief” packages were passed. From a financial perspective alone, therefore, the administration’s objectives invite skepticism.

They also invite skepticism from an energy perspective. Take the International Energy Agency (IEA), whose World Energy Outlook sets the objective of reducing carbon dioxide (CO2) emissions by 60% in 20 years.

To achieve that goal, the IEA aims for a 25% drop in energy demand. But according to a recent analysis by Goehring & Rozencwajg, an investment firm specializing in commodities, this is unlikely to happen. While energy demand has dropped by 10% over the last 20 years in wealthy, developed countries, the analysis found, it has increased by 65% in developing countries, which have been driving world economic growth in recent years.

The IEA also assumes that the concentration of CO2 per unit of energy will decline by half. This is also unlikely, since the developed countries that have been attempting to reduce emissions have achieved only about a 10% reduction. Even Germany, which gets nearly 40% of its electricity from renewables, has been unable to come close to this goal, achieving little better than the United States, where wind and solar account for less than 9% of generated power.

To achieve President Biden’s goal of carbon-free electricity by 2035, the United States would need to build a staggering number of new nuclear plants (a much cleaner source of fuel than hydrocarbons) or even greater numbers of new solar and wind power installations.

Without such increases, Reason magazine science correspondent Ronald Bailey has calculated that it would take some 50 years for renewables to replace existing fossil-fuel based sources of electric power—at a cost of additional trillions of dollars in federal spending.

The lesson here is not that we need to opt for pollution or renounce our clean-energy ideals, but that we also need to consider current financial realities, the added costs such a shift would entail and the very real technological constraints.

Human kind cannot bear very much reality, wrote T.S. Eliot in his “Four Quartets.” He might have been thinking of politicians and climate change.


Prosperity frees people to protect the environment

True Environmentalists Should Prioritize Economic Prosperity

The COVID-19 pandemic and the accompanying lockdowns reduced global CO2 emissions by 7 percent last year. Some environmentalists, such as the University College London professor Mariana Mazzucato, have thus wondered about the feasibility of future “climate lockdowns … to tackle a climate emergency.” Yet even if we ignore the negative consequences of the lockdowns on broader health outcomes and human psychology, Mazzucato appears to fail to account for the well-known correlation between economic prosperity and environmental quality.

Lockdowns have contributed to around 100 million people, most of them living in the developing world, sliding back into extreme poverty. While they may have lowered the CO2 emissions in the short term, by increasing absolute poverty, the lockdowns may cause massive environmental destruction in the long term. Simply put, people can afford to care about the environment only when they have enough income to cover their basic needs. If their survival depends on killing an endangered animal or cutting down a rare tree, then so be it.

The Environmental Kuznets Curve (EKC) hypothesis posits that environmental damage increases in tandem with economic growth, but only until a certain level of income is reached. Once people are wealthy enough not to have to worry about day-to-day survival, environmental degradation stops, and ecosystems begin to recover. The environmental scientist Jesse H. Ausubel, for example, suggests that once a nation achieves a GDP per capita of $6,200 (in 2021 dollars), deforestation stops or afforestation occurs.

In fact, forest coverage is growing in China, Russia, India, and Vietnam – all emerging economies that reached the $6,200-mark. The curve is even clearer in wealthy regions like North America and Europe – both of which have more trees today than they did a century ago. The UK, for example, has more than doubled its forest area in the last 100 years. Conversely, deforestation continues in poor African and Latin American countries. Scientists have found that the EKC holds true in all manner of environmental domains, including water pollution, carbon dioxide emissions, nitrogen, sulphur, and biodiversity.

While it is too early to gauge the impact of the lockdowns on forest coverage, the lockdowns have already wreaked havoc on endangered species and protected habitats in the developing world. In Kenya, the killing of giraffes has skyrocketed. Given that a tonne of giraffe meat is worth about $1,000 (i.e., almost seven months of the average Kenyan salary), it is unsurprising that desperate locals have resorted to slaughtering the endangered animal. Kenya’s Mara Elephant Project also recorded that illegal logging in the region peaked in the months following the first lockdown. In Botswana, government workers had to evacuate dozens of critically endangered black rhinos from the Okavango Delta after six of the animals were found dead after the lockdowns were implemented.

In Colombia, the poaching of endangered pumas and jaguars has also rapidly increased. In India, tiger numbers were steady, as incomes have increased, for the last two decades. But, since the lockdowns were imposed, various reports have highlighted an upsurge in tiger poaching and illegal hunting. Similarly, in India’s Western Bengal region, where over a million jobs have been lost due to the lockdowns, the local authorities have reported the first-ever instance of illegal ivory poaching in the region. The problem of illegal poaching is exacerbated by the fact that park rangers in some countries have been left without work and income. The animals, in other words, have lost their human protectors.

The World Economic Forum recently acknowledged that the significant increase in bushmeat harvesting and wildlife trafficking in Africa “is directly linked to COVID-19-related lockdowns.” Similarly, the UK-based wildlife charity called People’s Trust for Endangered Species has warned that “unintended consequences” of lockdowns could undo “decades of work” devoted to animal protection.

Fortunately for mother nature, as economies begin to recover from the government-mandated lockdowns, the number of people who rely on illegal activities will decrease, and biodiversity will slowly recover. However, the EKC and the wretched impact of lockdowns on poverty and biodiversity teaches us an important lesson – true environmentalists should seek to prioritize economic growth, not lower it. Poverty-reducing policies, such as strong property rights, freedom to trade, lower regulation, and few burdensome taxes, as shown annually in the Fraser Institute’s Economic Freedom of the World Report, remain some of the most reliable ways of raising economic prosperity for all.

In conclusion, poor people depend on mother nature to survive. Rich people, in contrast, can decouple themselves from the environment, protect wildlife for future generations, and return vast swathes of land to nature. Now, what environmentalist wouldn’t want that?


Another stupid prophecy about the reef

What the future temperature will be nobody knows. But the report below assumes a large rise. Even if that came to pass, it would not mean the end of the reef. Corals grow in wildly different temperatures -- from Iceland to the Persian gulf. So we might expect some turnover of species but that is all

It's boring to have to point this out again but Australian corals have the greatest diversity in the Torres Strait, where the temperature is always HIGH. Corals THRIVE in high temperatures. Some species may not but there are plenty that do

A damning new report has painted a grim picture of Australia’s future, with one of the nation’s most renowned natural wonders set to suffer.

Up to 90 per cent of the world’s coral reefs are expected to vanish, even at low levels of warming, and there are grave fears for one of Australia’s most famous natural wonders. The outlook for the Great Barrier Reef is considered “very poor”, according to a new report by the Australian Academy of Science.

And climate change is a major driver.

At 1.5 degrees of warming, the world will lose between 70 and 90 per cent of coral reefs.

“Substantial losses in ocean productivity, ongoing ocean acidification, and the increasing deterioration of coastal systems such as mangroves and seagrasses are projected to occur if global warming exceeds 2C,” the harrowing report states.

Scientists said the target set by the Paris Climate Agreement of keeping global warming to 1.5C was “virtually impossible” as they painted a grim picture for Australia’s ecosystems.

It is more likely that global temperatures will soar by up to 3C. "Critical thresholds in many natural systems are likely to be exceeded as global warming of 1.5C above pre-industrial levels continues,” the report said.

“These impacts will increase as global warming reaches 2C and beyond, with iconic ecosystems such as the Great Barrier Reef and the World Heritage-listed Kakadu National Park being severely affected.

“At 3C of global warming, many of Australia’s ecological systems would be unrecognisable.”

A leading figure within the European Union has even sounded the alarm on the Great Barrier Reef.

The EU’s commissioner for environment, oceans and fisheries, Virginijus Sinkevičius, told Guardian Australia he feared for the natural wonder. “As long as we do not change our behaviours, things will not improve,” he said.

Global warming has already triggered mass bleaching events on the Great Barrier Reef that have destroyed at least half of the world’s largest reef system. It has also contributed to droughts and bushfires.

Professor Ove Hoegh-Guldberg, who chairs the expert panel that developed the report, said a rapid transition to net zero greenhouse gas emissions was required if the international community was to limit warming to well below 2C.

“Current international commitments to greenhouse gas emission reduction, if unchanged, would result in average global surface temperatures that are 3C above the pre-industrial period in the lifetimes of our children and grandchildren,” he said.

“The evidence presented in this risk-assessment report, which is based on peer-reviewed scientific literature, indicates that this would have serious consequences for Australia and the world.”

But scientists said it was possible for Australia to meet its climate goals.

Australian Academy of Science president John Shine said the new report suggested while the planet was warning, science had its solutions.

“Australia is well positioned to meet the climate change challenge by combining our scientific knowledge with economic opportunities associated with moves to net zero greenhouse gas emissions,” Professor Shine said.

The report makes 10 recommendations, including scaling up the development and implementation of next-generation zero greenhouse gas technologies and exploring how food production and supply systems can prepare for climate change.




Monday, March 29, 2021

Economically destructive cap and trade for HFCs is here

This is a bit complicated so bear with me. To begin with, Biden’s avalanche of climate scare executive orders included one telling the State Department to prepare the Kigali Amendment to the Montreal Protocol on Substances that Deplete the Ozone Layer for submission to the Senate, for ratification.

The Kigali Amendment has nothing to do with ozone depletion (a fanciful tale in itself), quite the contrary in fact. The 1987 Montreal Protocol mandated the phase out of CFCs, the primary refrigerant and aerosol propellant at the time. CFC were globally replaced with HFCs, at great expense and bother.

The 2015 Kigali Amendment now mandates the phase out of HFCs. I am not making this up.

HFCs do not threaten the ozone layer, so they have nothing to do with the Montreal Protocol. But the Protocol community decided to do what is called “mission creep”. They crept over to global warming, where HFCs are considered a problem. They have what is called a high “global warming potential” or GWP, so they too have to go.

Reportedly Obama and Kerry played a big role in creeping the Protocol, but they never submitted it for Senate ratification, knowing it would never get past the Republicans.

But now, to steal a great line: A funny thing happened on the way to the Senate.

Remember the giant Omnibus Appropriations Act passed in February? It funded the federal government and the Covid stimulus to the tune of $2.3 trillion. It also, as usual, included some riders that probably could not pass by themselves.

Well on page 1074 we find the “American Innovation and Manufacturing Act” or AIM. Incredibly, AIM includes the entire Kigali Amendment. (Still not making it up.) Not by name, mind you, but all the HFC phaseout rules and time table, pretty much word for word. I wonder if the Senators that voted for this addition to the Omnibus knew they were letting Kigali in the back door?

Simply put, this is a cap and trade system for HFCs, with a cap that declines over time to the point where almost no HFCs are allowed in America.

So, for example, AIM and Kigali establish the same cap. Here it is almost funny. Kigali was formulated in 2015 so the cap is mostly based on how much HFC was made and imported in 2011-13. Back then this was recent supply data. AIM uses the same dates, even though the present HFC supply situation may be very different. The only explanation for AIM using 8 to 10 year old data to establish the cap today is that it is Kigali by the back door.

So the Senate will be asked to ratify something that is already law. I hope they refuse but maybe it is not worth the filibuster. To paraphrase another great line, the greens don’t need no stinking Senate. They got Kigali in the back door already. Note that China, the world’s biggest producer of HFCs, has not ratified Kigali or implemented it by law.

In any case we now have before us a looming declining-cap and trade system for HFCs. EPA is moving quickly and is expected to propose the regulations for this system shortly. There are some big potential problems.

Keep in mind that some of the many primary uses of HFCs, in vast quantities, are these:

Air conditioning in cars, homes and big buildings

Refrigerators, freezers and chillers

Aerosol sprays

Electric power transformers

Heat pumps

Structural foam

Fire suppression

Note too that the leading candidates for HFC replacement are HFOs, which presently have some serious problems. For example they can be flammable, and they do not last all that long. In fact one reason they do not have a high GWP is that they self destruct quickly in air.

By far the biggest problem with AIM is that outdated cap. Technically it is called a “baseline” which sounds friendlier than a cap. But the HFC allowances EPA will distribute, in ever decreasing amounts, are from the baseline, so it is the cap.

In fact there are two big problems here. First of all we really have very little information on HFC production and import 8 to 10 years ago. EPA recently acknowledged this, again in an almost funny way.

They issued what is called a “Notice of data availability” or NODA. Normally a agency issues a NODA to announce the availability of new data, hence the name. But EPA’s NODA says they do not have good data about HFCs in 2011-13 and asks for suggestions. So this is really an EPA “Notice of data unavailability” or NODU.

Second it looks like HFC use today is much greater than it was 8-10 years ago. It is crucial that the AIM baseline developed by EPA be accurate, and especially that EPA’s estimates are not significantly lower than reality. The baseline determines the allocations of allowances and these must be adequate, lest there be a severe shortage of HFCs.

To take an extreme example, suppose the EPA consumption baseline is just half of what is needed for business as usual. In that case the initial, modest 10% reduction which is effective immediately, becomes a destructive 55% cut. If EPA is low by just 20%, the 10% reduction still balloons to 28%. Such a shortage of allowable HFCs could wreak havoc with certain industries and important products.

Then there is the problem of how EPA can even find all of the companies that use HFCs or import HFC containing products, much less how they can allocate the increasingly limited allowances to them. With SO2 allowances it was relatively easy because we knew where the big coal-fired power plants were and they did not change over time.

In contrast, HFC use and importation is a highly dynamic situation. For just one small example, roughly half of the cars sold in America are imported and all of them contain HFCs. In the case of aerosol sprays the allowance allocation problem is mind boggling.

There are other major problems, some not in Kigali, but this is enough to make the point. We are looking at a cap and trade phaseout of a ubiquitous harmless chemical, all in the name of climate change.

AIM is climate craziness personified, a prescription for disaster, especially economically destructive shortages.


The Social Cost of Carbon and Climate Sensitivity Is Model Manipulation at Its Finest

The “social cost of carbon” is a calculation that the Biden administration is looking to use to justify stringent regulation of carbon dioxide emissions.

Missouri Attorney General Eric Schmitt—joined by Arkansas, Arizona, Indiana, Kansas, Montana, Nebraska, Ohio, Oklahoma, South Carolina, Tennessee, and Utah—have now filed a lawsuit, arguing that the use of this metric in policy would constitute an overreach of government power.

Carbon dioxide is an odorless gas that is the basis for almost all plant life on earth. Of course, carbon dioxide emissions have slightly warmed the surface temperature of the planet, but on the other hand, access to plentiful energy has been accompanied by a doubling of life expectancy in the developed world and an elevenfold growth in personal wealth, as noted in the 2016 book “Lukewarming.”

A recent commentary that one of us (Kevin Dayaratna) published, titled “Why the Social Cost of Carbon Is the Most Useless Number You’ve Never Heard Of,” presented years of research on the topic conducted at The Heritage Foundation’s Center for Data Analysis.

He noted how easy it is to artificially raise the social cost of carbon by manipulating various assumptions in the calculation, including the discount rate, which is essentially an estimate of how much money invested today in say, the stock market, will grow in the future.

The Office of Management and Budget has recommended calculating things like the social cost of carbon with discount rates of 3%, 5%, and 7%. Obviously, at the higher rates, the social cost of carbon becomes pretty low. Using a 3% discount rate, the Biden administration hiked the social cost of carbon up to $51 per ton, a significant increase from the Trump administration’s $7 per ton.

Even that might not do for the Biden administration, which could rely upon the recent arguments made by University of Chicago economist Michael Greenstone, who said that the discount rate should be 2% or lower.

Additionally, in order to determine the social cost of carbon, we need to have a good idea of how much the planet’s surface will warm under various policy scenarios.

To calculate this level, scientists have for decades used computer models to find the “sensitivity” of climate to an arbitrary amount of carbon dioxide emissions. This sensitivity is usually the calculated warming, in terms of temperature, for a doubling of atmospheric carbon dioxide.

Here is a dirty little secret that few are aware of: All those horrifying future temperature changes that grace the front pages of papers of record aren’t really the predicted warming above today’s level. Instead, they are the difference between two models of climate change.

The “base climate” isn’t the observed global temperature at a given point in time. Instead, it is what a computer model simulates temperatures to be prior to any significant changes in carbon dioxide.

Reality need not apply to these calculations. And there are sometimes very big differences between the base models and reality, especially in the high latitudes of both hemispheres, and over the world’s vast tropics.

The usual procedure is then to instantaneously quadruple carbon dioxide and let the model spin up to an equilibrium climate. Then—hold onto your hat—that number is divided by two, taking advantage of the fact that warming varies linearly with increasing carbon dioxide, something that has been known for a long time. The final figure is called the equilibrium climate sensitivity to doubled carbon dioxide.

With regard to the equilibrium climate sensitivity, climate science is very odd: The more money we spend studying it, the more uncertain our forecasts become.

This fact is becoming increasingly obvious as a new suite of models is emerging that will be incorporated in the next climate science report from the U.N.’s Intergovernmental Panel on Climate Change, to be released next year.

For decades, there was no real narrowing of the range of the equilibrium climate sensitivity, since a 1979 National Academy of Sciences report, “Carbon Dioxide and Climate: A Scientific Assessment,” chaired by Jule Charney of the Massachusetts Institute of Technology.

The “Charney Sensitivity,” as it came to be called, was 1.5-4.5 C for the lower atmospheric warming that would be caused by a doubling of carbon dioxide.

Subsequent assessments, such as some of the serial “scientific assessments” of the Intergovernmental Panel on Climate Change, gave the same range, or something very close.

Periodically, the U.S. Department of Energy runs what it calls “coupled model intercomparison projects.” The penultimate one, used in the 2013 Intergovernmental Panel on Climate Change assessment, contained 32 families of models with a sensitivity range of 2.1-4.7 C, and a mean value of 3.4 C—i.e., warmer lower and mean values than Charney.

Nevertheless, the Intergovernmental Panel on Climate Change rounded this range back to the good old 1.5-4.5 C, because there was some skepticism about the warmer models.

Despite these differences between various base climate models and the doubled carbon dioxide calculation, reality-based calculations of the equilibrium climate sensitivity by other researchers yield much lower sensitivities, between 1.4 and 1.7 C.

The new coupled model intercomparison projects model suite, on the other hand, displays an even larger range of sensitivity beyond what has been observed. The range of models currently available (which is most of them), is 1.8-5.6 C, and an estimate of the mean is 4 C, and is likely what the Biden administration may very well use to determine the social cost of carbon.

So, sadly, the new coupled model intercomparison project models are worse than the older ones.

A 2017 study shows that, with one exception, the older coupled model intercomparison project models made large systematic errors over the entire globe’s tropics. The exception was a Russian model, which also had the lowest sensitivity of all, at 2.05 C.

Last year, researchers examined the new coupled model intercomparison projects model suite, and what they found was not good:

Rather than being resolved, the problem has become worse, since now every member of the CMIP6 generation of climate models exhibits an upward bias in the entire global troposphere as well as in the tropics.

A very recent paper just published in Geophysical Research Letters indicates that it may be that new estimates of the enhancements of clouds by human aerosol emissions are the problem. Interestingly, the model that has the least cloud interaction is the revised Russian model, and its sensitivity is down to 1.8 C, but it nonetheless still overpredicts observed global warming.

When it became apparent that the new models were predicting even more warming than their predecessors, Paul Voosen, the climate correspondent at Science magazine, interviewed a number of climate scientists and found that the new, “improved” renditions of the cloud-aerosol interaction is causing real problems, either completely eliminating any warming in the 20th century or producing far too much.

One of the scientists involved, Andrew Gettelman, told Voosen that “it took us a year to work that out,” proving yet again that climate scientists modify their models to give what French modeler Frederic Hourdin called an “anticipated acceptable result.”

Acceptable to whom? Hourdin’s landmark paper clearly indicates that it is scientists, not objective science, who subjectively decide how much warming looks right.

The implications of the systematic problems with coupled model intercomparison project models and other manipulated models on the social cost of carbon may be big: The Biden administration will rely on these models to beef up the social cost of carbon as well.

In fact, the Obama administration had done so by using an outdated equilibrium climate sensitivity distribution that was not grounded in reality that inflated its social cost of carbon estimates.

In fact, peer-reviewed research conducted by Kevin Dayaratna, Pat Michaels, Ross McKitrick, and David Kreutzer in two separate journals has illustrated that that under reasonable and realistic assumptions for climate sensitivity, alongside other assumptions, the social cost of carbon may effectively be zero or even negative.

It is now apparent that the reason for using the social cost of carbon to begin with is very simple: to be able to control the energy, agricultural, and industrial sectors of the economy, which will result in big costs for ordinary Americans with little to no climate benefit in return.

So altogether, we have one manipulated class of models—models determining climate sensitivity—likely being used as a basis for manipulating the social cost of carbon. The consequences on the social cost of carbon’s uncertainty are profound.

As a result, the public should be very cautious about accepting new calculations of the social cost of carbon. Although the social cost of carbon is based on an interesting class of statistical models, its use in policy should also serve a case study of model manipulation at its finest.


Canada: Supreme Court Rules Mandatory Carbon Price Constitutional

The Supreme Court of Canada, in a pivotal victory for Prime Minister Justin Trudeau’s climate policy, has ruled that the government’s decision to mandate a national carbon price to reduce greenhouse gas emissions is constitutional.

In a split 6-3 decision issued Thursday morning, Canada’s highest court ruled in favor of the nation’s federal government following a hotly contested legal battle over the government’s decision to impose a minimum fuel charge on all distributors and producers of carbon-based fuel in the country. The move, approved by parliament in 2018, received immediate pushback from a number of Canada’s provinces who claimed the decision was a blatant overreach on behalf of the government and argued that such decisions fall solely under their provincial authority.

These challenges resulted in a lengthy court battle, numerous appeals and conflicting rulings before ultimately landing at the feet of the Supreme Court, which has now definitively found that the decision to issue the carbon price was legal.

Chief Justice Richard Wagner, writing for the majority, said that at the heart of the legal battle rests the stark reality that climate change is a very real threat to the safety and wellbeing of humanity.

“Climate change is real,” Wagner wrote in Thursday’s ruling. “It is caused by greenhouse gas emissions resulting from human activities, and it poses a grave threat to humanity’s future. The only way to address the threat of climate change is to reduce greenhouse gas emissions.”

Because the threat of climate change is so severe, Wagner says, the gravity of the problem gives the government authority to act under the “peace, order and good government” clause of the Canadian Constitution. The clause, commonly referred to as the POGG clause, is rarely successfully cited as the basis for governmental action but does nonetheless give federal leaders the authority to act on issues that relate to the entire nation.

Wagner says this is where Trudeau’s carbon price prevails. While the provinces — namely the more conservative or oil-centric Alberta, Ontario and Saskatchewan areas — claim that managing natural resources to combat climate change is something they can do independently, POGG can be activated when there is a clear inability on behalf of all the provinces to come together and fix the problem themselves.

The chief justice says that allowing provinces to handle this on their own would only hold Canada back from combating climate change as a collective nation. Even if the majority of provinces were able to coordinate their efforts, it would only take a small number of provinces unwilling to impose a minimum carbon price to undermine the actions of the rest of the country.

The ruling states that only though a national and unified approach, in which all provinces are bound to playing by the same carbon pricing rules, does Canada have a chance at successfully reducing its carbon footprint.

The judge notes that provinces are still allowed to regulate themselves when it comes to their carbon pricing systems and can still chose their own regulatory frameworks when it comes to emissions standards. All they have to do is comply with the minimum standards laid out by the federal government or else they risk being slapped with an increased carbon tax.

Wagner also notes that carbon pricing works. The justice writes that there is a “”broad consensus among international bodies” that setting these minimum prices can significantly cut back on greenhouse emissions from carbon, with the idea being that the more carbon costs the less people will actually use it.

The ruling’s mention of an international consensus regarding the effectiveness of carbon pricing methods could have some possible implications for other nations that are considering similar measures. Implementing a carbon pricing system in the United States for instance, has often been cited as crucial step in overhauling America’s energy policies, but has so far failed to get off the ground.

Thursday’s ruling officially gives Trudeau’s climate policy the ability to forge ahead with imposing minimum carbon prices, which will continue to rise throughout the next decade to further discourage carbon use. The minimum price is currently set at $30 per 1.1 ton of emissions, but the government says it will continue to rise the minimum over the next few years before it ultimately hits $170 per 1.1 ton of emissions by 2030.


We should learn what lessons from Fukushima?

Lesson #1: People died from forced evacuations, not from radiation

Dr. Kelvin Kemm

A decade has passed since the Great East Japan Earthquake, and the name Fukushima is etched into history. But few people know the truth of what happened. The phrase, “the lessons learned from Fukushima,” is well-known. But how do people implement them, if they don’t know what happened, or what lessons they should actually learn?

It was after lunch on 11 March 2011 that a giant earthquake occurred 72 kilometers (45 miles) off the Oshika Peninsula in Japan. It registered 9.0 on the Richter Scale, making it the largest ’quake ever recorded in Japan. The undersea ground movement, over 30 km (18 miles) beneath the ocean’s surface, lifted up a huge volume of water, like an immense moving hill. Meanwhile, the ground shockwave travelled toward the land at high speed. It struck Japan and shook the ground for six terrifying minutes.

The shock wave travelled under 11 nuclear reactors, including two separate Fukushima complexes: Fukushima-Diani and Fukushima-Daiichi. (Diani means ‘Complex 1’ and Daiichi ‘Complex 2’.) All 11 reactors shut down, as they were designed to do, and no doubt all the reactor operators breathed a great sigh of relief. It was premature.

The mound of sea water was still traveling. As the water “hill” entered shallow water, nearer the land, it was lifted up into a towering wave as high as 40 meters (130 feet!) in places. Then, some 50 minutes after the earthquake, the tsunami struck the Fukushima-Daiichi nuclear power station. Some kilometres away, when water struck the Fukushima-Diani nuclear power station, it was “only” 9 m (30 ft) high, which was not as devastating as at Daiichi. Diani did not make it into the news.

The water jumped the protective sea walls at Fukushima-Daiichi. The sighs of relief from a half hour before turned into concern and dread. Over at the Fukushima Diani power station, 12 km (7 mi) to the south, water also caused damage to machinery, but the reactors were not harmed. There was no risk of radiation release, so the Diani power station was of no interest to the international media. Diani was safely shut down to “cold shutdown” after two days.

As a result, over the past decade, any reference to “Fukushima” has meant only the Daiichi power station and not the other one.

The devastating tsunami swept up to 10 km (6 mi) inland in places, washing away buildings, roads, and telecommunication and power lines. Over 15,000 people were killed, mainly by drowning.

Although all the nuclear reactors had shut down to a state known as “hot shutdown,” the reactors were still very hot and needed residual cooling for many hours after the urgent fast shutdown. People instinctively know not to put their hands on the engine block of a car right after it has been switched off. Nuclear reactors are the same and need to cool down until they reach the safe state known as “cold shutdown.”

A nuclear reactor has pumps that send water through the reactor until it cools. But the Fukushima electrical pumps failed, because the tsunami had washed away the incoming electrical lines. So the reactor system automatically switched to diesel-driven generators to keep the cooling pumps going; but the water had washed away the diesel fuel supply, meaning the diesels worked for only a short while. Then it switched to emergency batteries; but the batteries were never designed to last for days, and could supply emergency power for only about eight hours.

The hot fuel could not be cooled, and over the next three or four days the fuel in three reactors melted, much like a candle melts.

The world media watched, and broadcast the blow-by-blow action. Japanese authorities started to panic under the international spotlight. The un-circulating cooling water was boiling off inside the reactors resulting in a chemical reaction between hot fuel exposed to hot steam. This led to the production of hydrogen gas. As the steam pressure rose, the engineers decided to open valves to release the pressure. That worked as planned, but it released the hydrogen as well.

Hydrogen, being light, rose up to the roof, where the ventilation system was not working, because there was no electricity. After a while some stray spark ignited the hydrogen which exploded, blowing the lightweight roof off the building right in front of the world’s TV cameras. The Fukushima news just became much more dramatic. Authorities were desperate to show the world some positive action.

They progressively ordered the evacuation of 160,000 people living around the Fukushima neighbourhood. That was a mistake. As days and weeks passed, it materialized that not one single person was killed by nuclear radiation. Not one single person was even injured by nuclear radiation, either. Even today, a decade later, there is still no sign of any longer-term radiation harm to any person or animal. Sadly, however, people did die during the forced evacuation.

So one of the lessons learned from Fukushima is that a huge amount of nuclear power can be struck by the largest earthquake and tsunami ever recorded, and nobody gets harmed by nuclear radiation.

Another lesson learned is that an evacuation order issued too hastily did harm and kill people.

World Nuclear Association Director-General Dr. Sama Bilbao y León said: “The rapidly implemented and protracted evacuation has resulted in well-documented significant negative social and health impacts. In total, the evacuation is thought to have been responsible for more than 2,000 premature deaths among the 160,000 who were evacuated. The rapid evacuation of the frail elderly, as well at those requiring hospital care, had a near-immediate toll.” [emphasis added]

She added: “When facing future scenarios concerning public health and safety, whatever the event, it is important that authorities take an all-hazards approach. There are risks involved in all human activities, not just nuclear power generation. Actions taken to mitigate a situation should not result in worse impacts than the original events. This is particularly important when managing the response to incidents at nuclear facilities – where fear of radiation may lead to an overly conservative assessment and a lack of perspective for relative risks.”

Thus, a decade later, we can contemplate the cumulative lessons learned. Above all, they are that nuclear power is far safer than anyone had thought. Even when dreaded core meltdowns occurred, and although reactors were wrecked, resulting in a financial disaster for the owners, no people were harmed by radiation.

We also learned that, for local residents, it would have been far safer to stay indoors in a house than to join the forced evacuation. We also learned that governments and authorities must listen to the nuclear professionals, and not overreact, even though the television news cameras look awfully close.

Fukushima certainly produced some valuable lessons. Governments, news media and the public need to learn the correct lessons from them.

Dr Kelvin Kemm is a nuclear physicist and is CEO of Stratek Business Strategy Consultants, a project management company based in Pretoria. He conducts business strategy development and project planning in a wide variety of fields for diverse clients. Contact him at

Email from




Thursday, March 25, 2021

The Story About Offshore Oil Drilling Environmentalists DESPERATELY Don't Want You to Hear

“Louisiana officials say the state’s oil and gas industry is in danger. This comes after President Joe Biden cancelled a March oil lease sale in the Gulf of Mexico. Nearly 80 million acres of available leases would have been sold this week. The damage to Louisiana’s (and the nation’s) oil and gas companies started in January when President Biden signed an executive order banning all new oil and gas leases on public land and waters for 60 days," reports KLFY.

Now for the cheering:

“Cancelling this huge offshore Gulf oil auction helps protect our climate and life on Earth. President Biden understands the urgent need to keep this oil in the ground…This is a great step toward phasing out all offshore drilling and bringing environmental justice to the Gulf Coast and Alaska. We need to help restore coastal communities and marine life," the Center for Biological Diversity said in a statement.

And speaking of “marine life.” If bona-fide science has crowned “Global Warmists” with 10-foot dunce caps, then over half a century of scientific evidence has crowned anti-offshore drilling activists with 50-foot dunce caps. That offshore oil drilling—far from an environmental disaster, is empirically an environmental bonanza—has been pounded home with a vengeance in study after study. The science, you might say, is settled. To wit:

According to the Energy Information Administration, "Gulf of Mexico federal offshore oil production accounts for 17% of total U.S. crude oil production." Yet with over 3000 of the 4000 plus offshore oil production platforms in the Gulf of Mexico off her coast, Louisiana provides almost a third of North America’s commercial fisheries.

A study by LSU’s sea grant college showed that 70 percent of Louisiana’s offshore fishing trips target these structures. “Oil platforms as artificial reefs support fish densities 10 to 1000 times that of adjacent sand and mud bottom, and almost always exceed fish densities found at both adjacent artificial reefs of other types and natural hard bottom,” revealed a study by Dr. Bob Shipp, professor at the Marine Sciences department of the University of South Alabama in Mobile, Alabama. “Evidence indicates that massive areas of the northwestern Gulf of Mexico were essentially empty of Red Snapper stocks for the first hundred years of the fishery. Subsequently, areas in the western Gulf have become the major source of red snapper, concurrent with the appearance of thousands of petroleum platforms.”

In brief, “villainous” Big Oil produces marine life at rates that puts to shame “wondrous” Earth Goddess Gaia. “The fish Biomass around an offshore oil platform is ten times greater per unit area than for natural coral reefs,” also found Dr. Charles Wilson of LSU’s Department of Oceanography and Coastal Science (emphasis added). "Ten to thirty thousand adult fish live around an oil production platform in an area half the size of a football field.”

But you’re very conveniently “forgetting” the infamous BP oil spill! comes the retort from Environmentalist Whackos.

Glad you mentioned that. Because only one year after the infamous spill, the FDA’s Gulf Coast Seafood Laboratory, the National Oceanic and Atmospheric Administration’s National Seafood Inspection Laboratory, the Louisiana Department of Wildlife and Fisheries, the Louisiana Department of Health and Hospitals, along with similar agencies from neighboring Gulf coast states, have methodically and repeatedly tested Gulf seafood for cancer-causing “polycyclic aromatic hydrocarbons.”

“Not a single sample [for oil or dispersant] has come anywhere close to levels of concern,” reported Olivia Watkins, executive media advisor for the Louisiana Department of Wildlife and Fisheries.

“All of the samples have been 100-fold or even 1,000-fold below all of these levels,” reported Bob Dickey, director of the FDA’s Gulf Coast Seafood Laboratory. “Nothing ever came close to these levels.”

That this proliferation of seafood in the Gulf of Mexico came because – rather than in spite – of the oil production rattled many environmental cages and provoked a legion of scoffers.

Amongst the scoffers were some Travel Channel producers, fashionably greenish in their views. They read these claims in a book by yours truly—"The Helldiver’s Rodeo”—that Publishers Weekly hailed as “highly-entertaining!” (Ted Nugent’s blurb certainly didn’t help against their scoffing!)

The book describes an undersea panorama that (if true) could make an interesting show for the network, they concluded, while still scoffing. They scoffed as we rode in from the airport. They scoffed over raw oysters, grilled redfish and seafood gumbo that night. More scoffing through the Hurricanes at Pat O’Brien’s. They scoffed even while suiting up in dive gear and checking the cameras as we tied up to an oil platform 20 miles in the Gulf off the southeast Louisiana coast.

But they came out of the water bug-eyed and indeed produced and broadcast a Travel Channel program showcasing a panorama that turned on its head every environmental superstition against offshore oil drilling. Schools of fish filled the water column from top to bottom – from 6-inch blennies to 12-foot sharks. Fish by the thousands. Fish by the ton.

The cameras were going crazy. Do I focus on the shoals of barracuda? Or that cloud of jacks? On the immense schools of snapper below, or on the fleet of tarpon above? How ’bout this – whoa – hammerhead! We had some close-ups, too, of coral and sponges, the very things disappearing off Florida’s pampered reefs—a state that bans offshore oil drilling. Off Louisiana, they sprout in colorful profusion from the huge steel beams —acres of them. You’d never guess this was part of that unsightly structure above. The panorama of marine life around an offshore oil platform staggers anyone who puts on goggles and takes a peek, even (especially!) the most worldly scuba divers. Here’s a video peek at this seafood bonanza.


Toyota Warns (Again) About Electrifying All Autos. Is Anyone Listening?

Depending on how and when you count, Japan’s Toyota is the world’s largest automaker. According to Wheels, Toyota and Volkswagen vie for the title of the world’s largest, with each taking the crown from the other as the market moves. That’s including Volkswagen’s inherent advantage of sporting 12 brands versus Toyota’s four. Audi, Lamborghini, Porsche, Bugatti, and Bentley are included in the Volkswagen brand family.

GM, America’s largest automaker, is about half Toyota’s size thanks to its 2009 bankruptcy and restructuring. Toyota is actually a major car manufacturer in the United States; in 2016 it made about 81% of the cars it sold in the U.S. right here in its nearly half a dozen American plants. If you’re driving a Tundra, RAV4, Camry, or Corolla it was probably American-made in a red state. Toyota was among the first to introduce gas-electric hybrid cars into the market, with the Prius twenty years ago. It hasn’t been afraid to change the car game.

All of this is to point out that Toyota understands both the car market and the infrastructure that supports it perhaps better than any other manufacturer on the planet. It hasn’t grown its footprint through acquisitions, as Volkswagen has, and it hasn’t undergone bankruptcy and bailout as GM has. Toyota has grown by building reliable cars for decades.

When Toyota offers an opinion on the car market, it’s probably worth listening to. This week, Toyota reiterated an opinion it has offered before. That opinion is straightforward: The world is not yet ready to support a fully electric auto fleet.

Toyota’s head of energy and environmental research Robert Wimmer testified before the Senate this week, and said: “If we are to make dramatic progress in electrification, it will require overcoming tremendous challenges, including refueling infrastructure, battery availability, consumer acceptance, and affordability.”

Wimmer’s remarks come on the heels of GM’s announcement that it will phase out all gas internal combustion engines (ICE) by 2035. Other manufacturers, including Mini, have followed suit with similar announcements.

Tellingly, both Toyota and Honda have so far declined to make any such promises. Honda is the world’s largest engine manufacturer when you take its boat, motorcycle, lawnmower, and other engines it makes outside the auto market into account. Honda competes in those markets with Briggs & Stratton and the increased electrification of lawnmowers, weed trimmers, and the like.

Help us STOP Joe Biden’s radical agenda by becoming a PJ Media VIP member. Use promo code AMERICAFIRST to receive 25% off your VIP membership.
Wimmer noted that while manufactures have announced ambitious goals, just 2% of the world’s cars are electric at this point. For price, range, infrastructure, affordability, and other reasons, buyers continue to choose ICE over electric, and that’s even when electric engines are often subsidized with tax breaks to bring pricetags down.

The scale of the switch hasn’t even been introduced into the conversation in any systematic way yet. According to FinancesOnline, there are 289.5 million cars just on U.S. roads as of 2021. About 98 percent of them are gas-powered. Toyota’s RAV4 took the top spot for purchases in the U.S. market in 2019, with Honda’s CR-V in second. GM’s top seller, the Chevy Equinox, comes in at #4 behind the Nissan Rogue. This is in the U.S. market, mind. GM only has one entry in the top 15 in the U.S. Toyota and Honda dominate, with a handful each in the top 15.

Toyota warns that the grid and infrastructure simply aren’t there to support the electrification of the private car fleet. A 2017 U.S. government study found that we would need about 8,500 strategically-placed charge stations to support a fleet of just 7 million electric cars. That’s about six times the current number of electric cars but no one is talking about supporting just 7 million cars. We should be talking about powering about 300 million within the next 20 years, if all manufacturers follow GM and stop making ICE cars.

Simply put, we’re gonna need a bigger energy boat to deal with connecting all those cars to the power grids. A LOT bigger.

But instead of building a bigger boat, we may be shrinking the boat we have now. The power outages in California and Texas — the largest U.S. states by population and by car ownership — exposed issues with powering needs even at current usage levels. Increasing usage of wind and solar, neither of which can be throttled to meet demand, and both of which prove unreliable in crisis, has driven some coal and natural gas generators offline. Wind simply runs counter to needs — it generates too much power when we tend not to need it, and generates too little when we need more. The storage capacity to account for this doesn’t exist yet.

We will need much more generation capacity to power about 300 million cars if we’re all going to be forced to drive electric cars. Whether we’re charging them at home or charging them on the road, we will be charging them frequently. Every gas station you see on the roadside today will have to be wired to charge electric cars, and charge speeds will have to be greatly increased. Current technology enables charges in “as little as 30 minutes,” according to Kelly Blue Book. That best-case-scenario fast charging cannot be done on home power. It uses direct current and specialized systems. Charging at home on alternative current can take a few hours to overnight to fill the battery, and will increase the home power bill.

That power, like all electricity in the United States, comes from generators using natural gas, petroleum, coal, nuclear, wind, solar, or hydroelectric power according to the U.S. Energy Information Administration. I left out biomass because, despite Austin, Texas’ experiment with purchasing a biomass plant to help power the city, biomass is proving to be irrelevant in the grand energy scheme thus far. Austin didn’t even turn on its biomass plant during the recent freeze.

Half an hour is an unacceptably long time to spend at an electron pump. It’s about 5 to 10 times longer than a current trip to the gas pump tends to take when pumps can push 4 to 5 gallons into your tank per minute. That’s for consumer cars, not big rigs that have much larger tanks. Imagine the lines that would form at the pump, every day, all the time, if a single charge time isn’t reduced by 70 to 80 percent. We can expect improvements, but those won’t come without cost. Nothing does. There is no free lunch. Electrifying the auto fleet will require a massive overhaul of the power grid and an enormous increase in power generation. Elon Musk recently said we might need double the amount of power we’re currently generating if we go electric. He’s not saying this from a position of opposing electric cars. His Tesla dominates that market and he presumably wants to sell even more of them.

Toyota has publicly warned about this twice, while its smaller rival GM is pushing to go electric. GM may be virtue signaling to win favor with those in power in California and Washington and in the media. Toyota’s addressing reality and its record is evidence that it deserves to be heard.

Toyota isn’t saying none of this can be done, by the way. It’s just saying that so far, the conversation isn’t anywhere near serious enough to get things done.


Nations Aren’t Acting as If Climate Change Poses Existential Crisis

Political leaders and media personalities are fond of saying climate change poses an existential threat to humans and the planet. The weight of scientific evidence doesn’t support this oft-made claim.

Despite what is reported almost daily by the mainstream media, data from the United Nations Intergovernmental Panel on Climate Change (IPCC) and the U.S. National Oceanic and Atmospheric Administration (NOAA) show no increase in extreme weather events as the earth has modestly warmed over the past 150 years. Indeed, IPCC and NOAA data show the number of extreme cold spells, drought, floods, heatwaves, hurricanes, tornadoes, and wildfires have all either declined modestly or remained relatively stable since the late 1870s.

Despite these irrefutable facts, leaders from nations around the world have signed multiple international agreements, the latest being the 2015 Paris climate agreement, intended to avert a supposed pending climate disaster.

However, their actions do not match their words. The U.N. recently reported the same political leaders who publicly signed the Paris Climate Agreement committing them to restrict emission from their countries, have enacted domestic policies that actually increase emissions.

As of February 26, the U.N. says only 75 of the more than 190 countries that have ratified the Paris Climate Agreement have tendered firm commitments and detailed plans to cut emissions, despite having committed to deliver those plans by 2020. Adding insult to injury, the U.N. says “the level of ambition communicated through these NDCs indicates that changes in these countries’ total emissions would be small, less than -1%, in 2030 compared to 2010 … [whereas the] IPCC, by contrast, has indicated that emission reduction ranges to meet the 1.5°C temperature goal should be around -45% in 2030 compared to 2010.”

Whether it is large emitters or small, the reality is nations are putting poverty reduction and economic growth (rightly in my opinion) ahead of climate action.

Let’s look at a few examples.

India is the world’s third-largest greenhouse gas emitter. Under the Paris Agreement, India did not pledge to cut its emissions, rather it said it would reduce emissions intensity (emissions as a percentage of GDP). The problem with this is, even if India exceeds its carbon intensity reduction goals, its total emissions will still have increased substantially. As a result, the U.N. observes, “with current energy targets and policies, emissions are projected to keep increasing (by 24-25 percent above 2019 levels in 2030) and show no signs of peaking, in particular due to the lack of a policy to transition away from coal. Such an increase of emissions is not consistent with the Paris Agreement.”

Indeed, 70 percent of India’s electric power is generated by burning coal. And India’s most recent estimate is that by 2030 coal use for energy will increase by 40 percent.

The news is even worse out of China, the world’s biggest emitter. China, which is responsible for approximately 25 percent of the world’s emissions, vaguely indicated it expected carbon dioxide emissions to peak by 2030. Peak at what level?

The Chinese Communist Party recently released its five-year plan for economic development and it contained no reduction in coal use. It would be surprising if it did. In recent years, China has built dozens of new coal-fueled power plants, with hundreds more in various stages of construction, development, and planning. China intends to build coal-fueled power plants in Africa, throughout Asia, and the Middle East.

Simultaneously, China is disincentivizing new construction of wind and solar facilities, which the National Energy Administration (NEA) referred to as “unreliables.”

Even Argentina, a relatively small emitter, will have trouble squaring its development goals with its Paris climate commitment. At a recent conference, President Alberto Fernandez said Argentina’s goal was to reach net-zero greenhouse gas emissions by 2050. Meanwhile, back home, far from the limelight of international climate conferences, Fernandez announced the government was doubling down on fossil fuels, saying, “Today we are relaunching the oil and gas economy,” starting with $5 billion in government subsidies, to develop its shale fields.

Political elites don’t really fear a climate apocalypse is in the offing. Rather, they are using the threat of the “climate change” hobgoblin to accrue ever more power and control over peoples’ lives. For politicians, this is what the climate scare is and always has been about.

H.L. Mencken once famously quipped, “The urge to save humanity is almost always a false front for the urge to rule.” Nowhere is this truer than in the push to save the world from climate change.


What Australians really think about climate change

Sampling, sampling, sampling. The revelation that only one in seven Australians take climate change seriously is very encouraging but ALL the figures below have to be taken with a large grain of salt.

The "sample" was derived from an online panel study and the biases of online studies are well-known, to say nothing of the inaccuracies in panel studies. Online samples tend to skew Left. So even the 7% is probably an overstimate

The journal article is "Australian voters’ attitudes to climate action and their social-political determinants" in

Just one in seven Australians considered climate change their decisive issue when voting in the 2019 federal election.

But some 80 per cent say action to reduce Australia’s greenhouse gas emissions is important, including almost 70 per cent of Coalition voters.

They are two key findings from new ANU research published today in the journal PLOS ONE, based on online and telephone surveys with more than 2000 Australian voters after the 2019 poll.

In the paper, researchers Dr Rebecca Colvin and Professor Frank Jotzo looked at some of the reasons why, in the “climate election,” the party that was offering the more “status quo” emissions policy was returned to government.

They found 52 per cent of survey respondents said climate change was a factor in how they voted in 2019, but it was the single biggest issue for just 13 per cent of voters – or slightly more than one in seven people.

Asked whether this finding could be a source of hope or despair for supporters of climate action, Dr Colvin said both interpretations were possible.

“One way to look at it is that there isn’t a massive unbridgeable divide across the political spectrum on climate change,” she told News Corp. “There are lots of people who say they want to see action on climate change but they’re not determining their votes on it – but that broad based of social support is there.”