Wednesday, February 28, 2018

Unusual heat over the Arctic

It is well-known that there is extensive and vigorous subsurface vulcanism in the Arctic, particularly around Gakkel and Lomonosov ridges, but that is never called on as an explanation of surface warming.  WHY NOT?  Arctic warming is irregular, just like volcanoes are and the Arctic warms by much larger amounts than the rest of the globe.  It is completely out of sync with global warming, which hardly exists

Climate scientists are used to seeing the range of weather extremes stretched by global warming but few episodes appear as remarkable as this week's unusual heat over the Arctic.
Zack Labe, a researcher at the University of California at Irvine, said average daily temperatures above the northern latitude of 80 degrees have broken away from any previous recordings in the past 60 years.

"To have zero degrees at the North Pole in February - it's just wrong," said Amelie Meyer, a researcher of ice-ocean interactions with the Norwegian Polar Institute. "It's quite worrying."
The so-called Polar Vortex - a zone of persistient low-pressure that typically keeps high-latitude cold air separate from regions further south - has been weakening for decades.

In this instance, "a massive jet of warm air" is penetrating north, sending a cold burst southwards, said Dr Meyer, who has relocated to Hobart to research on the southern hemisphere, and is hosted by Institute for Marine and Antarctic Studies.

"The anomalies are really extreme," Andrew King, a lecturer in climate science at the University of Melbourne, said. "It's a very interesting event."

Warm, moist air is penetrating much further north than it would normally at a time when the North Pole is in complete darkness.
Cape Morris Jessup, the world's most northerly land-based weather station, in Greenland, touched 6 degrees late on Saturday, about 35 degrees above normal for this time of year.

Robert Rohede, a Zurich-based scientist with Berkeley Earth, posted on Twitter that Cape Morris Jessup had already recorded 61 hours above freezing so far in 2018.

The previous record of such relative was just 16 hours recorded to the end of April in 2011. "Parts of Greenland are quite a bit warmer than most of Europe," Dr King said.

The cold snap will sink temperatures moderately below freezing in London each day until Friday. However, cities such as Berlin will dive to as low as minus 12 degrees and Moscow to minus 24.

With a weak jetstream, surface winds are taking an unusual course - bringing snow from the east and prompting some commentators to dub the event the "Beast from the East".

"For Britain and Ireland, most weather systems would typically blow in from the west, but [on Tuesday] we will see a cold front cross Britain from the east," Dr King said.

 There is open water north of #Greenland where the thickest sea ice of the #Arctic used to be. It is not refreezing quickly because air temperatures are above zero confirmed by @dmidk's weather station #KapMorrisJesup. Wacky weather continues with scary strength and persistence.

Along with the unusual warmth over the Arctic, scientists are monitoring the retreat of sea ice in the Bering Sea.

The ice coverage in the region is now at levels previously seen only in May or June, Mr Labe posted on Twitter, citing data from the US National Snow and Ice Data Centre.

While climate change itself is only likely to have exacerbated regional weather variability, the long-term shrinkage of sea ice has a reinforcing effect on global warming in a region already warming faster than anywhere else on the planet, Dr King said.
Ice reflects sunlight back to space. When it melts, the sea ice exposes more of the dark ocean beneath, which then absorbs that solar radiation, adding to the warming.

Sea ice coverage is currently at or close to record low levels at both the Arctic and Antarctic regions.

The impact of the relatively warm air in the Arctic could play out for months to come. Multi-year ice is likely to be thinner and more cracked, leading to a faster melt when spring arrives, Dr Meyer said.

While researchers had pegged 2050 as a possible year when the Arctic will become ice-free, this winter and the previous one - also unusually warm - had thrown those estimates out.
"It's going much faster than we thought," said Dr Meyer, who will begin work later this year at the ARC Centre of Excellence for Climate Extremes.


Our next energy and security crisis?

Importing 65% of US oil in 2005 vs 100% of many key minerals now (from China and Russia)

Paul Driessen

Oil and natural gas aren’t just fuels. They supply building blocks for pharmaceuticals; plastics in vehicle bodies, athletic helmets, and numerous other products; and complex composites in solar panels and wind turbine blades and nacelles. The USA was importing 65% of its petroleum in 2005, creating serious national security concerns. But fracking helped cut imports to 40% and the US now exports oil and gas.

Today’s vital raw materials foundation also includes exotic minerals like gallium, germanium, rare earth elements and platinum group metals. For the USA, they are “critical” because they are required in thousands of applications; they become “strategic” when we don’t produce them in the United States.

They are essential for computers, medical imaging and diagnostic devices, night vision goggles, GPS and communication systems, television display panels, smart phones, jet engines, light-emitting diodes, refinery catalysts and catalytic converters, wind turbines, solar panels, long-life batteries and countless other applications. In 1954, the USA imported 100% of just eight vital minerals; in 1984, only eleven.

Today, in this technology-dominated world, the United States imports up to 100% of 35 far more critical materials. Twenty of them come 100% from China, others from Russia, and others indirectly from places where child labor, worker safety, human rights and environmental standards are nonexistent.

The situation is untenable and unsustainable. Literally every sector of the US economy, the nation’s defense, its energy and employment base, its living standards – all are dependent on sources, supply chains and transportation routes that are vulnerable to disruption under multiple scenarios.

Recognizing this, President Trump recently issued an executive order stating that federal policies would henceforth focus on reducing these vulnerabilities, in part by requiring that government agencies coordinate in publishing an updated analysis of critical nonfuel minerals; ensuring that the private sector have electronic access to up-to-date information on potential US and other alternative sources; and finding safe and environmentally sound ways to find, mine, reprocess and recycle critical minerals – emphasizing sources that are less likely to come from unfriendly nations, less likely to face disruption.

The order also requires that agencies prepare a detailed report on long-term strategies for reducing US reliance on critical minerals, assessing recycling and reprocessing progress, creating accessible maps of potentially mineralized areas, supporting private sector mineral exploration, and streamlining regulatory and permitting processes for finding, producing and processing domestic sources of these minerals.

Incredibly, the last report on critical minerals and availability issues was written in 1973, the year the first mobile telephone call was made. That inexcusable 45 years of neglect by multiple administrations and congresses dates back to the era of “revolutionary” Selectric typewriters and includes the appearance of desktop computers in 1975 and the first PC in 1981. (That PC had a whopping 16 KB of memory!)

As former geologist, Navy SEAL and military commander – and now Secretary of the Interior – Ryan Zinke has observed, allowing our nation to become so heavily “reliant on foreign nations, including our competitors and adversaries,” for so many strategic minerals “is deeply troubling.”

It’s actually far worse than “troubling” or “neglectful.” It involved a concerted, irresponsible, ill-considered effort to place hundreds of millions of acres in wilderness, wilderness study and other highly restrictive land use categories – often with the very deliberate intention of making their mineral prospects off limits, before anyone could assess the areas’ critical, strategic and other mineral potential.

The 1964 Wilderness Act had contemplated the preservation of a few million or tens of millions of acres of wild and primitive areas and natural habitats. To ensure informed land use decisions and access to vital mineral resources, Congress included “special provisions” that allowed prospecting and other activities in potential and designated wilderness areas – and required surveys by the US Geological Survey “on a planned, recurring basis,” to gather information about mineral or other resources – if such activities are carried out “in a manner compatible with the preservation of the wilderness environment.”

In 1978, while hiking with him, I asked then Assistant Secretary of Agriculture Rupert Cutler how he could defend ignoring this clear statutory language and prohibiting all prospecting, surveys and other assessment work in wilderness and study areas. “I don’t think Congress should have enacted those provisions,” he replied, “so I’m not going to follow them.”

As of 1994, when geologist Courtland Lee and I prepared a detailed analysis, areas equal to Arizona, Colorado, Montana, New Mexico, Utah and Wyoming combined (427 million acres) were off limits to mineral exploration and development. The situation is far worse today – and because of processes unleashed by plate tectonic, volcanic and other geologic forces, these mountain, desert and other lands contain some of the most highly mineralized rock formations in North America, or even the entire world.

The deck was stacked: for wilderness, and against minerals and national security. This must not continue.

These areas must be surveyed and explored by government agencies and private sector companies. The needs of current and future generations are at stake. Failure to conduct systematic evaluations violates the most fundamental principles of national defense, national security and responsible government.

The Departments of Agriculture and the Interior should follow the special provisions of the Wilderness Act; abolish, modify or grant exceptions to existing motorized access restrictions; and ensure that areas are evaluated using airborne magnetic and other analytical equipment, assay gear carried in backpacks, truck-mounted and helicopter-borne drilling and coring rigs, and other sophisticated modern technologies.

This approach also complies with environmental and sustainability principles. It ensures that we can get vital strategic minerals from world class deposits on small tracts of land, instead of having to mine and process vast quantities of low quality ores. That protects most of our wild, scenic and wildlife areas – and modern techniques can then restore affected areas to natural conditions and high quality habitats.

Even ardent environmentalists should support this, because the renewable energy, high-tech future they want and promise depends on these minerals. For example, generating all US electricity (3.5 billion megawatt hours per year) from wind would require some 14 million 1.8 MW turbines, requiring some 8 billion tons of steel alloys and concrete, 2 million tons of neodymium, other rare earths, and vast amounts of cobalt, molybdenum and other minerals. Substituting photovoltaic solar panels for turbines would require arsenic, boron, cadmium, gallium, indium, molybdenum, selenium, silver, tellurium and titanium.

Backing up that electricity for seven windless or sunless days would require 700 million 100kw Tesla battery packs – and thus millions of tons of lithium, cobalt, manganese, nickel and cadmium.

Every generation of renewable energy, computer, communication and other high-tech equipment requires new materials in new quantities – and thus renewed exploration, mining and processing.

The United States is the only country that locks up its strategic mineral resources. No sane, responsible nation risks or forecloses its energy, technology, economic, employment, defense and sustainable future. So it will be fascinating to see which legislators, judges and pressure groups vilify the activities proposed in the Trump executive order, government minerals report and this article.

Those that try to block progress in these areas should be named and shamed (along with their financial supporters) – and their actions made key issues in election campaigns and social responsibility discussions. Perhaps they should be the first to get shut off from electricity, cars, computers, cell phones, medical care, social media and other modern benefits that depend on petroleum and critical minerals.

Let the Interior Department know your views on these vital issues. And maybe take a page from the Cutler-illegal immigrants playbook: Become a sanctuary county or state, simply ignore troublesome laws, regulations and court dictates – and just initiate your own exploration and mining programs. J

Via email

The Weaponization of the EPA Is Over: An Exclusive Interview With Scott Pruitt

In his first year as administrator of the Environmental Protection Agency, Scott Pruitt has already transformed the agency in many ways. He spoke exclusively to The Daily Signal before addressing attendees at the Conservative Political Action Conference’s annual Reagan Dinner. An edited transcript of the interview is below.

Rob Bluey: You gave a speech at CPAC last year where you were just at the beginning of your tenure at the Environmental Protection Agency, and you outlined some of the things that you wanted to do. Here we are a year later, you’ve repealed, taken back, 22 regulations at a savings at $1 billion, a significant contribution to the U.S. economy, as President Donald Trump talked about in his speech. What does that mean?

Scott Pruitt: Busy year. And it was great to be at CPAC about two weeks after having been sworn in last year. And I talked last year about the future ain’t what it used to be, that Yogi Berra quote that I cited about the change that was gonna take place at the agency and I think we’ve been about that change the last year. Focusing on rule of law, restoring process and order, making sure that we engage in cooperative federalism as we engage in regulation.

But the key to me is that weaponization of the agency that took place in the Obama administration, where the agency was used to pick winners and losers. Those days are over.

Americans need an alternative to the mainstream media. But this can't be done alone. Find out more >>

You know, to be in Pennsylvania as I was early in my term, shortly after the CPAC speech last year, and to spend time with miners in Pennsylvania and be able to share with them underground. I was a thousand feet underground and 3 miles in. First time that an administrator in history had done that, and I talked to those long wall miners in Pennsylvania, and delivered the message from the president that the war on coal is over. That was a tremendous message for them, emotion that I saw on their faces.

Can you imagine, in the first instance, an agency of the federal government, a department of the U.S. government, declaring war on a sector of your economy? Where is that in the statute? Where does that authority exist? It doesn’t. And so to restore process and restore commitment to doing things the right way, I think we’ve seen tremendous success this past year.

Bluey: President Trump cited a number of examples that have come out of EPA in his speech to the CPAC attendees, and one of them was coal, another one was the Paris climate treaty. Talk about those two issues and your work with the president in terms of why you decided to take those actions in conjunction with him?

Pruitt: The president’s decision to exit the Paris accord—tremendously courageous. When you look at that decision, it put America first, which is what the president said in the Rose Garden in June.

What was decided in Paris under the past administration was not about carbon reduction. It was about penalties to our own economy because China and India, under that accord, didn’t have to take any steps to reduce CO2 until the year 2030. So, if it’s really about CO2 reduction, why do you let that happen?

“That weaponization of the agency that took place in the Obama administration—where the agency was used to pick winners and losers—those days are over.”

When you look at who’s led the world in CO2 reduction, it’s us. From the year 2000 to 2014, we reduced our CO2 footprint almost 20 percent through innovation and technology. So, we have nothing to be apologetic about as a country, and yet, the past administration went to Paris, hat in hand, and said, “Penalize our economy”, which is what happened with the Clean Power Plan.

The president saying no to that and putting America first was the tremendously courageous and right thing to do. I’m very excited about that decision. I know he talked about that in his speech and it was a wonderful decision he made, and I think great for the American people.

Overall, this regulatory reform agenda—this regulatory certainty that we’re about—is achieving good things for the environment, but it’s also achieving, as you say, good things for our economy. We can do both. And I think that’s what’s key.

President Donald Trump listens to EPA Administrator Scott Pruitt after announcing his decision that the United States will withdraw from the Paris climate agreement. (Photo: Kevin Lamarque/Reuters/Newscom)
Bluey: President Trump certainly cited deregulation as just as significant, I believe he said, as the tax cuts. We’ve seen some of the benefits for many American businesses, and certainly American workers as a result of that.

Pruitt: When you think about an EPA—armed, weaponized, if you will—like a rule like WOTUS, the Waters of the United States rule, that would take a puddle and turn into a lake. To take land use decisions away from farmers and ranchers and landowners across this country, and people think it was just farming and ranching. It was the building of subdivisions. It was really all land use decisions.

I was in Utah last year meeting with some folks there that were building a subdivision, and there was an Army Corps of Engineers representative that was standing outside the subdivision with me, and he pointed to an ephemeral drainage ditch and he said, “Scott, that’s a water of the United States.” And I said, “Well, it’s not gonna be anymore.”

That’s exactly the kind of attitude that drove the past administration. It was all about power. It wasn’t about outcomes necessarily. It was about power and picking winners and losers, and we’re getting that corrected.

Bluey: That’s one thing I want to talk to you about because right now your agency is going across the country. You’re having hearings on the Clean Power Plan. You’re trying to get input from Americans, and not just Americans in Washington, D.C., and the Beltway, but places like Wyoming and Missouri and West Virginia. Why is that important to get out and hear from Americans about how government affects their lives?

Pruitt: Couple things: One, we’ve been to 30-plus states. And as we’ve met with stakeholders, farmers and ranchers, and those in the utility sector and the energy sector, landowners, representatives from the state’s governors, and DEQs from across the country, I think what we didn’t recognize over the last several years with the past administration is that those folks are partners. They care about outcomes.

“We shouldn’t start from the premise that those folks are adversaries or don’t care about clean air or clean water. We should start from the premise that they do, and work with them to achieve good outcomes.”

Think about those farmers and those ranchers. They’re our first conservationists. They’re our first environmentalists. I think of the young man, David, in Florida that I meant about a month ago, 12 years old. I was speaking to a group of individuals in Florida. David was there with his dad and his granddad was there. Now, think about what their greatest asset is? Their land. And they’re teaching David how to cultivate and harvest and care for that land and act as a steward.

That’s the message we’re sending across the country.


Why are government scientists manipulating data on behalf of the Church of Environmental Radicalism?

In the 1970’s it was called “Global Cooling.” When that didn’t happen, it was switched to “Global Warming.” After another failure, the Church of Environmentalism finally came up with a new phrase that was sure to catch all, “Climate Change.” This new phrase could not possibly be wrong because it means if anything changes, it must be Climate Change. Now the church has gone even further than changing a name, it has resorted to changing the data to fit its narrative.

For anyone that has taken a high school science class, manipulating data to fit a hypothesis is not considered science. But that is where we find ourselves, and one of the U.S. government’s scientific organizations is in the crosshairs again. It seems the National Oceanic and Atmospheric Administration (NOAA) has been caught manipulating weather data to fit a narrative yet again.

Paul Homewood was reviewing the NOAA data for the recent cold spell experienced in the Northeast. You may remember it was bitterly cold this past January with tales of animals freezing and falling from trees and sharks freezing in the ocean. But when Homewood looked at the data from NOAA, it didn’t seem to match what was observed. When Homewood got to the raw temperature data, he found it had been manipulated.

Homewood stated, “So at the three sites of Ithaca, Auburn and Geneva, we find that January 2018 was colder than January 1943 by 1.0, 1.7 and 1.3F respectively.” He continued, “Yet NOAA say that the division was 2.1F warmer last month. NOAA’s figure makes last month at least 3.1F warmer in comparison with 1943 than the actual station data warrants.”

Upon further investigation, Homewood found more data manipulation in 2013. Homewood remarked, “on average the mean temperatures in Jan 2014 were 2.7F less than in 1943. Yet, according to NOAA, the difference was only 0.9F…Somehow, NOAA has adjusted past temperatures down, relatively, by 1.8F.”

This is not the first time NOAA has manipulated data to prove a hypothesis. In 2015, NOAA published the Karl study that reportedly showed there was no “climate change hiatus” between 1998 and 2013. During this time frame, the rate of global temperature growth slowed, throwing a wrench in every climate model. The Karl study adjusted the data to show the warming had not decreased.

John Bates, a retired NOAA climate scientist, blew the whistle on the study accusing NOAA of, “flagrant manipulation of scientific integrity guidelines.” He went on to hint the study was rushed to publication, so it could have an impact on the 2015 Paris climate talks. You may remember the Paris Climate Agreement is an international agreement that does nothing for the environment.  However, it does put a stranglehold on the U.S. economy, because the U.S. government was the only government likely to enforce the harsh regulations against its citizens.

The situation has gotten so bad Congress has gotten involved. For over two years the House Committee on Science, Space, and Technology has been fighting with NOAA to get to the bottom of the data manipulation. NOAA has decided it is not going to cooperate with Congress and has fought oversight through the entire process. If they did nothing wrong and are proud of their work, what are they hiding?

As people sit back and try to figure out why data manipulation is crucial to them, they must realize policymakers and government bureaucrats are making decisions based on the manipulated data. When policy is enacted based on biased data grocery and fuel bills go up, electric and heating bills go up, and people are put out of work.

It is not enough to be wrong about almost every prediction since the 1970’s. The Church of Environmentalism has taken to flat out lying to reach its goals. Congress must continue to investigate NOAA and force the truth to come out. Scientists that manipulate data to fit a narrative are not scientists; they are committing fraud.


Polar bears are flourishing, making them phony icons, and false idols, for global warming alarmists

One powerful polar bear fact is slowly rising above the message of looming catastrophe repeated endlessly by the media: More than 15,000 polar bears have not disappeared since 2005. Although the extent of the summer sea ice after 2006 dropped abruptly to levels not expected until 2050, the predicted 67-per-cent decline in polar bear numbers simply didn’t happen. Rather, global polar bear numbers have been stable or slightly improved. The polar bear’s resilience should have meant the end of its use as a cherished icon of global warming doom, but it didn’t. The alarmism is not going away without a struggle.

Part of this struggle involves a scientific clash about transparency in polar bear science. My close examination of recent research has revealed that serious inconsistencies exist within the polar bear literature and between that literature and public statements made by some researchers. For example, Canadian polar bear biologist Ian Stirling learned in the 1970s that spring sea ice in the southern Beaufort Sea periodically gets so thick that seals depart, depriving local polar bears of their prey and causing their numbers to plummet. But that fact, documented in more than a dozen scientific papers, is not discussed today as part of polar bear ecology. In these days of politicized science, neither Stirling nor his colleagues mention in public the devastating effects of thick spring ice in the Beaufort Sea; instead, they imply in recent papers that the starving bears they witnessed are victims of reduced summer sea ice, which they argued depleted the bears’ prey. There are also strong indications that thick spring-ice conditions happened again in 2014–16, with the impacts on polar bears being similarly portrayed as effects of global warming.

The polar bear's resilience should have meant the end of its use as an icon of global warming doom

One reason that the 2007 predictions of future polar bear survival were so far off base is that the model developed by American biologist Steven Amstrup (now at Polar Bears International, an NGO) assumed any polar bear population decline would be caused by less summer ice, despite the Beaufort Sea experience. Moreover, Amstrup and fellow modelers were overly confident in their claim that summer ice was critical for the polar bear’s survival and they had little data on which to base their assumption that less summer ice would devastate the polar bears’ prey.

Consequently, many scientists were surprised when other researchers subsequently found that ringed and bearded seals (the primary prey of polar bears) north of the Bering Strait especially thrived with a longer open-water season, which is particularly conducive to fishing: These seals do most of their feeding in summer. More food for seals in summer means more fat seal pups for polar bears to eat the following spring, a result that’s probably true throughout the Arctic.

As long as polar bears have lots of baby seals to eat in spring, they get fat enough to survive even a longer-than-usual summer fast. And while it’s true that studies in some regions show polar bears are lighter in weight than they were in the 1980s, there is no evidence that more individuals are starving to death or becoming too thin to reproduce because of less summer ice.

Not all bears get enough to eat in the spring, of course. Starvation has always been the leading natural cause of death for polar bears, due to a number of factors including competition, injury, tooth decay and illness. Some cancers induce a muscle-wasting syndrome that leads to faster-than-usual weight loss. This is likely what happened to the emaciated Baffin Island bear captured on video in July 2017 and promoted by National Geographic late last year. The videographers claimed it showed what starvation due to sea-ice loss looked like — an implausible conclusion given the time of year, the isolated nature of the incident, and the fact that sea ice that year was no more reduced than previously.

That starving-bear video may have convinced a few more gullible people that only hundreds of polar bears are left in the world. But it also motivated others to locate the International Union for Conservation of Nature (IUCN) Red List report for 2015 that estimated global polar bear numbers at somewhere between 22,000-31,000, or about 26,000, up slightly from 20,000-25,000, or about 22,500, in 2005. Newer counts not included in the 2015 assessment potentially add another 2,500 or so to the total. This increase may not be statistically significant, but it is decidedly not the 67-per-cent decline that was predicted given the ice conditions that prevailed.

The failure of the 2007 polar bear survival model is a simple fact that explodes the myth that polar bears are on their way to extinction. Although starving-bear videos and scientifically insignificant research papers still make the news, they don’t alter the facts: Polar bears are thriving, making them phony icons, and false idols, for global warming alarmists.




Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here


Tuesday, February 27, 2018

NEW BOOK just out

The Politically Incorrect Guide to Climate Change by Marc Morano

It's a very comprehensive coverage of all the issues associated with the global warming theory

From the blurb:

Less freedom. More regulation. Higher costs. Make no mistake: those are the surefire consequences of the modern global warming campaign waged by political and cultural elites, who have long ago abandoned fact-based science for dramatic fearmongering in order to push increased central planning. The Politically Incorrect Guide to Climate Change gives a voice -- backed by statistics, real-life stories, and incontrovertible evidence -- to the millions of "deplorable" Americans skeptical about the multibillion dollar "climate change" complex, whose claims have time and time again been proven wrong.

The Russian encouragement and perhaps origin of the now discredited theory of "nuclear winter"

Matt Ridley

So, Russia does appear to interfere in western politics. The FBI has charged 13 Russians with trying to influence the last American presidential election, including the whimsical detail that one of them was to build a cage to hold an actor in prison clothes pretending to be Hillary Clinton.

Meanwhile, it emerges that the Czech secret service, under KGB direction, near the end of the Cold War had a codename (“COB”) for a Labour MP they had met and hoped to influence — presumably under the bizarre delusion that he might one day be in reach of power.

There is no evidence that Jeremy Corbyn was a spy, or of collusion by Trump campaign operatives with the Russians who are charged. Yet the alleged Russian operation in America was anti-Clinton and pro-Trump. It was also pro-Bernie Sanders and pro-Jill Stein, the Green candidate — who shares with Vladimir Putin a strong dislike of fracking.

The Keystone Cops aspects of these stories should not reassure. The interference by Russian agents in western politics during the Cold War was real and dangerous. A startling example from the history of science has recently been discussed in an important book about the origins of the environmental movement, Green Tyranny by Rupert Darwall.

In June 1982, the same month as demonstrations against the Nato build-up of cruise and Pershing missiles reached fever pitch in the West, a paper appeared in AMBIO, a journal of the Royal Swedish Academy of Sciences, authored by the Dutchman Paul Crutzen and the American John Birks. Crutzen would later share a Nobel prize for work on the ozone layer. The 1982 paper, entitled The Atmosphere after a Nuclear War: Twilight at Noon, argued that, should there be an exchange of nuclear weapons between Nato and the Soviet Union, forests and oil fields would ignite and the smoke of vast fires would cause bitter cold and mass famine: “The screening of sunlight by the fire-produced aerosol over extended periods during the growing season would eliminate much of the food production in the Northern Hemisphere.”

Alerted by environmental groups to the paper, Carl Sagan, astronomer turned television star, then convened a conference on the “nuclear winter” hypothesis in October 1983, supported by leading environmental and anti-war pressure groups from Friends of the Earth to the Audubon Society, Planned Parenthood to the Union of Concerned Scientists. Curiously, three Soviet officials joined the conference’s board and a satellite link from the Kremlin was provided.
In December 1983, two papers appeared in the prestigious journal Science, one on the physics that became known as TTAPS after the surnames of its authors, S being for Sagan; the other on the biology, whose authors included the famous biologists Paul Ehrlich and Stephen Jay Gould as well as Sagan. The conclusion of the second paper was extreme: “Global environmental changes sufficient to cause the extinction of a major fraction of the plant and animal species on Earth are likely. In that event, the possibility of the extinction of Homo sapiens cannot be excluded.”

Who started the scare and why? One possibility is that it was fake news from the beginning. When the high-ranking Russian spy Sergei Tretyakov defected in 2000, he said that the KGB was especially proud of the fact “it created the myth of nuclear winter”. He based this on what colleagues told him and on research he did at the Red Banner Institute, the Russian spy school.

The Kremlin was certainly spooked by Nato’s threat to deploy medium-range nuclear missiles in Europe if the Warsaw Pact refused to limit its deployment of such missiles. In Darwall’s version, based on Tretyakov, Yuri Andropov, head of the KGB, “ordered the Soviet Academy of Sciences to produce a doomsday report to incite more demonstrations in West Germany”. They applied some older work by a scientist named Kirill Kondratyev on the cooling effect of dust storms in the Karakum Desert to the impact of a nuclear exchange in Germany.

Tretyakov said: “I was told the Soviet scientists knew this theory was completely ridiculous. There were no legitimate facts to support it. But it was exactly what Andropov needed to cause terror in the West.” Andropov then supposedly ordered it to be fed to contacts in the western peace and green movement.

It certainly helped Soviet propaganda. From the Pope to the Campaign for Nuclear Disarmament to the non-aligned nations, calls for Nato’s nuclear strategy to be rethought because of the nuclear winter theory came thick and fast. A Russian newspaper used the nuclear winter to inveigh against “inhuman aspirations of the US imperialists, who are pushing the world towards nuclear catastrophe”. In his acceptance speech of the Nobel peace prize in 1985, the prominent Russian doctor Evgeny Chazov cited the Nobel committee's citation: "a considerable service to mankind by spreading authoritative information and by creating an awareness of the catastrophic consequences of atomic warfare". The statement continued: "...this, in turn, contributes to an increase in the pressure of public opposition".

“Propagators of the nuclear winter thus acted as dupes in a disinformation exercise scripted by the KGB”, concludes Darwall. We can never be entirely certain of this because Tretyakov’s KGB colleagues may have been exaggerating their role and he is now dead. But that the KGB did its best to fan the flames is not in doubt.

It soon became apparent that the nuclear winter hypothesis was plain wrong. As the geophysicist Russell Seitz pointed out, “soot in the TTAPS simulation is not up there as an observed consequence of nuclear explosions but because the authors told a programmer to put it there”. He added: “The model dealt with such complications as geography, winds, sunrise, sunset and patchy clouds in a stunningly elegant manner — they were ignored.” The physicist Steven Schneider concluded that “the global apocalyptic conclusions of the initial nuclear winter hypothesis can now be relegated to a vanishingly low level of probability”.

The physicists Freeman Dyson and Fred Singer, who would end up on the opposite side of the global-warming debate from Schneider and Seitz, calculated that any effects would be patchy and short-lived, and that while dry soot could generate cooling, any kind of dampness risked turning a nuclear smog into a warming factor and a short-lived one at that.

By 1986 the theory was effectively dead, and so it has remained. A nuclear war would have devastating consequences, but the impact on the climate would be the least of our worries.

The stakes were higher in the Cold War than today. The Soviet peace offensive secured the support of many western intellectuals and much of the media, and very nearly prevailed.


Delingpole: NOAA Caught Lying About Arctic Sea Ice

The Arctic is melting catastrophically! Sea ice levels are experiencing their most precipitous decline in 1500 years! Something must be done – and fast…

Well, so claims the National Oceanic and Atmospheric Administration (NOAA), and we know by now what that means, don’t we?

Yep: the Arctic sea ice is doing just fine. Yep: yet again, the NOAA is telling porkies.

As usual, Paul Homewood has got its number.

First, here’s what the NOAA is claiming, as relayed in a scaremongering piece at Vox:

The Arctic Ocean once froze reliably every year. Those days are over.

Arctic sea ice extent has been measured by satellites since the 1970s. And scientists can sample ice cores, permafrost records, and tree rings to make some assumptions about the sea ice extent going back 1,500 years. And when you put that all on a chart, well, it looks a little scary.

In December, NOAA released its latest annual Arctic Report Card, which analyzes the state of the frozen ocean at the top of our world. Overall, it’s not good.

“The Arctic is going through the most unprecedented transition in human history,” Jeremy Mathis, director of NOAA’s Arctic research program, said at a press conference. “This year’s observations confirm that the Arctic shows no signs of returning to the reliably frozen state it was in just a decade ago.”

Now, courtesy of Homewood, are the facts:

Sea ice in the Arctic is recovering after a period of decline:

Arctic sea ice is getting thicker:

Arctic temperatures now are no higher than in the 1930s and 1940s:

On longer timescales. there is nothing unusual about Arctic temperatures:

If you’re still worried that the Arctic is about to disappear, here are more papers confirming that Arctic sea ice is well within its normal range of variability.

One of them, Stein et al. argues that there is more Arctic sea ice now than there has been for most of the last 10,000 years:

If only liberals and greenies relied on media that give facts rather than narrative, eh?


Cheap energy forever: Permian’s mammoth cubes herald supersized shale future,/b>

‘Cube development,’ which taps multiple layers of shale all at once, could accelerate the U.S. shale boom and make the world swim in cheap and abundant energy for much of the next 250 years, as The GWPF reports.

In the scrublands of West Texas there’s an oil-drilling operation like few that have come before.

Encana Corp.’s RAB Davidson well pad is so mammoth, the explorer speaks of it in military terms, describing its efforts here as an occupation.

More than 1 million pounds of drilling rigs, bulldozers, tanker trucks and other equipment spread out over a dusty 16-acre expanse. As of November, the 19 wells here collectively pumped almost 20,000 barrels of crude per day, according to company reports.

Encana calls this “cube development,” and it may be the supersized future of U.S. fracking, says Gabriel Daoud, a JPMorgan Chase & Co. analyst who visited Davidson last year. The technique is designed to tap the multiple layers of petroleum-soaked rock here in Texas’s Permian shale basin all at once, rather than the one-or-two-well, one-layer-at-a-time approach of the past.

After a years-long land grab by explorers, “the Permian is graduating,” according to Daoud. “Now it’s all about entering manufacturing mode.”

With the new technique, Encana and other companies are pushing beyond the drilling patterns that dominated during the early, exploratory phases of the shale revolution. Now, operators are assembling projects with a dozen or more well bores that touch multiple underground layers of the Permian and other shale plays simultaneously, tapping the entire 3-D “cube” beneath a producer’s acreage.

The shift has been controversial, with some of the biggest names in oil shying away from the approach as too aggressive and expensive. But if proponents are right, the cube could accelerate a drilling boom that’s already helped push U.S. production past an historic 10 million barrels a day, rewriting the rules of global energy markets along the way.


Battery storage* in perspective - solving 1% of the problem

The energy world is fixated on the "huge" amounts of battery storage presently being installed to back up slowly-increasing levels of intermittent renewables generation. The feeling seems to be that as soon as enough batteries are installed to take care of daily supply/demand imbalances we will no longer need conventional dispatchable energy - solar + wind + storage will be able to do it all. Here I take another look at the realities of the situation using what I hope are some telling visual examples of what battery storage will actually do for us. As discussed in previous posts it will get us no closer to the vision of a 100% renewables-powered world than we are now.

*Note: "Battery storage" covers all storage technologies currently being considered, including thermal, compressed air, pumped hydro etc. Batteries are, however, the flavor of the moment and are expected to capture the largest share of the future energy storage market.

This post is all about the difference between pipe dreams and reality. Prof. Mark Jacobson of Stanford University et al. have just published a new study that responds to the critics of their earlier 2017 study. The new study is paywalled, but Stanford's press release describes the basic procedures used:

For the study, the researchers relied on two computational modeling programs. The first program predicted global weather patterns from 2050 to 2054. From this, they further predicted the amount of energy that could be produced from weather-related energy sources like onshore and offshore wind turbines, solar photovoltaics on rooftops and in power plants, concentrated solar power plants and solar thermal plants over time. These types of energy sources are variable and don't necessarily produce energy when demand is highest.

The group then combined data from the first model with a second model that incorporated energy produced by more stable sources of electricity, like geothermal power plants, tidal and wave devices, and hydroelectric power plants, and of heat, like geothermal reservoirs. The second model also included ways of storing energy when there was excess, such as in electricity, heat, cold and hydrogen storage. Further, the model included predictions of energy demand over time.

Scenarios based on the modeling data avoided blackouts at low cost in all 20 world regions for all five years examined and under three different storage scenarios.

What's the energy mix that leads to this happy ending in no fewer than 139 of the world's countries? The lead-in figure of Jacobson et al's 2017 report, reproduced below as Figure 1, tells us. Rounded off to the nearest percent it's 5% hydro + geothermal, 37% wind, 58% solar and not a kilowatt of nuclear.

In contrast to Jacobson et al, who compare this energy mix with computer-generated demand scenarios that foresee the replacement of fossil fuels with wind and solar somehow lowering demand by 42.5%, I have taken my usual approach of comparing an energy mix with real-life grid data, which raises the question of which real-life data to use. Well, Stanford University is in California, and I happen to have quite a lot of grid data from the California Independent System Operator (CAISO), so I used that. And California is also a good example to use because it's heavy into solar and battery storage, or at least would like to be.

So what’s the problem with energy storage in California? It’s widely perceived to be the now-famous California duck curve, which shows how rapidly increasing solar generation could within a few years increase afternoon ramp rates to the point where existing gas-fired and hydro balancing facilities are no longer able to handle them:

Figure 2: The California duck curve

But while this could indeed be a problem in the future it isn’t at the moment. We begin our analysis of the real-life CAISO grid data with Figure 3, which plots hourly generation against demand for three days in early March 2015. With the help of imports from surrounding states California had no difficulty matching generation to demand over this period, with most of the load-balancing handled by gas-fired generation:

Figure 3: Actual CAISO generation by source and demand (black), hourly data, March 3, 4 and 5, 2015

Figure 4 now shows what Figure 3 would have looked like with the Jacobson et al renewables generation mix (5% hydro+geothermal, 37% wind, 58% solar) in place. It looks more like the Shanghai skyline than a duck:

Figure 4: Generation and demand, hourly data, March 3, 4 and 5, 2015, Jacobson et al generation mix. Generation is scaled to match demand over the period.

In this case CAISO would have considerable difficulty balancing generation against daily demand, and since a) the imbalances are caused almost entirely by solar and b) when it’s dark in California it will be dark in the surrounding Western US states too there will be little or no surplus energy available. So balancing will have to be done by storing the daytime solar surpluses for re-use at night. How much storage would be needed over the three-day period considered? According to Figure 5, about 300 GWh, the equivalent of over 2,000 Big South Australian Batteries (BSABs):

This, of course, is not a real-life case. No sane grid operator, nor even the California state legislature, would allow imbalances and ramp rates of this magnitude to develop in the first place.......

As noted earlier this is not a real-life case, but should it wish to go 100% renewable California will clearly have a seasonal energy storage requirement which vastly exceeds its daily “duck curve” requirement. And what does California, which claims to be a world leader in energy storage, propose to do about it?

Well, in 2010 it passed an energy storage mandate, the wording in which (offpeak, peaking powerplants, peak load requirements) left little doubt that its basic intention was to flatten out the daily duck curve when more solar comes on line:

    SECTION 1.

    (b) Additional energy storage systems can optimize the use of significant additional amounts of variable, intermittent, and offpeak electrical generation from wind and solar energy
    (c) Expanded use of energy storage systems can (avoid or defer) the need for new fossil fuel-powered peaking powerplants
    (d) Expanded use of energy storage systems will reduce the use of electricity generated from fossil fuels to meet peak load requirements on days with high electricity demand

The mandate went on to confirm that this was indeed its intention by calling for 1.325 gigawatts of energy storage without specifying how many hours the gigawatts were to last for. Apparently this was unimportant. According to recent reports California is about to call for two gigawatts more “storage”, with gigawatt-hours again unspecified. It‘s questionable whether California even understands what energy storage is.

Now there’s no question that high levels of intermittent renewables generation will require fast-frequency-response capabilities to ensure grid stability during the day, but what is California doing about seasonal storage, which makes up 99% of its total storage problem?

Absolutely nothing. It has yet to recognize its existence.

And the same goes for everyone else, including the UK, where proposed revisions to the energy storage market concentrate almost entirely on “fast frequency response” (I remember reading somewhere that according to National Grid any storage exceeding 15 minutes in duration will be superfluous but can’t find the reference).

People may be wondering why I’ve been spending so much time recently writing about energy storage problems. Well, this is why. Go back to Figure 7 and imagine what it would look like with the little wiggles gone. To all intents and purposes it would look exactly the same. And these little wiggles are all the growing rush for battery storage is going to remove.

More HERE  (See the original for links, graphics etc.)



Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here


Monday, February 26, 2018

Climate Accord Nations Failing, Complaining and Buying Coal

The Paris climate accord, which the U.S. wisely vacated last year, accomplishes nothing in the way of meaningful environmental changes. One huge and inherent roadblock is that other nations are only halfheartedly and haphazardly invested. In fact, aside from the obvious fact that Barack Obama signed onto it unconstitutionally, that was one of conservatives’ biggest gripes against the Obama administration’s obsession with making the U.S. a captive of the agreement.

Not only is the accord a misnomer in that it won’t significantly alter future temperatures (realists rightfully doubt it will alter temperatures at all), but major pollution emitters other than the U.S. are far less inclined to clean up their act. The expectation of fecklessness by other nations wasn’t so much a prediction as an inevitability.

This week, a Washington Post story — “Countries made only modest climate-change promises in Paris. They’re falling short anyway.” — proves this is exactly the case. The articles says the persistence of deforestation in Brazil and the development of new coal plants in nations like Turkey and Indonesia are a few major reasons for the world’s “struggling to hit the relatively modest goals set in Paris.” In Germany, “The county’s emissions actually rose slightly in 2015 and 2016 because of continued coal burning and emissions growth in the transportation sector.”

With 2030 acting as the embryonic deadline for emissions targets, environmentalists are hoping that nations step up and push hard over the next 12 years to fulfill their obligations. As the Post notes, “The emissions-cutting pledges that countries brought to the table in Paris were nowhere near sufficient to meet such goals, which world leaders acknowledged at the time. The plan was for nations to ramp up their ambition over time.”

However, it continues, “By 2020, countries are expected to actually ramp up the promises they made in Paris.” This is a pipe dream. These nations were never expected to actually keep their promises, much less take initiative by going the extra mile. What makes anyone think they’ll change their ways in a few years?

Foreign nations can certainly be criticized for expecting the Paris climate accord to actually accomplish anything, not to mention their hypocrisy on the matter. But it’s not unreasonable for nations to put their interests ahead of a fairy tale accord. For example, The Washington Times reports: “As France, Germany and Italy chastised President Trump for rejecting the Paris climate accord in June and mocked the U.S. for turning its back on the environment, their nations were busy importing record amounts of American coal.”

An additional 95 million short tons of coal were shipped out of the U.S. last year. According to the Times, “About 31 million short tons of that went to Asia, nearly double the amount from 2016. China alone imported 2.8 million short tons through September 2017 — a wild increase over the previous year’s 205,000. Total exports to Europe reached 40 million short tons — 13 million more than in 2016.”

The U.S., thanks to Donald Trump, is making its economic interests a top priority by producing and exporting more energy resources. Meanwhile, other nations that greatly need them are happily taking it off our hands. It’s a win-win situation that benefits each nation. The bottom line? While the results are antithetical, contradictory and hypocritical regarding the rhetoric we’re hearing from “environmental leaders,” it demonstrates why the Paris climate accord will never work.


Germany Had to Ground Its 'Green' Luftwaffe

Too much biodiesel in the fuel mix leaves our NATO ally grounded and way behind schedule.   

How’s that “green” fuel initiative by the United States military and our allies working out? Not too well. In fact, one NATO ally has seen its military readiness take a huge hit as a result of placing being “green” over being ready for war.

According to a report by UK Defence Journal, the German Luftwaffe’s force of Tornado IDS strike aircraft has been grounded. The reason? Too much biodiesel in the fuel mix. As a result, these potent strike aircraft are out of action until their fuel tanks can be flushed, new-pilot training is now three months behind schedule, and the Germans may not be able to lead the NATO force slated to counter Russian aggression, the Very High Readiness Joint Task Force, next year.

Now, “green fuel” from various sources (anything from beef fat to plants) can be useful as a reserve in case of a disruption in the supply of oil. But when jet fuel costs almost $30 a gallon or can only use a small amount of biofuel because it would break the bank, using it regularly is pretty stupid.

But there’s another “green” fuel with no carbon footprint that could be very useful here. That’s nuclear power, and it’s already used on the aircraft carriers and submarines of the United States Navy. What you may not remember is that it also was once used to power the Navy’s nine cruisers.

Perhaps a good idea might be to develop two classes of nuclear escorts for the nuclear-powered carriers: One would be an aerospace-defense cruiser loaded with Mk 41 vertical-launch cells — something at least the size of the one-of-a-kind USS Long Beach (CGN 9). The other would be a general-purpose escort — think something like an updated California-class guided-missile cruiser (originally designated a guided-missile destroyer leader).

Doing this could be a start in helping to free up some of the fuel resources. In 1962, the Navy used Task Force One to go around the world in two months without re-fueling. That’s not a bad thing.


Britain and Europe must ban palm oil in biofuel to save forests, EU parliament told

If Britain and other European nations are to fulfil forest protection goals, they must ban the use of palm oil for biofuel and tighten oversight of supply chains, a delegation of forest peoples told parliamentarians this week.

The call for urgent, concrete action comes amid an increasingly heated diplomatic row over the issue between the EU and the governments of major palm-producing nations such as Indonesia, Malaysia and Costa Rica.

The European parliament voted last April to prohibit sales of biofuels made from vegetable oils by 2020 in order to meet its climate goals. This was followed by a related vote last month. Whether and how this might be implemented is now being considered by the European Commission and member states.

The pushback has been strong, particularly in south-east Asia, the origin of 90% of the world’s palm oil exports, which is used in hundreds of supermarket products. Palm oil can also be blended with diesel to power engines, which is what the ban would halt.

Influential politicians in these countries, many of whom are closely linked to the industry, accuse the EU of trade protectionism, colonial thinking and undermining poverty reduction efforts. Malaysia’s plantations minister described the proposed ban as “crop apartheid.”

But indigenous and other communities who are negatively affected by the plantations urge the EU to push ahead with the ban and to go further by tightening other supply chain controls to prevent damage to their land, rights and environment.

Franky Samperante, a founder of the indigenous peoples’ organisation Pusaka, said the Indonesian government had granted concessions to more than 50 companies to open plantations on 1.2m hectares of land claimed by local communities. For him, any palm oil from this area should be considered a conflict product and prohibited from sale in Europe.

“There should be sanctions. If not, there is no point,” he said.

Samperante is part of a group of 14 forest peoples representatives from 11 nations in Asia, Africa and Latin America visiting Europe this week to lobby for a new action plan on sustainable supply chains.

The delegation proposed concrete steps, including for European nations to establish sustainable trade ombudsmen to look into reports of human rights and environmental violations, and for companies to adopt binding human rights policies rather than voluntary actions. Their call was supported by a coalition of environmental NGOs including the Forest People’s Programme, Global Witness, Greenpeace, WWF and the Environmental Investigation Agency.

Tom Griffiths, the author of , said lofty goals to protect forests were being undermined by a failure to protect the rights of those who live in them.

“There are so many pledges and commitments by companies and government that sound good on paper, but the reality on the ground is starkly different,” he said. “At the meetings this we, they are all saying close the gap.”

Their recommendations will be presented at a multilateral meeting in Paris in June, when the French president, Emmanuel Macron, is expected to launch his strategy for “deforestation-free trade”.


The RFS has bankrupted its first refinery, more to follow

By Printus LeBlanc

Americans for Limited Government has been warning of the impending bankruptcies in the petroleum refining industry for some time now. Well, the first canary, Philadelphia Energy Solutions (PES), has died, and the only question left is how many more will die before the Environmental Protection Agency (EPA) and Congress wake up.

In 2005, Congress passed, and President Bush signed the Energy Policy Act of 2005. Among the many new regulations created in the legislation, the Renewable Fuel Standard (RFS) was birthed. The RFS mandated a certain amount of renewable fuels, mostly corn ethanol, be blended with gasoline. The amount was 4 billion gallons in 2006 with a rise to 7.5 billion in 2012.

In 2007, the Energy Independence and Security Act of 2007 was passed. The bill increased the amount of renewable fuel to be blended. It required 9 billion gallons be blended in 2008 with an increase to 36 billion gallons in 2022. Of course, this made the subsidy loving corn growers extremely happy. The federal government was now mandating citizens purchase their product. And we wonder where they got the idea for the Obamacare individual mandate.

To track the renewable fuel usage, Renewable Identification Numbers (RIN) were created. A RIN is a string of numbers and letters used to identify each batch of biofuel produced. The RINs count towards the Renewable Volume Obligation (RVO), an amount designated to each refinery by the EPA. The RINs are the problem.

When the EPA instituted the program, it believed the costs would only be a few cents per RIN. As usual, when the federal government gets involved the costs spiraled out of control. Wall Street speculators now routinely drive up the prices. In one seven-month period in 2013, the value of one RIN went from 7 cents to $1.43. The volatility is creating economic hardships for refiners across the nation and has caused the largest refinery on the east coast to declare bankruptcy.

PES filed for Chapter 11 bankruptcy protection in January with over an estimated $600 million in debts. The company owns the largest refinery on the East Coast with the capability to refine 335,000 barrels per day.  In the bankruptcy filing, PES stated the second largest expenditure behind crude oil was RINs, spending $218 million on the imaginary numbers in 2017.

Mixing ethanol and gasoline is not as easy as it sounds. A refinery, like the PES facility, cannot combine the two ingredients at the plant. Because ethanol degrades the mixture over time, there is a relatively short shelf life once the two chemicals are mixed, around three months. For this reason, the ethanol is mixed in at the point of sale to the consumer.

This does not have an impact on refineries that own gas stations. Several of the larger companies like Exxon and Saudi Aramco also own gas stations across the country. What are small and medium-sized refiners to do, go out and spend millions to purchase gas stations?

Of course, King Corn could care less about the companies going bankrupt because they are being forced to buy their product. All King Corn cares about is making sure the government mandate stays in place, regardless of the outcome to the consumers or refiners.

However, there is a middle ground everyone can agree on. Allowing RINs attached to exported biofuels to be counted towards the RVO benefits almost everyone:

The refiners no longer must pay twice for RINs;

The corn producers still produce the same amount of corn, and will have greater access to overseas markets;

Increases American exports;

EPA Administrator Pruitt must act quickly. The first canary in the ethanol corn maze is dead. The next one is likely to come from Delaware. It is time to reform the RFS and get the government out of picking winners and losers. The RINs system must be updated.


Rainfall’s Natural Variation Hides Climate Change Signal

New research from The Australian National University (ANU) and ARC Centre of Excellence for Climate System Science suggests natural rainfall variation is so great that it could take a human lifetime for significant climate signals to appear in regional or global rainfall measures.

Even exceptional droughts like those over the Murray Darling Basin (2000-2009) and California (2011 to 2017) fit within the natural variations in the long-term precipitation records, according to the statistical method used by the researchers.

This has significant implications for policymakers in the water resources, irrigation and agricultural industries.

“Our findings suggest that for most parts of the world, we won’t be able to recognize long-term or permanent changes in annual rainfall driven by climate change until they have already occurred and persisted for some time,” said  Professor Michael Roderick from the ANU Research School of Earth Sciences.

“This means those who make decisions around the construction of desalination plants or introduce new policies to conserve water resources will effectively be making these decisions blind.

“Conversely, if they wait and don’t act until the precipitation changes are recognized they will be acting too late. It puts policymakers in an invidious position.”

To get their results the researchers first tested the statistical approach on the 244-year-long observational record of precipitation at the Radcliffe Observatory in Oxford, UK. They compared rainfall changes over 30-year-intervals. They found any changes over each interval were indistinguishable from random or natural variation.

They then applied the same process to California, which has a record going back to 1895, and the Murray Darling Basin from 1901-2007. In both cases, the long dry periods seem to fit within expected variations.

Finally, they applied the process to reliable global records that extended from 1940-2009. Only 14 percent of the global landmass showed, with 90 percent confidence, increases or decreases in precipitation outside natural variation.

Professor Graham Farquhar AO also from the ANU Research School of Biology said natural variation was so large in most regions that even if climate change was affecting rainfall, it was effectively hidden in the noise.

“We know that humans have already had a measurable influence on streamflows and groundwater levels through extraction and making significant changes to the landscape,” Professor Farquhar said.

“But the natural variability of precipitation found in this paper presents policymakers with a large known unknown that has to be factored into their estimates to effectively assess our long-term water resource needs.”




Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here


Sunday, February 25, 2018

World's coral reefs face new peril from beneath within decades (?)

This is just a new variation on an old fraud.  For the ocean to become more acidic it has to absorb more CO2 and thus produce carbonic acid (H2O + CO2 = H2CO3). And as CO2 levels rise, that might happen to some degree.

But according to Warmist theory higher CO2 levels will bring higher temperatures.  But higher ocean temperatures will REDUCE the carrying capacity of the oceans for CO2.  So CO2 will OUTGAS from the oceans under higher temperatures and the oceans will be LESS acidic.

So if the galoots below really believed in global warming they would welcome it as REDUCING the threat to corals.

So there is a small potential threat to corals from higher CO2 levels but it will only eventuate if there is NO global warming. Fun?

The world's coral reefs, already enduring multiple threats from bleaching to nutrient run-off from farming, also face another challenge - this time from below.

New research, published in the journal Science on Friday, has found the sediments on which many reefs are built are 10 times more sensitive to the acidifying oceans than the living corals themselves. Some reef bases are already dissolving.

The study used underwater chambers at four sites in the Pacific and Atlantic oceans, including Heron Island in the Great Barrier Reef, and applied modelling to extrapolate results for 22 reefs in three ocean basins.

As oceans turn more acidic, the corals themselves produce less of the calcium carbonate that forms their base. Instead of growing, the reef bases start to dissolve.

"The public is less aware of the threat of ocean acidification [than warming waters]," said Brendan Eyre, a professor of biogeochemistry at the Southern Cross University and the paper's lead author.

“Coral reef sediments around the world will trend towards dissolving when seawater reaches a tipping point in acidity - which is likely to occur well before the end of the century,” he said.

At risk will be coral reef ecosystems that support tourism, fisheries and the many other human activities, he said.

The ocean's acidity has increased about 30 per cent since the start of the industrial revolution, as seas absorb about one-third of the build-up of greenhouse gases in the atmosphere.

“It is vital that we put pressure on governments globally to act in concert to lower carbon dioxide emissions as this is the only way we can stop the oceans acidifying and dissolving our reefs,” Professor Eyre said.

Rates of dissolving reef sediment will depend on their starting points, including their exposure to organic sediment. The Hawaiian reef studied is already showing signs of its sediment dissolving, with higher organic nutrient levels likely to be contributing, he said.

"Carbonate sediments in Hawaii are already net dissolving and will be strongly net dissolving by the end of the century," the paper said.

Living corals themselves appear to be able to resist the acidification process, with mechanisms and strategies to resist some of the impacts.

Still, the study said the transition of the dissolution of reef sediment "will result in the loss of material for building shallow reef habitats such as reef flats and lagoons, and associated coral cays". It is unknown if the reefs will face "catastrophic destruction" once the erosion begins, the paper said.

Over time, as coral bases begin to dissolve, they are more likely to become more vulnerable to cyclones and other threats, Professor Eyre said.

He said further study was needed to understand how reefs would be affected by temperatures, rising organic and nutrient levels and more acidic waters in combination, he said.

The impact of bleaching - such as the two mass events in the 2015-16 and 2016-17 summers on the Great Barrier Reef - would most likely accelerate the breakdown of reefs by "making more sediment and organic matter available for dissolution", the paper said.


Groupthink On Climate Change Ignores Inconvenient Facts

Christopher Booker

Since we’ve now been living with the global warming story for 30 years, it might seem hard to believe that science could now come up with anything that would enable us to see that story in a wholly new light.

But that is what I am suggesting in a new paper, just published in the UK by the Global Warming Policy Foundation, thanks to a book called Groupthink, written more than 40 years ago by a professor of psychology at Yale, Irving Janis.

What Janis did was to define scientifically just how what he called groupthink operates, according to three basic rules. And what my paper tries to show is the astonishing degree to which they explain so much that many have long found puzzling about the global warming story.

Janis’s first rule is that a group of people come to share a particular way of looking at the world which may seem hugely important to them but which turns out not to have been based on looking properly at all the evidence. It is therefore just a shared, untested belief.

Rule two is that, because they have shut their minds to any evidence which might contradict their belief, they like to insist that it is supported by a “consensus”. The one thing those caught up in groupthink cannot tolerate is that anyone should question it.

This leads on to the third rule, which is that they cannot properly debate the matter with those who disagree with their belief. Anyone holding a contrary view must simply be ignored, ridiculed and dismissed as not worth listening to.

What my paper does is look again at the entire global warming story in the light of Janis’s rules, and to show how consistently they explain so much of the way it has unfolded all the way through.

The alarm over man-made climate change was first exploded on the world in 1988 by a tiny group of scientists who had become convinced that, because both CO2 levels and global temperatures were rising, one must be the cause of the other. Unless something very drastic was done, they urged, the planet was heading for catastrophe.

In November that year, two of these fervent believers in what they called “human-induced climate change” were authorized to set up the UN’s Intergovernmental Panel on Climate Change, the IPCC. This would report to the world’s politicians on the basis of computer models programmed, according to their theory, to predict just how fast the world was likely to heat up over the next 100 years.

With startling speed, their theory was soon proclaimed as being supported by a scientific “consensus”, backed by governments, all the main scientific journals and institutions, environmental pressure groups and the media.

In fact right from the start, many scientists, like the eminent physicist Richard Lindzen of MIT, were highly skeptical, both of the theory itself and of those computer models. These, as Lindzen wrote, were so narrowly focused on CO2 that they were far too simplistic to allow for all the other natural factors which shape the earth’s climate.

But such dissenters were ignored. And for nearly 20 years the “consensus” rolled on, ever more extreme in its apocalyptic claims, with each new IPCC report scarier than the last. By 2006 Al Gore’s An Inconvenient Truth was outdoing them all.

Anyone daring to question the “consensus” was now being vilified as just an “anti-science denier”, no better than those crazies who deny the reality of the Nazi Holocaust.

Just then, however, the story was beginning to change. It was noted that, since the abnormally hot year of 1998, caused by a record El Nino, global temperatures had not risen at all. Those computer models had not predicted this.

Even more significant, thanks to the internet, expert science blogs were now appearing, able to show that not a single one of the claims from the “consensus” – vanishing Arctic ice, disappearing polar bears, unprecedented hurricanes, floods, droughts etc – was supported by the factual evidence.

By 2009, the “consensus” was facing considerable embarrassment, with the highly damaging Climategate emails between the little group of scientists at the heart of the IPCC, followed by the collapse in disarray of the great Copenhagen climate conference.

Then there was the spate of scandals surrounding the IPCC itself when it was revealed some of the scariest predictions of its latest report had not been based on proper science at all, but only on more hysterical claims by climate activists.

Finally, in Paris in 2015, came what I describe as the crux of the whole story. This was yet another great global conference to decide what the world must do to avert catastrophe.

Every nation had been asked in advance to submit its energy plans for the years up to 2030. The West, led by President Obama and the EU, dutifully pledged that it would be cutting its “carbon emissions” by up to 40 percent.

But from the rest of the world, a totally different story emerged. China, by now the world’s largest CO2 emitter, was planning to build so many new coal-fired power stations that by 2030 its emissions would have doubled. India, the third largest emitter, was planning to triple them. Altogether global emissions by 2030 were set to rise by a staggering 46 percent.

The rest of the world was just giving two fingers to the “consensus”, and planning to carry on regardless, But not one Western leader mentioned this until 2017 when President Trump gave it as his reason for pulling the US out of that meaningless “Paris Accord”.

In effect, Trump was thus finally calling the bluff of the groupthink which for 30 years had driven the whole global warming scare. If other Western countries wanted to commit economic suicide, that was their affair. But the rest of the world was no longer taken in by it, and the US was now with them.


2 More New Papers Affirm There Is More Arctic Ice Coverage Today Than During The 1400s

Earlier this year, Stein et al., 2017 published a reconstruction of Arctic sea ice variations throughout the Holocene that appeared to establish that there is more Arctic sea ice now than for nearly all of the last 10,000 years.

The study region, the Chukchi Sea, was deemed representative of most of the Arctic, as the authors asserted that “the increase in sea ice extent during the late Holocene seems to be a circum-Arctic phenomenon as PIP25-based sea ice records from the Fram Strait, Laptev Sea, East Siberian Sea and Chukchi Sea  display a generally quite similar evolution, all coinciding with the decrease in solar radiation.”

The proxy data used to reconstruct Arctic-wide sea ice variations over the Holocene (PIP25) clearly show that modern sea ice extent has only modestly retreated relative to the heights reached during the Little Ice Age (the 17th and 18th centuries),  and that the from about 1400 A.D.on through the rest of the 10,000-year-long Holocene, Arctic sea ice extent was much lower than it is today.

In 2014, Dr. Qinghua Ding and colleagues published a consequential paper in the journal Nature contending that much of the warming trend in the Arctic since 1979 can be traced to “unforced natural variability” rather than anthropogenic forcing.

“A substantial portion of recent warming in the northeastern Canada and Greenland sector of the Arctic arises from unforced natural variability.”

Then, a few months ago, Dr. Ding and co-authors published another Nature paper (Ding et al., 2017) that extended  a natural attribution to trends in Arctic sea ice variability, concluding that as much as half of the decline in Arctic sea ice since 1979 is due to internal (natural) factors, further undermining the position that anthropogenic forcing dominates Arctic sea ice changes.

“Internal variability dominates the Arctic summer circulation trend and may be responsible for about 30–50% of the overall decline in September sea ice since 1979.”
Within the last month, two more papers have been published that further affirm the conclusion that modern Arctic sea ice extent has not changed significantly relative to even the last few centuries, nor has it fallen outside the range of natural variability.

1. Like Stein et al. (2017), Yamamoto et al., 2017 largely attribute Holocene sea ice concentration variations to solar forcing, and they assemble a reconstruction of sea ice trends for the region that once again clearly shows sea ice coverage is greater now than it has been for almost all of the Holocene.

“Millennial to multi-centennial variability in the quartz / feldspar ratio (the BG [Beaufort Gyre] circulation) is consistent with fluctuations in solar irradiance, suggesting that solar activity affected the BG [Beaufort Gyre] strength on these timescales. … The intensified BSI [Bering Strait in-flow] was associated with decrease in sea-ice concentrations and increase in marine production, as indicated by biomarker concentrations, suggesting a major influence of the BSI on sea-ice and biological conditions in the Chukchi Sea. Multi-century to millennial fluctuations, presumably controlled by solar activity, were also identified in a proxy-based BSI record characterized by the highest age resolution. … Proxy records consistent with solar forcing were reported from a number of paleoclimatic archives, such as Chinese stalagmites (Hu et al., 2008), Yukon lake sediments (Anderson et al., 2005), and ice cores (Fisher et al., 2008), as well as marine sediments in the northwestern Pacific (Sagawa et al., 2014) and the Chukchi Sea (Stein et al., 2017).”

2. In another new paper, Moffa-Sánchez and Hall, 2017  analyze subpolar temperature changes, glacier advances and declines, and sea ice variations in the Labrador Sea, North Atlantic, North Iceland, Alaska, Swedish Lapland, and Northwestern Europe region.

“Paleoceanographic reconstructions from a more northward location of the polar front on the North Iceland margin show centennial-scale cold events and marked increases in sea ice with similar timing to the cold events recorded in the eastern Labrador Sea.  … The records from the northernmost sites show a linear cooling trend perhaps driven by the Neoglacial decrease in summer insolation in the northern high latitudes and its effects on Arctic sea ice production. “

“Periods of increased influence of polar waters in the eastern Labrador Sea, reduced LSW  [Labrador Sea Water] formation and weaker subpolar gyre largely coincide with well-established cold periods recorded in glacier advances, tree-ring and pollen records in the circum-North Atlantic and northwest Europe [Dark Ages Cold Period, Little Ice Age]. … Conversely, periods of reduced influence of polar waters in the eastern Labrador Sea, stronger subpolar gyre and increase LSW [Labrador Sea Water] formation largely coincide with mild/warm periods in Europe namely the Roman Warm Period and the Medieval Climatic Anomaly.”

The authors find that while Arctic sea ice coverage was more advanced during the Little Ice Age, sea ice concentrations in the waters north of Iceland were far lower than now from about 500 years ago onward, especially during the centuries encompassing the Medieval Warm Period (or Medieval Climate Anomaly) and Roman Warm Period.

Glacier advance and retreat for the Alaska and Swedish Lapland regions also followed the climate trends associated with the Little Ice Age, Medieval Climate Anomaly, Dark Ages Cold Period, and Roman Warm Period.   During the earlier warm periods and for most of the last 3,000 years, glacier recession was more pronounced than it is now.

Moffa-Sánchez and Hall (2017) also report that sea surface temperatures north of Iceland were much warmer in the past than they are now.

Finally, the 10-150 m layer of the Labrador Sea  has also not undergone any net warming trend in the last 75 years.


Terence Corcoran: Polar bear battle in Toronto! It’s good science vs. climate do-gooders

Two events next week juxtapose two conflicting conclusions on the current health and future for polar bears. Behind the science, there’s also a juicy personal clash

Are the great, charismatic polar bears, all white, cuddly-looking and dangerous, caught in the death grip of climate change

Coming next Tuesday to Toronto’s swanky Yorkville district, it’s the 2018 Polar Bear Showdown, an international display of conflicting views on the state of polar-bear science. Are the great, charismatic creatures, all white, cuddly-looking and dangerous, caught in the death grip of climate change?

At one corner in Yorkville, in the ballroom of the upmarket Four Seasons Hotel, Polar Bears International (PBI) will stage a grand, $15,000-a-table gala to raise funds to protect the allegedly threatened Arctic species from the ravages of our addiction to fossil fuels. Sponsored by a klatch of corporate goody-two-shoes — a couple of Canadian banks, a major accounting outfit, The Globe and Mail — and filled with razzle-dazzle entertainment and good food, the purpose of the event is to mark International Polar Bear Day and draw attention to PBI’s science-based effort to sound a global polar-bear alarm.

At another corner, exactly one block away, in the Founders’ Room at the down-market Toronto Reference Library, the Global Warming Policy Foundation of London, England will launch a new report on the state of polar bears by Susan Crockford, adjunct professor at the University of Victoria. There will be no entertainment, and no food, but the science will be far superior.

As a science showdown, the Yorkville events juxtapose two conflicting conclusions on the current health and future prospects for polar bears amid climate change. Behind the science, there’s also a juicy personal clash.

There will be no entertainment, and no food, but the science will be far superior
The chief scientist at Polar Bears International is Steven Amstrup, adjunct professor at the University of Wyoming and a leading purveyor of the theory that climate change could exterminate polar bears from the Arctic regions. In recent months, Amstrup has launched direct attacks on Crockford and joined others in producing what can only be described as junk-science attempts to undermine her polar-bear research. In return, Crockford recently published a critique of Amstrup’s decades-long campaign to portray polar bears as an endangered species and establish them as the poster-species for climate change.

Crockford’s conclusion is that PBI’s chief scientist and prime motivational guide, whose biographic page contains a catalogue of polar-bear alarmism, spent more than a decade creating a media scare that drove many (including Al Gore) to believe in a threat that didn’t exist. As Crockford wrote in a posting on her blog last month: “Polar bear experts who falsely predicted that roughly 17,300 polar bears would be dead by now (given sea ice conditions since 2007) have realized their failure has not only kicked their own credibility to the curb, it has taken with it the reputations of their climate change colleagues.”

Crockford’s new paper is aimed at a wide audience of teachers, scientists, students, decision-makers and the general public. It should be required reading for attendees at the Polar Bear Day gala. An executive summary of the report, State of the Polar Bear Report 2017, says that global polar-bear numbers have been stable or have risen since 2005, despite lower summer sea ice levels: “Overly pessimistic media responses to recent polar bear issues have made heartbreaking news out of scientifically insignificant events.”

As of this writing, one of those insignificant heartbreaking events — the video of a lone and apparently starving polar bear — adorns PBI’s website and serves as part of the sales pitch for next Tuesday’s gala in Yorkville. The video went viral in December, but has since been widely criticized. As veteran British environment writer Fred Pearce wrote recently in New Scientist magazine: “Emaciated, it stumbled across a green Arctic landscape without a speck of snow or ice in sight …Media outlets seized on the video as an example of how climate change is killing its poster child. But behind the headlines is an awkward question: have climate change activists chosen the wrong mascot?”

Pearce notes that the theory of looming polar-bear extinction has proved wrong. With rising temperatures in the Arctic and less ice “the polar bear population should have crashed. It hasn’t. If anything, numbers are up compared with 10 years ago.” Population numbers are also up since 1973, when hunting bans were put in place. While Pearce still sees the bears at some risk from a variety of threats, current estimates suggest “the species is not at immediate risk of extinction.”

Another recent commentary makes a similar point. In a release summarizing a recent polar-bear conference in Fairbanks, Alaska, an organization funded by the Russian Geographical Society quotes a Russian conservation official, Yegor Vereshchagin, on the fate of polar bears in Russia’s Chukotka region, across the Bering Sea from Alaska. “Both scientific data and traditional knowledge prove that nothing threatens our bears. During spring counts of dens we often find female bears with three cubs, which proves that the population is in good shape and there is no danger of a decrease in the population.”

Surely the attendees, corporate sponsors and organizers of that big Yorkville gala will find it instructive if they were to download Crockford’s paper when it is released by the Global Warming Policy Foundation next Tuesday, a few hours before their ritzy event. They will no doubt be thrilled by the good news. Maybe one of them will grab the mic that night and propose a toast: “Here’s to the polar bears, who are doing great!”


Deregulate Australian energy market and go back to coal

The catastrophic outcome of government energy market interventions is palpably clear. As the latest new regulatory body, the Energy Security Board, diplomatically puts it: “Fifteen years of climate policy instability … (have) left our energy system vulnerable to escalating prices while being both less reliable and secure.”

Australia has seen electricity prices double since 2015 and the once reliable supply is now suspect. From enjoying the world’s lowest cost electricity a decade ago, Australia now has among the most expensive.

The main cause has been subsidies and regulatory favours to renewable energy — chiefly wind — that have forced the closure of reliable coal-fired generators, particularly Northern in South Australia and Hazelwood in Victoria. Without these subsidies, costing about $5 billion a year, there would be no wind or solar. Not only are customers and taxpayers slugged with the subsidy costs but the outcome also has been to raise prices and reduce reliability.

A new Australian coal plant would produce electricity at about $50 a megawatt hour. A new wind farm can produce electricity, at best, at $110/MWh and its present subsidy is about $85/MWh. Solar is about twice the cost of wind

Fundamentally, the cost disadvantage of wind and solar stems from their low “energy density”. To get the equivalent energy from a standard 500MW coal generation unit requires 300 wind generators or 900,000 solar panels, and storage or back-up capacity is required to offset the inherent unreliability of energy sources dependent on the vagaries of the weather. Energy Minister Josh Frydenberg put the cost of this at $16/MWh, an optimistic estimate even with the government’s 23.5 per cent renewable target.

Wind farm entrepreneur Simon Holmes a Court recently argued on this page that the world is abandoning coal for electricity generation. Australia’s booming coal exports testify to the ludicrous nature of such statements. In fact, according to Greenpeace’s data, China has 300,000MW of new coal plant under way, increasing its capacity by a third; Japan has 20,000MW, which also would raise capacity by a third; while India has plans for an additional 148,000MW, adding 65 per cent to its capacity. Australian coal generating capacity is about 25,000MW.

The US has no new coal generators planned. This is partly a legacy of Barack Obama, who declared his policies would bankrupt any new coal generators, and partly because of the US boom in gas and oil production. Due to fracking, a technology largely banned in Australia, the US has gas at less than half the Australian price, making it cheaper than coal for new electricity generation.

Holmes a Court was correct in drawing attention to the costly failures of “carbon capture and storage”, the global propaganda arm for which is largely financed by the Australian government, and of high-energy, low-emissions coal power stations. These technologies reduce carbon dioxide emissions but involve add-on costs.

The Minerals Council of Australia, anxious to retain the support of BHP, has promoted low-emission technologies. For internal reasons, BHP supports renewables and opposes coal generation in Australia notwithstanding its dependence on international coal sales and cheap energy generally. The firm’s promotion of renewable energy confronted the reality of this with high fuel costs for its Olympic Dam mine in wind-dependent South Australia. It also took a $137 million hit from the 2016 wind-induced collapse of SA’s power system.

Many firms support renewable policies out of self-interest. Revenue from subsidies is itself valuable and, in addition, coal generators, as Origin Energy’s half-year results last week showed, are earning huge profits from the doubled wholesale price. Others are conscripted to support renewables for PR reasons, as part of what German political scientist Elisabeth Noelle-Neumann has called a “spiral of silence”, where a loud and confident group is perceived to be majority opinion, leading others to acquiesce in much of its message.

The ESB has been tasked with creating an electricity market blueprint that marries lower carbon dioxide emissions with lower costs and greater reliability. This is an impossible task and would require massive new regulatory interventions.

The ESB’s proposals envisage creating a market combining emissions and energy in which every retailer and generator would need to participate. They would add new dimensions of complexity to electricity supply, bringing a further proliferation of administrative resources within the bureaucracy and the industry.

Envisaging such further controls as bringing improved efficiency represents a triumph of hope over experience. We can restore our latent competitiveness in cheap energy only by abandoning all the intrusions and distortions that are in place. Donald Trump has achieved success from such an approach and we may have to await full recognition of this before our politicians adopt similar deregulatory policies.




Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here