Thursday, March 22, 2018

When will the US feel the heat of global warming? For the Great Plains, natural variability will dominate until late this century

Even Warmists are noting that the USA hasn't warmed significantly

By increasing the energy stored in our atmosphere, climate change is expected to generate more severe storms and heat waves. Severe storms and heat waves, however, also happen naturally. As a result, it's tough to figure out whether any given event is a product of climate change.

A corollary to that is that detecting a signal of climate change using weather events is a serious challenge. Are three nor'easters in quick succession, as the East Coast is now experiencing, a sign of a changing climate? Or is it simply a matter of natural variability?

A team of researchers has now looked at heat waves in the US, trying to determine when a warming-driven signal will stand out above the natural variability. And the answer is that it depends. In the West, the answer is "soon," with climate-driven heat waves becoming the majority in the 2020s. But for the Great Plains, the researchers show that a specific weather pattern will push back the appearance of a warming signal until the 2070s.

Finding the heat

The study, performed by researchers at three different institutions in Florida, focuses on what they term the Time of Emergence, which they define as the point when "the signal of anthropogenic climate change will emerge against the background natural variability." For this work, they focused on heat waves, which they defined as an extended period of time with temperatures 5 degrees Celsius or more above the typical temperature.

They started out by analyzing historic events, using the temperature records from 1920-2000. They found that there were regions where heat waves tended to cluster. These included the West Coast, Southern Great Plains, Northern Great Plains, and the Great Lakes (they found eight in total). While many of these regions partially overlap, a heat wave that affected one of them typically did not affect any of the others, suggesting they were driven by independent weather patterns. The four mentioned above affect the largest portions of the US population and so were chosen for further analysis.

From here, the analysis is pretty straightforward. The authors used a collection of climate models to examine the frequency of heat waves for the remainder of the present century (2020-2100) under a high emissions scenario. While nations have committed to reducing emissions, this scenario would reflect a continuation of our current trajectory. The frequency of extreme events that showed up in the models was then compared to their frequency in the historic record.

For areas like the Great Lakes and the US West, the results were about what you'd expect: with continued climate change, both the frequency and severity of heat waves went up. For the Great Plains, this was also true, but the effect was much more moderate and emerged only gradually.

To quantify this difference, the authors developed a simple measure: the year in which half of the heat waves wouldn't have qualified as heat waves if it weren't for the influence of climate change. For the US West, that point was crossed in 2028. The West was followed by the Great Lakes, which crossed the threshold a decade later in 2037. But the Great Plains were on a completely different schedule. In the Northern Plains, the 50-percent threshold wasn't crossed until 2056, while the Southern Plains didn't have a clear signal of climate change until 2074.

Explaining the Plains

So why is internal variability so significant in the Great Plains? The researchers suggest two potential causes of these regional differences. One is a difference in the flow of air across the continental US, something that may be changing with our warming climate. If the prevailing winds become more erratic, then it's possible that they would bring cooler air across the Plains more often. The alternative is soil moisture. This takes up heat from the air and ground as it evaporates, which would counteract some of the heating caused by greenhouse gases.

Harsh winter weather in eastern US could be due to warmer Arctic
For the West Coast, the two appear to be related. Our warming climate is expected to produce wind patterns that reduce the frequency of storms and thus lower the amount of moisture in the soil. This, in turn, would reduce evaporation, leading to enhanced heat—which may explain why the climate signal appears there earliest.

For the Great Plains, however, the researchers identified a specific weather pattern that prevailed during the summer months called the Great Plains Low-Level Jet. The LLJ draws moist air up from the Gulf of Mexico, allowing it to fall as rain over the Plains. The evaporation of this rain would then offset some of the heat.

As a bit of science, this is some nice work, as the researchers have not only identified a case where natural variability has large influence on climate change, but they've identified the source of that variability. But they also point out that the findings could be helpful for policy. Over the last three decades, they note, heat-related fatalities have been the biggest weather-related cause of death in the US. Identifying the areas most at risk of increased heat would help us prepare for a future where that's looking increasingly inevitable.

And, in the case of the West Coast, it may be arriving in as little as a decade.


Clean Power Plan: Just Repeal, Don't Replace

Marlo Lewis

Yesterday I submitted comments on the Environmental Protection Agency’s advance notice of proposed rulemaking (ANPRM), which discusses options for replacing the Obama administration’s Clean Power Plan (CPP) with some other regulation to control carbon dioxide (CO2) emissions from existing power plants. The EPA is in the process of repealing the CPP, which was President Obama’s marquee domestic climate policy and principal regulatory component of his Paris Climate Treaty emission-reduction pledge. My comments make the case that the EPA should simply repeal the Clean Power Plan without replacing it.

In this post, I first discuss—and supplement—the ANPRM’s statutory argument for repealing the CPP. I then summarize three reasons why any CPP replacement rule would also be unlawful.   

Why the CPP is unlawful

The EPA promulgated the CPP under section 111(d) of the Clean Air Act (CAA). According to the ANPRM, the CPP exceeds EPA’s authority under that provision. CAA section 111(d) emission performance standards are supposed to reflect the “best system of emission reduction” (BSER) that has been “adequately demonstrated,” taking into account “cost” and the “remaining useful life” of each “source.” The statute defines “stationary source” as “any building, structure, facility, or installation which emits or may emit any air pollutant.” Consequently, all previous CAA section 111 performance standards were based on technologies and practices that could be applied by and at the source.

Abruptly departing from the text and the agency’s historic practice, the CPP requires states to establish performance standards more stringent than any individual fossil-fuel power plant can meet through measures implemented by and at the source. For example, the CPP establishes an emission performance goal for existing coal power plants of 1,305 lbs. CO2/MWh. That is beyond the capability of even new highly efficient super critical pulverized coal power plants, which typically emit about 1,720 lbs. CO2/MWh.

To comply with unattainable standards, the CPP expects power plant owners or operators to reduce grid-wide emissions in their capacity as actors in the electricity marketplace. CPP compliance options include purchasing power from lower-emitting facilities, investing in new renewable generation, buying emission credits from other facilities that over-comply, or simply reducing output, which cedes market share to lower-emitting facilities.

In effect, the CPP regulates the “U.S. power sector” as if it were a single source, with individual power plants—the actual sources as defined in CAA section 111—conceived as mere cogs in a vast machine. However, the power sector cannot be a “source” because it is not a building, structure, facility, or installation. The power sector is a market process comprised of hundreds of sources, hundreds of non-emitting generating units that are not sources, and millions of customers who do not produce power.

Or, as the ANPRM sums up the issue, a valid best system of emission reduction “must be based on a physical or operational change to a building, structure, facility, or installation at that source, rather than measures that the source’s owner or operator can implement on behalf of the source at another location.”

The ANPRM requests information on how EPA might replace the CPP with a new regulation “limited to [CO2] emission reduction measures that can be applied to or at an individual stationary source.”

Why EPA should not replace the Clean Power Plan

While I completely concur that lawful CAA section 111(d) emission performance standards must be based on measures that can be applied to individual sources, even a replacement rule based on such measures would still be unlawful. Four separate statutory reasons lead to that conclusion.

Section 112 Exclusion. CAA section 111(d) excludes from its regulatory purview “any air pollutant . . . emitted by a source category regulated under CAA section 112.” CAA section 112 requires EPA to list and regulate categories of industrial sources of hazardous air pollutants, such as arsenic, mercury, and cyanide. Coal- and oil-fueled power plants have been regulated as hazardous air pollutant sources under section 112 since 2012, and NGCC combustion turbines since 2004. Therefore, EPA may not regulate power plants under CAA section 111(d). The CPP is unlawful under the very provision that purportedly authorizes it. Any CPP replacement rule would be unlawful for the same reason.

Historic practice. The ANPRM suggests that BSER for existing power plants could be based on “equipment upgrades” and “good practices” that increase the efficiency by which those facilities convert heat into electricity. The improvement in thermal efficiency would, in turn, reduce CO2 emission rates. Such measures would be applied by and at the source. Nonetheless, such a BSER is inconsistent with the EPA’s practice of more than 40 years.

Until the Clean Power Plan, the EPA always based Clean Air Act performance standards for both new and existing sources on specific emission control technologies, not recipes to improve the source’s operating efficiency. It would be ridiculous, for example, to define BSER for primary aluminum plants in terms of incremental efficiency gains rather than in terms of technologies that can actually control fluoride emissions. The ANPRM’s suggested BSER is inconsistent with the statutory understanding reflected in EPA’s historic regulatory practice under Clean Air Act section 111.

Non-existent BSER. The Obama EPA claimed carbon capture and sequestration (CCS) is the “adequately demonstrated” BSER for new coal power plants. That was highly dubious, because no utility-scale CCS power plant has ever been built without hefty government subsidies. Even with subsidies, CCS power plants are not economical unless they can sell the captured CO2 to firms engaged in enhanced oil recovery. A BSER must be “broadly applicable,” but many coal power plants are not located near enhanced oil recovery operations. Besides, even the Obama EPA acknowledged that retrofitting existing power plants with CCS technology is too costly to pass muster as BSER.

The Trump EPA should acknowledge the reality that its predecessor refused to face: An adequately demonstrated best system for reducing CO2 emissions from existing power plants does not exist. Absent a bona fide BSER, Clean Air Act section 111(d) may not be used to regulate CO2 emissions from those facilities.

Contrary to congressional intent. As EPA’s 1975 implementing rule explains, one of Congress’s major purposes in enacting CAA section 111(d) was to enable EPA to control air pollutants ineligible for regulation under the national ambient air quality standards (NAAQ) program. Such pollutants may not be regulated under the NAAQ program because they are not emitted by “numerous or diverse sources.” However, carbon dioxide is emitted by both numerous and diverse sources. It is exactly the type of ubiquitous “air pollutant” Congress did not intend to be addressed by CAA section 111(d).

As the 1975 implementing rule also explains, CAA section 111(d) was designed to address air pollutants with “highly localized” effects. For such pollutants, proximity to the source chiefly determines the associated health risks. In contrast, the CO2-greenhouse effect is global, not local. Whatever the impacts of CO2 emissions on global climate, or climate change on particular communities, the potential health and welfare risks are not affected by proximity to the source.

In short, carbon dioxide and CAA section 111(d) are a total mismatch.


Pompeo, Trump and the Paris climate agreement

John Stossel

President Trump's pick to be the new secretary of state, Mike Pompeo, is not a fan of the Paris climate agreement, the treaty that claims it will slow global warning by reducing the world's carbon dioxide emissions. Politicians from most of the world's nations signed the deal, and President Obama said "we may see this as the moment that we finally decided to save our planet."

That's dubious.

Trump wisely said he will pull America out of the deal. He called it a "massive redistribution of United States wealth to other countries."

Unfortunately, Trump often reverses himself.

The climate change lobby has been trying to change Trump's mind. Al Gore called his stance "reckless and indefensible." Most of the media agree. So do most of my neighbors in New York.

That's why it's good that Pompeo opposes the Paris deal. Such treaties are State Department responsibilities. Pompeo is more likely to hold Trump to his word than his soon-to-be predecessor Rex Tillerson, who liked the agreement.

The Paris accord is a bad deal because even if greenhouse gases really are a huge threat, this treaty wouldn't do much about them.

I'll bet Al Gore and most of the media don't even know what's in the accord. I didn't until I researched it for this week's YouTube video.

Manhattan Institute senior fellow Oren Cass is the rare person who actually read the Paris accord.

Cass tells me it's "somewhere between a farce and a fraud." I interviewed him for a video project I am doing with City Journal, a smart policy magazine that often makes the case for smaller government. "You don't even have to mention greenhouse gases in your commitment if you don't want to. You send in any piece of paper you want."

The Paris accord was just political theater, he says. "They stapled it together and held it up and said, 'This is amazing!'"

The media announced that China and India made major commitments.

In truth, says Cass, "They either pledged to do exactly what they were already going to do anyway, or pledged even less. China, for instance said, 'we pledge to reach peak emission by about 2030.' Well, the United States government had already done a study to guess when Chinese emissions would peak, and their guess was about 2030."

In other words, China simply promised to do what was going to happen anyway.

"China was actually one of the better pledges," says Cass. "India made no pledge to limit emissions at all. They pledged only to become more efficient. But they proposed to become more efficient less quickly than they were already becoming more efficient. So their pledge was to slow down."

It's hard to see how that would help the planet.

"My favorite was Pakistan, whose pledge was to 'Reach a peak at some point after which to begin reducing emissions,'" says Cass. "You can staple those together, and you can say we now have a global agreement, but what you have is an agreement to do nothing."

However, Cass says one country did make a serious commitment. "The one country that showed up in Paris with a very costly, ambitious target was the United States. President Obama took all the zero commitments from everybody else but threw in a really expensive one for us."

Obama pledged to reduce emissions by 26 percent. If that ever happened, it would squash America's economy.

Nevertheless, when Trump said he was leaving the Paris accord, he was trashed by politicians around the world.

The UK's Theresa May was "dismayed," and Obama said, "This administration joins a handful of nations that reject the future."

Cass counters that if "the future is worthless climate agreements ... we should be proud to reject."

Don't get me wrong: The Earth has been warming, and humans probably contribute to it.

But the solution isn't to waste billions by making emissions cuts in America while other countries do nothing.

Trump was right to repudiate this phony treaty. It's good that Pompeo is around to remind him of that.


Greenie versus Greenie in Massachusetts

NAHANT — Since 1967, scientists at Northeastern University’s Marine Science Center have quietly gone about their work, studying ocean life from East Point, a spectacular rocky bluff that juts into the Atlantic Ocean and was once home to Henry Cabot Lodge’s estate and a World War II bunker.

But that scenic outpost has turned into a bitter battleground. Neighbors are fighting Northeastern’s proposal to build a 60,000-square-foot addition to the center as part of an ambitious plan to turn it into a nationally regarded coastal sustainability institute.

In an ironic twist, residents assert the institute dedicated to protecting vulnerable coastal communities will instead ruin the natural beauty of East Point — one of Nahant’s most cherished spots — and make the state’s smallest town feel more like a heavily traveled college campus.

“No matter how you design it, a 60,000-square-foot building on what we call Nahant’s last wild area will destroy it,” said Jim Walsh, a former selectman.

Jim Dolan, a retired high-tech worker who raised five children in Nahant, said he is so angry at Northeastern’s “total disrespect and total disregard” for the town he’s ready to set fire to the master’s degree he earned from the university in the 1970s.

“How about if we get everyone from Nahant who has a Northeastern degree to go to the board of directors meeting and burn them all?” Dolan said. “I don’t do that lightly. I’m a big education guy. But having a university take a position to not honor its neighbors is unconscionable.”

Such is the intensity of the opposition in Nahant, a one-square-mile peninsula that is home to 4,000 residents and has been a haven for wealthy families since the 19th century.

Hundreds of signs on front lawns in town declare, “Love Nahant, No Northeastern Expansion.” Residents have picketed outside the gates of the center and packed a Town Hall meeting last month, booing when Northeastern officials presented their expansion plans.

Last week, the bad blood reached a boiling point when Northeastern canceled a lecture at the center titled, “Nitrogen: Friend or Foe? Effects of Fertilization on a New England Salt Marsh.” University officials feared that residents who had been posting hostile messages on social media would disrupt the talk.

“It’s been really hard coming through town, the town that I’ve been driving through for 30 years, seeing such hostility,” said Geoffrey Trussell, the center director, who has worked there for three decades, studying ocean predators.


Green/Left governments want us to use public transport

But that puts us in the hands of bureaucrats who don't give a sh*t about us.  The story below is from the Australian city of Brisbane.  The Brisbane train system is actually one of the best in Australia's capital cities.  Sydney commuters have it much worse.  So it is interesting to see what counts as a good system below.  Nobody gives a sh*t in Brisbane either

SCHEDULED maintenance has caused public transport chaos on the night of Ed Sheeran’s first concert at Suncorp Stadium in Brisbane.

Passengers leaving from the city on the Caboolture/Sunshine and Redcliffe lines were being moved on to buses at Northgate station and being told to expect delays of up to an hour on their journey.

Buses replaced trains between Northgate and Petrie stations for the remainder of the evening.

As reported by The Courier-Mail, TransLink announced the works – maintenance on overhead powerlines – two months ago, warning commuters that buses would be used from 9.30pm onwards, before tracks reopened in the morning. That particular maintenance work was only scheduled for last night and will not impact tonight’s show.

Concert goer Katherine Lameree didn’t arrive home at Dakabin until after 1am due to the maintenance work. The gig finished at 10.30pm. Ms Lameree said it took her 30 minutes to reach the station.

Ms Lameree and her partner got off the train at Northgate where they were forced to join the que to the waiting bus.

“The lines were up the ramp for the overpass to get to the busses,” she said. “There was one waiting and they couldn’t keep up with the demand.”

Ms Lameree ended up calling a friend from the station and instead got a lift home, but was left disappointed that the maintenance went ahead despite the event.

“They knew the event was on, they were partnered with it offering free transport. Surely it could’ve waited until Thursday or be done in off peak during the day,” she said.

“A lot of people voiced it (frustration) on the train… but we all were like do we expect any better from Queensland Rail.”

A TransLink spokesman last week told The Courier-Mail last week of the track closure from Northgate to Petrie affecting the Redcliffe Peninsula and Sunshine Coast lines, encouraging Ed Sheeran fans to plan ahead. They did not give a reason as to why the closure was scheduled for that particular night.

Despite the warning many Ed Sheeran fans were angry TransLink chose the night of a major Brisbane event to conduct the maintenance.

In a Facebook comment, concert goer Ashley Darrenkamp called Queensland Rail “utterly ridiculous” for scheduling maintenance on the same night as the the 52,000-capacity sellout gig.

“They really messed up! I had to end up finding another way home, costing heaps of money!,” she wrote.

“Having to wait for buses to then stop at every station then to catch another train... very upset. I was fully aware and expecting delays due to high volumes but this was unacceptable.”

Another fan, Jessica Hopwood, said she too was also caught off guard by the maintenance.

“Traffic to Roma street station from the concert took 40 minutes then been told at the platform to get off the train at Northgate, then waiting in line for 20 minutes for a bus,” she said.




Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here


Wednesday, March 21, 2018

An amusing sermon from a true believer, Peter Kraai

He seems to be a passionate believer in global warming but shows zero familiarity with science.  Science depends on numbers and he offers not a single number in support of anything he says. And the things he does say are so vaguely put that you would need a book to critique them all

But one of his assertions that is unambiguously wrong is his claim that only a small part of the Antarctic is melting. Zwally's 2015 study showed that the Antarctic as a whole is cooling, not warming.  And since the Antarctic holds 96% of earth's glacial ice those rising sea levels beloved of Warmists are just not going to happen.  Zwally's findings and many others like them are statistics enough to discredit the whole global warming theory but Mr. Kraai is ignorant of them.  He just believes what suits him.

The way he puffs up his chest as someone concerned for the future of "the children" shows what his motivation is.  He wants to be seen as on the side of the angels and damn the facts.  He is doing virtue signalling, not any serious discussion of the evidence.  His confidence in the truth of his delusions is remarkable, though

To the Editor:

I trust that you have defaulted into printing op-eds from Cal Thomas because you have paid for a contract to do so -- even if the content is tripe or a fable. However, consider the moral of his recent fable ("Apocalypse now and the $6,000 Costco meal," March 15,2018) -- that, despite the decades of massive, peer-reviewed support for anthropogenic climate change, we should listen to charlatans funded by fossil fuel companies and ignore the corrections necessary (and readily available) to provide our offspring with a healthy future. This is indeed immoral.

The book he cites, Marc Morano's "Politically Incorrect Guide To Climate Change," is also scientifically incorrect. Not because there is no truth within, but because it is at least 10,000 respected studies away from the whole truth, and does not contain "nothing but the truth." To say Morano's pamphlet rises to the intellectual rigor of a comic book is to insult comic books; they try to get the science right. Thomas is neither stupid nor uneducated (so he can understand the basic science if he chooses), therefore he must be complacent or venal enough to allow our children's planet to fall further into disrepair.

No competent scientist has ever believed climate change (which, by the way, Cal, includes global warming -- they're not the same) "will destroy all life on Earth." However, neither will any of us deny that, through our obscene and opulent overuse of the world's resources, we have entered into the sixth mass extinction event. Yes, life will continue, but lacking many if not most of the algae, plants, animals and ecosystems that keep our lives viable, beautiful and awe-inspired.

As for climatologists (who interpret multi-year, -century, and -millennia trends rather than tell us what may happen in the sky tomorrow, as Thomas'  meteorologist source John Coleman does), and their predictions, "none of which have materialized":

Ocean acidification, dilution by fresh water, and altered flow of climate-determining currents have materialized.

More flooding from bigger hurricanes (note, Cal, I didn't say more frequent storms) has materialized.

Progressive melting of the great majority of mountain range ice- and snow-caps, glaciers, permafrost and poles (except, Cal, as you stated, a small part of Antarctica) has materialized.

Wars over dwindling water, soil and food have materialized.

Drastically mutated terrestrial growth zones (normally stable for thousands of years) have materialized.

Perhaps most paradoxical about Thomas' indifference to Earth's (and its life forms') chronic degradation is his frequently revisited self-identification as a God-fearing Christian. As God views the destruction perpetrated by fossil fuel apologists on this used-to-be-Eden he created, the Almighty must surely be dusting off his smiting instruments. And I think even Thomas would agree with what Jesus wouldn't do; pat the egocentric and materialistic on the back while suggesting they continue their consumptive ways as the poorest and those least contributory to the global mayhem suffer the most.

But then, I'm only a "climate change fanatic," according to Thomas, willing to set aside my nonessential desires so that subsequent human generations and the other 2 million species can reclaim the unsullied life they also deserve. At least this fanatic can look his children and students in their eyes and truthfully say, "I'm on your side."


Scott Pruitt Will End EPA’s Use Of ‘Secret Science’ To Justify Regulations

Environmental Protection Agency (EPA) Administrator Scott Pruitt will soon end his agency’s use of “secret science” to craft regulations.

“We need to make sure their data and methodology are published as part of the record,” Pruitt said in an exclusive interview with The Daily Caller News Foundation. “Otherwise, it’s not transparent. It’s not objectively measured, and that’s important.”

Pruitt will reverse long-standing EPA policy allowing regulators to rely on non-public scientific data in crafting rules. Such studies have been used to justify tens of billions of dollars worth of regulations.

EPA regulators would only be allowed to consider scientific studies that make their data available for public scrutiny under Pruitt’s new policy. Also, EPA-funded studies would need to make all their data public.

“When we do contract that science out, sometimes the findings are published; we make that part of our rule-making processes, but then we don’t publish the methodology and data that went into those findings because the third party who did the study won’t give it to us,” Pruitt added.

“And we’ve said that’s fine — we’re changing that as well,” Pruitt told TheDCNF.

Conservatives have long criticized EPA for relying on scientific studies that published their findings but not the underlying data. However, Democrats and environmental activists have challenged past attempts to bring transparency to studies used in rule making.

Texas Republican Rep. Lamar Smith pushed legislation to end the use of what he calls “secret science” at EPA. Pruitt instituted another policy in 2017 backed by Smith against EPA-funded scientists serving on agency advisory boards.

“If we use a third party to engage in scientific review or inquiry, and that’s the basis of rulemaking, you and every American citizen across the country deserve to know what’s the data, what’s the methodology that was used to reach that conclusion that was the underpinning of what — rules that were adopted by this agency,” Pruitt explained.

Pruitt’s pending science transparency policy mirrors Smith’s HONEST Act, which passed the House in March 2017. Smith’s office was pleased to hear Pruitt was adopting another policy the House Committee on Science, Space and Technology chairman championed.

“The chairman has long worked toward a more open and transparent rule-making process at EPA, and he looks forward to any announcement from Administrator Pruitt that would achieve that goal,” committee spokeswoman Thea McDonald told TheDCNF.

Junk science crusader Steve Milloy also called on EPA to end its use of “secret science” in rule making, especially when it comes to studies on the toxicity of fine particulates in the air.

EPA has primarily relied on two 1990s studies linking fine particulate pollution to premature death. Neither studies have made their data public, but EPA used their findings to justify sweeping air quality regulations.

Reported benefits from EPA rules are “mostly attributable to the reduction in public exposure to fine particulate matter,” according to the White House Office of Management and Budget report. That’s equivalent to billions of dollars.

In fact, one of EPA’s most expensive regulation on the books, called MATS, derived most of its estimated benefits from reducing particulates not from reducing mercury, which the rule was ostensibly crafted to address.

EPA estimated MATS would cost $8.2 billion but yield between $28 billion to $77 billion in public health benefits. It’s a similar story for the Clean Power Plan, which EPA estimated would cost $8.4 billion and yield from $14 billion to $34 billion in health and climate benefits.

Democrats and environmentalists have largely opposed attempts to require EPA rely on transparent scientific data. Said data would restrict the amount of studies EPA can use, but a major objection is making data public would reveal confidential patient data, opponents argue.

“A lot of the data that EPA uses to protect public health and ensure that we have clean air and clean water relies on data that cannot be publicly released,” Union of Concerned Scientists representative Yogin Kothari told E&E News.

“It really hamstrings the ability of the EPA to do anything, to fulfill its mission,” Kothari said.

Milloy, however, countered and argued it’s a “red herring” to claim that forcing regulators to use public science data would harm patient privacy.

“The availability of such data sets is nothing new,” said Milloy, publisher of and senior fellow at the Energy and Environmental Legal Institute.

“The state of California, for example, makes such data available under the moniker, ‘Public Use Death Files,'” Milloy said. “We used such data in the form of over two million anonymized death certificates in our recent California study on particulates and death.”

“Opponents of data transparency are just trying to hide the data from independent scrutiny,” Milloy added. “But the studies that use this data are taxpayer-financed, and they are used to regulate the public.”


Can climate litigation save the world?

Global moves to tackle climate change through lawsuits are poised to break new ground this week, as groups and individuals seek to hold governments and companies accountable for the damage they are causing.

On Tuesday, action by 12 UK citizens reaches the high court for the first time, while on Wednesday in San Francisco, the science of climate change will effectively be on trial at a key moment in a lawsuit.

The litigation represents a new front of climate action, with citizens aiming to force stronger moves to cut carbon emissions, and win damages to pay the costs of dealing with the impacts of warming.

They are inspired by momentous cases from the past, from the defeat of big tobacco to the racial desegregation of schools in the US. Big oil is fighting back hard, but though victories have been rare to date wins are more likely in future, as legal experts say the attitudes of judges often shift with the times.

A flurry of billion-dollar cases against fossil fuel companies brought by New York city and communities in California over the rising seas has pushed climate litigation into the limelight. But cases are being brought across the globe, with more than 1,000 suits now logged by the Sabin Center for Climate Change Law at Columbia law school in New York.

The UK government is now facing its first major climate change lawsuit, brought by 12 citizens through a legal group called Plan B and which already has the support of the government’s former chief scientific adviser, Prof Sir David King.

“The UK carbon target for 2050 does not match the Paris agreement goal and the government knows that,” says Tim Crosland, a barrister at Plan B. He says the purpose of the case is to make the government live up to its responsibilities: “It is about closing the accountability deficit which is one of the biggest problems with climate change – if everybody is responsible, nobody is responsible.”

The UK has had climate laws in place for a decade and is seen by some as a leading nation, but Crosland argues the great dangers of global warming make this irrelevant. “Either we don’t want to fall off the climate cliff edge or we do. Who is doing better than others is the wrong question.”

On Wednesday meanwhile, a landmark case in California, in which the cities of San Francisco and Oakland are suing major oil companies for damages, reaches an unprecedented moment with a day-long hearing on the science of climate change itself.

Further cases are under way from India to Uganda, and across Europe including the UK, Ireland, Belgium, Portugal and Norway, where campaigners are seeking to block oil drilling in the Arctic. In Colombia, 25 young plaintiffs are taking to the courts to halt deforestation.

Lawyers won a rare victory – now under appeal – in the 2015 Urgenda case in the Netherlands, with the court ruling the Dutch state must increase its cuts to emissions.

A Pakistani farmer has also won a ruling that the “lethargy of the state in implementing [climate policies] offends the fundamental rights of the citizens”.

And a Peruvian farmer is suing German energy company RWE over its alleged contribution to the melting glaciers near his Andean hometown.

But it is in the US, the world’s most litigious nation, that the greatest number of cases have been brought. The most high-profile suit against the government is the Juliana case, filed by 21 teenagers in Oregon, which saw off a Trump administration attempt to halt it earlier in March.

The basis of the case, says Julia Olson, lead counsel for Our Children’s Trust which is fighting the case, is failure of US administrations to protect its citizens by tackling global warming. “The US government has put these plaintiffs and other young people in a dangerous situation. First and foremost it is about personal security and the danger that exists presently. Beyond that, it is about protecting other fundamental rights under the US constitution – basic liberties such as to be able to decide where to live and to raise a family safely.

“I can’t convey how egregious and incredible this story is across every presidential administration of our government going back 60 years,” Olsen says. “This is not a Republican versus Democrat issue. Every president made those choices.”

While lawsuits against governments seek stronger action, those against the fossil fuel industry seek a simpler remedy – money. The argument is that these companies knowingly sold products that caused damage and a financial settlement is required, drawing parallels with the titanic legal battle fought and won against the tobacco industry.

Michael Burger, at the Sabin Center, says there is a similarity in that these growing cases threaten a level of legal and reputational risk to the companies that might eventually force them to settle or face financial oblivion.

“The other obvious similarity with tobacco is you have a long history of corporate obfuscation and attempts to blur the science and public understanding,” he says. Some of the characters involved are even the same – on both sides – with the same scientists defending both tobacco and oil companies and the same lawyers prosecuting them.

“But there is also a key difference,” says Burger. “Tobacco is a product individual people inhale and it gives them cancer. Fossil fuel production is the beginning of a long chain of causation that includes numerous corporate actors and individual consumers, as well as government licensing and permitting schemes. The causal chain is much longer.”

Scientists are now confident they can quantify the emissions resulting from each big company’s fossil fuels – just 90 firms are responsible for two-thirds of all emissions. But lawyers for the fossil fuel companies are robust in their response, saying a causal link to damages is “unprovable”.

The oil giants are also fighting back, with ExxonMobil in January seeking court permission to “investigate potential claims of abuse of process, civil conspiracy, and constitutional violations” by the Californian officials suing them. One of these, Serge Dedina mayor of Imperial Beach, says: “This appears to be the same kind of bullying tactic that the industry uses again and again to avoid accountability.”

Industry lobby groups are also mobilising against climate litigation in the US, with the National Association of Manufacturers (NAM) launching a campaign against “politically motivated legal attacks” in November. “It has become clear that these activist plaintiffs’ attorneys, sympathetic academics and agenda-driven media outlets are distorting the use of tort litigation to advance their narratives with the ultimate objective of undermining manufacturers and the engine of the American economy,” said Linda Kelly, NAM general counsel.

However, an international panel of senior judges concluded in January that many companies around the world may well already be in breach of existing laws in relation to their impact on climate change. “Very, very few enterprises currently meet their obligations – if they did [climate change] would mostly be solved,” said Jaap Spier, who was advocate general in the Dutch supreme court until 2016 and part of the panel that published the assessment.

Spier says judges are influenced by growing concerns in society, such as worries over climate change, and are increasingly likely to look favourably on climate litigation in coming years. “If you assume companies don’t [change] at some stage, I have not the slightest doubt that courts will understand that they must step in.”

More and more climate cases are being filed, with lawyers suggesting a range of factors, from the election of Donald Trump to more extreme weather events, to revelations about what fossil fuel companies knew about climate change dangers, and a growing awareness of the urgent need to act.

Despite this, major victories at a supreme court level still appear years off. But Burger says even wins along the way to the highest court add to the pressure for change: “There is a victory here which is just surviving the initial motions to dismiss. Having courts advance these cases towards a trial could itself move fossil fuel companies to want to start to seek some other solution.”

Nick Butler, who spent 29 years at BP and is now at King’s College London, says the companies do not believe the point has been reached where they are likely to lose cases but the pressure is real nonetheless: “The legal actions add a further dimension to the pressure for change in an industry that has begun to accept the need to reinvent itself.”

Olson argues that the courts are starting to recognise the urgency and highlights that the district judge in the Juliana case, Ann Aiken, said climate change needs to be addressed with “all deliberate speed”. That is a potent phrase in the US, taken from the 1955 supreme court judgment on the Brown v Board of Education case that ordered the end of racial segregation of schools.

“We need a decision that says you cannot discriminate against young people and deprive them of a climate system that will sustain their lives,” she says.

“[Aiken] very much understands that constitutional rights are at stake and that speed is the critical factor here. I think we will be on a very fast track.”


Environmentalist Publishes Op-Ed on Climate Change… Covers it With Blatant Lies

An anti-fossil fuel movement proponent dubiously claimed Tuesday natural gas development’s methane emissions are hitting catastrophic levels.

Activist are failing to impress upon people the dangers associated with the fracking industry, according to Vermont’s Middlebury College Professor Bill McKibben. He also suggested most research shows methane emissions from natural gas are pitching above a safe level, yet many studies show the antithesis.

“When I think about my greatest failing as a communicator — and one of the greatest failings of the climate movement — it’s not that global warming still continues,” McKibben wrote Wednesday for Yale Environment 360.

The movement’s biggest moral failing, he said, was not selling people on the danger unchecked methane emissions pose to the climate.

Democrats, Republicans and the public have generally accepted the idea natural gas is a fine alternative to other forms of fossil fuel production, but the general population is unaware methane emissions from such energy put the climate in a precarious spot, McKibben added.

“It turns out that there are lots of places for leaks to happen — when you frack a field, when you connect a pipe, when you send gas thousands of miles through pumping stations — and so most studies show that the leakage rate is at least three percent and probably higher,” he noted without citing any specific study buttressing his claim.

McKibben relied on data from Cornell University Ecology Professor Bob Howard’s studies to conclude methane emission leakage rates were nearly three percent, he told The Daily Caller News Foundation.

Howard’s work has been criticized in the past for using too short a time frame. He uses a 20-year window to study the global warming potential of methane emissions in the atmosphere as opposed to the more common 100-year horizon.

Environmental groups have also scrutinized Howard’s work.

“While I can see an argument for using a time horizon shorter than 100 years, I personally believe that the 20-year GWP is too short a period to be appropriate for policy analysis,” former National Resources Defense Council director Dan Lashof said in 2011 of McKibben’s chronological methodology.

Environmental Protection Agency research and other studies, meanwhile, paint a much different story.

Actual emissions from gas power plants were “nearly 50 times lower than previously estimated by the Environmental Protection Agency,” a 2013 University of Texas study availed. Researchers at UT concluded methane emissions from the supply chain’s upstream portion are 0.38 percent of production.

EPA’s latest methane emissions data from 2017 show very low methane leakage rates of approximately 1.2 percent. The agency and UT’s data and research were concluded, using the more reliable 100-year time frame. McKibben has spent several years thrashing Democratic leaders for promoting the natural gas industry.

McKibben was singing a different tune in 2009 when he felt so strongly about power plants switching to natural gas he was willing to be jailed in support of the cause. He was one of several celebrities who protested on Capitol Power Plant’s front steps in Washington, D.C.

“There are moments in a nation’s — and a planet’s — history when it may be necessary for some to break the law … We will cross the legal boundary of the power plant, and we expect to be arrested,” McKibben told reporters prior to the March 3, 2009, protest.

“(I)t would be easy enough to fix. In fact, the facility can already burn some natural gas instead, and a modest retrofit would let it convert away from coal entirely. … It would even stimulate the local economy,” he added.

A version of this article appeared on The Daily Caller News Found


Trudeau’s carbon tax plan is close to blowing up in his face

The carbon-tax system isn’t a tax grab. It’s an economic bulldozer. A carbon tax in a low pollution country with endless forests like Canada is ludicrous. The idea that it is to be collected by the provincial government, sent to the Federal government, and then returned to the provinces makes absolutely NO sense

Things have turned very much Jim Karahalios’s way lately, and they might not be done yet. If you haven’t heard of Karahalios, he was the noisy member of the Ontario Progressive Conservatives persecuted by his own party for refusing to let former leader Patrick Brown get away with making carbon taxes an official policy. Although Karahalios clearly spoke for most members, Brown was determined to stick with his carbon tax — and to muzzle Karahalios and his “Axe the Tax” campaign, which has since expanded to every province. Karahalios was even tossed out of PC events and stripped of his PC membership.

With Doug Ford now leading the party into a spring election, the Ontario PC party looks less like Brown’s than it does Karahalios’s, who got his official apology (and the lawsuit appeal dropped) earlier this month from the party. And with Canada’s largest province looking like it might soon be on the same warpath as other provinces against the federal Liberals over the carbon tax, the whole country could soon look more like Karahalios’s sort of place than Prime Minister Justin Trudeau’s.

Until now, most pundits have taken the federal Liberals’ word that the carbon tax is going to happen, whether provinces sign on to it or not. No one’s really questioned the legitimacy of Trudeau’s threat to use a “backstop” power that would see Ottawa collecting a price-fixed carbon tax within a particular province if the province itself will not. Even the National Post’s estimable Andrew Coyne suggested not long ago that the Ontario PC leadership candidates’ “declaration of opposition to a carbon tax is… meaningless”: Since Ottawa will levy the tax itself if it has to, “the tax will be collected” whether they liked it or not. In reality, though, a growing number of provinces are girding for battle in what could be a federal-provincial showdown for the ages. Far from being certain of getting its way, the federal government likely lacks the weapons it needs to win.

Federal Environment Minister Catherine McKenna’s tough talk this week warning the Saskatchewan government she was coming to get their carbon taxes (while revoking their $62-million low-carbon grant) could be a bluff. In a few months, she might be trying the same bluff on Ontario, if polls hold up and the next premier ends up being Ford. He has been fiercely decrying the carbon tax, insisting the “reckless” and “job-killing” tax would “do great damage to Ontario.” He promises to “take Justin Trudeau to court” to stop it.

And by next year, Ford could be teaming up in that fight not just with Saskatchewan’s Premier Scott Moe, but likely Jason Kenney in Alberta, who is remarkably popular and currently on track for a landslide victory to become the province’s first United Conservative Party premier in 2019, with a campaign built almost entirely on a promise to axe Alberta’s carbon tax. Heck, even before then, Alberta’s NDP Premier Rachel Notley said Thursday that she’s now refusing to raise Alberta’s carbon taxes to meet Trudeau’s minimum rates if he doesn’t stop B.C. from blocking the Trans Mountain pipeline.

Together, the provinces representing half the Canadian population now are or soon might be arming for a carbon-tax war with Ottawa. That doesn’t even count Manitoba, which is still refusing to align its own less-burdensome carbon-tax plan with Ottawa’s pricing scheme. Or New Brunswick, whose plan is also at odds with federal requirements. Or Nova Scotia, whose cap-and-trade scheme doesn’t come close to meeting McKenna’s stipulations. (Newfoundland and P.E.I. haven’t revealed their plans yet.)

So how many provinces exactly do Trudeau and McKenna think they can successfully fight? And if they thought trying to force wildly unpopular small-business taxes down peoples’ throats was a fiasco, wait till they see how ugly things get trying to force a carbon tax on hostile Canadians who loathe it even more than their premiers, who at least are tempted by the potential cash grab.

Yet McKenna sounded positively blithe this week about the ease with which she plans to deploy her “backstop” weapon, in responding to a letter from Saskatchewan’s Environment Minister Dustin Duncan, who said the province could not accept her carbon tax. McKenna’s response: If Duncan’s government didn’t start taxing carbon at the minimum price she requires, “we would have no choice but to ensure that a price on pollution applies …. We would do so by applying the federal carbon pricing system in Saskatchewan.” It sounds so simple. Except the closer you examine her position, the weaker it seems.

There is already the matter of the constitutional clash that will form the basis of the Ford court case, over whether the feds even have the power to tax carbon, especially connected to resources. University of Saskatchewan constitutional law expert Dwight Newman thinks the provinces have “more of a case than a lot of people are giving them credit for.”

More to the point, no one has explained the logistics that would let McKenna make good on her ultimatum. The ability of a Canadian government to implement policy has always relied on the co-operation of provincial governments, notes Newman. Indeed, B.C.’s current NDP government, with its pipeline-stalling mischief, is right at this moment revealing just how powerless the federal government can appear when a province refuses to play along with it. Besides that, Newman points out that the federal government has never tried exercising the power to levy a specific tax on Canadians of one province but not another. Even if this one were willing to try, in the face of so many obvious constitutional problems, does it really have the tools and the stomach to monitor and bill for the carbon use of every farmer, factory, fuel pump and furnace in Saskatchewan?

McKenna makes it all sound so simple and straightforward, from her perch there on the edge of a minefield of untold and unprecedented legal and logistical difficulties. But this is a government that has yet to succeed in executing on anything remotely this complicated. At least it will be entertaining watching McKenna trying to collect carbon bills from ill-disposed farmers in Melfort, without an ounce of help from the province or municipalities.




Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here


Tuesday, March 20, 2018

Thanks To Global Warming We May Run Out Of Cities To Host Future Winter Olympic Games

This is just modelling rubbish. There are many large cities in the North of the world where winter temperatures fall way below zero.  A 2 degree rise in global temperature would hardly touch them.  They would still be way below zero

If you’ve been enjoying the 2018 Winter Olympic Games, we have bad news for you. Climate change could be jeopardizing the future of this popular quadrennial sporting event.

Competitions such as ski jumping, bobsleigh racing, and snowboarding require a lot of snow and ice, which means host countries should really have average daily temperatures of below freezing.

Sadly, thanks to global warming, locations that have traditionally been perfect for the Games may not be up to scratch by the mid-century. This is the conclusion of researchers at the University of Waterloo in Ontario, Canada, who published their initial findings in 2015 in the journal Current Issues in Tourism. The study has recently been updated to include Pyeongchang (2018) and Beijing (2022).

The team, led by geography professor Daniel Scott, analyzed climate data from the former host locations and used models to predict the effect climate change will have on February temperatures in throughout the next century. First, they considered a low-emission model, where average global temps increase by 2.6°F by 2100. Then, they considered a high-emission model, where average global temps rise by 8.5°F.

Shockingly, nine former sites would be considered “unreliable” or "high risk" hosts by 2080 under the low-emission model. This rises to 13 under the high-emission scenario. By 2050, between eight (low-emission) and nine (high-emission) locations would already be judged “unreliable” or "high risk".

So, what can be done about it?

Well, one solution is to use artificial snow. This is done by pumping highly-pressurized water through tiny nozzles, which freezes in cold air and transforms into "snow". There is one little problem, however.

“You’re relying on cold air to do the refrigeration for you,” Scott told the New York Times. This won’t happen if the air is above sub-zero.

Alternatively, you could bank snow from an earlier winter or cover bales of straw with a combination of artificial snow and natural snow excavated from somewhere colder. This is what they did in Sochi (2014) and Vancouver (2010) respectively, when temperatures were above freezing.

However, both times athletes complained of poor conditions.

There is the option to bring competitions inside, though this might work a little better for figure skating than it does for alpine skiing.

More likely, we'll see the list of possible locations shrink and the same sites will take it in turns to hold the Winter Olympic Games.


FEMA is preparing for the future. “Climate change” isn’t part of it

The United States is still reeling from last year’s megadisasters. Puerto Rico lingers in the longest blackout in US history after Hurricane Maria tore through the island, and the scorched earth left behind from record-breaking fires in California is now causing floods and mudslides.

The Federal Emergency Management Agency, which was on the front lines of many of these calamities, had to go back to Congress last September to ask for billions more dollars to handle the gargantuan relief efforts.

With some of the dust settled, it’s clear that the events in 2017 fit the pattern of extreme weather we expect as average global temperatures go up, with strong climate change signals emerging in fires and rainfall.

FEMA has, and will continue to, respond to climate change-influenced disasters. But the agency’s new strategic plan for 2018-2022, released Thursday, doesn’t mention climate change or global warming at all. That’s despite the fact that the 38-page document projects more frequent and more expensive disasters. This is a glaring omission from an agency that deals with anticipating and responding to weather extremes driven in part by a changing climate.

The damages from the hurricanes, heat waves, wildfires, and tornadoes in 2017 cost at least $306 billion.

“Disaster costs are expected to continue to increase due to rising natural hazard risk, decaying critical infrastructure, and economic pressures that limit investments in risk resilience,” according to the document.

FEMA did signal that it wants to invest more in “pre-disaster mitigation,” which may or may not include climate change. But it wasn’t so coy in its previous 2014-2018 strategic plan, which noted that the agency “will also ensure that future risks, including those influenced by climate change, are effectively integrated into the Agency’s risk assessment resources and processes.”

Dropping climate change from FEMA’s strategic plan is just the latest part of the Trump administration’s long pattern of erasing climate change as a public policy issue across the federal government. Agencies including the Department of Energy, the Environmental Protection Agency, and the Department of the Interior have removed climate change language from many of their websites. Other agencies have limited access to policy and technical documents dealing with climate change. And some government research grant applications faced extra scrutiny and rejection for mentioning the “double C-word.”

It’s not just rhetoric. President Trump wants the United States to back out of the Paris climate agreement, and the EPA is working to undo the main policy for restricting greenhouse gases, the Clean Power Plan. The White House’s latest budget proposal seeks a 72 percent cut to clean energy research and reduces FEMA’s budget by almost $600 million from $16.1 billion.


The bias towards bad news is getting worse, and affecting how we act

‘Deadly new epidemic called Disease X could kill millions, scientists warn,” read one headline at the weekend. “WHO issues global alert for potential pandemic,” read another. Apparently frustrated by the way real infectious diseases keep failing to wipe us out, it seems that the nannies at the World Health Organisation have decided to invent a fictitious one.

Disease X is going to be a virus that jumps unexpectedly from an animal species, as happens from time to time, or perhaps a man-made pathogen from a dictator’s biological warfare laboratory. To be alert for such things is sensible, especially after what has happened in Salisbury, but to imply that the risk is high is irresponsible.

No matter how clever gene editors get, the chances that they could beat evolution at its own game and come up with the right combination of infectiousness, lethality and viability to spread a disease through the human race are vanishingly small. To do so in secret would be even harder.

I fear the only effect of the WHO’s decision could be to cause unnecessary alarm and damage public confidence in the very technology that brings more effective cures and vaccines for known and unknown diseases. It also feeds our appetite for bad news rather than good. Almost by definition, bad news is sudden while good news is gradual and therefore less newsworthy. Things blow up, melt down, erupt or crash; there are few good-news equivalents. If a country, a policy or a company starts to do well it soon drops out of the news.

This distorts our view of the world. Two years ago a group of Dutch researchers asked 26,492 people in 24 countries a simple question: over the past 20 years, has the proportion of the world population that lives in extreme poverty

1) Increased by 50 per cent?

2) Increased by 25 per cent?

3) Stayed the same?

4) Decreased by 25 per cent?

5) Decreased by 50 per cent?

Only 1 per cent got the answer right, which was that it had decreased by 50 per cent. The United Nations’ Millennium Development goal of halving global poverty by 2015 was met five years early.

As the late Swedish statistician Hans Rosling pointed out with a similar survey, this suggests people know less about the human world than chimpanzees do, because if you had written those five options on five bananas and thrown them to a chimp, it would have a 20 per cent chance of picking up the right banana. A random guess would do 20 times as well as a human. As the historian of science Daniel Boorstin once put it: “The greatest obstacle to discovery is not ignorance — it is the illusion of knowledge.”

Nobody likes telling you the good news. Poverty and hunger are the business Oxfam is in, but has it shouted the global poverty statistics from the rooftops? Hardly. It has switched its focus to inequality. When The Lancet published a study in 2010 showing global maternal mortality falling, advocates for women’s health tried to pressure it into delaying publication “fearing that good news would detract from the urgency of their cause”, The New York Times reported. The announcement by Nasa in 2016 that plant life is covering more and more of the planet as a result of carbon dioxide emissions was handled like radioactivity by most environmental reporters.

What is more, the bias against good news in the media seems to be getting worse. In 2011 the American academic Kalev Leetaru employed a computer to do “sentiment mining” on certain news outlets over 30 years: counting the number of positive versus negative words. He found “a steady, near linear, march towards negativity”. A recent Harvard study found  that 87 per cent of the coverage of the fitness for office of both candidates in the 2016 US presidential election was negative. During the first 100 days of Donald Trump’s presidency, 80 per cent of all coverage was negative. He is of course a master of the art of playing upon people’s pessimism.

This is a human susceptibility and one that is open to exploitation. Even while saying that they would prefer good news, subjects in a subtle psychology experiment in Canada who were told to choose and read a newspaper article while waiting for the “experiment” to begin in fact “chose stories with a negative tone — corruption, setbacks, hypocrisy and so on — rather than neutral or positive stories”. Financial journalists have been found to report rising financial market indices with declining enthusiasm as rises continue, but falling ones with growing enthusiasm as the falls continue. As the Financial Times columnist John Authers said: “We are far more scared of encouraging readers to buy and ushering them into a loss, than we are of urging them to be cautious, and leading them to miss out on a gain.”

That is one reason for the pervasive negativity bias that afflicts the public discourse. Humans are loss-averse, disliking a loss far more than they like an equivalent gain. Such a cognitive bias probably kept us safe amid the dangers of the African savannah, where the downside of taking risks was big. The golden-age tendency makes us remember the good things about the past but forget the bad, with the result that the present seems worse than it is. For some reason people sound wiser if they think things are going to turn out badly. In fiction, Cassandra’s doom-mongering proved prescient; Pollyanna was punished for her optimism by being hit by a car.

Thus, any news coverage of the future is especially prone to doom-mongering. Brexit is a splendid example: because it has not yet happened, all sorts of ways in which it could go wrong can be imagined. The supreme case of unfalsifiable pessimism is climate change. It has the advantage of decades of doom until the jury returns. People who think the science suggests it will not be as bad as all that, or that humanity is likely to mitigate or adapt to it in time, get less airtime and a lot more criticism than people who go beyond the science to exaggerate the potential risks. That lukewarmers have been proved right so far cuts no ice.

Activists sometimes justify the focus on the worst-case scenario as a means of raising consciousness. But while the public may be susceptible to bad news they are not stupid, and boys who cry “wolf!” are eventually ignored. As the journalist John Horgan recently argued in Scientific American: “These days, despair is a bigger problem than optimism.”


Mystery solved: Rain means satellite and surface temps are different. Climate models didn’t predict this…

A funny thing happens when you line up satellite and surface temperatures over Australia. A lot of the time they are very close, but some years the surface records from the Australian Bureau of Meteorology (BOM) are cooler by a full half a degree than the UAH satellite readings. Before anyone yells “adjustments”, this appears to be a real difference of instruments, but solving this mystery turns up a rather major flaw in climate models.

Bill Kininmonth wondered if those cooler-BOM years were also wetter years when more rain fell. So Tom Quirk got the rainfall data and discovered that rainfall in Australia has a large effect on the temperatures recorded by the sensors five feet off the ground. This is what Bill Johnston has shown at individual stations. Damp soil around the Stevenson screens takes more heat to evaporate and keeps maximums lower. In this new work Quirk has looked at the effect right across the country and the years when the satellite estimates diverge from the ground thermometers are indeed the wetter years. Furthermore, it can take up to six months to dry out the ground after a major wet period and for the cooling effect to end.

In Australia rainfall controls the temperature, which is the opposite of what the models predict, but things are different in the US.

In Australia maximum rainfall occurs in the summer but it is highly variable, whereas in the US, while the summer rain is heavier, it’s the winter precipitation where the big variations occur. This seasonal pattern makes a big difference. Both the Australian pattern and the US pattern appear in other places around the world, but the models only have the one scenario. It appears the modelers figured out the situation in New Jersey and programmed it in for the rest of the world, but whole zones of the world are behaving quite differently.

Models predict that temperature affects rainfall — but in Australia the rainfall affects the temperature. No wonder these models are skillless at predicting  temperature and on rainfall — they are even worse.

As far as I know this is new and original research. Tom Quirk has run it past a few people, including John Christy of UAH who notes that this has been seen elsewhere. Let’s keep up with the peer review…


Australian PM 'disappointed' the Greens linked destructive wildfire to climate change

Prime Minister Malcolm Turnbull has expressed his "disappointment" that the Greens linked the catastrophic bushfire that ripped through Tathra overnight to climate change.

"I'm disappointed that the Greens would try to politicise an event like this," the Prime Minister told reporters, speaking from the fire-ravaged town this afternoon.

“You can’t attribute any particular event, whether it’s a flood or fire or a drought or a storm – to climate change."

As Tathra residents waited to hear if they'd lost their homes, businesses, and or livelihoods, Greens leader Richard Di Natale rose in the Senate and linked the catastrophic bush fires to climate change.

"We are seeing climate change in our every day lives have an impact on the risk of bush fires in our communities," he said.

"We can't any longer be complacent about risk of bush fires once the end of summer comes around. "And yet here we are with bush fires racing through my home state and indeed my community."

But Malcolm Turnbull said such intense fires are part and parcel of life in Australia.

“We are the land of droughts and flooding rains, we're the land of bushfires,” he said. “Nature hurls her worst at Australians – always has and always will."

“We saw from the air how the fire had not just leapt over a river, but had leapt over streets of houses, apparently without any damage, and then landed on a group of houses which had been burnt out. So, you can see how unpredictable it is.

“We have an environment which has extremes. Bushfires are part of Australia, as, indeed, are droughts and floods.”

Coalition Senator, Ian Macdonald, called the speech "hypocritical and a fraud". "These events happened before. They will happen in the future," he said.

The official New South Wales bush fire season ends at the end of March.




Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here


Monday, March 19, 2018

How to lie with graphs

According to climate alarmists at Earth-Sky, wildfires are on the increase in the US, and 2005, 2006, 2007, 2011 and 2012 were record years.

2015 wildfire season a record-breaker | Earth | EarthSky

The graph above starts in 1960, and it isn’t hard to see why. Their graph starts at one of the lowest years on record. The graph below is a much longer record from the US Forest Service.

Indicator 3.16: Area and percent of forest affected by abiotic agents

I overlaid the two graphs at the same scale below, showing the spectacular fraud behind the start date of 1960 in the 1960-2014 graph. Their record high years were actually closer to being record low years.

The USFS graph is quite real, and correlates well with newspaper reports from the time


An interesting graphic from NOAA

The unobservant might well look at this graph and say:"There you are!  The sea level is rising, just as global warming theory predicts!

But look at the calibrations.  The graph goes back to 1850, showing that the sea level was rising long before global warming was thought of. And the rate of rise has been smooth, unlike the accelerated rise in recent times that would be expected from anthropogenic global warming

Relative Sea Level Trend:  The Battery, New York

The relative sea level trend is 2.84 millimeters/year with a 95% confidence. interval of +/- 0.09 mm/yr based on monthly mean sea level data from 1856 to 2017 which is equivalent to a change of 0.93 feet in 100 years.

The plot shows the monthly mean sea level without the regular seasonal fluctuations due to coastal ocean temperatures, salinities, winds, atmospheric pressures, and ocean currents. The long-term linear trend is also shown, including its 95% confidence interval. The plotted values are relative to the most recent Mean Sea Level datum established by CO-OPS. The calculated trends for all stations are available as a table in millimeters/year and in feet/century (0.3 meters = 1 foot). If present, solid vertical lines indicate times of any major earthquakes in the vicinity of the station and dashed vertical lines bracket any periods of questionable data or datum shift.



THERESA May’s “green” energy dash was slammed yesterday as a report claimed wind turbines barely turned for TWO MONTHS last year.

The GMB union said that on 65 separate days, turbines supplied less than 10 per cent of their potential for at least half a day – meaning the UK was reliant on gas, nuclear and coal to keep the lights on.

Wind turbine ineffectiveness means the UK is reliant on gas, nuclear and coal to keep it running

In total there were 138 days with “low wind” for at least half an hour. And there have been a staggering 341 days since March 2017 when solar panels supplied less than 10 per cent of the “installed capacity”.

GMB national secretary Justin Bowden said: “It is the facts, not the hype, which should determine the UK’s energy policy decisions.  He added: “Those advocating a renewable-only energy policy cannot just shrug their shoulders on cloudy, windless days, or when it is dark, and pretend that more windmills and solar panels on their own can keep the lights on.

“They have to accept that unless and until there is a scientific breakthrough on carbon capture or solar storage, then a balanced energy supply mix -which includes nuclear and gas as the only reliable shows in town - is a reality.”

Hundreds of millions of pounds has been handed in subsidies to green energy giants to fund turbines and solar panels. Ex-PM David Cameron famously told aides to ditch the “green crap” in 2013 after the subsidies were blamed for hikes in household bills.

The figures from the union’s “Wind Watch” come a day after it emerged hundreds of offshore wind turbines in UK waters need emergency repairs after they started eroding. Owners of the 175-turbine London Array wind farm off Kent – the biggest offshore farm in the world – are among those to have applied to the Marine Management Organisation for permission to carry out urgent repairs.

The GMB has been the sole trade union to back fracking – and represents thousands of staff in the gas industry. The union attacked Labour for speaking “nonsense” when the party vowed to ban fracking.

The union said it would force the UK to rely on foreign dictators “henchman, hangmen and headchoppers” for gas.


The Empirical Evidence Answers: What Is The Correlation Between CO2 & Climate-Related Deaths?

The empirical evidence strongly indicates an inverse correlation between CO2 levels and deaths from climate.

The adjacent chart superimposes annual atmospheric CO2 levels onto a chart that Bjorn Lomborg produced on his Facebook page.

Since 1920, while climate-related deaths have plummeted, the deaths from non-climate related natural events has essentially hovered in a narrow range.

Yet, from 1920-2017, the atmospheric CO2 levels has grown an exceptional amount at an exceptional speed. A growth that has primarily attributed to the modern industrial/consumer combustion of fossil fuels.

That is the undeniable empirical evidence that lays total waste to the anti-science beliefs and doomsday claims of celebrity-seeking individuals who populate Washington D.C., Hollywood, ivory towers, and etc.

It's just another example of 'elites' failing to connect the real science dots.

Simple Summary: The trace greenhouse gas CO2 should not be feared as some sort of death-machine unleashed by humans. Instead the empirical evidence suggests it is an indication of civilization advancement and the life-saving achievements it produces.

Long live the CO2-savior!


Opponents say Block Island wind farms are causing problems across prime fishing grounds

NEW BEDFORD — The five enormous turbines that have been generating electricity off Block Island over the past year are considered a model for the future of offshore wind.

But the nation’s first ocean-based wind farm also has exposed what fishermen say are serious threats to them caused by scattering massive metal shafts and snaking underwater cables across prime fishing grounds.

With state officials poised to announce the winners of bids to develop much larger wind farms south of Martha’s Vineyard, fishermen across the region have been pressing officials for answers to their concerns about where the turbines will be located, how far apart they’ll be built, and the placement of the cables to the mainland.

Somehave pointed to issues they’ve encountered in the waters around Block Island as the reason they are worried.

“It’s true that the area where the turbines are have created habitat that attracts fish, which is good; but in the area where the cable lines extend to the mainland, it’s completely devoid of fish,” said Michael Pierdinock, chairman of the Massachusetts Recreational Alliance, which represents about 50,000 recreational fishermen. “These used to be fruitful fishing grounds.”

The opposition of the fishing industry, a powerful interest group in New England, could prove a hindrance for developers of the proposed wind farms, which will be chosen next month.

Those projects, which could ultimately span hundreds of thousands of acres some 14 miles south of Martha’s Vineyard, are expected to generate 1,600 megawatts of power within a decade, or enough electricity for about 800,000 homes.

At a meeting last month in New Bedford of fishermen, developers, and state and federal officials, Pierdinock and commercial fishermen urged regulators to study the potential impact of the proposed wind farms on marine mammals, spawning grounds of herring and squid, and other species that inhabit the area.

The fishermen also raised questions about the impact of electromagnetic waves pulsing across the seafloor on species such as sharks, which navigate and hunt in part by sensing electrical currents, and how rotating turbine blades could impede their ability to navigate with radar.

Wind power companies have dismissed most of their concerns, and fishermen have become increasingly frustrated, saying that they’re being ignored.

“We don’t know the causes of some of the things that have been happening, and it’s apples and oranges to compare fisheries off the United Kingdom and the North Atlantic,” said Beth Casoni, executive director of the Massachusetts Lobstermen’s Association. “There’s just a lot more we need to know.”

Officials at Deepwater Wind, the Providence-based company that built the Block Island wind farm and is seeking to developone of the larger projects off the Vineyard, rejected assertions that the underwater cables from their turbines have harmed the fishery.

“There’s zero scientific evidence for that,” said Aileen Kenney, vice president of permitting and environmental affairs at Deepwater Wind. “We’ve heard of no decline of fishing activity around the project.”

She also dismissed claims that fishermen have had their lines caught on the concrete casings that cover small portions of cables that couldn’t be buried. Most of the cables connecting the turbines to electricity substations on land have been laid 4 to 6 feet below the seafloor.

Still, Deepwater Wind intends to bury as much cable as it can if it is selected to develop one of the new projects, she said.

Erich Stephens, chief development officer for Vineyard Wind, which is also bidding to develop one of the offshore projects, said he hopes the fishing and offshore wind industries will learn to coexist.

“All indications are that fish and wildlife are not harmed by wind turbines,” he said. His company, which has proposed building a $2 billion wind farm that could generate 800 megawatts of power, has sought to accommodate fishermen, he said.

Instead of placing turbines in an irregular pattern, which would produce the most energy, the company intends to position them in neat rows, eight-tenths of a mile apart. That would allow two fishing vessels to drag their nets through the area at the same time, he said.

“We have made an effort to meet with fishermen and understand their concerns,” Stephens said. “It’s going to take patience and understanding all around.”

State and federal officials said they’re also trying to address fishermen’s issues. Baker administration officials said they have told federal regulators that any decision about where to build the turbines “must include consideration of natural resources and important marine ecosystems.”




Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here


Sunday, March 18, 2018

Sen. Sheldon Whitehouse Says Opponents of Climate Change Regulations Guilty of Grave Sins

Shelly is making a half-witted attempt to get religious people onside.  It's doubtful if he believes his own words

Sen. Sheldon Whitehouse (D-R.I.) said in a speech on the Senate floor on Tuesday that those who stand in the way of regulations meant to control “climate change” are guilty of three grave “sins.”

“It is an evil mess we are in, and if there is any justice in this world, there will one day be a terrible price to pay if we keep listening to evil voices,” Whitehouse said.

Before he arrived at his point about the grave sins committed by opponents of climate change regulations, he urged Americans to “listen to the oceans” and “the oysters.”

“If you can listen quietly, you can listen to the oceans,” Whitehouse said. “They speak to us, the oceans do. They speak to us through thermometers, and they say: We are warming. They speak to us through tide gauges, and they say: We are rising along your shores. They speak to us through the howl of hurricanes powered up by their warmer sea surfaces. They speak to us through the quiet flight of fish species from their traditional grounds as the seawater warms beyond their tolerance.”

“We can go out and check and see the corals and the oysters and the pteropods corrode and die before our eyes,” said Whitehouse. “It is happening.”

“The climate change problems we are causing by failing to act are a sin, as Pope Francis has flatly declared, but that is not the only sin,” said Whitehouse.

“To jam Congress up, fossil fuel interests are interfering with and corrupting American democracy, and to corrupt American democracy is a second and a grave sin,” he said.

“The science denial apparatus—to mount a fraudulent challenge to the very enterprise of science, that is a third grave sin,” he said.

“Perhaps worst of all is that the world is watching,” he said. “It is watching us as the fossil fuel industry, its creepy billionaires, its front groups, its bogus think tanks all gang up and debauch our democracy.”


Scientist declares Sen. Whitehouse is ‘a complete moron, scientifically’

A prominent scientist is pushing back on Democratic Senator Sheldon Whitehouse (RI) for his lecture on the Senate floor that anyone skeptical of man-made global warming is guilty of “grave sin” and “listening to evil voices.” See: Sen. Whitehouse: ‘Climate deniers’ guilty of ‘grave sin’ & ‘LISTENING TO EVIL VOICES’ – Instead ‘listen to the oceans’

“I am really getting sick and tired of this blowhard Sheldon Whitehouse (D-RI) lecturing us for being sinners,” Dr. Thomas P. Sheahen, an MIT educated physicist and author of the book “An Introduction to High-Temperature Superconductivity,” told Climate Depot. Sheahen is the writer of the popular newspaper column “Ask the Everyday Scientist.” Sheahen is featured in the new book,
‘The Politically Incorrect Guide to Climate Change’ by Marc Morano.

“Senator Whitehouse is a complete moron, scientifically.  He doesn’t know any real science at all.  He believes in the mythology initiated a generation ago by Al Gore,  where CO2 emitted by mankind is entirely to blame,” Sheahen explained.

Sheahen continued: “Here’s the reality:   There is no such thing as a ‘climate denier.’ That category doesn’t exist.  There are certain facts that we all agree on:  a)  the climate is always changing;  b) the globe is warming;  c) there is a finite human contribution  (e.g., the urban heat island effect).  Where disagreement begins is on the role of CO2 in heating the planet.  There is great scientific controversy about that point, because of factors such as how molecules absorb and re-radiate photons at various altitudes in the atmosphere, because of flow via convection of warm air from the surface to the upper atmosphere; and more.  It’s a really complicated field of science.”

Sheahen added: “Sheldon Whitehouse has no intention whatsoever to engage in any scientific debate at all.  Instead, he quotes the entirely false and manufactured statistic that “97% of scientists agree…” and goes from there to further faulty steps:

     1) he asserts that he knows the truth perfectly;

     2) he asserts that anybody who disagrees with him is a sinner.

      I say it’s high time that our religious leaders stepped forth and shouted “Stop!” to Senator Whitehouse and similar bloviators.”

Sheahen concluded: “No way is Whitehouse capable of defining some action as a ‘sin.’  His scientific acumen is so weak that he cannot even defend the position he holds but instead resorts to the ‘argument from authority’ to brush off any scientific disagreement.”


Low-Cost Natural Gas, the Environmentally Friendly Fuel

If you want to know the state of America’s environment today, a good place to start is with the dramatic decline in airborne emissions from power plants over the past decade.

As they generate electricity, hundreds of fossil-fuel power plants across the country emit sulfur dioxide, nitrogen oxides and carbon dioxide into the air. The first two substances cause acid rain and contribute to respiratory ailments and are the emissions of most concern to public health. The third is the principal greenhouse gas that accompanies the burning of oil, natural gas and coal because of their carbon content.

According to the Energy Information Administration, there has been a sharp reduction in power-plant emissions over a 10-year period. Since the start of the shale revolution in 2006 and leading up to 2016, annual sulfur-dioxide emissions dropped 81 percent, from 9.5 million metric tons to 1.8 million tons, and nitrogen oxides fell from 3.8 million metric tons to 1.63 million tons, a reduction of 57 percent.

And over the same period, annual carbon-dioxide emissions dropped 22.5 percent, from 2.5 billion metric tons to 1.9 billion tons. Today carbon-dioxide emissions from power production are at late-1980s levels. Think about it: Even as electricity production has risen, carbon emissions fell.

These numbers should bring home a clear message: The fossil fuel revolution in the United States is profoundly changing not only the economics of oil and gas production but also the environment. When it comes to electricity, the economics increasingly favor low-cost abundant natural gas.

Moreover, natural gas is replacing coal, not only in the United States but also in China and India, two countries with fast-growing economies that are beginning to use imports of liquefied natural gas for electric power production. It’s a powerful demonstration that the significant benefits of the shale revolution are beginning to reach other countries and that the United States has the know-how and resources to play a major role globally in reducing carbon emissions.

Everyone seems to recognize this except U.S. environmental groups and those politicians who are eagerly courting their endorsement by supporting efforts to ban the production and use of fossil fuels. Environmentalists participating in the keep-it-in-the-ground movement want to replace natural gas with renewable energy sources like solar and wind. That misguided approach would unnecessarily send energy costs soaring, is technologically unfeasible, and far from the most efficient way to achieve environmental progress.

Greater use of clean natural gas has already helped us take a significant environmental leap forward. While solar and wind power will continue to become more market competitive, we ought to lean on the resources that are already winning in the marketplace today.

Regrettably, the proposition that reducing the U.S. carbon footprint can be done without natural gas has been gaining ground in political circles. Democrats in both the U.S. Senate and the California Assembly have proposed legislation calling for a full transition to solar and wind.

But relying entirely on renewables is both foolish and unrealistic. Solar and wind are growing as energy sources and a case can be made for investing in renewables. But sacrificing natural gas is ill-advised. Given that solar and wind energy are intermittent, it would require a fundamental change in our energy system and impose enormous costs on the nation’s economy.

Those who cling to the belief that natural gas can be replaced forget that the reason you hear so little about acid rain these days is that sulfur-dioxide emissions have declined significantly over the years. Climate change is still a concern to some.

However, the significant reduction in power-plant emissions to the lowest level in almost 30 years proves that we can grow the economy and have a healthy environment, too. And it’s a demonstration that the technology revolution — and a dose of reason and resolve — can address climate challenges without changing the way we live.


The Renewable Fuel Standard is beyond repair; it is time to repeal it

By Printus LeBlanc

For several years the Renewable Fuel Standard (RFS) has placed an undue burden on the consumers and producers of transportation fuel. It became clear early in the implementation of the RFS it had significant flaws, but special interests have fought reform for fear of losing their gravy train. The RFS has turned nothing more than a government subsidy for the farmers. It is time to return competition to the transportation fuel market and repeal the RFS.

In 2005, Congress passed, and President Bush signed the Energy Policy Act of 2005. Among the many new regulations created in the legislation, the RFS was birthed. The RFS mandated a certain amount of renewable fuels, mostly corn ethanol, be blended with gasoline. The amount was 4 billion gallons in 2006 with a rise to 7.5 billion in 2012.

In 2007, the Energy Independence and Security Act of 2007 was passed. The bill increased the amount of renewable fuel to be blended. It required 9 billion gallons be blended in 2008 with an increase to 36 billion gallons in 2022. The increase amounted to a massive government ordered subsidy to be paid to biofuel producers.

Each refiner has a Renewable Volume Obligation (RVO) that is given to them by the EPA. A Renewable Identification Numbers (RIN) is a tracking number used for biofuels. To ensure every refiner is following the laws outlined in the 2005 and 2007 acts the EPA devised a way to track each batch of biofuel. Refiners must have a certain amount of RINs to meet its RVO. If a refiner does not have the capability to blend biofuel, it must purchase a RIN from another refiner that can produce RINs. A government mandate forcing a private company to buy a product it doesn’t need or want, where have we heard this before?

The largest refinery on the East Coast was just bankrupted by the RFS. The refinery belonging to Philadelphia Energy Solutions (PES) was forced to declare bankruptcy in January. The 335,000 barrel per day refinery was over $600 million in debt, much of that due to the RFS. PES stated it spent $218 million in 2017 for RINs, more than it spent on personnel.

Even the U.S. Energy Information Agency knows the RFS isn’t worth it, stating, “The energy content of ethanol is about 33 percent less than pure gasoline. The impact of fuel ethanol on vehicle fuel economy varies depending on the amount of denaturant that is added to the ethanol. The energy content of denaturant is about equal to the energy content of pure gasoline. In general, vehicle fuel economy may decrease by about 3 percent when using E10 relative to gasoline that does not contain fuel ethanol.”

This begs the question, why is the U.S. government mandating consumers purchase a less efficient fuel?

Not only is ethanol less fuel efficient, but it also acts as yet another tax on the consumer. A 2014 study by the Congressional Budget Office found the RFS adds between $0.13 and $0.26 per gallon of regular gasoline and $0.30 to $0.51 for diesel.

Now the environmental lobby is turning against the RFS. Writing for The Hill, David DeGennaro of the National Wildlife Federation, noted the carbon pollution released by farmers plowing more than 7 million acres between 2008 and 2012 released emissions equal to 20 million cars.

The renewable fuel standard is a complete failure. It did not reduce dependence on foreign oil, fracking did. So are electric cars that don’t use fuel. The RFS did not help the environment; it made it worse. If it did nothing that it was supposed to do, then why is the Obamacare mandate of energy still around? If the special interests are unwilling to reform it, the RFS must be repealed. At this point, it is nothing more than a tax on the consumer and a subsidy for big business.


Ice-Free Arctic Fantasies Melting Away As Temperatures Plummet…Sea Ice Mass Grows Impressively

German skeptic and weather expert ‘Schneefan’ here writes how climate activist Mark C. Serreze recently announced this year’s sea ice extent was at the smallest all-time area. But since then Arctic temperatures have plummeted and sea ice area has grown to over 14 million square kilometers:

At the sea ice portal, the development is clearly shown.

On March 103 2018 sea ice extent in the Arctic reached 14.55 million km² and so the end of Arctic sea ice growth had in fact not been reached.

The plunge in the mean temperature north of 80°N to -25°C can be seen in the plot by the DMI, and so a growth in sea ice was expected.

After an increase to about -10°C in February (due to a weather pattern) the average temperature above 80°N latitude has since fallen to -25°C. Source: DMI.

Naturally the German mainstream media such as ARD television pounced on the news and set off the climate catastrophe alarms, and thus ended up reporting totally falsely again on the real sea ice development in the Arctic:

A heat wave at a mean temperature of -10°C?

SOURCE   (See the original for links and graphics)



Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here