Tuesday, December 17, 2013

More grief for Warmists

Up until now the Arctic has been their only friend.  And they still can't let go.  They talk about a mythical "long term" melting trend.  And even if it were a trend, how do we know it will continue?  Successful straight line extrapolations are rare in nature.  An ogive is more typical

The amount of sea ice in the Arctic has increased by close to 50 per cent compared to last year, according to satellite measurements.

ESA’s CryoSat mission revealed that in October this year the Arctic had 9000 cu km of sea ice. This compares to just 6000 cu km in October 2012.

Scientists believe part of this stronger performance is due to a greater retention of older ice.

Measurements from CryoSat show that the volume of Arctic sea ice has significantly increased this autumn

Over the last few decades, satellites have shown a downward trend in the area of Arctic Ocean covered by ice.

However, the actual volume of sea ice has proven difficult to find out because it moves around and so its thickness can change.

The CryoSat-2 satellite was designed to measure sea-ice thickness across the entire Arctic Ocean, and has allowed scientists, for the first time, to monitor the overall change in volume accurately.

Scientists claim around 90 per cent of the increase is due to growth of multi-year ice – which survives through more than one summer without melting – with only 10 per cent growth of first year ice.

CryoSat-2 carries technologies to measure changes in the vast ice sheets of Greenland and Antarctica and marine ice floating in the polar oceans.

By measuring thickness change in both types of ice, CryoSat-2 is providing information to better understand the role ice plays in the Earth system.

Launched on 8 April 2010, CryoSat-2 is in a highly inclined polar orbit, reaching latitudes of 88° north and south, to maximise its coverage of the poles.

Its main payload is an instrument called Synthetic Aperture Interferometric Radar Altimeter (SIRAL). Previous radar altimeters have been optimised for operations over the ocean and land, but SIRAL is the first sensor of its kind designed for ice.

They claim that thick, multi-year ice indicates healthy Arctic sea-ice cover.  This year’s multi-year ice is now on average about 20 per cent, or around 30 cm, thicker than last year.

'One of the things we’d noticed in our data was that the volume of ice year-to-year was not varying anything like as much as the ice extent – at least in 2010, 2011 and 2012,' said Rachel Tilling from the UK’s Centre for Polar Observation and Modelling, who led the study.

'We didn’t expect the greater ice extent left at the end of this summer’s melt to be reflected in the volume. But it has been, and the reason is related to the amount of multi-year ice in the Arctic.'

While this increase in ice volume is welcome news, it does not indicate a reversal in the long-term trend.

'It’s estimated that there was around 20 000 cu km of Arctic sea ice each October in the early 1980s, and so today’s minimum still ranks among the lowest of the past 30 years,' said Professor Andrew Shepherd from University College London.

The findings from a team of UK researchers at the Centre for Polar Observation and Modelling were presented last week at the American Geophysical Union’s autumn meeting in San Francisco, California.
Sea ice

'We are very pleased that we were able to present these results in time for the conference despite some technical problems we had with the satellite in October, which are now completely solved,' said Tommaso Parrinello, ESA’s CryoSat Mission Manager.

In October, CryoSat-2’s difficulties with its power system threatened the continuous supply of data, but normal operations resumed just over a week later.

With the seasonal freeze-up now underway, CryoSat will continue its measurement of sea ice. Over the coming months, the data will reveal just how much this summer’s increase has affected winter ice volumes.


Climate Change This Week: Warmer Then, Not Now

A new study from Swedish climate scientists indicates that the earth was likely warmer during the ancient Roman empire and Medieval period than it is today. Leif Kullman, the study's author, found that tree lines were at higher elevation during those times than they are today, mainly because “summer temperatures during the early Holocene thermal optimum [Roman and Medieval period] may have been 2.3°C higher than present.” Something tells us that wasn't because of all the Roman SUVs.

Certainly, one report doesn't prove anything one way or the other, but the trend is certainly not going in favor of those who want to blame “global warming” on modern human activity and then clamp down on it with draconian government measures. Despite the alarmism today, temperatures have not increased globally since 1998, leaving warmists scrambling to come up with an explanation. Indeed, just last week saw 2,000 cold and snow records broken in the U.S.

In light of this mounting evidence, some climate scientists are – gasp – becoming skeptics. Judith Curry, a climatologist at the Georgia Institute of Technology, says, “All other things being equal, adding more greenhouse gases to the atmosphere will have a warming effect on the planet. However, all things are never equal, and what we are seeing is natural climate variability dominating over human impact.” That's worth repeating: Natural climate variability might have something to do with the climate.


Forest fires not a result of climate change

Some common sense from the normally moonbat state of California, in an analysis that would apply equally well in Australia:

For purposes of analysis, the history of wildfire in California can be loosely categorized into pre-European settlement fire regimes and post-European settlement fire regimes, especially the last fifty years where rigorous fire suppression efforts have been undertaken.

Natural fire regimes that existed prior to European settlement in California (pre-1700) involved a wide range of fire frequencies and effects on ecosystems; roughly one-third of the State supported frequent fire regimes of 35 years or less. Some areas likely burned on an almost annual basis. Pre-European settlement fire patterns resulted in many millions of acres burning each year, with fire acting as a major ecological force maintaining ecosystem vigor and ranges in habitat conditions. The pre-settlement period is often viewed as the period under which the “natural” fire regime standard for assessing the ecological role of fire developed.....

In the suppression (modern) era, statewide fire frequency is much lower than before the period of European settlement. Between 1950 and 2008, California averaged 320,000 acres burned annually, only a fraction of the several millions of acres that burned under the pre-settlement regimes. Land uses such as agriculture and urbanization have reduced the amount of burnable landscape, and most wildland fires are effectively suppressed to protect resources, commodities, and people.

Before the twentieth century, many forests within California were generally open and park like due to the thinning effects of recurrent fire. Decades of fire suppression and other forest management have left a legacy of increased fuel loads and ecosystems dense with an understory of shade-tolerant, late-succession plant species. The widespread level of dangerous fuel conditions is a result of highly productive vegetative systems accumulating fuels and/or reductions in fire frequency from fire suppression. In the absence of fire, these plant communities accrue biomass, and alter the arrangement of it in ways that significantly increase fuel availability and expected fire intensity.

Paul Homewood (h/t) summarises thus:

* Large and frequent wildfires were the norm before European settlement.

* Regular wildfires provide an essential ecological function and increase forest health and diversity.

* Acreage burnt reduced drastically during the 20thC, as fire suppression methods took effect.

* This fire suppression, though, had the calamitous effect of allowing a dangerous build up of biomass, that now makes fires larger and more intense.

Perhaps somebody might tell Obama.


A British municipality run by Greenies is a disaster

By James Delingpole

Another Little Nell moment in today's Guardian.  It seems that Britain's greenest town council is also turning out to be Britain's most disastrous town council. This is what happens when you put a bunch of Greens, led by a man called Jason Kitcat, in charge and it isn't pretty.

Author and local resident Lynne Truss delivered a picture of what happened. "The place turned into Armageddon," she wrote. "Helped by foxes and the seagulls … a tide of used teabags, eggshells, soiled kitchen paper, banana skins, smelly tin cans, and used sanitary towels (yes!) advanced in such a determined and menacing manner down nice residential streets, you could almost hear it breathing."  "It wasn't pleasant," says Kitcat. "It was very difficult. Of course it was. We didn't want to be there."

Poor Kitcat – quite moderate by Green standards – hasn't been helped by his deeper Green brethren.

Ben Duncan is one of the alleged watermelons, and the councillor who accused Kitcat of betrayal in his blogpost. A former journalist who now works for a Green party MEP, Duncan has floated a handful of provocative ideas, including a "tourist tax" on some of the city's bigger hotels, a possible boycott of one of the taxi firms opposed to the Greens' 20mph speed limit, and the possibility of Brighton allowing the opening of cannabis cafes and becoming the British version of Amsterdam. When asked on Twitter if he himself inhaled, he said this: "I only smoke weed when I'm murdering, raping and looting!" It was, he says, a reference to the famous anti-cannabis film Reefer Madness, but his political enemies didn't seem to get the joke.  Now, he is really on the warpath. There is mileage, he reckons, in the idea of the Greens following the lead of Trotskyite Labour councillors in 1980s Liverpool, refusing to set a cuts-based budget, and thereby putting Brighton in the vanguard of UK-wide anti-austerity resistance.

Yes: turning an affluent middle class seaside town into Eighties Liverpool. That would work! [Liverpool is a mainly working-class British city principally noted for football and unemployment  -- JR]


Wind energy costs four times more in UK than Brazil because of way green subsidies are handed out

Britain pays four times as much for its wind energy as Brazil, thanks to uncompetitive subsidies.

A damning report from the Policy Exchange urges the government to hold the wind industry to its pledge to slash costs by the end of the decade.

UK families are paying £95 per MWh for onshore wind, compared to £27 MWh in Brazil, according to its report.  It argues that the government should hold an auction of renewable technologies to allow the industry to compete for state support.  This could start as early as next year for projects which would begin in 2017.

Currently, ministers only plan to introduce energy auctions in 2018 for projects that will be commissioned after 2020.

But this week Energy Minister Michael Fallon is set to unveil detailed plans about how to make subsidies more competitive.

Critics fear this could mean more onshore windfarms by the backdoor as they are cheaper to run compared to offshore wind, which costs about 50 per cent more to generate.

But government sources insisted that the plan will make it more difficult for the onshore windfarms to get state handouts because they will be subject to a ‘constrained allocation’.

In Brazil, prices for onshore wind have dropped to world record lows since auctioning was introduced.

But in Britain, ministers are much keener on more expensive offshore wind because it is less politically controversial.  While onshore windfarms trigger fury from local communities due to their visual blight and noise, offshore windfarms are far away enough from homes not to spark protest.  But offshore wind is about 50 per cent as expensive as onshore wind, with a strike price of £155 MWh.

The most expensive technologies are tidal and wave power, which cost around £305 MWh.

Simon Moore, author of the report said: ‘The government needs to act more ruthlessly to reduce household energy bills by cutting state support for renewable technologies that do not come down in price.

He added: ‘Offshore wind may play an important role in our future energy mix. But it should not be given favourable treatment at the expense of other low carbon technologies which could reduce our carbon emissions at a much cheaper price.’

The report said of offshore wind, that ‘at its current costs it is simply too poor value an investment to be allowed to continue much longer’.

The Policy Exchange also called for the government to abolish the EU Renewable Energy Target which it said imposed unnecessary costs on Britain’s energy.

In its report, it said: ‘While the renewable energy target exists, introducing effective means of competition, or of cost-control more broadly, is seen as impossible. The legally-binding target makes no allowance for the potential high expense of meeting it.’

The EU Renewable Energy Target decrees that 15 per cent of all energy in the UK - or 30 to 35 per cent of electricity - is generated by renewable sources by 2020.

But emissions from the electricity sector are already capped by a separate European Emissions Trading System.

Average electricity bills are already £563 a year, out of a total dual fuel bill of £1255.  By 2020, this will rise to £598 out of a total bill of £1331. Various renewables targets are already contributing to £37 of this rise, and will account for £110 in 2020.

‘A policy that imposes higher-than-necessary costs risks failing if public support is lost,’ the report said.

Part of the reason for Brazil’s low wind energy prices has been put down to unusually high wind speeds, a surplus of wind turbines and hidden incentives.

Its wind turbines have much better capacity of up to 65 per cent, compared to up to 35 per cent in the UK and much of Europe.


The Power-Mad EPA

By Alan Caruba

Barely a week goes by these days without hearing of some new demand by the Environmental Protection Agency that borders on the insane.

Increasingly, EPA regulations are being challenged and now reach the Supreme Court for a final judgment. This marks the failure of Congress to exercise any real oversight and control of an agency that everyone agrees is now totally out of control.

Recently the EPA ruled that New York City had to replace 1,300 fire hydrants because of their lead content. The ruling was based on the Drinking Water Act passed by Congress in 2011. As Senator Charles Schumer (D-NY) pointed out while lambasting the agency, “I don’t know a single New Yorker who goes out to their fire hydrants every morning, turns it on, and brushes their teeth using the water from these hydrants. It makes no sense whatsoever.” Reportedly, the Senate is poised to consider legislation exempting fire hydrants if the EPA does not revise its ruling.

The EPA is not about making sense. It is about over-interpreting laws passed by Congress in ways that now continually lead to cases before the Supreme Court. The Court is composed of lawyers, not scientists. In an earlier case, they ruled that carbon dioxide (CO2) is a “pollutant” when it is the one gas that all vegetation requires. Without it, nothing grows and all life on Earth dies.

A federal appeals court recently heard a case about the EPA’s interpretation of the 2012 Mercury and Air Toxics Rule, yet another effort in the “war on coal” that would shut down more coal-fired plants that provide the bulk of the electricity the nation requires.

The EPA is asserting that the rule would annually prevent 11,000 premature deaths, nearly 5,000 heart attacks, and 130,000 asthma attacks. Moreover it asserts that it would help avoid more than 540,000 missed work days, and protect babies and children. These statistics are plucked from various studies published in journals and are typical of the way the EPA operates to justify its rulings. Their accuracy is dubious.

What makes this case, brought by EarthJustice--formerly the Sierra Club Legal Defense Fund--of interest is the way the NAACP, along with 17 other organizations, came to the defense of the ruling. Are you surprised that the NAACP has a director of Environmental and Climate Justice?

Apparently civil rights for Afro-Americans now embraces the absurd claims about climate change, formerly known as global warming. “Civil rights are about equal access to protections afforded by law,” said Jacqui Patterson, the NAACP director. “These standards provide essential safeguards for communities who are now suffering from decades of toxic exposure.” If these essential safeguards are in place, on what basis does she make such a claim?

The EarthJustice attorney, Jim Pew, claims the case is about protecting “hundreds of thousands of babies each year from development disorders, and spare communities of 130,000 asthma attacks each year. If, in a lawsuit, you find yourself arguing against the lives of babies, children with asthma, and people suffering from your toxic dumping, then you are on the wrong side of both the lawsuit and history..”

Here, again, the claims about health-related harm are absurd. Who believes that asthma or development disorders are related to mercury? Who believes that communities served by coal-fired power plants are subject to major health hazards?

The claims about mercury are baseless, in a 2011 commentary published in The Wall Street Journal, Dr. Willie Soon, a geoscientist at Harvard and expert on mercury and public health issues was joined by Paul Driesson, a senior policy advisor for the Committee For a Constructive Tomorrow (CFACT), rebuts the claims about mercury that have been part of the environmental lies put forth for years.

“There is no factual basis for these assertions. To build its case against mercury, the EPA systematically ignored evidence and clinical studies that contradict its regulatory agenda, which is to punish hydrocarbon use.”

“Mercury has always existed naturally in the Earth’s environment…Mercury is found in air, water, rocks, soil and tries, which absorb it from the environment. This is why our bodies evolved with proteins and antioxidants that help protect us from this and other potential contaminants.”

Dr. Soon and Driessen do not deny that coal-burning power plants emit an estimated 41-to-48 tons of mercury per year, “but U.S. forest fires emit at least 44 tons per year; cremation of human remains discharges 26 tons, Chinese power plants eject 400 tons; and volcanoes, subsea vents, geysers, and other sources spew out 9,000-10,000 additional tons per year.”

“Since our power plants account for less than 0.5% of all the mercury in the air we breathe, eliminating every milligram of it will do nothing about the other 99.5% in our atmosphere.”

Such FACTS mean nothing to the EPA. The air and the water of the United States is remarkably clean, but to justify its existence and expand its power, the EPA continues to impose idiotic and unscientific rules about fire hydrants and power plants.

The threat is the EPA, not mercury.



For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here


No comments: