"Green" Britain gets the jitters amid fears of going dark brown
All those windmills and solar farms are doing nothing to avert looming blackouts
Hotels will be paid to turn down refrigerators and factories paid to make staff work overnight to cut energy consumption and prevent blackouts this winter, under emergency plans to be revealed this week.
Ed Davey, the Energy Secretary, has told The Sunday Telegraph that energy regulators have asked for extra contingency measures to cut consumption in event of a cold winter or more power station failures.
Energy analysts have warned that Britain is now at risk of power shortages after two ageing nuclear plants were shut down for safety reasons, and a fire closed Didcot B, a gas-fired plant in Oxfordshire.
Mr Davey will on Tuesday unveil a package of contingency measures designed to reduce pressure on the National Grid over the winter months.
Several disused power stations will be brought back into service to increase supply, he said.
There will also be measures to reduce demand after officials from Ofgem, the energy regulator, and National Grid suggested that further action could be required.
“We have demand-side contingencies. We have had them for a long time, but they wanted – quite rightly – to see if we could increase that,” Mr Davey said.
The demand-side measures, he disclosed, would include National Grid paying large companies to generate their own power in the event of shortages.
“And some companies would change their behaviour, voluntarily, and be recompensed for it. Turning down their refrigerators by a degree, or changing a shift pattern for a week so staff come in earlier. The idea is to move factory production away from peak energy demand periods,” Mr Davey added.
The package of contingency measures will be more than sufficient to cope with the coldest possible winter, further nuclear shutdowns and power station fires, Mr Davey said. “There will not be blackouts,” he insisted.
To allay concerns over the security of British energy supplies, Mr Davey highlighted a report by the US Chamber of Commerce, which says that Britain has the fourth most secure energy supplies in the world.
However, the same report also warns that Britain’s shrinking spare energy capacity “could lead to blackouts”.
During the interview, Mr Davey also disclosed that he has been encouraged by colleagues to run as a replacement for Nick Clegg as leader of the Liberal Democrats, and said that he is “thinking” about standing.
The recent hiatus in the rise of global temperatures
Carl Mears, boss of Remote Sensing Systems purports below to explain why his own data show a temperature "hiatus" in the last 18 years. He canvasses a number of explanations, with which skeptics are well familiar, but the fact that there are many candidate theories to explain why temperatures are not rising shows that no-one in fact knows what it going on. So sticking with a failed prediction amid complete uncertainty is just an act of faith. "Ad hoc" explanations (being wise after the event) are rightly scorned in real science. You have to get your predictions right if people are to accept that you know what you are talking about. And Warmist predictions are nowhere near right
Recently, a number of articles in the mainstream press have pointed out that there appears to have been little or no change in globally averaged temperature over the last two decades. Because of this, we are getting a lot of questions along the lines of “I saw this plot on a denialist web site. Is this really your data?” While some of these reports have “cherry-picked” their end points to make their evidence seem even stronger, there is not much doubt that the rate of warming since the late 1990’s is less than that predicted by most of the IPCC AR5 simulations of historical climate. This can be seen in the RSS data, as well as most other temperature datasets. For example, the figure below is a plot of the temperature anomaly (departure from normal) of the lower troposphere over the past 35 years from the RSS “Temperature Lower Troposphere” (TLT) dataset. For this plot we have averaged over almost the entire globe, from 80S to 80N, and used the entire TLT dataset, starting from 1979. (The denialists really like to fit trends starting in 1997, so that the huge 1997-98 ENSO event is at the start of their time series, resulting in a linear fit with the smallest possible slope.)
TLT time series image
In this figure, the thick black line is from a climate data record derived from microwave sounding satellite (MSU and AMSU ) measurements. Each of the thin light blue lines represents the temperature anomaly time series for the same atmospheric layer from one of 33 IPCC climate model simulations that I have analyzed. I have adjusted each individual time series so that its average is 0.0 for the 1979-1988 period. This has no effect on the trend of each line, but it does make it easier to see long term changes in the plot.
The dips in the simulated model temperatures in 1983 and late 1991 are due to the eruptions of El Chichón and Mt. Pinatubo. These eruptions spewed enough volcanic ash into the stratosphere to block part of the incoming sunlight and cool Earth’s surface and troposphere. The cooling can easily be seen in the measured satellite data in 1992-1993. The cooling event in 1983 happened by chance at more or less the same time as an El Niño event in 1983-84, making it harder to see. (Note that the same events warmed the stratosphere. See our TLS dataset.) The year-to-year variability of the measured data is dominated by El Niño/La Niña events, with an overall warming trend (0.123K/decade) on longer timescales.
The plot shows that the measured temperature rise is within the envelope of model predictions up until the late 2000’s. After that time, observed temperatures are sometimes less than any model prediction, and are clearly different than the mainstream model behavior. This slow-down in the warming, often called the “warming hiatus”, has become a major research topic over the last several years, and a source of much controversy across the blogosphere. In this post, I offer my view on the cause of the hiatus. Some of the following discussion is distilled from a moderated debate I took part in under the auspices of the Climate Dialogue website
Does this slow-down in the warming mean that the idea of anthropogenic global warming is no longer valid? The short answer is ‘no’. The denialists like to assume that the cause for the model/observation discrepancy is some kind of problem with the fundamental model physics, and they pooh-pooh any other sort of explanation. This leads them to conclude, very likely erroneously, that the long-term sensitivity of the climate is much less than is currently thought.
The truth is that there are lots of causes besides errors in the fundamental model physics that could lead to the model/observation discrepancy. I summarize a number of these possible causes below. Without convincing evidence of model physics flaws (and I haven’t seen any), I would say that the possible causes described below need to be investigated and ruled out before we can pin the blame on fundamental modelling errors.
Also, a philosophical comment -- often, we are predisposed to the position that a given effect is due to a single cause. Part of the reason for this is probably human nature. We like to distill complex things into simple stories or parables. The other part is that for most of the science courses we take in school, simple experiments are presented that demonstrate the fundamental ideas in the topic under study. Single causes are often the case in laboratory experiments -- these experiments are usually designed to isolate a single causative effect. In “real-world” science, such as the study of Earth’s climate, things are very unlikely to be as clear cut. Instead, each observed “effect” will be due to the combination of numerous causes. My point is that I do not expect the disagreement between models and observations over the past 15 years to be due to a single cause. It is much more likely to be due to some combination of the possible causes listed below.
The possible causes for the model/observation discrepancies can be grouped into several categories:
Errors in Model “Forcing”
Internal Variability (Random Fluctuations) in the Climate System
Errors in Fundamental Model Physics
The first 3 causes have no effect on the long-term sensitivity of the climate to increased CO and only some of the fundamental model physics errors (4th cause) would change the long-term sensitivity. Some of the causes described below might delay the warming, but we would end up at the same point in the future, or we might see changes in the geographical patterns of the warming signal, but the overall change would be the same. In this post, I’ll address the first three possibilities in more detail. I am not an expert in modeling, so I will leave the discussion of possible model errors to others with more expertise.
New paper shows global sea level rise has greatly decelerated since ~2002, opposite of predictions
A new discussion paper published in Ocean Science evaluates multi-mission satellite sea level records and shows that the rate of sea level rise has greatly decelerated since ~2002, as has been documented in prior research finding sea level rise decelerated 31% since 2002, and decelerated 44% since 2004 to less than 7 inches per century. This is obviously the opposite of climate model predictions in response to a steady rise in CO2 greenhouse gas levels, but is compatible with the ongoing "pause" of global warming.
Other notable findings from the paper include:
1) The positive global sea level rise trend is almost entirely due to an apparent huge "bulge" located in the Western equatorial Pacific region [Fig 12 immediately below, and the "bulge" 3-D illustrated by StevenGoddard.wordpress.com in the 3rd figure below].
2) Conversely, all areas shown in blue have experienced a drop in altimetric sea levels [different from relative sea levels which are more dependent upon land height changes] from 1993-2010, including most of the East and West coasts of North and South America.
3) As the 2nd figure below indicates, this "bulge" is almost entirely steric sea level rise from thermal expansion, as opposed to eustatic sea level rise from melting of ice. The fact that the "bulge" is so localized in the equatorial Western Pacific points to trade winds or ocean oscillations such as the Pacific Decadal Oscillation as responsible, rather than any effect from greenhouse gases, which would cause a generalized, not highly localized, effect on ocean thermal expansion or eustatic sea level rise from melting ice.
4) The 4th figure below shows an alternative method of determining sea level rise using ARGO + GRACE finds sea levels rising at 2.31 mm/yr, about 35% less than determined by the satellite altimetry methods.
5) The 5th figure below [Figure 10 of the paper] shows sea level rise has greatly decelerated since ~2002, as has been documented in prior research. This is the opposite of climate model predictions in response to a steady rise in CO2 levels, but is compatible with the ongoing "pause" of global warming.
6) Figure 7 below shows how easy it is to tamper with previously published satellite data from the European ENVISO satellite, which previously showed sea levels rising at 1.59 mm/yr [very similar to what global tide gauges show], but a convenient new processing algorithm magically increases or up-justs the rate of ENVISO sea level rise by 86% to match the US satellite data of 2.96 mm/yr. It's deja vu all over again.
More HERE (See the original for links, graphics etc.)
Improved sea level record over the satellite altimetry era (1993–2010) from the Climate Change Initiative Project
M. Ablain et al.
Sea level is one of the 50 Essential Climate Variables (ECVs) listed by the Global Climate Observing System (GCOS) in climate change monitoring. In the last two decades, sea level has been routinely measured from space using satellite altimetry techniques. In order to address a number of important scientific questions such as: "Is sea level rise accelerating?", "Can we close the sea level budget?", "What are the causes of the regional and interannual variability?", "Can we already detect the anthropogenic forcing signature and separate it from the internal/natural climate variability?", and "What are the coastal impacts of sea level rise?", the accuracy of altimetry-based sea level records at global and regional scales needs to be significantly improved. For example, the global mean and regional sea level trend uncertainty should become better than 0.3 and 0.5 mm year−1, respectively (currently of 0.6 and 1–2 mm year−1). Similarly, interannual global mean sea level variations (currently uncertain to 2–3 mm) need to be monitored with better accuracy. In this paper, we present various respective data improvements achieved within the European Space Agency (ESA) Climate Change Initiative (ESA CCI) project on "Sea Level" during its first phase (2010–2013), using multi-mission satellite altimetry data over the 1993–2010 time span. In a first step, using a new processing system with dedicated algorithms and adapted data processing strategies, an improved set of sea level products has been produced. The main improvements include: reduction of orbit errors and wet/dry atmospheric correction errors, reduction of instrumental drifts and bias, inter-calibration biases, intercalibration between missions and combination of the different sea level data sets, and an improvement of the reference mean sea surface. We also present preliminary independent validations of the SL_cci products, based on tide gauges comparison and sea level budget closure approach, as well as comparisons with ocean re-analyses and climate model outputs.
Ocean Sci. Discuss., 11, 2029-2071, 2014. doi:10.5194/osd-11-2029-2014
Frac sand study: Lots of scare, little science
How would you feel if you walked into a doctor's office and the doctor told you about the potential dangers of heart surgery but didn't tell you the risks can be minimized with proper precautions or about any of the benefits of the surgery? That would be a frightening experience, because we need as much information as possible to make the best decisions, and withholding vital information from those who need it most is unethical.
Unfortunately, special interest groups have published a study attempting to scare the people of Wisconsin and other parts of the Upper Midwest about mining sand used for hydraulic fracturing, commonly referred to as "frac sand," by presenting only one side of the story.
Instead of basing the study on the best available scientific evidence and discussing both the costs and benefits of frac sand mining, anecdotal evidence (which is unscientific and unreliable and can lead to cherry-picking data) is used to focus on costs while completely ignoring benefits.
This study attempts to portray frac sand mining as an industry running amok, operating without oversight or regulation. It also tries to paint the industry as a threat to a clean water supply and as a possible cancer risk, but it doesn't provide even a grain of real science to support these claims.
Contrary to assertions that the frac sand industry lacks proper oversight, the Wisconsin Department of Natural Resources' website states all non-metallic mining operations (including frac sand) must obtain DNR water permits to operate in the state. Additional permits are needed for water withdrawal, modifying wetlands, stormwater discharge, air pollution for construction and operation of the facilities, mine safety and many more industry practices. DNR rules also require frac sand companies to restore the land upon completing the mining process, re-establishing wildlife habitats or farm fields.
The study also raises concerns about the amount of water used to wash frac sand, leading some to fear these operations could potentially deplete water resources. However, frac sand washing and processing was only the sixth-largest use in the 10 counties that reported frac sand watering operations, and most frac sand facilities use a closed-loop process, indicating nearly 90% of water can be recycled for onsite reuse. Because most of the water is recycled, EOG, a sand plant in Chippewa Falls, which uses approximately 2 million gallons a day, requires only 18,000 gallons of "make-up water" each day.
A vital part of recycling water for frac sand processing is removing the small clay particles from the water by using the flocculant polyacrylamide, a safe chemical used by most municipal wastewater treatment facilities, to get clay particles to "clump together" and settle out of the water. Perhaps in an attempt to stir up fears about water contamination and cancer outbreaks, the study states polyacrylamide can also contain acrylamide, a known neurotoxin, but it fails to provide proper context. Polyacrylamide can contain acrylamide, but only in trace amounts.
The study additionally fails to acknowledge acrylamide breaks down quickly into CO2 and ammonia. Within 14 days, 74% to 94% of acrylamide breaks down in oxygen-rich soils and 64% to 89% in oxygen-poor soils. Because horizontal groundwater flow velocities are typically on the order of centimeters per day, acrylamide does not last long in ground water. This further reduces the probability of negative health effects.
The study also purports to have evidence of acid mine drainage, which frac sand mining does not create.
Make no mistake, everything we do has an environmental impact, and frac sand mining is no exception. But to exaggerate the costs and ignore the benefits is dishonest. Wisconsin can take reasonable precautions to develop frac sand resources in an environmentally responsible way and continue to enjoy the benefits of creating thousands of high-paying jobs throughout the state.
Unscientific studies, half-truths and missing data stand in the way of an informed discussion about frac sand in the same way a doctor does when he or she tells you the costs and none of the benefits of a procedure. Wisconsinites should seek a second opinion.
EXPOSED: How a shadowy Greenie network funded by foreign millions is making household energy bills soar
The Mail on Sunday today exposes how a ‘Green Blob’ financed by a shadowy group of hugely wealthy foreign donors is driving Britain towards economically ruinous eco targets.
The phrase the ‘Green Blob’ was coined by former Environment Secretary Owen Paterson after he was sacked from the Cabinet in July.
He was referring to a network of pro-green lobbyists working at every level of the British Establishment, who have helped shape the eco policies sending household energy bills soaring.
But investigations by this newspaper reveal the Blob is not just an abstract concept. We have found that innocuous-sounding bodies such as the Dutch National Postcode Lottery, the American William and Flora Hewlett Foundation and the Swiss Oak Foundation are channelling tens of millions of pounds each year to climate change lobbyists in Britain, including Greenpeace and Friends of the Earth.
They have publicly congratulated themselves on their ability to create green Government policy in the UK – most notably after Ed Miliband steered through aggressive CO2 reduction targets in his 2008 Climate Change Act, and announced there would be no more coal power stations.
Yet the consequences of their continuing success are certain: further eye-watering rises in energy costs for millions of Britons and an increasing risk of blackouts.
According to leading energy analyst Peter Atherton of Liberum Capital, current UK energy policies shaped by the Blob will cost between £360 billion and £400 billion to implement by 2030. He said this will see bills rise by at least a third in real terms – on top of the increases already seen over the past ten years.
This bill dwarfs the EU’s £1.7 billion demand from Britain last week.
Lobbying by the Blob helped lead to a new European Union emissions deal announced on Friday, when EU leaders including the Prime Minister agreed to triple the current pace of emissions cuts.
Following earlier deals, EU-wide emissions of CO2 are supposed to fall 20 per cent over the 30-year period 1990 to 2020.
Under the new agreement, this reduction must be doubled in just a decade, reaching ‘at least’ 40 per cent by 2030 – a goal that could only be accomplished through further massive investment in wind and nuclear energy.
At the heart of the Blob is a single institution – the European Climate Foundation (ECF) – which has offices in London, Brussels, The Hague, Berlin and Warsaw.
Every year it receives about £20 million from ‘philanthropic’ foundations in America, Holland and Switzerland, and channels most of it to green campaign and lobby groups.
It refuses to disclose how much it gives to each recipient, and does not publish its accounts. But it admits that the purpose of these grants is to influence British and EU climate and energy policy across a broad front.
Many more millions are fed directly to British and European lobby groups from the same overseas foundations which also fund ECF.
In its last annual report, ECF said working towards a 2030 deal was ‘a big focus area for ECF as a whole’.
ECF managing director Tom Brookes told The Mail on Sunday he provides ‘a fact-base’ to help policy-makers make the ‘many complex decisions that are necessary to move towards a high-innovation, prosperous and low-carbon future’. He added: ‘The UK is a leader in many of these fields.’
Much more HERE
The Australian government reveals position on renewable energy target
The Abbott government is supporting a scaling back of the renewable energy target which they say will better reflect changes in demand for electricity.
Industry Minister Ian Macfarlane on Wednesday revealed the government's long-awaited position on the target, which would reduce the amount of energy produced by renewable energy projects by 2020 from 41,000 gigawatt hours to about 26,000.
The position rejects the recommendations in the review of the target headed by businessman and climate change sceptic Dick Warburton.
The review, which cost the government more than $500,000, in August recommended Australia's RET be either closed to new projects or scaled back dramatically on the basis of yearly reviews.
But the government, which has been looking to restore a bipartisan agreement with Labor, faces a battle to negotiate its position through the parliament.
Labor has signalled it will reject the scale back proposed, which Mr Macfarlane said would constitute a so-called "real 20 per cent" of Australia's electricity production.
"It won't be a 27 per cent renewable energy target, it will be 20 per cent renewable energy target," Mr Macfarlane said.
Mr Macfarlane said the position put to Labor on Wednesday included exemptions for emissions intensives industries, including aluminium, copper, zinc and cement.
The small-scale solar panel scheme will remain untouched and biannual reviews of the target will cease.
The target legislated in 2009 set Australia's target at 41,000 gigawatt hours, which based on electricity demand at the time would have represented 20 per cent of the electricity produced in Australia in 2020.
But in recent years, electricity demand has collapsed, meaning the 41,000 gigawatt hour target is now closer to 27 per cent.
Labor is expected to reject the proposed RET reduction as too dramatic when it meets the government for talks on Wednesday afternoon.
Earlier on Wednesday, Labor Leader Bill Shorten said the opposition had made it clear it was open to discussing the target but it had established "no-go zones".
"The government say they want a real 20 per cent, I call it a fraud 20 per cent, a fake 20 per cent. The truth of the matter is that renewable energy is part of our energy mix. It's had a great benefit for a whole lot of consumers," Mr Shorten said.
"We've seen thousands of jobs created...and we've seen billions of dollars of investment. The real damage that this government's doing in renewable energy cannot be overstated."
In the ideological tussle within the Coalition about climate policy, the position announced on Wednesday, while a compromise, represents a win for those in the cabinet, such as Environment Minister Greg Hunt, who favour greener policies.
But the renewable energy industry said the target as proposed would devastate the industry and jeopardise millions of dollars in investment.
Lane Crockett, general manager of PacificHydro, said: "What reason can there be [for this cut] other than to protect the coal industry?"
Andrew Bray, national coordinator of the Australian Wind Alliance, said the government had "learnt nothing" from the Warburton review, noting its own commissioned research pointed to electricity prices being lower over the longer term with the RET as it is.
"What the government has indicated today is that it wants to increase the massive profits of big power companies by charging everyday Australians more for their electricity," Mr Bray said.
For more postings from me, see DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are here or here or here. Email me (John Ray) here.
Preserving the graphics: Most graphics on this site are hotlinked from elsewhere. But hotlinked graphics sometimes have only a short life -- as little as a week in some cases. After that they no longer come up. From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site. See here or here