Sunday, August 24, 2014

Large conclusions drawn from just 3 years of Cryosat-2 data

Even using ICEsat data extends the series to only 5 years  -- far too little to support claims of a trend.  A larger series could reveal very different earlier changes. The report below is objective enough.  It is only the spin Warmists are putting on it that is fanciful.  Note that the report below admits that the Antarctic sheet is both thickening and thinning (in different places)  -- so any trend is not even Antarctic-wide, let alone global

Elevation and elevation change of Greenland and Antarctica derived from CryoSat-2

By V. Helm, A. Humbert, and H. Miller


This study focuses on the present-day surface elevation of the Greenland and Antarctic ice sheets. Based on 3 years of CryoSat-2 data acquisition we derived new elevation models (DEMs) as well as elevation change maps and volume change estimates for both ice sheets. Here we present the new DEMs and their corresponding error maps. The accuracy of the derived DEMs for Greenland and Antarctica is similar to those of previous DEMs obtained by satellite-based laser and radar altimeters. Comparisons with ICESat data show that 80% of the CryoSat-2 DEMs have an uncertainty of less than 3 m ± 15 m. The surface elevation change rates between January 2011 and January 2014 are presented for both ice sheets. We compared our results to elevation change rates obtained from ICESat data covering the time period from 2003 to 2009. The comparison reveals that in West Antarctica the volume loss has increased by a factor of 3. It also shows an anomalous thickening in Dronning Maud Land, East Antarctica which represents a known large-scale accumulation event. This anomaly partly compensates for the observed increased volume loss of the Antarctic Peninsula and West Antarctica. For Greenland we find a volume loss increased by a factor of 2.5 compared to the ICESat period with large negative elevation changes concentrated at the west and southeast coasts. The combined volume change of Greenland and Antarctica for the observation period is estimated to be −503 ± 107 km3 yr−1. Greenland contributes nearly 75% to the total volume change with −375 ± 24 km3 yr−1.

The Cryosphere, 8, 1539-1559, 2014.

Australia's BOM caught with its pants down

No surprise. NASA GISS does the same.  See also here and here and here

THE Bureau of Meteorology has been accused of manipulating historic temperature records to fit a predetermined view of global warming.

Researcher Jennifer Marohasy claims the adjusted records resemble “propaganda” rather than science.

Dr Marohasy has analysed the raw data from dozens of locations across Australia and matched it against the new data used by BOM showing that temperatures were progressively warming.

In many cases, Dr Marohasy said, temperature trends had changed from slight cooling to dramatic warming over 100 years.

BOM has rejected Dr Marohasy’s claims and said the agency had used world’s best practice and a peer reviewed process to modify the physical temperature records that had been recorded at weather stations across the country.

It said data from a selection of weather stations underwent a process known as “homogenisation” to correct for anomalies. It was “very unlikely” that data homogenisation impacted on the empirical outlooks.

In a statement to The Weekend Australian BOM said the bulk of the scientific literature did not support the view that data homogenisation resulted in “diminished physical veracity in any particular climate data set’’.

Historical data was homogenised to account for a wide range of non-climate related influences such as the type of instrument used, choice of calibration or enclosure and where it was located.

“All of these elements are subject to change over a period of 100 years, and such non-climate ­related changes need to be ­accounted for in the data for ­reliable analysis and monitoring of trends,’’ BOM said.

Account is also taken of temperature recordings from nearby stations. It took “a great deal of care with the climate record, and understands the importance of scientific integrity”.

Dr Marohasy said she had found examples where there had been no change in instrumentation or siting and no inconsistency with nearby stations but there had been a dramatic change in temperature trend towards warming after homogenisation.

She said that at Amberley in Queensland, homogenisation had resulted in a change in the temperature trend from one of cooling to dramatic warming.

She calculated homogenisation had changed a cooling trend in the minimum temperature of 1C per century at Amberley into a warming trend of 2.5C. This was despite there being no change in location or instrumentation.

BOM said the adjustment to the minimums at Amberley was identified through “neighbour comparisons”. It said the level of confidence was very high because of the large number of stations in the region. There were examples where homogenisation had resulted in a weaker warming trend.


More on BoM shenanigans

WHEN raging floodwaters swept through Brisbane in January 2011 they submerged a much-loved red Corvette sports car in the basement car park of a unit in the riverside suburb of St Lucia.

On the scale of the billions of dollars worth of damage done to the nation’s third largest city in the man-made flood, the loss of a sports car may not seem like much.

But the loss has been the catalyst for an escalating row that ­raises questions about the competence and integrity of Australia’s premier weather agency, the Bureau of Meteorology, stretching well beyond the summer storms.

It goes to heart of the climate change debate — in particular, whether computer models are better than real data and whether temperature records are being manipulated in a bid to make each year hotter than the last.

With farmer parents, researcher Jennifer Marohasy says she has always had a fascination with rainfall and drought-flood ­cycles. So, in a show of solidarity with her husband and his sodden Corvette, Marohasy began researching the temperature records noted in historic logs that date back through the Federation drought of the late 19th century.

Specifically, she was keen to try forecasting Brisbane floods using historical data and the latest statistical modelling techniques.

Marohasy’s research has put her in dispute with BoM over a paper she published with John Abbot at Central Queensland University in the journal Atmospheric Research concerning the best data to use for rainfall forecasting. (She is a biologist and a sceptic of the thesis that human activity is bringing about global warming.) BoM challenged the findings of the Marohasy-Abbot paper, but the international journal rejected the BoM rebuttal, which had been prepared by some of the bureau’s top scientists.

This has led to an escalating dispute over the way in which ­Australia’s historical temperature records are “improved” through homogenisation, which is proving more difficult to resolve. If Marohasy is right, contrary to widely published claims, last year cannot be called the hottest year on ­record. BoM insists it is using world’s best practice to determine temperature trend and its methods are in accordance with those of its international peers.

But in furious correspondence with BoM, Marohasy argues the computer “homogenisation” of the records is being undertaken in a way that is at odds with its original intent.

“In (George Orwell’s) Nineteen Eighty-Four Winston Smith knows that, ‘He who controls the present controls the past’. Certainly the bureau appears intent on improving the historical temperature record by changing it,” Marohasy says.

There has been correspondence between Marohasy and BoM involving federal MP Dennis Jensen and the parliamentary secretary responsible for the bureau, Simon Birmingham.

After taking up the issue for Jensen, Birmingham says he is “confident that the bureau’s methods are robust’’. On its website, BoM says it is “improving Australia’s temperature record” by carefully analysing records “to find and address spurious artefacts in the data, thus developing a consistent — or homogeneous — record of daily temperatures over the last 100 years”.

BoM says historic high temperatures are unreliable, some having been collected by thermometers housed in a beer crate on an outback veranda.

In response to questions from Inquirer, BoM says “the premise that data homogenisation results in diminished physical veracity — in any particular climate data set — is unsupported in the bulk of the scientific literature’’.

But Marohasy is not convinced.

“Repetition is a propaganda technique,’’ she wrote back to Birmingham. “The deletion of information from records, and the use of exaggeration and half-truths, are ­others.

“The Bureau of Meteorology uses all these techniques, while wilfully ignoring evidence that contradicts its own propaganda.’’

Marohasy has analysed the physical temperature records from more than 30 stations included in the BoM set that determines the official national temperature record.

And she remains disturbed by a pattern whereby homogenisation exaggerates, or even produces, a record of steady warming against a steady or cooling trend in the raw data.

Marohasy says the clearly ­stated intent of homogenisation is to correct for changes in equipment, siting, and/or other factors that conceivably can introduce non-­climatic factors into the temperature record.

“The bureau, however, is applying the algorithms subjectively and without supporting metadata, in such a way that the temperature record is completely altered, despite the absence of evidence that there were any changes in siting, equipment, or any other factor which could have conceivably introduced a non­-climatic discontinuity,’’ she says.

Marohasy says the “corruption” of the data was of no practical consequence to climate scientists at BoM because they do not use historical data for forecasting either rainfall or temperature — they use simulation models that attempt to recreate the climate based on assumed physical ­processes.

But she says the remodelling is “of considerable political value to them, because the remodelled data better accords with the theory of anthropogenic global warming’’.

Marohasy presented a paper on her research to the Sydney Institute earlier this year. She has since expanded the number of physical temperature records analysed and says the results have only added weight to her concerns.

At Amberley, in Queensland, temperatures have been collected at the same well-maintained site within the perimeter of the air force base since 1941.

But through the homogenisation process BoM has changed what was a cooling trend in the minimum temperature of 1.0C per century into a warming trend of 2.5C per century.

“Homogenisation has not resulted in some small change to the data set, but rather a change in the temperature trend from one of cooling to dramatic warming,’’ Marohasy says.

This has been achieved by jumping-up the minimum temperatures twice through the homogenisation process: once in about 1980 and then around 1996 to achieve a combined temperature increase of more than 1.5C. NASA’s Goddard Institute for Space Studies, based in New York, also applies a jump-up to the Amberley series in 1980, and makes other changes, so that the annual average temperature for Amberley increases from 1941 to 2012 by about 2C.

In correspondence, Marohasy was told by NASA the Amberley data was adjusted to take account of historic temperature records at nearby stations.

But these 310 “nearby” stations stretched to a radius of 974km and include Frederick Reef in the Coral Sea, Quilpie post office in southwestern Queensland and even Bourke post office in NSW.

Considering the unhomogenised data for the nearest weather station, BoM’s jump-up for Amberley creates an increase for the official temperature trend of 0.75C per century. Temperatures at old Brisbane aero, the closest station that is also part of the national temperature network, also shows a long-term cooling trend.

“Perhaps the cooling at Amberley is real,’’ Marohasy says.

“Why not consider this, particularly in the absence of real physical evidence to the ­contrary?”

Another example is Rutherglen, a small town in a winegrowing region of northeast Victoria, where temperatures have been measured at a research station since November 1912.

There have been no documented site moves but an automatic weather station was installed on January 29, 1998.

Temperatures measured at the Rutherglen weather station also form part of the national temperature network (ACORN-Sat), so the information from this station is homogenised before inclusion in the official record that is used to calculate temperature trends for Victoria and also Australia.

Marohasy says the unhomogenised/raw mean annual minimum temperature trend for Rutherglen for the 100-­year period from January 1913 through to December last year shows a slight cooling trend of 0.35C per 100 years.

After homogenisation there is a warming trend of 1.73C per 100 years. Marohasy says this warming trend essentially was achieved by progressively dropping down the temperatures from 1973 back through to 1913. For the year of 1913 the difference between the raw temperature and the ACORN-Sat temperature is 1.8C.

BoM is adamant the purpose of homogenisation is to remove non-­climatic disconuities. But Marohasy says because there have been no site changes or equipment changes at Rutherglen, but very large adjustments made to the data, it is perhaps reasonable to assume that the bureau has changed the record for Rutherglen because it is very different to the record for the neighbouring stations.

According to a technical manual written by Blair Trewin from BoM, changes can be made where discontinuities are apparent when the trend at a site, for example Rutherglen, is compared with up to 40 neighbouring sites.

But Marohasy says analysis of nearby sites finds temperature trends show almost no warming during the past 100 years.

Marohasy says the changes to the minimum temperatures for Rutherglen are broadly consistent with many other changes made to temperature records for eastern Australia, which make the trends consistent with the theory of anthropogenic global warming.

But these changes are not consistent with the overriding principle of homogenisation, which is that changes should only be made to correct for non-­climatic factors.

In the case of Rutherglen, she says, the changes do not even appear consistent with a principle in the bureau’s own technical manual, which is that changes should be consistent with trends at neighbouring weather stations.

At Burke, in western NSW, BoM deleted the first 40 years of data because temperatures before 1908 were apparently not recorded in a Stevenson screen, the agreed modern method.

Marohasy says this could have been easily accounted for with an accepted algorithm, which would not have changed the fact that it was obviously much hotter in the early 20th century than for any period since. Instead, the early record is deleted, and the post-1910 data homogenised.

“Rather than searching for a real physical explanation for the early 20th century cooling at Bourke since the hot years of the late 1800s, the Bureau has created a warming trend,’’ Marohasy says.

“This homogenisation, and the addition of data recorded after 1996 from the airport, means that the official record shows an overall warming trend of 0.01C per century and 2013 becomes about the hottest year on record for Bourke.’’

BOM says major adjustments at Bourke related to site moves as well as comparisons with neighbouring areas, while the Amberley and Rutherglen adjustments also were made after comparison with neighbouring stations.

And the bureau says an extensive study has found homogeneity adjustments have little impact on national trends and changes in temperature extremes.


Jennifer's personal comments on the BOM

EARLIER this year Tim Flannery said “the pause” in global warming was a myth, leading medical scientists called for stronger action on climate change, and the Australian Bureau of Meteorology declared 2013 the hottest year on record. All of this was reported without any discussion of the actual temperature data. It has been assumed that there is basically one temperature series and that it’s genuine.

But I’m hoping that after today, with both a feature (page 20) and a news piece (page 9) in The Weekend Australia things have changed forever.

I’m hoping that next time Professor Flannery is interviewed he will be asked by journalists which data series he is relying on: the actual recorded temperatures or the homogenized remodeled series. Because as many skeptics have known for a long time, and as Graham Lloyd reports today for News Ltd, for any one site across this wide-brown land Australia, while the raw data may show a pause, or even cooling, the truncated and homogenized data often shows dramatic warming.

When I first sent Graham Lloyd some examples of the remodeling of the temperature series I think he may have been somewhat skeptical. I know he on-forwarded this information to the Bureau for comment, including three charts showing the homogenization of the minimum temperature series for Amberley.

Mr Lloyd is the Environment Editor for The Australian newspaper and he may have been concerned I got the numbers wrong. He sought comment and clarification from the Bureau, not just for Amberley but also for my numbers pertaining to Rutherglen and Bourke.

I understand that by way of response to Mr Lloyd, the Bureau has not disputed these calculations.

This is significant. The Bureau now admits that it changes the temperature series and quite dramatically through the process of homogenisation.

I repeat the Bureau has not disputed the figures. The Bureau admits that the data is remodelled.

What the Bureau has done, however, is try and justify the changes. In particular, for Amberley the Bureau is claiming to Mr Lloyd that there is very little available documentation for Amberley before 1990 and that information before this time may be “classified”: as in top secret. That’s right, there is apparently a reason for jumping-up the minimum temperatures for Amberley but it just can’t provide Mr Lloyd with the supporting meta-data at this point in time.

SOURCE (See the original for charts)


Written by Dr Tim Ball, Climatologist

My first involvement with the Acid Rain scare was indirect, but added to awareness of the limitations of data and understanding of atmospheric and ocean mechanisms. acid rain

It also heightened awareness of the political nature of environmental science. I knew the extents because of membership in the Canadian Committee on Climate Fluctuation and Man (CCCFM). It was part of the National Museum of Natural Sciences Project on Climate Change in Canada During the Past 20,000 years.

The committee was funded jointly by the National Museum of Natural Sciences and Environment Canada. It met yearly for several years, bringing together a wide range of specialists to focus on a region, time period, or area of study. Papers were published in Syllogeus, edited by Dr C.R.Harington of the Paleobiology Division. A review of them underlines how much the IPCC sidelined progress in climatology.

My election to Chair of the CCCFM likely caused its demise. In my acceptance speech I urged people not to rush to judgment on the recent anthropogenic global warming (AGW) hypothesis. I was unaware at the time of the involvement of Environment Canada (EC) in the promotion of the hypothesis and the work of the IPCC. Gordon McBean, was Assistant Deputy Minister (ADM, second highest bureaucrat) at Environment Canada and Chaired the IPCC foundation meeting in Villach Austria in 1985.

Within a few months of my election, EC withdrew funding and the Museum could not sustain it alone. One of the last projects was a detailed study of the global impact of the eruption of Mount Tambora, Indonesia in 1815. The conference proceedings were published in C.R.Harington (ed) The Year Without a Summer? World Climate in 1816. 1992, National Museum of Natural Sciences, Canadian Museum of Nature, Ottawa. Environment Canada’s actions were part of the suppression of people and data that continues to this day.

Dangers of Bureaucrats Doing Research

At the one annual conference under my chair, an Environment Canada researcher approached me to talk about a problem with the issue of Acid Rain. His dilemma underscored my argument that bureaucratic researchers are almost automatically compromised.

He was so nervous that he wouldn’t talk about it at the museum; instead, we met at the airport coffee shop. He was directed to prove US coal burning plants were causing the Acid Rain causing demise of the Quebec Maple Syrup industry. Canadian Prime Minister Brian Mulroney was already, publicly saying they were to blame.

The problem was his research showed the decline in Maple Syrup production was not caused by Acid Rain, but two natural cyclical events. The major one was a drought. The other, was due to a period of Meridional flow (the dreaded “polar vortex”) resulting in a very early spring warming that caused the tree to start leafing, followed by leaf destroying “black” frosts. Both events cause “dieback”, that is a loss of leaves. Trees, like all vegetation, have recovery and catch-up mechanisms that drive them to seed production. They will grow new leaves and go through a shortened and reduced production cycle. This includes the amount of sap flowing.

His dilemma was how to tell his bosses at Environment Canada that evidence didn’t support the Prime Ministers accusations. He even talked of publishing under an assumed name. I pointed out that he might then be fired because he hadn’t done anything for two years, although that is no guarantee in a bureaucracy.

The solution was obvious; he had to retain his scientific integrity and present his evidence. It was not his job to determine what happened to the results. His job was to do thorough, well-documented, research. He was not paid to make political decisions. The report would go up the bureaucratic ladder until somebody, holding a job for political reasons, would put it on a shelf. Later, a joint investigation by three US and three Canadian investigators, confirmed that Acid Rain was not the cause of the decline in Maple Syrup. After climate conditions changed again, yields exceeded previous records.

There is no question that Acid Rain occurred in concentrations sufficient to destroy plants. I lived in Sudbury Ontario for a year, with its smoke stack, identified as the source of 10 percent of North American Acid Rain, and saw the effects. Town leaders were proud of the fact NASA chose the region as most like the moon for astronaut training. At that time, the philosophy was ‘the solution to pollution is dilution’, so they built the smokestacks higher to disperse the sulfur further downwind. Ironically, after scrubbers were put on the stacks, reports appeared of reduced tree growth downwind because small amounts of sulfur were a fertilizer enhancing growth. This appears to support Paracelsus’ 16th century observation that the toxicity is in the dosage.

Types of Acid Rain And Chemical Variations

Water vapor condenses on to particles called condensation nuclei (CN), most of them are clay or salt particles. The CN influences the chemical nature of the liquid water drop created. For example, salt particles change the freezing temperature so the droplet becomes super-cooled and remain liquid below the freezing point. If it is a sulfur particle, the water droplet becomes a mild sulfuric acid droplet and that became the Acid Rain of environmental focus.

Most people don’t know that all rain is acid rain, but not because of the CN. Water, whether in the form of water droplets that take an estimated 1 million to form a medium-sized raindrop, absorb CO2 from the atmosphere. Droplets have a very high ratio of surface area to volume and absorb CO2 at a known rate. The chemical formula is CO2 + H2O clip_image002 H2CO3,which results in a weak, approximately 10 percent, carbonic acid in chemical equilibrium.

How much water is there in the atmosphere and how much does it vary regionally and over time? Two comments give an idea of the lack of accurate information.

One estimate of the volume of water in the atmosphere at any one time is about 3,100 cubic miles (mi3) or 12,900 cubic kilometers (km3).

At any moment, the atmosphere contains an astounding 37.5 million billion gallons of water, in the invisible vapor phase. This is enough water to cover the entire surface of the Earth (land and ocean) with one inch of rain.

Combine these with the extremely poor precipitation data for the entire globe and you have another example of climate science being a modern equivalent of the number of angels on the head of a pin. One-person claims

…the approximate rate of washout of carbon dioxide from the Earth’s atmosphere via rainwater can be determined from the known ocean evaporation rate and from the known solubility of CO2 in distilled water as a function of temperature and CO2 partial pressure.

Fine, but what is the figure? I understand estimates of evaporation are very crude, if not essentially meaningless. In the early atmosphere/ocean computer models they simply assumed a “swamp” approach of 100 percent evaporation. The 2007 Intergovernmental Panel on Climate Change (IPCC) Report says,

The spatial resolution of the coupled ocean-atmosphere models used in the IPCC assessment is generally not high enough to resolve tropical cyclones, and especially to simulate their intensity.

Carol Anne Clayson of Woods Hole explains the difficulties.

The air-sea interface “is typically the most turbulent part of the ocean,” Clayson said. A dizzying mix of interrelated factors—waves, winds, water temperature and salinity, bubbles and spray, solar radiation, and others—each adds a layer of complexity that occurs over wide ranges of time (seconds to seasons) and space (millimeters to miles). [See illustration above.]

“Getting observations of what’s going on at the air-sea interface is not trivial, especially in extreme conditions such as high winds,” Clayson said. “It’s also difficult to simulate the air-sea interactions, especially in extreme conditions, in laboratory experiments in a wave tank. Current computers don’t have enough computational capacity to incorporate all the processes occurring, on all the spatial and temporal scales involved, to produce realistic simulations.”

So we don’t know and can’t do anything with it. IPCC know the limits, but also know few read or understand the science reports.

Unfortunately, the total surface heat and water fluxes (see Supplementary Material, Figure S8.14) are not well observed.

For models to simulate accurately the seasonally varying pattern of precipitation, they must correctly simulate a number of processes (e.g., evapotranspiration, condensation, transport) that are difficult to evaluate at a global scale.

How much CO2 is absorbed in the atmosphere by moisture? How much does it vary spatially with changing temperature of the water droplets and raindrops? How much does it vary with changing air temperature and saturation vapor pressure? How much does it vary with wind speed? How do the quantities relate to human additions of CO2? All we can do is ask questions to help the public realize the inadequacy of the data and lack of understanding of the mechanisms behind IPCC claims.


£11m for the wind farm that was not working

A wind farm has been paid £11 million not to produce electricity, The Telegraph can disclose.

An analysis shows that 10 wind farms have each been paid more than £3 million over the past three years to shut down their turbines.

The sums being handed out to renewable energy companies are up to double what they would have received for producing electricity.

The highest payment of £11.1 million was paid over three years to ScottishPower, a Spanish-owned firm, which operates the Whitelee wind farm, around 10 miles from Glasgow.

The disclosures prompted claims that the Government has failed to “rein in” the amounts being demanded by wind farm owners to turn off their turbines to stop the electricity network becoming overloaded.

The money, which is ultimately added to household bills, is being paid to a series of firms, including a handful owned by the Swedish, Norwegian and Danish governments.

National Grid is responsible for managing the flow of electricity and hands producers “constraint payments” to shut down when there is a risk of the grid overloading because too much is being generated.

Each owner asks for a particular sum for each megawatt hour of energy its turbines would have produced had they been switched on, and National Grid chooses whether to accept their assessment.

The industry says the payments cover forgone subsidies, the wear and tear of shutting turbines down and administration costs.

In September 2011 MPs demanded action by Ofgem, the energy regulator, after The Telegraph disclosed that the Norwegian owners of the Crystal Rig wind farm in the Scottish Borders had been paid £1.2 million not to produce electricity for eight and a half hours.

Ofgem said that since then it has brought down the cost of constraint payments, but critics said firms were still making “excessive” profits.

The Renewable Energy Foundation think tank, which compiled the latest figures from official data, warned that the companies had either made “token reductions” to the amounts or “simply ignored” warnings that they should be brought down.

The total payments since 2011 have exceeded £70 million. Many have been made to wind farms in Scotland, where a large proportion of the UK’s turbines have been built and there are “bottlenecks” of energy leaving the area during high winds.

Among the payments was a total of £7.6 million to RWE Innogy, a German-owned firm, for shutting down the Farr wind farm, 10 miles from Inverness, on dozens of occasions. The sum includes £357,353 for shutting down the wind farm over six days this month.

Fred Olsen has received £3.2 million for shutting down its 85-turbine Crystal Rig wind farm over the past three years, including £159,987 earlier this month amid the high winds caused by Hurricane Bertha.

It asked for £114 per megawatt hour of energy the turbines would have produced had they been switched on, which National Grid accepted.

However, the main loss from turning off its turbines was a consumer subsidy amounting to around £50. Therefore, it was still receiving £64 per megawatt hour more than if it had been generating electricity, the foundation said.

Fred Olsen declined to provide a breakdown of the costs involved in shutting down the wind farm.

Dr Lee Moroney, research director of REF, said: “It is now becoming crystal clear that the full cost of constraints is disturbingly high. A more robust position from both government and the regulator, Ofgem, would go a long way to reining in wind power’s very high constraint prices.”

Wind farm owners insisted that the number of times wind farms were paid to shut down was “very small” compared with the number of times conventional power stations were paid to reduce their output.

Zoltan Zavody, from RenewableUK, the industry body, said that last year the total cost of wind constraint payments was around 65p on the average annual domestic electricity bill.

Ofgem said that since it was given powers in 2012 to prevent firms getting an “excessive benefit” from the payments, the prices paid have fallen from an average of £197 per MWh to £83 per MWh.

A spokesman for National Grid said: “For the moment, constraint payments are the most economically efficient way to balance the system while we await improvements to the electricity network such as the Western Link — a £1  billion sub-sea link that will bring renewable energy from Scotland to the rest of the UK.”


Same EPA Office Still Beset with Heinous Bathroom Hijinks

Back in June, the Environmental Protection Agency’s Denver regional office called in a workplace violence expert to help stop its employees from defecating in the hallways. Two months later, it seems they may need to request a refund. The problem has gotten out of control.

Recently-released memos and emails from the same office show the problem is much worse than originally thought. The documents reveal the “beyond gross” conditions of the office bathrooms and the apparent fear that has gripped some employees:

"One of my employees refused to come into the office today because she is terrified after hearing a story on the train home last night." ...

"A male supervisor ... told her that management knows that it is a female on the [redacted] floor who has been wiping feces and menstrual blood on the walls (I'm really sorry, this is beyond gross) and that they are worried that her behavior is escalating."

One email chain disclosed “a list of at least nine suspected restroom incidents.” These events lasted throughout the summer. The situation escalated to such a degree that EPA officials called for Homeland Security officers to patrol the hallways and other trouble spots.

The first line to the EPA’s mission statement reads:

“EPA's purpose is to ensure that all Americans are protected from significant risks to human health and the environment where they live, learn, and work.”

The appalling conditions at this EPA office are both outrageously funny and highly unsafe for those that work there. Seeing as the EPA has proposed an avalanche of new regulations this year controlling every part of ordinary Americans’ lives, perhaps they should physically clean up their own house first.



For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here


No comments: