Saturday, June 12, 2010

Surface temperature measurements: how reliable?

Transcript of an interview with Anthony Watts on the ABC: Australia's main public broadcaster. The interviewer is Michael Duffy, a former Labor Party politician, but a rational one

Michael Duffy: First up, climate sceptic Anthony Watts. Anthony is a weatherman with KPAY AM radio in California, he's been a weatherman for 25 years. He also runs a very popular climate change sceptic blog,, and as well in 2007 he founded This is a very interesting website, it invites readers to visit America's official climate monitoring stations and describe their physical location, often with the help of photos that are then posted on the website.

The results can be quite surprising; they show stations next to air conditioner outlets, for example, or in the middle of asphalt-coated car parks. Anthony Watts is about to begin a speaking tour of Australia. I caught up with him at his home in California late last week.

Your blog is hugely popular, I think it's the most visited climate site in the world, isn't it?

Anthony Watts: It is. It caught me quite by surprise but it now regularly exceeds two million visits per month and we get visitors from all over the world. When we compare the traffic to other climate sites, surprisingly it is the largest climate related informational site in the world.

Michael Duffy: Can you tell us a bit about the work that you've done on America's network of climate monitoring stations? I know this is very important and we might need to take our listeners through it step by step. First of all, can you tell us something about the size of the network?

Anthony Watts: The size of the network is 1,221 different stations around the US, so it's quite large. It's taken us three years now to get over 1,000 stations surveyed. What we discovered was that there was a very simple rule that the Weather Service had put in place a long time ago called the 100-foot rule which basically said that you're to keep the weather station measurement instruments 100 feet away from other biasing influences such as buildings, asphalt, trees, structures, heat sources, whatever. And in our survey we discovered that 89.7%, almost 90% of all the stations that we surveyed, over 1,000 of them, didn't meet the government's own criteria of 100 feet.

Michael Duffy: What does one of these stations look like?

Anthony Watts: Typically a station looks like one of three kinds, depending on where it's at. The old traditional station is what's called a Stevenson screen, it looks like a slatted wooden box on stilts and it has a rain gauge next to it, and inside the wooden box they have an old style mercury thermometer that records the maximum and minimum temperatures for the day.

The newer electronic station looks like what some people describe as a beehive on a post, like a series of stacked dinner plates that are sitting on top of a metal pole about four and a half feet high. And then it has a cable that runs into the structure where the observer's office or their domicile or home is and they read the temperature remotely there. That caused a problem with a lot of the placements because the Weather Service personnel when they were converting these in the '80s and '90s weren't given any specific construction tools other than a shovel and a pickaxe and so oftentimes they could not get the trenching where they had to lay the cable past things like sidewalks or roadways out to where the old weather station used to be, which might have been further away from the building. So we've found a trend towards these new electronic thermometers moving closer to buildings and closer to heat sources.

Michael Duffy: You talked earlier about the 100-foot rule, why is that important, why is the siting of each of these stations significant?

Anthony Watts: The Weather Service and the National Climatic Data Centre created a new network in response to the problems that they recognised that started in 2002 called the Climate Reference Network where they add an even more stringent set of exposure rules, much more stringent and much more detailed than the 100-foot rule. They rated them by categories and by quality.

The reason for keeping things away from the thermometer is that if you start building up things around a thermometer...let's say the thermometer is originally in a grass field and around that grass field you get development, buildings go up, roadways get put in, sidewalks get put in and so forth. And then all of a sudden you have things that are close to the thermometer that retain heat overnight, such as asphalt, and as anyone can tell you, if you have a very hot day and then you at midnight go out to a football field in the middle of the football field and measure the temperature there on the grass and then walk over to the football field's parking lot and measure the temperature there, you're going to see a significant difference.

And so what happens is that this overnight release of heat from things like asphalt, concrete and buildings bias those night-time temperatures upwards. And when you average temperatures, the highs and the lows for the day, that shifts the average temperature higher.

Michael Duffy: Checking that large number of stations must have been quite an effort, how did you go about it?

Anthony Watts: I employed social networking. I'd never done a project like this before, and so we took a bit of a gamble. I worked with Dr Roger Pielke at the University of Colorado to get the project initially set up, and he supervised the method and the way we were gathering data, and then we used blogs and other types of social networking environments to advertise the project and created a series of simple steps that people could follow to locate the station, photograph and document the station, get a GPS reading and then submit that information to our central website where it could be posted, checked for quality control and then evaluated for the station rating tag.

Michael Duffy: Have you visited many of the stations yourself?

Anthony Watts: I've done a large number of them. I've done I would say now about 180 different stations, in California, in Nevada (I did most of those stations), in Texas, Oklahoma and Southern Kansas and parts of Arkansas I did a number of stations, and I did that because we had a huge gap, a missing gap of stations there that needed to be filled in. I also did some in Idaho and in Oregon for the same reason. So yes, I've done about 180 stations, and the experience that I had parallels what the rest of the observers had and that is the vast majority of them don't follow the basic simple exposure rules set down by the Weather Service, the 100-foot rule.

Michael Duffy: You must have driven a lot of miles.

Anthony Watts: I would say I've logged 8,000 miles in driving to surveying stations over the past three years.

Michael Duffy: It's quite an effort, and I should say to our listeners that you've got some very striking photographs on your website, and we'll give a link to that website. Can I ask you then, to summarise, what have you found having visited so many of the stations, how well are most of them sited?

Anthony Watts: The project summary is basically this; the majority of stations are out of compliance with the Weather Service's own siting rules, and while the compliance itself is clear and no-one denies that, the question then becomes how has that affected the temperature record. Dr Pielke, his research group and myself are now in the process of finishing a paper for submission to a peer reviewed scientific journal that illustrates what we found in the way that siting difference has affected the US temperature record. And I can say with certainty that our findings show that there are differences in siting that cause a difference in temperatures, not only from a high and low type measurement but also from a trend measurement and a trend calculation.

So we believe that the United States temperature record is biased by this problem, and that the problem also extends worldwide. We have found similar kinds of problems throughout the world. For example, in Rome the airport there has a similar problem. In Sydney there's a weather station downtown that has similar kinds of problems. Baltimore had a station close because they had been giving erroneously high readings and the Weather Service recognised it. So the problem is real and the problem has contributed to a change in the data. Right now we're just finishing up calculations of the magnitude.

Michael Duffy: In which direction does the bias lie? Are you suggesting that the temperature has not got as hot as the American official historical record suggests?

Anthony Watts: That's correct. It's an interesting situation. The early arguments against this project said that all of these different biases are going to cancel themselves out and there would be cool biases as well as warm biases, but we discovered that that wasn't the case. The vast majority of them are warm biases, and even such things as people thinking a tree might in fact keep the temperature cooler doesn't really end up that way.

In fact if you have a thermometer underneath a tree, while it might suppress the daytime high a little bit because it suppresses the amount of direct sunlight, it also elevates the night-time low in a much greater fashion, particularly in the summer because the leaves of the tree reflect the infrared radiation back down towards the ground. What we've discovered is that this effect is fairly pronounced.

In fact the National Weather Service, after we were going around and looking at different stations and identifying these problems, they were following us and actually closing stations in our wake. They closed the Marysville station which was originally the one that tipped me off to this problem, and they also closed one in Telluride Colorado that was right under a tree, they closed it specifically for that reason because they realised the night-time temperatures were no longer accurate.

Michael Duffy: Can you tell us anything at all about the scope of the bias that you're presently calculating? Do you think it might actually pretty much negate any suggestion there's been an increase in temperature?

Anthony Watts: It does not negate it completely, it is a contributor. First of all I want to make it clear that there is an effect from carbon dioxide in our atmosphere, I'm not disputing that. However, what we are disputing is that in the surface temperature records of the US and the world, the effects of urbanisation, the poor siting and a combination of those two can affect the temperature record in such a way that it biases the temperature record upwards.

Michael Duffy: It's the case, isn't it, that the National Oceanic and Atmospheric Administration has put out a paper referring to other papers by Matthew Menne and Thomas Peterson, and they claim that people have already done the sort of calculation you're now doing and come up with a result that in fact these poor sitings have not affected the trend. What would your response to that be?

Anthony Watts: I think before I respond to what their results were, I should tell you a little bit about their methodology. They borrowed (and I use that term loosely) some of my early data that I had published up to the website to help my volunteers locate stations. We had 43% of the network surveyed at that point, and I had never published any other data beyond that, and I advised them when they started doing this work that that data that they used had not been quality controlled yet, it was there just for the purposes of locating stations, and the data contained in it hadn't been quality checked, and it was far from complete. It had biases in it related to the spatial representation in the US, those holes that I tried to fill in, for example, in the middle of the country, in rural areas in the middle of Texas and Oklahoma and Idaho, away from cities. That data didn't have those things. And so what they ended up with was a set of data that was mostly urban, mostly around cities, not quality controlled and not complete, and they used that data because they were so keen on discrediting our work that they rushed to get that out.

And I made complaints with the journal saying that the use of my data to publish a paper that I hadn't even finished yet was wrong and it violated professional standards, and they went ahead anyway and did it. So I think that their methodology speaks to the credibility of the results.

Michael Duffy: And do you have any idea when your forthcoming paper will appear?

Anthony Watts: We are very close to finishing the final copy on it, and we have two independent review teams do statistical analysis on it. I did not do any statistical analysis myself, my job was the data gathering and the quality control process. So we are very close to finishing that, literally within days. The question will be how long will it take to go through the journal and peer review process.

Some papers we have seen from the sceptical side of science have taken as long as 18 months to get processed and go through that whole chain of review. We hope it will be sooner than that. We'll be submitting it to the same journal for review that Mr Menne's and Mr Peterson's papers went to. Hopefully they will see that as a value to get a counterpoint view and move it through as quickly as their paper got reviewed and published which was on the order of about five to six months. So we hope that we'll be able to get the same kind of expedience in our review.

Michael Duffy: We'll keep an eye on that and look forward to covering it when it comes out. Anthony Watts, thanks for joining us today, and good luck when you come to Australia.

Anthony Watts: I'm looking forward to it.


Greenies building a fallback position for the demise of belief in global warming

World governments are meeting this week to try to set up a new international body that would put the global destruction of the natural world on an equal footing with the threat of climate change.

The proposed new organisation would be modelled on the Intergovernmental Panel for Climate Change (IPCC), which was set up 22 years ago. Since then, it has launched global warming and climate change to the top of the political and economic agenda.

The meeting, at Busan in South Korea, follows growing evidence in the last few years about the huge rate of destruction of species and the ecosystem services they provide for humans – from regulating local weather and fertilising soil to providing a rich gene pool for medical researchers.

Another major report this summer, commissioned by the United Nations, is expected to say that the economic benefits of policies to protect and restore biodiversity are worth 10 to 100 times the costs

"If the true value of ecosystem services – economic, social and spiritual – were factored into decision-making, wetlands, forests and reefs would be viewed and treated very differently," said French ecology secretary, Chantal Jouanno, and campaigner Janet Ranganathan in an article for the Guardian.

"How to ensure cross-governmental participation and buy-in is therefore the key question for countries gathering at Busan.

"The future health of the natural world, and humanity's wellbeing, may depend on it."

The proposed "IPCC for nature" could provide regular, independent reports on the state of global and regional biodiversity – reflecting the IPCC's five-yearly assessments of the state of climate science, forecasts for impacts and advice about how to tackle the problem.

Perhaps more important would be the symbolic significance of an organisation which sent out a message that governments and global organisations were finally taking the biodiversity crisis as seriously as they have climate change, say supporters.

"Climate change may have captured public attention, but the global collapse of ecosystems and loss of biodiversity is equally threatening to human wellbeing," said Ranganathan, a vice-president of the World Resources Institute.

"The IPCC helped give climate change a global profile. The time has come for an IPCC for nature."

The creation of the body, provisionally named the Intergovernmental Platform on Biodiversity and Ecosystem Services (IPBES), was first formally proposed last year.

This week delegates from 97 governments and 50 organisations are meeting for what could be the official go-ahead for the new body.

More HERE. (See the original for links)

Biodiversity scare just as poorly founded as global warming

Commentary on the report above

Shock! The UN is using protection of the natural world as a reason to make massive changes to the global economy? This sounds familiar, which I’m sure is why Morano posted it. Whenever the UN puts out a report that involves the world spending a lot of money, I get suspicious, so I decided to take a look at the interim report (the final isn’t going to be published until later this year). Here is the report.

I started at Chapter 1. On the second page of Chapter 1 (page 12 on the pdf) there is a short list of items showing how the earth has lost its biodiversity:
However, the levels of many of the benefits we derive from the environment have plunged over the past 50 years as biodiversity has fallen dramatically across the globe. Here are some examples:

• In the last 300 years, the global forest area has shrunk by approximately 40%. Forests have completely disappeared in 25 countries, and another 29 countries have lost more than 90% of their forest cover. The decline continues (FAO 2001; 2006).

• Since 1900, the world has lost about 50% of its wetlands. While much of this occurred in northern countries during the first 50 years of the 20th century, there has been increasing pressure since the 1950s for conversion of tropical and sub-tropical wetlands to alternative land use (Moser et al. 1996).

• Some 30%of coral reefs – which frequently have even higher levels of biodiversity than tropical forests – have been seriously damaged through fishing, pollution, disease and coral bleaching (Wilkinson 2004).

• In the past two decades, 35% of mangroves have disappeared. Some countries have lost up to 80% through conversion for aquaculture, overexploitation and storms (Millennium Ecosystem Assessment 2005a).

• The human-caused (anthropogenic) rate of species extinction is estimated to be 1,000 times more rapid than the “natural” rate of extinction typical of Earth’s
long-term history (Millennium Ecosystem Assessment 2005b).

If you read this list you can see why we need to take urgent action. Forests have disappeared in 25 countries, and in 29 they have lost 90% of their forests. Half of the worlds wetlands have gone in only a century. Species are going extinct 1,000 times more quickly because of humans. This is frightening.

This also sounds familiar. Making startling claims about how much damage humans are doing to our planet is nothing new. But just because something is startling doesn’t mean it isn’t true, and these claims have citations, so let’s look at them.

The source for the claims about the 30% reduction of coral reefs isn’t peer-reviewed, but otherwise it at least matches the source.

The source for the claims about Mangroves isn’t peer reviewed, although that source references a Science article, and the claim does match the source. So far, two of these five claims at least match their source.

However, the rest are all estimations or patently false. Not only that, but none of the references for the entire first chapter of the TEEB report are peer-reviewed. They are nearly all (UN) government reports or environmental institute reports. Not only do they entirely rely on non-peer-reviewed material, but their claims don’t even match their cited sources. Let’s start with the first claim:
” In the last 300 years, the global forest area has shrunk by approximately 40%. Forests have completely disappeared in 25 countries, and another 29 countries have lost more than 90% of their forest cover. The decline continues (FAO 2001; 2006).”

FAO 2001 and 2006 are referenced as:
FAO – Food and Agriculture Organization of the United Nations (2001) Global Forest Resources Assessment 2000. [Found here]

FAO – Food and Agriculture Organization of the United Nations (2006) Global Forest Resources Assessment 2005. [Found here]
None of these claims are in the FAO reports. In fact, one of the claims is roundly contradicted by their own source. They claim that “Forests have completely disappeared in 25 countries”, yet the FAO report says (page 14 of 2005 report):
“Seven countries or areas have no forest at all, and an additional 57 have forest on less than 10 percent of their total land area.”

This is repeated and gone into more depth in the report but the numbers are the same. Only 7 countries are without forests, not 25. The other claims are not in the report, the article doesn’t talk about forest loss before the 1940′s when countries started to report the state of their forests. Also, there is no mention at all of “another 29 countries have lost more than 90% of their forest cover”. Where did these claims come from?

Another UN document. Surprised? This time it is the Millennium Ecosystem Assessment, Chapter 21, Forest and Woodland Systems. Here is part of the first claim in the ‘Main Messages’ section at the beginning of the document:
In the last three centuries, global forest area has been reduced by approximately 40%, with three quarters of this loss occurring during the last two centuries. Forests have completely disappeared in 25 countries, and another 29 countries have lost more than 90% of their forest cover.

This is practically verbatim to the TEEB claim. They clearly cited the wrong source.

The claim itself is suspect. The first part, about 40% reduction, appears here (pg. 588):
From today’s perspective, however, reagricultural impacts on overall forest cover appear to have been slight. Since that time, the planet has lost about 40% of its original forest (high certainty), and the remaining forests have suffered varying degrees of fragmentation and degradation (Bryant et al. 1997; Matthews et al. 2000; Ball 2001; Wade et al. 2003). Most of this loss has occurred during the industrial age, particularly during the last two centuries, and in some cases much more recently. Some analyses have yielded substantially smaller estimates. Richards (1990), for example, estimates global loss of forests to have been only about 20%.

Just reading this leads to some uncertainty, they admit that some research indicates that it has only been 20% loss. Also, all of those references (except Wade et al. 2003) are done by environmental groups. But the real deception is in the statistic itself. The implication of including this statistic is that this loss of forest is bad, but clearly this isn’t the case as the study itself admits in the very next sentence:
Much of the progress of human civilization has been made possible by the conversion of some forest areas to other uses, particularly for agricultural expansion.

Even if the 40% statistic is accurate, it is hardly a cause for concern in and of itself. It reflects mankind’s progress to this point, to be able to tame the outdoors and provide ourselves with food.

The second half of the claim “Forests have completely disappeared in 25 countries, and another 29 countries have lost more than 90% of their forest cover” is not mentioned in the report at all. If you find it in there please let me know. As I mentioned before, it is contradicted by their cited source, which claims only 7 countries have no forest and the FAO report makes no mention of the 90% claim.

I’ll address the other two errors in another post, this one has gotten quite lengthy.

I’m uncertain why, but UN reports seem to have difficulty correctly citing their claims. It doesn’t seem as though using one UN report is any better than using another UN report (FAO paper versus Millennium Assessment), so why can’t they keep their citations straight? Also, the reliance on other UN reports seems to cast serious doubt on the report itself. Of the 16 references for Chapter 1, 7 of them are from UN reports (along with 4 news articles and 5 reports from environmental groups). I don’t know what the full report will look like this summer, but just the very first chapter of this report is pretty pathetic.


A 35-Year History of Caribbean Coral Reefs

Discussing: Schutte, V.G.W., Selig, E.R. and Bruno, J.F. 2010. Regional spatio-temporal trends in Caribbean coral reef benthic communities. Marine Ecology Progress Series 402: 115-122.


Climate alarmists are quick to contend that earth's coral reefs are headed to hell in a handbasket, as it were, with Pelejero et al. (2010) arguing that the oceanic changes we are facing today, in pCO2 and in pH, "are happening ~100-times faster than during glacial-interglacial transitions," and that "the average surface pH levels that oceans have reached today are already more extreme than those experienced by the oceans during the glacial-interglacial changes and beyond, probably being more extreme than at any time during the last 20 million years."

What was done

In a study designed to determine regional-scale trends in coral cover on Caribbean reefs over the last 35 years in each of seven sub-regions -- which effort could logically be expected to shed light on the impacts of the highly-hyped oceanic changes lamented by Pelejero et al. -- Schutte et al., as they describe it, "analyzed the spatio-temporal trends of benthic coral reef communities in the Caribbean using quantitative data from 3,777 coral cover surveys of 1,962 reefs from 1971-2006."

What was learned

Schutte et al. determined that from 1971 to 1980, annual Caribbean-wide coral cover averages were highest and without trend, with all but two values falling between 30 and 40%. Then came the largest one-year decline in coral cover of the entire record -- a precipitous drop from about 37% to 12% between 1980 and 1981 that corresponded in time, in their words, "with the beginning of the Caribbean-wide Acropora spp. white band disease outbreak," after which (from 1982 to 2006) they note that "coral cover has been relatively stable," with values ranging from about 15% to 22%.

What it means

Clearly, the temporal history of Caribbean coral cover change does not bear any resemblance to the gradual and continuous decline that could have been expected from the concomitant increase in oceanic pCO2 and decrease in pH. Indeed, after suffering the sharp one-year decline caused by the white band disease outbreak, coral cover once again stabilized, which phenomenon, in the words of Schutte et al., "could be interpreted as relatively good news" -- which it truly is -- although they state that this pattern "could also be a temporary plateau preceding a potential collapse in coral cover." Then, again, we could just as easily say it could also be a temporary plateau preceding a potential increase in coral cover. (Isn't speculation wonderful?)


Oil fuels better lives

by Jeff Jacoby

AS THE DEEPWATER HORIZON SPILL continues to foul the Gulf of Mexico, pundits and policymakers everywhere are once again reaching for the A-word.

The BP disaster, proclaims Washington eminence David Gergen, is "a wake-up call to end our addiction to oil."

Without "a real climate bill," warn the editors of The Washington Post, "America might be addicted to oil a lot longer than it needs to be."

We must "begin to wean ourselves from our addiction to oil," intones Senator John Kerry on ABC, while syndicated columnist Thomas Friedman lambastes "the powerful lobbies and vested interests that want to keep us addicted to oil."

To be sure, this isn't a new trope. Barack Obama liked to say during his presidential campaign that we are bankrolling "both sides of the war on terror" through our "addiction to oil." George W. Bush, a onetime oilman, memorably announced in his 2006 State of the Union address that "America is addicted to oil." According to Nexis, the media database, the metaphor dates back at least as far as 1974, when psychiatrist Thomas Szasz wrote in the New York Times that "oil addiction is equivalent to drug addiction."

But it's not.

The explosion of BP's oil rig in the Gulf has been a calamity in so many ways, above all the loss of 11 human beings. With hundreds of thousands of gallons of crude oil gushing daily from the crippled wellhead, the environmental impacts have been excruciating. BP is responsible for a dreadful mess, one that will take years and many millions of dollars to clean up.

Awful as the catastrophe has been, however, life without oil would be far, far worse.

Americans consume oil not because they are "addicted" to it, but because it enriches their lives, making possible prosperity, comfort, and mobility that would have been all but unimaginable just a few generations ago. The life of a heroin junkie is pitiful, desperate, and unproductive; his addiction undermines his health and overpowers his self-control. Almost by definition, an addiction is something one is healthier without. But oil-based energy improves human health and reduces poverty -- it makes life longer, safer, and better. Addictions debase life. Oil improves and expands it.

"Oil may be the single most flexible substance ever discovered," writes the Manhattan Institute's Robert Bryce in Power Hungry, a new book on the myths of "green" energy. "More than any other substance, oil helped to shrink the world. Indeed, thanks to its high energy density, oil is a nearly perfect fuel for use in all types of vehicles, from boats and planes to cars and motorcycles. Whether measured by weight or by volume, refined oil products provide more energy than practically any other commonly available substance, and they provide it in a form that's easy to handle, relatively cheap, and relatively clean." If oil didn't exist, Bryce quips, we'd have to invent it.

Of course there are problems created by oil, as the Deepwater Horizon calamity so heartbreakingly demonstrates. But most things of great value come with downsides. There are 40,000 traffic fatalities in the United States each year, but no rational person suggests doing away with cars, trucks, and highways. Airplanes sometimes crash and boats sometimes sink, but air and sea travel are not derided as "addictions" we need to break. Iatrogenic deaths due to hospital infections, medication errors, or unnecessary surgery number in the scores of thousands annually, but who would recommend an end to modern medical care?

Someday there may be an energy source that is as abundant, efficient, clean, and economically viable as oil. But nothing available today fits that bill -- certainly not biofuels, wind farms, or solar power. Besides, it isn't only energy products -- gasoline, kerosene, diesel fuel, propane -- that we get from petroleum. Crude oil refining also makes possible plastics, synthetic fibers, lubricants, waxes, asphalt. "Other products made from petroleum," notes the US Energy Information Administration, "include ink, crayons, bubble gum, dishwashing liquids, deodorant, eyeglasses, CDs and DVDs, tires, ammonia, [and] heart valves." The list could be expanded almost endlessly.

The United States consumes more than 300 billion gallons of oil per year, nearly two-thirds of it imported. There is no denying the drawbacks associated with oil, but its advantages ought to be equally undeniable. American wealth, progress, and autonomy -- the most dynamic and productive economy in history -- would be impossible without it. What we have isn't an addiction, but a blessing.


Failed EPA Votes Undermines Economy

United States Senators went on record this afternoon and the result was unfortunate. 53 Senators voted against a resolution offered by Senator Lisa Murkowski (R-AK) that would have disapproved of the Environmental Protection Agency’s backdoor global warming regulations. Today’s outcome was a victory for anti-growth environmentalists, but a devastating loss for the American people.

The EPA’s regulations will marginalize any potential economic recovery by making investment and job creation more expensive. Why? Because the costs of regulation are staggering. The EPA estimates the average permit will cost applicants $125,000 and 866 hours of labor. Some businesses will simply close. The lucky ones will move overseas, cancel expansion plans and just lower wages. All of those are bad options considering the American economy has lost nearly 8 million jobs over the past 30 months.

Despite the outcome of today’s vote, many liberals recognize the EPA cannot be left to its own devices, which means there will be other, more subtle efforts to limit the EPA’s regulatory dragnet.

Chief among them is a proposal offered by Senator Jay Rockefeller (D-WV). His proposal would simply delay the implementation of the EPA’s regulation. Delaying these destructive regulations is not inherently bad, but it does not address the fact that bad regulations are indeed coming. It creates regulatory uncertainty is bad for the economy and bad for the American people.

According to Greenwire (subscription required), Senate Majority Leader Harry Reid (D-NV) promised the Senate would vote on Rockefeller’s proposal before the elections. The article implied Reid’s promise was designed to prevent the Murkowski resolution from passing.

Another potential alteration of the EPA’s regulatory scheme comes from Senators Tom Carper (D-DE) and Robert Casey (D-PA), both of whom voted against Senator Murkowski’s resolution. Their approach is rumored to “protect” small businesses while focusing the economic pain on only the biggest emitters. Any student of economics knows those so-called “big emitters” will pass those costs along businesses and families. Even worse, the plan would only “protect” the little guy until 2016.

While those two policy prescriptions are misguided, the real danger is that Senators will use this failure as an excuse to move forward legislatively on a cap-and-trade scheme or renewable electricity mandate. A Heritage analysis found that the House-passed global warming bill would destroy 2.5 million jobs and $9.4 trillion in economic growth. Similarly, an analysis of a renewable electricity mandate would reduce employment by more than 1 million jobs, add to our national debt and undermine our quality of life.

By voting against the Murkowski resolution, Senators have failed to address the primary concern of Americans—the economy.



For more postings from me, see DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC, GUN WATCH, SOCIALIZED MEDICINE, AUSTRALIAN POLITICS, IMMIGRATION WATCH INTERNATIONAL and EYE ON BRITAIN. My Home Pages are here or here or here. Email me (John Ray) here. For readers in China or for times when is playing up, there are mirrors of this site here and here


1 comment:

V. Alium said...

Blood clot formation, or haemostasis, depends on an intricate series of events involving platelets, other cells, and the activation of specific blood proteins, known as coagulation factors.6