Friday, September 09, 2011

Damn! U.S. has just had only the SECOND warmest summer in 75 years

Those pesky 1930s, when industrial activity was at a minimum (remember the great Depression?), spoil the story again

The USA just endured its hottest summer in 75 years and the second-hottest summer on record, according to data released Thursday afternoon by the National Climatic Data Center in Asheville, N.C.

The average U.S. temperature during the summer of 2011 was 74.5 degrees, which was 2.4 degrees above the long-term (1901-2000) average. Only the Dust Bowl year of 1936, at 74.6 degrees, was warmer.

Texas also suffered through its driest summer on record. The state is in the midst of its worst drought since the 1950s.

[So what was happening in the 1950s? They didn't have SUVs then and power usage was a fraction of what it is now]

On the flip side, two states — Oregon and Washington — had a cooler than average summer, while California had its wettest summer on record. [So the anomalies weren't even national, let alone global]

U.S. climate data go back to 1895. The climate center defines summer as June 1-Aug. 31. It will release global temperature data for the summer of 2011 next week.

More HERE




Howard Fineman Smears Climate Skeptics and Their 'New Ten Commandments'

This blog is a pretty well-known skeptical blog and yet I am an atheist -- but facts never upset Leftists, of course. Warmism is itself a religion, totally dependent on prophecies -- JR

Former Newsweek editor Howard Fineman appeared on Hardball, Thursday, to attack Rick Perry's climate change skepticism as a "war of the worlds between science and faith." Dismissing anyone who isn't sold on global warming as not logical, Fineman scoffed, "It's part of their new Ten Commandments."

Both Matthews and the Huffington Post contributor offered condescending takes on the Tea Party movement. Comparing Perry to a student, Fineman derided, "And he's not just the kid who didn't read the assignment. He's questioning the right of the teacher to make the assignment in the first place."

He added, "It's an article of faith with the Tea Party people and Perry is going right at every Tea Party voter he can."

Matthews argued that only those who accept climate change could have reasoned beliefs: "Why is [global warming skepticism] so theocratic? Why is it so close to their religious beliefs?"

Fineman, who was the senior editor at Newsweek until October of 2010, has previously attacked Perry. On August 14, 2011, he dismissed, "And you know when proponents of George Bush...are accusing Rick Perry of being shallow, you've got some questions to ask, okay?"

More HERE (See the original for links)




The Dessler 2010 Travesty: It's Now Obvious Why He Avoided Using HadCRUT Data, The Gold-Standard

Read here. Climate scientist Andy Dessler produced research that was a supposed refutation of the Spencer and Braswell research. It was pointed out previously that Dessler chose not to use the same HadCRUT data as Spencer, which smacks one as an extreme form of cherry-picking.

If one is to challenge another scientist's research, should they not be held to the standard of using the same data to make their case? Well....apparently not in Dessler's case since he obviously is driven by a political agenda, not a scientific one.

So, what happens when the Dessler methodology uses the gold-standard HadCRUT data that Spencer used? As Steve McIntyre discovers, the new results actually resemble Spencer's output suggesting that clouds provide a negative feedback.

Well, everyone now knows why Dessler avoided the HadCRUT data. His refutation of the Spencer study was literally a sham.
"Having exactly replicated Dessler’s regression results and Figure 2a, I’ve repeated the exercise with CERES clear sky in combination with CERES all sky, and with the widely used HadCRUT3 series and got surprising results...The supposed relationship between CLD forcing and temperature is reversed: the slope is -0.96 w/m2/K rather than 0.54 (and with somewhat higher though still low significance)."

More HERE (See the original for links)




More Evidence That Models Continue To Show Too Much Recent Warming

In our last World Climate Report article, we detailed a recent paper that showed that climate models which fail to account for the evolution of stratospheric aerosols (that is, reflective particles in the earth’s upper atmospheric) during the past decade or two project less warming than they would have had they included the influence of stratospheric aerosols in their calculations. This means that the discrepancy between the observed warming trend during the past 10-15 years (which is near zero) and climate model projections should be even larger than it appears (and it is already quite large).

Now comes along a new paper which hints at another reason why the climate models should actually be projecting more warming than they currently do—again, meaning that the models are faring even worse than it appears.

The new paper is by Stephen Hudson of the Norwegian Science Institute, is soon to be published in the Journal of Geophysical Research, and deals with the impact of Arctic ice loss on global temperatures.

Hudson points out that this topic is significant because it seems to be a pretty straightforward demonstration of a positive feedback resulting from global warming, going something like this: atmospheric carbon dioxide increases lead to higher temperatures which lead to the melting of the highly reflective Arctic sea ice which leads to more sunlight being absorbed which leads to higher temperatures, and so on.

In fact, this example is a favorite of Al Gore. Hudson explains:
In general introductions to the topic of climate change, the sea ice–albedo feedback (SIAF) is often singled out for use in explaining the concept of climate feedbacks (e.g., it is the only feedback mentioned by Gore [2006]), something that can give the impression to the interested public that it is the most important feedback process, while its popularity likely stems from the relative ease with which it can be explained and grasped.

Hudson’s intent is to find out just how much of an influence Arctic sea ice really has on the earth’s temperature (or at least its radiative effect, which is directly related to temperature). And he is intent on doing so without overly involving climate models. He describes his motivation:
This study focuses directly on the changes in the amount of solar radiation absorbed by Earth due to the loss of Arctic sea ice. It focuses only on the Arctic because that is where significant changes have been observed in recent decades. The estimates here are based mostly on observations, rather than on the results of climate models. Furthermore, they are kept relatively simple, to make the uncertainties and assumptions that go into the calculation of the increased absorption of solar radiation as clear as possible.

Included in Hudson’s calculations are the observed sea ice values from 1979 through 2007, the climatological cloud cover over the sea ice regions, various characteristics of ice and cloud reflectivity, and the changing angle of the sun over the seasons. Using these factors, Hudson calculated the total global radiative forcing anomaly for each year based on the amount of sea ice that year (Figure 1). The radiative forcing anomaly is basically the change in radiation that is absorbed at the surface and goes into heating the earth—positive forcing anomalies indicate a tendency towards higher temperatures. Plotted along side the radiative forcing anomaly iin Figure 1 is the observed Arctic sea ice extent in September of each year from 1979 to 2007 (the left-hand axis flipped so less ice is upwards), which shows that September sea ice is a pretty good indicator of the radiative forcing changes—the less September sea ice cover in the Arctic, the greater the radiative forcing anomaly and the greater the pressure imparted to raise the earth’s average temperature.

In order to convert radiative forcing changes into actual global temperature changes, we need to use some value for the “climate sensitivity”—that is, how much the earth’s average surface temperature changes for a given forcing change. In general, the climate sensitivity that climate models spit out is about 0.75°C per Watts per square meter. We, along with a growing number others, think that climate models overestimate the climate sensitivity, and that in fact, it is probably less than half of this value. But for the sake of this article in which our goal is to assess climate model performance, we will use the model estimate of climate sensitivity. So to convert the radiative forcing changes in Figure 1 to global temperature changes, we multiply by 0.75°C/W/m2. In Figure 2, we show this result (updated with our best guess at the values through 2011) and add a trend line through the temperature change. The magnitude of this trend, which represents the rate of global warming that climate models would likely project from the decline in Arctic sea ice is 0.034°C/decade (from 1979-2011). If we only look at the last 10-15 years of the record, a period of time in which sea ice decline has hastened, the trend increases to about 0.06°C/decade.

So what does this have to do with climate models and their projections? We are glad you ask!

A couple of years ago, Julienne Stroeve and colleagues from the National Snow and Ice Data Center analyzed the rate of Arctic sea ice loss projected by a host of climate models used by the IPCC and then compared the model projections against the observations. In Figure 3, we reproduce the major finding from Stroeve et al.’s study (including updating the September sea ice extents through (an estimated value for) 2011). Figure 3 demonstrates that the observed sea ice is declining at a rate about twice as fast as climate models projected. A fairly larger body of scientific work has been focused on explaining why this is the case, with most studies concluding that natural variability (in such factors as wind and ocean circulation patterns) has been responsible for the extra amount of ice loss over the past several decades.

But regardless of the cause, it means than only about 50% of the warming from the loss of Arctic sea ice has been correctly captured by the climate models.

In other words, over the past 10-15 years, about 0.03°C/decade of warming has been missed by the models. Had they correctly captured it, the model-projected warming rate for the past decade or more would have been even greater than it is now, and the resulting discrepancy between model projections and the observed rate of global warming would have been even larger than it is currently.

In other words, the models are doing worse than we realize.

Currently, a bevy of researchers are furiously scrambling to make excuses why the models aren’t working so well by pointing out potential influences such as a slight decline in solar radiation (Lean and Rind, 2007), and a decrease in upper atmospheric water vapor (Solomon et al., 2010) which may be acting to impart a cooling pressure on surface temperatures and thus offset some warming from increasing greenhouse gas concentrations. Others are contending that the observed lack of warming is perfectly consistent with model projections (e.g. Santer et al., 2011).

More HERE (See the original for links, graphics etc.)





EPA Declares war on farmers: Hay now a ‘Pollutant’

During his presentation on the status of the nation’s new country-of-origin labeling (COOL) law, and on behalf of the R-CALF USA COOL Committee, R-CALF USA member and Kansas cattle feeder Mike Callicrate was asked a non-COOL question that set convention goers on their heels during the 12th Annual R-CALF USA Convention held August 26-27 in Rapid City, S.D.

“Has the Environmental Protection Agency declared hay a pollutant?” an audience member asked. Callicrate responded affirmatively and explained that the Environmental Protection Agency (EPA) recently initiated a formal enforcement action against his Kansas feedlot for, among other things, failure to store his hay in a pollution containment zone. “Now that EPA has declared hay a pollutant, every farmer and rancher that stores hay, or that leaves a broken hay bale in the field is potentially violating EPA rules and subject to an EPA enforcement action,” Callicrate said. “How far are we going to let this agency go before we stand up and do something about it?”

Callicrate is permitted to handle 12,000 cattle at a time in his feedlot, which is considered a small to mid-sized feedlot in an industry now dominated by mega-feedlots such as those owned by the world’s largest beef packer – JBS-Brazil – with a one-time capacity of over 900,000 cattle; or the other mega-feedlot that also feeds hundreds of thousands of cattle at a time and is owned by the nation’s second-largest beef packer – Cargill; or the other handful of mega feedlots with capacities of hundreds of thousands of cattle such as those owned by Cactus Feeders, Inc. and Friona Industries.

In comments submitted to the U.S. Department of Justice, R-CALF USA estimated the above named mega-feedlots feed 18 percent of the nation’s fed cattle each year while one-fourth of the nation’s cattle are fed in feedlots with a one time capacity of 50,000 head or more. The largest of feedlots are getting larger and Callicrate’s feedlot is among the group of small to mid-sized feedlots that are being pressured to exit the industry so beef packers and corporate feedlot owners can increase their respective capacities. Data from the U.S. Department of Agriculture (USDA) show that 45 feedlots with one-time capacities of between 1,000 or more cattle but less than 16,000 cattle have exited the industry from 2008 to 2010.

R-CALF USA contents beef packers are deliberately forcing small to mid-sized feedlots out of business through unfair and abusive cattle-buying practices that effectively restrict market access for all but the largest of feedlots. “The proposed GIPSA rule (USDA Grain Inspection, Packers and Stockyards Administration rule) will put a stop to such unfair and abusive practices, but only if USDA issues a final rule,” said Callicrate.

Callicrate’s feedlot is the perfect example. In late 1998, the nation’s largest beef packers blackballed Callicrate because he called attention to the unfair buying practices of the corporate meatpackers. Callicrate was forced to cease his feedlot operations until 2000 when he opened Ranch Foods Direct, a meat processing and distribution company in Colorado Springs, Colorado, and began marketing his own beef more directly to consumers.

“I believe the EPA’s enforcement action is a premeditated effort by EPA to partner with the beef packers to finish the job the beef packer’s couldn’t do alone,” said Callicrate adding, “along with my feedlot, the EPA has filed enforcement actions against five other smaller feedlots, including one with only 400 cattle.

Callicrate said the EPA does not appear to be going after the corporate feedlots. “EPA is turning a blind eye toward the mega-feedlots that are a real risk for pollution and, instead, is antagonizing small to mid-sized family operations in an effort to help their packer-partners capture the entire live cattle supply chain away from family farm and ranch operations.”

We thought the Obama Administration was going to bring about a change to the ongoing corporate control and corporate dominance that has been decimating the U.S. cattle industry. I guess we’re seeing that change right now. Rather than reduce corporate control and dominance the EPA is overtly partnering with the corporate beef packers to accelerate the exodus of sustainable, independent family operations. This really smells,” Callicrate concluded.

SOURCE





Unclogging America’s oil pipeline system

Two proposed pipelines that together have the potential to unclog the overburdened pipes from the oil sands – and in doing so provide America with a friendly, stable, and growing source of oil – took significant steps forward last week.

The State Department gave a crucial green light to the proposed Keystone XL pipeline, which would carry heavy oil from Canada’s oil sands across the Great Plains to terminals in Oklahoma and the Gulf Coast. And Calgary-based Enbridge announced that it has lined up enough shippers to fill its proposed Northern Gateway pipeline, which would move bitumen from Canada’s oil sands to the west coast for transport to Asian markets.

Together, Keystone XL and Northern Gateway would alleviate the mounting issue of how to get crude oil from the oil sands out to market. Oil sands production is very much on the rise: Canada produced 1.5 million barrels of crude a day from the oil sands in 2010 and plans to expand that total to 2.2 million barrels a day in 2015 and 3.7 million barrels a day by 2025.

The crude oil extracted from the sands is thick and heavy. Refining this bitumen (as it is known) requires heavy oil facilities, of which there are none in Canada. At present bitumen is sent to heavy oil refineries in America’s Midwest, but the pipelines that run from the oil sands to Oklahoma will reach maximum capacity in as little as four years. At that point oil sands producers will be stuck with increasing volumes of bitumen and no way to get it to market… that is, unless another pipeline or two gets built.

The Northern Gateway pipeline would bypass refining altogether, sending crude bitumen to the west coast to be loaded onto tankers and taken across the Pacific to Asian markets. The company looking to build the line – Enbridge – says companies have fully subscribed to long-term service on both the main line (a 525,000-barrel-per-day pipe running from Alberta to the west coast town of Kitimat) as well as on the subsidiary (a smaller line that would bring imported condensates inland). In its announcement, Enbridge did not identify which companies signed on to use the C$5.5-billion facility, but Chinese refining giant Sinopec says it is on board with the project.

Enbridge called the shipper agreements a “major step forward” for the project. The company says the option of selling crude oil to Asia, instead of only selling to the United States, will enable Canadian producers to command a better price for their product.

Then there’s Keystone XL. The Keystone pipeline, which started operations in mid-2010, crosses the border from Manitoba into North Dakota, then traverses South Dakota, Nebraska, Kansas, and Missouri to terminate in Illinois. In February of this year another terminus was added in Cushing, Oklahoma. But even with two end points, the line can still carry only 591,000 barrels of oil a day.

Owner TransCanada wants to expand the pipeline in two steps. Phase I would see a new, 2,700-kilometer line run from Alberta through Montana, South Dakota, and Nebraska, meeting up with the current line in Steele City. The second phase would connect the entire system to the Gulf Coast by adding a line from Cushing, Oklahoma, to Houston.

More HERE

***************************************

For more postings from me, see DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC, GUN WATCH, AUSTRALIAN POLITICS, IMMIGRATION WATCH INTERNATIONAL and EYE ON BRITAIN. My Home Pages are here or here or here. Email me (John Ray) here. For readers in China or for times when blogger.com is playing up, there are mirrors of this site here and here

*****************************************

No comments: