Sunday, February 24, 2008


The enviroloonies seem to have found their way out of the asylum again: this time to tell us that 70 per cent of Britons should die for the sake of Gaia. That's not quite the way they put it, of course. Rather, the Optimum Population Trust (there's a pedantic part of me that wants to tell them it's Optimal) tells us that the maximum sustainable population of the UK is 17 million: given that there are north of 60 million currently, we can only avert the coming End Times if the extra pop their clogs soonest.

It's not bad for a paper on demography, economics, the environment and their interactions written by a physicist, that is, a paper written by someone with no knowledge of any of the three basic disciplines. The argument rests on two fundamental pieces of illogic. The first is the use of the Commoner-Ehrlich equation which is that ecological Impact is equal to Population times Affluence times Technology or:

I = P x A x T

Paul Ehrlich, you might recall, is the man who in the 60s predicted hundreds of millions starving in India in the 70s and the US in the 80s. Then in the 70s predicted the same in the 80s and 90s and, in his latest book, Real Soon Now. The flaw in this equation is that technology is held to multiply the impact instead of, as is obvious to even the casual observer, divide it.

If you haven't spotted why yet, consider this. Are we using higher technology than hunter gatherers? Yes? Good, now, if there were 6 billion hunter gatherers around, would Gaia simply be, as at present, a bit grumpy, or even worse off? Correct, there wouldn't be any biosphere at all as that many humans with flints and spears alone would have eaten every thing on the planet and then each other. As, indeed, hunter gatherers did with the megafauna of every place they got to outside Africa, the Aborigines, the Clovis culture in North America, the Maori in New Zealand and so on. The equation should thus read:

I = (P x A)/T

For higher technology reduces the environmental impact. The effect of this upon the logic used in the paper is this. As the paper says, higher technology and increased affluence increase the pressure on the environment, and as none of us is prepared to give up the levels of both which we already have, the only thing we can do to save the planet is to have fewer people. But getting the equation the right way around removes this constraint: we can reduce the impact by having better technology and there's no need to go round slaughtering the chavs [disrespectful young British lower class people] (well, OK, not this reason then). And most importantly, as we'll see, we can do this by creating technology which has lower carbon emissions.

The second conceptual error is that in their calculation of the permissible population level they use the concept of ecological footprints as calculated by Mathis Wackernagel. Now in one way I've got a lot of time for him: it's not everyone who manages to turn their Ph.D thesis into a thriving international business, so hats off, well done sir. On the other hand, that thesis is what is technically known in serious circles as horse manure. For example, when looking at the carbon emissions of nuclear power, the calculation is:

Nuclear power, about 4 per cent of global energy use, does not generate CO2. Its footprint is calculated as the area required to absorb the CO2 emitted by using the equivalent amount of energy from fossil fuels.

Mat bubba: over the cycle nuclear does have CO2 emissions, roughly the same as hydro or wind, less than half solar and a tiny fraction of coal. But our ecological footprint idea gets much worse than that. The essential idea is that we work out how much land a particular activity requires. Then we work out how many activites and how much of such there are and then look at how many hectares of land we need to be able to do all of them. This is what gives us our regular yearly (when Mathis and his boys release their annual update) cycle of we need "three more earths" if we're all going to carry on living like this.

Again there's a conceptual error about technology: thinking that the amount of land we need to do something is static, which it plainly isn't. We get more food off a hectare now than we did last year, as we have every year for at least a century (yields have been going up one per cent a year for at least that long) and so on. But wait, there's yet more.

Each piece of land is only allowed to count once. The land needed to recycle CO2 emissions is somehow different land from that needed to grow the food: that plants eat CO2 to turn it into my food gets missed.

Even given all of this exaggeration the actual end finding of the ecological footprints calculations is that we've got plenty of land to do everything except recycle our CO2 emissions, something which really isn't all that much of a surprise. We've had thousands of scientists labouring away for more than a decade to tell us that, they even wrote a great big report about it. And guess what the result of that report is? If we can invent a few more bright shiny new technologies that don't emit carbon then everything is just hunky dory.

In the end this report is just another sad set of scribblings from people who would appear to have some deeper personal problems. Perhaps it's the thought of people having sex without a full body condom that does it, or perhaps they've come over all Fran Liebowitz ("Children don't smoke enough and I find that they're sticky, perhaps as a result of not smoking enough") but something is clearly wrong, when we read:

"It follows that if it is not possible to constrain affluence and technology, then the only parameter left to constrain and reduce is population."

Their sad misunderstanding about the effects of technology blinds them to the truth, that by not constraining technology we don't have to constrain either affluence or population. The late great Julian Simon once calculated that we had the resources for a permanently growing economy and population for the next 7 billion years. That might be a little Panglossian, to be honest, but it's more accurate than the insistence that there should be fewer, poorer people.



Three German biodiesel production plants were recently sold to the United States and Canada and more are up for sale after biodiesel sales collapsed, a German renewable fuels industry leader said on Wednesday. "I estimate that 30 percent of Germany's biodiesel plants are now up for sale," said Peter Schrum, president of the German renewable fuels industry association BBK. "Three have recently been sold to the US and Canada and I believe that more will follow," he told Reuters in an interview. Names of the companies involved are not being given. "Germany's biodiesel industry is currently being dismantled and sold abroad," he said.

The country's five million tonnes biodiesel industry is only producing at about 10 percent of capacity largely because a biofuels tax increase on Jan 1 has sharply cut sales, he said.

Although the European Union wants to increase biofuel use to stop global warming, Germany has started taxing biodiesel as the government said it cannot afford to lose the large tax revenue from fossil diesel. "The tax means that biodiesel is now more expensive than fossil diesel," said Schrum. "As biodiesel has eight percent less energy content, this means no-one is buying biodiesel."



The European Commission, bowing to industry concerns, said Thursday it was ready to exempt Europe's steel, chemical and power sectors from having to compensate for the environmental damage they cause - at least for a while.

The EU was keen to see a global deal to reduce greenhouse gas emissions and, until a deal was in place, the EU would hold back on plans to force more companies to pay to pollute from 2013, European Commission President Jose Manuel Barroso told European business leaders.

Last month, the EU executive said it would demand that major polluters buy all the carbon permits they need, which would raise the cost of manufacturing by charging them up to _50 billion (US$74 billion) a year and likely hike electricity prices by 10-15 percent.

Until now, companies get most of their carbon permits for free - and they say extra costs will make it harder for them to compete against rivals in countries that are less active in curbing climate change such as the U.S., China and India.

Barroso's comments aimed to soothe those worries. "Ultimately, the best solution is an international (emissions trading) agreement," he said. "But in the absence of an international agreement, we should be ready to look at interim solutions for energy intensive industries. For example, receiving their (emission trading) allowances free of charge, or requiring importers to obtain allowances alongside European competitors." Barroso added, "This is the most we can do for our energy-intensive industries."

The EU's carbon cap-and-trade program aims to cut overall releases by giving a financial incentive for companies to cut back on carbon because they can sell the permits they do not use. If they need to run plants for longer or fail to turn to cleaner technology, they need to buy more permits.

The EU insists that the costs of climate change are far outweighed by the damage a warmer climate will cause. Reducing energy use and turning to renewable energy sources could also slim down Europe's huge bill for oil and natural gas imports.

Barroso was rewarded for his address with a round of warm applause from hundreds of European business leaders attending a two-day conference devoted to climate change.



THE LONE voice in the midst of the debate argued effectively that climate change is both unavoidable and a `cyclical' phenomenon. Scientist Dr Henry Clemmey, who is the managing director of Preston-based Woodford Global Group, asserted that global warming is part of a pattern that has been happening throughout the earth's history.

The former Leeds University academic admits to having seen the effects of climate change within his own lifetime - but believes that mankind cannot be held responsible. "Climate change is something that we can't stop or prevent - but we do need to be knowledgeable about its potential effects," he said. "I think that so many debates about climate change take place against an ignorance of both time and scale in terms of the earth's evolution. "If you look back through the history of the planet, you will find a whole series of cycles. "You will also see that these cycles have taken place before - and will take place again in the future."

He added that the cycle lasted for 60,000 years and was currently warming up and said: "it's not so long ago - in terms of the earth's evolution over the past four billion years - that the climate was far hotter than this."

Dr Clemmey also maintained that when the carbon particles found in the atmosphere now are compared with other periods in the earth's evolution, they show that mankind's recent behaviour cannot be held responsible for climate change. He believes that it is a gradual progression that has led to this point.

His figures were dismissed by glacier expert Professor David Collins and climate change expert Professor Kevin Anderson.

"I cannot always understand the scientists' current obsession with carbon emissions," said Dr Clemmey. "The current changes we are experiencing go far beyond what we have caused by our lifestyles in our own lifetimes - and ultimately the earth will look after itself."



For more than 100 years, climate scientists have fully understood that if all else were held constant, an increase in the atmospheric concentration of carbon dioxide (CO2) would lead to an increase in the near-surface air temperatures. The problem becomes a lot more complicated in the real world when we consider that "all else" cannot be held constant and there are a lot more changes occurring at any one time than just the concentration of CO2. Once the temperature of the Earth starts inching upward, changes immediately occur to atmospheric moisture levels, cloud patterns, surface properties, and on and on. Some of these changes, like the additional moisture, amplify the warming and represent positive feedback mechanisms. Other consequences, like the development of more low clouds, would act to retard or even reverse the warming and represent negative feedbacks. Getting all the feedbacks correct is critical to predicting future conditions, and these feedbacks are simulated numerically in global climate general circulation models (GCMs). Herein lies a central component of the great debate - some GCMs predict relatively little warming for a doubling of CO2, and others predict substantial warming for the same change in atmospheric composition.

If that is not enough, changes in CO2 in the real world would almost certainly be associated with other changes in the atmosphere - sulfur dioxide, mineral aerosols (dust), ozone, black carbon, and who knows what else would vary through time and complicate the "all else held constant" picture. By the way, the Sun varies its output as well. And when discussing climate change over the next century, even more uncertainties come from estimations of economic growth, adoption of various energy alternatives, human population growth, land use changes, and . you get the message.

However, the fundamental question in the greenhouse debate still comes down largely to a question of climate sensitivity defined as the change in global temperature for change in radiative forcing associated with varying levels of atmospheric CO2. The United Nations Intergovernmental Panel on Climate Change (IPCC) suggests that the sensitivity is between 0.48 and 1.40 degrees Kelvin (K) per one Watt per square meter (Wm-2) which translates into a global warming of 2.0 K to 4.5 K for a doubling of CO2 concentration (1 degree K equals one degree Celsius which equals 1.8 degrees Fahrenheit). Rather than turn this into a review of a physics course, what we have is the IPCC predicting global warming of 3.2øF to 7.2øF for a doubling of CO2 concentration. Others have shown in very credible professional journals that there is a 66% chance of the IPCC being right in their estimate - this provides the fodder for alarmists to suggest that IPCC acknowledges the possibility of a global warm up of 10øF for a doubling of CO2.

To say the least, these numbers are hotly debated in the climate community. A recent article in Geophysical Research Letters presents an interesting approach to pinning down the critical sensitivity value (K/Wm-2) for elevated levels of CO2. The article is by Petr Chylek and Ulrike Lohmann of New Mexico's Los Alamos National Laboratory and Switzerland's Institute for Atmospheric and Climate Science; funding was provided by the Los Alamos Laboratory. The team decided to re-examine the temperature, CO2, methane, and dust record from the Vostok ice core extracted from a site in Antarctica. Although the core record goes back nearly a half million years, Chylek and Lohmann elected to restrict their primary analysis to the past 42,000 years....

By combining temperatures, carbon dioxide concentrations, methane concentrations and importantly, dust amounts determined from the ice core during the past 42,000 years, the authors were able to derive the climate sensitivity from the combined variations for these factors. One of their largest uncertainties surrounded the dust amounts, and so Chylek and Lohmann turned to a climate model to see if changes in atmospheric dustiness could have the magnitude of the effect on global temperatures (and thus climate sensitivity) that they had determined empirically. The modeled results were consistent with their other calculations, giving them added confidence in their calculations.

The reason they were looking for independent confirmation was that their findings for climate sensitivity were near the low end of the bounding range given by the IPCC—and that means they are going to be subject to an endless amount of scrutiny from those folks who want potential global warming to seem as bad as possible.

To long-time readers of World Climate Report (and its predecessors), these results should hardly come as much of surprise. For at least a good 7 or 8 years we have repeatedly been telling you that you should be expecting about a 1.5 to 2.0ºC of warming from greenhouse gas increases this century. Chylek and Lohmann’s findings are simply further confirmation of this.

The biggest thing to take home in all of this is that the less the temperature rise, the less the chance for major disruption, such as a large sea level rise, at least anytime soon. That means we have more time to figure out a solution. Assuredly, had Chylek and Lohmann discovered that IPCC was underestimating the climate sensitivity, they would have been a front page news story the world over. Instead, they found that IPCC is likely overestimating the climate sensitivity to CO2, so they were reduced to coverage only at World Climate Report.



For more postings from me, see TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC, GUN WATCH, SOCIALIZED MEDICINE, AUSTRALIAN POLITICS, DISSECTING LEFTISM, IMMIGRATION WATCH INTERNATIONAL and EYE ON BRITAIN. My Home Pages are here or here or here. Email me (John Ray) here. For times when is playing up, there are mirrors of this site here and here.


1 comment:

Anonymous said...

The proper answer for those say we need to "Die for Gia" is "You first, it's your thesis so you need to lead by example".