Wednesday, May 17, 2006


Global Insight, Inc., the private company that brought together DRI and WEFA, the world's most respected economic analysis, forecasting and financial information companies, today announced the release of its annual European Power Price Report. The report found that the impact of the European Union Emissions Trading Scheme (EU ETS), while increasing electricity prices in all major European markets, will be materially different across these countries due to their very different mixes of generating plants.

The EU ETS, scheduled for implementation in January 2005, will make it more expensive to generate electricity using carbon-emitting fuels such as oil, gas and coal by putting a price on carbon dioxide (CO2) emissions. Despite the fact that at least 95% of the available emissions permits will be allocated free in the first phase of the scheme (2005-2007), the price at which those allowances are subsequently traded will directly affect operating decisions. As a result, the notional or opportunity costs of the emissions will still be passed through to consumers in the form of higher prices. Generators with the largest CO2 emission allowances will receive the largest revenue windfall. The size of the windfall to each generator plant will depend on the market CO2 price and the volume of emissions credits it receives against its actual emissions.

According to the study, Germany and the UK will see the largest electricity wholesale price rises, due to their strong reliance on coal plants to generate electricity, resulting in electricity prices increasing as much as 40% by 2010.

Power prices in Italy are forecast to rise 15% given current CO2 market prices but could increase by up to 30% if those prices double. The smaller forecast impact on Italian prices, when compared to those in the UK and Germany, are due to its unusual mix of zero-carbon hydropower and high-carbon oil-fired plants. Spain and the Netherlands are forecast to fare much better, with electricity prices rising only between 10% and 20% by 2010, depending on the market price for CO2. Both countries primarily rely on gas-fired low carbon emitting power generation. Spain in addition also benefits from its hydropower plants. The study also found that Italy, Spain and the Netherlands will not be able to rely on emissions reductions from their power sectors to make progress against their Kyoto obligations and will need to import their CO2 reductions from other countries.

Dr Trevor Sikorski, Head of Global Insight's Power Service observed, "The expected price rises and windfall gains of some generator plants raises a serious question: Will governments stand by and allow industrial and household consumers to pay the higher electricity rates, with the main financial beneficiaries being the power generators' shareholders? If so, the manufacturing industry will be dealt a further blow with the inevitable result of direct and indirect job losses."

A key factor shaping the power markets in the next few years will be environmental policy. The introduction of the EU ETS, the generous incentives for greater renewable power across Europe and the application of the Large Combustion Plant Directive (LCPD) will all have a significant impact on plant mix and the type of new plants being built. "It highlights starkly the great problems governments will face in resolving conflicts between environmental and social objectives, both of which figure very high on the electoral agenda," added Sikorski.



Climate sensitivity is defined as the average increase of the temperature of the Earth that you get (or expect) by doubling the amount of CO2 in the atmosphere - from 0.028% in the pre-industrial era to the future value of 0.056% (expected around 2100). If you assume no feedback mechanisms and you just compute how much additional energy in the form of infrared rays emitted by (or reflected from) the surface will be absorbed by the carbon dioxide (refresh your knowledge about Earth's energy budget), you obtain the value of 1 Celsius degree for the climate sensitivity.

While the feedback mechanisms may shift the sensitivity in either direction, Prof. Richard Lindzen of MIT, a world's leader in the sensitivity issue, will convince you that the estimate is about right but the true value, with the mostly unknown feedback mechanisms, is likely to be lower. Why is it so?

You should realize that the carbon dioxide only absorbs the infrared radiation at certain frequencies, and it can only absorb the maximum of 100% of the radiation at these frequencies. By this comment, I want to point out that the "forcing" - the expected additive shift of the terrestrial equilibrium temperature - is not a linear function of the carbon dioxide concentration. Instead, the additional greenhouse effect becomes increasingly unimportant as the concentration increases: the expected temperature increase is something like

* 1.5 ( 1 - exp[-(concentration-280)/200 ppm] ) Celsius

The decreasing exponential tells you how much radiation at the critical frequencies is able to penetrate through the carbon dioxide and leave the planet. The numbers in the formula above are not completely accurate and the precise exponential form is not quite robust either but the qualitative message is reliable. When the concentration increases, additional CO2 becomes less and less important.

In particular, there exists nothing such as a "runaway effect" or a "point of no return" or a "tipping point" or any of the similar hysterical fairy-tales promoted by various Al Gores. The formula above simply does not allow you more than 1.5 Celsius degrees of warming from the CO2 greenhouse effect. Similar formulae based on the Arrhenius' law predicts a decrease of the derivative "d Temperature / d Concentration" to be a power law - not exponential decrease - but it is still a decrease.

One might also want to obtain a better formula by integrating the formula above over frequencies:


In all cases, such a possible warming distributed over centuries is certainly nothing that a person with IQ above 80 should be producing movies about.

When you substitute the concentration of 560 ppm (parts per million), you obtain something like 1 Celsius degree increase relatively to the pre-industrial era. But even if you plug in the current concentration of 380 ppm, you obtain about 0.76 Celsius degrees of "global warming". Although we have only completed about 40% of the proverbial CO2 doubling, we have already achieved about 75% of the warming effect that is expected from such a doubling: the difference is a result of the exponentially suppressed influence of the growing carbon dioxide concentration.

In reality, the increase of the temperatures since the pre-industrial era was comparable or slightly smaller than 0.76 Celsius degrees - something like 0.6 Celsius degrees. It is consistent to assume that the no-feedback "college physics" calculation of the CO2 greenhouse effect is approximately right, and if it is not quite right, it is more likely to be an overestimate rather than an underestimate, given the observed data.

The numbers and calculations above are actually not too controversial. Gavin Schmidt, a well-known alarmist from RealClimate, more or less agrees with the calculated figures, even though he adds a certain amount of fog - he selectively constructs various minor arguments that have the capacity to "tilt" the calculation above in the alarmist direction. But the figure of 1 Celsius degree - understood as a rough estimate - seems to be consistent with everything and Schmidt claims that only intellectually challenged climate scientists estimate the sensitivity to be around 5 Celsius degrees (I forgot Schmidt's exact wording).

Three weeks ago, Hegerl et al. have published a text in Nature that claims that the 95 percent confidence interval for the climate sensitivity is between 1.5 and 6.2 Celsius degrees. James Annan decided to publish a reply (with J.C. Hargreaves). As you might know, James Annan - who likes to gamble and to make bets about the global warming - is

* an alarmist who believes all kinds of crazy things about the dangerous global warming;

* a weird advocate of the Bayesian probabilistic reasoning.

However, he decided to publish a reply saying that

* the actual sensitivity is about 5 times smaller than the Hegerl et al. upper bound -- which means that the warming from the carbon dioxide won't be too interesting;

* Hegerl et al. have made errors in statistical reasoning; the error may be summarized as an application of Bayesian priors which are unscientific.

The second point means that Hegerl et al. simply use a "prior" (a dogma or a random religious preconception that is a crucial part of the Bayesian statistical reasoning) that simply allows the sensitivity to be huge a priori - and such a huge preconception is then not removed by the subsequent procedure of "Bayesian inference". Such an outcome is a typical result of Bayesian methods: garbage in, garbage out. I am convinced that the fact that Annan was able to appreciate these incorrect points of Hegerl et al. is partially a result of my educational influence on James Annan.

Nevertheless, Annan's reply was rejected by Nicki Stevens of Nature without review with the following cute justification:

"We have regretfully decided that publication of this comment as a Brief Communication Arising is not justified, as the concerns you have raised apply more generally to a widespread methodological approach, and not solely to the Hegerl et al. paper."

In other words, Annan's reply could have the ability to catch errors that influence more than one paper, and such replies are not welcome. Imagine that Nicki Stevens is the editor of "Annals der Physik" instead of Max Planck who received Albert Einstein's paper on special relativity. Even better, you can also imagine that Nicki Stevens is the editor who receives the paper on General Relativity whose insights apply more generally. ;-)

When we apply my reasoning more generally to a widespread methodological approach, we could also wonder whether the person named Nicki Stevens realized that one half of the internet was going to discuss how unusually dumb she was.

Lubos Motl's Reference Frame, 13 May 2006


Last week the Bank of England's May Inflation Report was released. As usual it contains its CPI inflation projection in the elegant form of a "fan chart". This chart comprises a series of bands fanning out from the start date and over the projection period, and includes just 90pc of all the expected outcomes, casting the remaining 10pc into limbo. The bands widen as the time horizon is extended, indicating increasing uncertainty about outcomes.

The Bank's fan chart is an exemplary attempt to display the relative probabilities of projected outcomes. Above all, it frankly acknowledges the huge uncertainties involved in any forecasting exercise and attempts to quantify them.

One would like to think the Intergovernmental Panel on Climate Change (IPCC) would take a similar approach. Alas, nothing could be further from the truth. The IPCC was set up in 1988 by two UN organisations, the World Meteorological Organisation (WMO) and the UN Environment Programme (UNEP). Its remit is to assess the "risk of human-induced climate change" by analysing the relevant economic and scientific information.

The IPCC is composed of representatives largely appointed by governments and is led by government scientists. It does not carry out its own research but relies on peer reviewed and published literature for its reports and assessments of future developments.

The IPCC is a hugely influential organisation. It solely provides the crucial assessments of manmade climate change that inform the decisions of international policymakers on global warming. Such policy making naturally occurs within the orbit of the UN.

The UN Framework Convention on Climate Change (UNFCCC) was initially developed at the Earth Summit held in Rio de Janeiro in 1992, the first international environmental treaty aimed at reducing greenhouse gas emissions to combat global warming. It did not set mandatory emission limits or contain enforcement provisions. The significant and path-breaking Kyoto Protocol to the UNFCCC of 1997 "corrected" these shortcomings. Under it, signatory countries agreed to legally binding reductions in emissions averaging 6-8pc below 1990 levels for the years 2008-12.

Compliance with Kyoto is not costless. And with the prospect of a "son of Kyoto" protocol, with potentially tighter emissions targets and higher costs, it is all the more important that the IPCC's analysis of future "human-induced climate change" is relevant, accurate and, above all, robust. This is a tall, though arguably not impossible, order. But no one should underestimate the difficulties of projecting the impact of mankind's activities on the climate. This is because the links in the chain that attempt to explain the relevant relationships are horrendously complex.

There are at least five identifiable links. First, how economic activity affects greenhouse gas (GHG) emissions; second, how changes in emissions alter atmospheric concentrations of GHGs; third, how altered atmospheric concentrations of GHGs impact on "radiative forcings" (changes in the balance of radiation entering and exiting the earth's atmosphere); fourth, how altered "radiative forcings" influence global temperatures; and, how these temperature changes impact on the environment and humanity with costs and benefits.

Every one of these links is riddled with uncertainty and, therefore, should be put through the discipline of a Bank of England fan-chart style of exercise when projected forward.

Given that the IPCC's analysis looks forward to 2100, compared with the Bank's modest 2009, the only conclusion that could be drawn would surely be one of overwhelming uncertainty!

The IPCC does not, however, make any pretence at following the Bank's probabilistic approach. Its analysis of "future alternatives" is based on probability-free "scenarios". In its Special Report on Emission Scenarios (2000), for example, it discussed 40 scenarios, based on a limited range of assumptions including population change, economic growth, technological change and energy usage. It concluded that temperature increases could range from 1.7 to 6.1 degrees Celsius by 2100.

More probable scenarios are given no more prominence than less probable ones. This is a killer constraint, especially as some of the biggest doubts have been expressed about the high emission scenarios.

In addition, many of the IPCC's assumptions and much of the analysis are highly questionable, not least because of the probable influence of "political factors".*

The IPCC's analysis, on which major international policy makers depend, is quite simply inadequate. And its position as monopoly supplier of official climate change wisdom must be challenged by individual governments. More rigorous analysis is needed. It's time to "call time" on the IPCC.

The Daily Telegraph, 15 May 2006


Polar bears are cute. Just ask the marketing executives at Coca-Cola which used animated polar bears to hawk their wares in recent years. Bears, pandas, lions and elephants are "charismatic megafauna" -- meaning basically cute animals that people care about. If you want to sell a product, or a cause, just tie it to one of these animals and you've got the attention of millions of people; kids and adults alike.

Thus, environmental alarmists have made much of research claiming the Arctic's great white bear faces extinction from human-caused global warming. Snails, snakes and spiders withering in the sun just don't pack the same emotional punch as a cuddly, furry polar bear slipping beneath the melting ice.

Fortunately, a new study by David Legates, director of the University of Delaware's Center for Climatic Research, throws cold water on the claim global warming threatens polar bears survival.

Mr. Legates critiques the Arctic Climate Impact Assessment that proclaimed Arctic air temperature trends strongly indicate global warming, causing polar ice caps and glaciers to melt. However, Mr. Legates says, the Assessment ignored data that undermine these claims.

For example, coastal stations in Greenland are cooling and average summer air temperatures at the summit of the Greenland Ice Sheet have decreased by 4 degrees Fahrenheit per decade since measurements began in 1987. In addition, records from Russian coastal stations show the extent and thickness of sea ice has varied greatly over 60- to 80-year periods during the last 125 years. Moreover, the maximum air temperature they report for the 20th century was in 1938, when it was nearly four-tenths of a degree Fahrenheit warmer than the air temperature in 2000.

Ice core data from Baffin Island and sea core sediments from the Chukchi Sea also show that even if there is warming, it has occurred before. In Alaska, the onset of a climatic shift -- a warming -- in 1976-1977 ended the multidecade cold trend in the mid-20th century returning temperatures to those of the early 20th century.

In addition, a study commissioned by Canada's Fisheries and Oceans Department examined the relationship between air temperature and sea ice coverage, concluding, "the possible impact of global warming appears to play a minor role in changes to Arctic sea ice."

According to the Arctic Assessment the threat to polar bears is threefold: changes in rainfall or snowfall amounts or patterns could affect the ability of bears primary prey species (seals) to successfully reproduce; decreased sea ice could result in greater number of polar bears drowning or living more on land, negatively affecting their diet by forcing them to use their fat stores prior to hibernation; and unusual warm spells could cause the collapse of winter dens or force more bears into less desirable den areas.

Though uniquely adapted to the Arctic, polar bears are not wedded solely to its coldest parts nor a specific Arctic diet. Aside from a variety of seals, they eat fish, kelp, caribou, ducks, sea birds, the occasional beluga whale and musk ox and scavenged whale and walrus carcasses.

Interestingly, the World Wildlife Fund (WWF) has also written on the threats posed to polar bears from global warming. But, their own data on polar bear populations contradict claims that rising air temperatures are causing a decline in polar bear populations.

According to the WWF there are some 22,000 polar bears in about 20 distinct populations worldwide. Only two bear populations -- accounting for about 16.4 percent of the total -- are decreasing, and they are in areas where air temperatures have actually fallen, such as the Baffin Bay region. By contrast, another two populations -- about 13.6 percent of the total number -- are growing and they live in areas were air temperatures have risen, near the Bering Strait and the Chukchi Sea.

As for the rest, 10 populations -- comprising about 45.4 percent of the total -- are stable, and the status of the remaining six is unknown. Conclusion: based on the available evidence there is little reason to believe the current warming trend will lead to extinction of polar bears.

These bears have survived for thousands of years, during both colder and warmer periods, and their populations are by and large in good shape. Polar bears may face many threats, but global warming is not primary among them. Global warming alarmists are like the wizard of Oz, asking the public fear the spectacle, but not to pull back the curtain and unmask them for the charlatans they are.

The Washington Times, 15 May 2006


Many people would like to be kind to others so Leftists exploit that with their nonsense about equality. Most people want a clean, green environment so Greenies exploit that by inventing all sorts of far-fetched threats to the environment. But for both, the real motive is to promote themselves as wiser and better than everyone else, truth regardless.

Global warming has taken the place of Communism as an absurdity that "liberals" will defend to the death regardless of the evidence showing its folly. Evidence never has mattered to real Leftists

Comments? Email me here. My Home Page is here or here. For times when is playing up, there are mirrors of this site here and here.


No comments: