Tuesday, February 28, 2006

Bible Bending Propaganda

It's almost too bad that Jesus Christ has been historically depicted as a long-haired, bearded, and sandal-clad -- because the enviro-hippies behind something called the "Evangelical Climate Initiative" have now claimed Him for their own alarmist agenda. While those physical representations of Christ may be accurate, the Biblical claims that these Birkenstocked believers make for global warming reductions are hermeneutically deficient. Most of their flawed interpretations emphasize the social gospel (surprise!) rather than genuine Divine intent -- a common liberal tendency.

They include the utterances of Sir John Houghton, a climate scientist, former chair of the Intergovernmental Panel on Climate Change, and an "evangelical Christian," according to the ECI. In a speech to the National Association of Evangelicals almost a year ago, he made a scientific case for the existence of global warming, and its benefits and drawbacks (with the second far outweighing the first). I will leave technical refutation and doubt to others, and address the Biblical support he attempted to use to buttress his position:

"[God] demonstrated this most eloquently by sending his son Jesus to be part of creation and by giving to us the responsibility of being good stewards of creation. What is more I believe that we do not do this on our own but in partnership with Him -- a partnership that is presented so beautifully in the early chapters of Genesis where we read that God walked with Adam and Eve in the garden in the cool of the day."

There you have it! God intended for man to live in temperate climes. But then again, He also intended for man to live naked: so much for that. And not surprisingly, Houghton butchered Genesis 3:8, which actually says that Adam and Eve (after the Fall) "heard the sound of the Lord God walking in the garden in the cool of the day," after which they tried to hide from Him -- hardly a harmonious stroll.

But seriously, Sir John's pining for the early days of the Bible is admirable. Who doesn't wish we were in the days of sinless perfection, in absolute accord with God? Alas, that is not the condition of the present globe on which we live. Rather than actually making us "good stewards of creation" as Houghton claims, God instead cursed the world and essentially said, "Here -- deal with this!" We've been toiling over the corrupted soil ever since, and the unspoiled creation that Jesus was allegedly sent "to be part of" disappeared long ago.

Still the 86 ministry leaders behind the ECI bought into it, maintaining that they "are articulating a biblical, Christ-centered, business-friendly evangelical approach to climate change and providing a different way of understanding the problem":

Once we understand the profound impacts climate change will have on people, especially the most vulnerable, then we find plenty in the Bible calling us to take prompt action. Jesus' commands to love our neighbors (Mk. 12:30-31), do unto others as we would have them do unto us (Lk. 6:31), care for "the least of these" (Mt. 25:40, 45), and be proper stewards of His creation (Lk. 12:42-48; Col. 1:16) all require immediate and sustained action to solve global warming.

These "social Gospel" passages can hardly be construed as a legitimate case for the fight to reduce global warming. The first misinterpretation is the proper role of man in relation to the creation. Calls to "stewardship" in the Bible never have to do with caring for some pristine earth -- for its own sake or for God's. Instead God gave man "dominion" over the world and its creatures, for human consumption and use.

Of course, that doesn't mean you pollute willy-nilly. Dumping oil or chemicals where they can seep in someone's water supply certainly is unneighborly. But that has nothing to do with Biblical "earth" stewardship, and linking disputed negative global warming effects to proper social practices is misleading at best. If environmentally conscious Christians want to do something that will clearly and measurably help their poor neighbors, why don't they invest in waste removal in places like Port-au-Prince and Bangladesh instead?

The answer is, because ECI signees have been duped by environmentalist liberals and have failed to discern their Biblical illiteracy:

This is God's world, and any damage that we do to God's world is an offense against God Himself (Gen. 1; Ps. 24; Col. 1:16).

I hope the ECI endorsers didn't overstrain their eyes searching those Scripture references for evidence of God's anger at human abuse of the earth. Instead, they would do well to recall some other Biblical citations that emphasize what the real goals of Christian ministry should be in relation to the planet. They should remember that the Apostle Paul disdained those "who set their mind on earthly things. For our citizenship is in heaven, from which we also eagerly wait for the Savior, the Lord Jesus Christ" (Philippians 3:19-20).

As for Jesus, contrary to Sir Houghton's assertions, He does not dwell on the earth but instead will return to the New Jerusalem (Rev. 21:22-23), after God also establishes a new heaven and a new earth (Rev. 21:1).

And don't forget, God has some serious global warming of His own planned (2 Peter 3:10). Christian leaders ought to be warning people about that rather than looking for ways to mitigate the questionable effects of the current heat wave.


Genetically engineered crops: Only the news that fits a Luddite agenda in the NYT

Newspapers are often criticized for bias in their "news" articles. A prime example was Andrew Pollack's Feb. 14 New York Times piece on biotechnology applied to agriculture: "At the dawn of the era of genetically engineered crops, scientists were envisioning all sorts of healthier and tastier foods, including cancer-fighting tomatoes, rot-resistant fruits, potatoes that would produce healthier French fries and even beans that would not cause flatulence. ... Resistance to genetically modified foods, technical difficulties, legal and business obstacles and the ability to develop improved foods without genetic engineering have winnowed the pipeline."

While Mr. Pollack misses many of the nuances about biotechnology applied to agriculture and food production, he devotes ample ink to the anti-biotech crowd, including the Pew Initiative on Food and Biotechnology (which he describes as a "nonprofit group," though "anti-biotechnology lobbyists" would be more accurate) and the radical Friends of the Earth.

Memo to Mr. Pollack: All points of view on scientific and technological issues are not created equal. Good journalism is not served by creating a kind of moral equivalence between those who hold ideological, anti-biotech views and those with supportable, legitimate viewpoints -not unlike equating creation theory with Darwinian theory. How ironic the same activists who opposed agbiotech relentlessly for 20 years now decry the "hype" and "overselling" of its benefits-rather like the teenager convicted of murdering his parents who pleads for mercy from the courts because he's an orphan.

Reflecting the views of biotech's antagonists, Mr. Pollack approaches the subject as though genetic engineering of plants were fundamentally new. But virtually all the 200 major crops in North America have been genetically improved, or modified, in some way. Plant breeders, not nature, gave us seedless grapes and watermelons, the tangelo (a tangerine-grapefruit hybrid), the canola variety of rapeseed and fungus-resistant strawberries. In North American and European diets, only fish and wild game, berries and mushrooms may be said not to have been genetically engineered in some fashion.

North Americans have consumed more than a trillion servings of foods containing gene-spliced ingredients, without a single untoward reaction. Gene-splicing is essentially an extension, or refinement, of earlier, less precise, less predictable techniques. In fact, when conventional and gene-spliced seed materials are mixed, arguably the former should be thought of as contaminating the latter.

What makes false alarms about a new technology hard to expose is the virtual impossibility of demonstrating the absolute safety of any activity or product: It's always possible we haven't yet gotten to the nth hypothetical risk or to the nth dose or the nth year of exposure, when the risk will finally be demonstrated. It is logically impossible to prove a negative, and all activities pose some nonzero risk of adverse effects.

The use of gene-splicing to craft small, precise genetic changes that enhance or introduce desirable traits into plants has been a stunning technological success. But excessive and unscientific regulation and the intractable opposition of activists have slowed its translation into consumer-friendly foods. Contrary to Mr. Pollack's implication, gene-spliced "potatoes that would produce healthier french fries" (with higher-than-usual starch content) were available-until anti-biotech activists bullied fast-food chains into rejecting them.

Mr. Pollack's statement, "Developing nonallergenic products and other healthful crops has also proved to be difficult technically" is simply untrue. A vast spectrum of such plants has been crafted by laboratory scientists, but they cannot afford the gratuitously inflated regulatory costs to test the plants in the field. Excessive and unwise regulation is a major reason products in the development pipeline "do not include many of the products once envisioned," to quote Mr. Pollack.

Unscientific and discriminatory Environmental Protection Agency and Agriculture Department regulatory policies make field trials with gene-spliced plants 10 to 20 times more expensive than a similar plant engineered with less precise, less predictable conventional genetic techniques. Unlike pharmaceutical development, agricultural R&D is a low-budget enterprise. Such counterintuitive regulation and gratuitous costs make it uneconomical to develop many promising and even important food products.

Then there is Mr. Pollack's puzzling disparaging claim "industry ... has been peddling the same two advantages-herbicide tolerance and insect resistance-for 10 years." These traits have been of monumental importance, not only to farmers' bottom line but to occupational health and the natural environment. Enhanced pest resistance in plants has obviated the need for hundreds of millions of pounds of chemical pesticides (and thereby reduced environmental and occupational exposures). Herbicide tolerance has made possible a shift to more benign herbicides and environment-friendly no-till farming.

As British historian Paul Johnson has written, "Left to themselves, the creative forces in society will always deliver, but keeping them reasonably free to do so is a perpetual, grinding battle. It is one that must never be lost." Once again, the New York Times is fighting on the wrong side.



The owl having been placed on the endangered list in 1990, the Clinton administration (in the mid 90's), banned all logging on 24 million acres in the Pacific North West. This shut down the logging industry, costing around 130,000 jobs

Last month the U.S. Fish and Wildlife Service published a call for proposals to develop a recovery plan for the northern spotted owl. It's about time: The owl was added to the nation's burgeoning list of threatened and endangered species nearly 16 years ago. That it took so long helps explain why only 10 of the 1,264 species listed under the federal Endangered Species Act (ESA) have ever recovered.

If my gut reading is correct, the owl won't be No. 11. It is already doomed across much of its range, and the reason is well known among field biologists who have been observing the bird for some 20 years. More aggressive barred owls are pushing them out of their 21-million-acre home range, or killing them, or both. In any case, spotted owls are fighting a losing battle, a fact that has me wondering if the Fish and Wildlife Service isn't whistling past the graveyard.

Barred owls, not to be confused with common barn owls, migrated from their native East Coast environs a century or more ago. No one knows why, and until they started killing already-threatened spotted owls, no one cared. Now they do. Just how long it will take the barreds to finish off their brethren isn't known, but the situation has become so precarious that a federal biologist recently opined that shooting barred owls might be the only way to save spotted owls.

How and why the government failed so miserably in its costly attempt to protect spotted owls is a sordid tale that illustrates what happens when science is politicized. Begin with the fact that protecting owls was never the objective: Saving old-growth forests from chainsaws was. The owl was simply a surrogate -- a stand-in for forests that do not themselves qualify for ESA protection. But if a link could be established between harvesting in old-growth forests and declining spotted owl numbers, the bird might well qualify for listing -- a line of thinking that in 1988 led Andy Stahl, then a resource analyst with the Sierra Club Legal Defense Fund, to famously declare, "Thank goodness the spotted owl evolved in the Northwest, for if it hadn't, we'd have to genetically engineer it. It's the perfect species for use as a surrogate."

Indeed it was. But to back their play, the Sierra Club, the Audubon Society and their friends in the Clinton administration needed a good story for the judge. They found it in three obscure reports: a 1976 master's thesis written by wildlife biology major Eric Forsman at Oregon State University; Mr. Forsman's 1980 doctoral dissertation; and a 1984 report written by him and two other biologists. All three reports suggested a strong link between declining owl populations and harvesting in old-growth forests. Unfortunately, the hypothesis has never been tested, so despite 16 years of research, no link between old-growth harvesting and declining owl populations has ever been established.

Moreover, we know little about the relationship between harvesting and owl populations. One such study -- privately funded -- infers an inverse relationship between harvesting and owls. In other words, in areas where some harvesting has occurred, owl numbers are increasing a bit, or at least holding their own, while numbers are declining in areas where no harvesting has occurred.

More here


Fossil wood gives vital clues to ancient climates

New research into a missing link in climatology shows that the Earth was not overcome by a greenhouse period when dinosaurs dominated, but experienced rapid fluctuations in temperature and sea level change that resulted in a balance of the global carbon cycle. The study is being published in the March issue of Geology.

"Most people think the mid-Cretaceous period was a super-greenhouse," says Darren Groecke, assistant professor and Director of the Stable Isotope Biogeochemistry Laboratory at McMaster University. "But in fact it was not to dissimilar to the climates over the past 5 million years."

By using high-resolution stable-isotope analysis from 95-million-year-old fossilized wood collected from Nebraska, Groecke and his team were able to precisely correlate the terrestrial carbon cycle with that from deep-sea records. However, when they compared the carbon curves from both records, it was evident that a chunk of about 500,000 years was missing from the terrestrial record. Other records already indicated a drop in sea level, a 2-4§C drop in oceanic temperature and a breakdown in oceanic stratification coincident with a marine extinction event.

"Rapid, large falls in sea-level in the ancient record are typically only produced by a glaciation, and so the combination of all the data during the mid-Cretaceous period suggests a short-lived glaciation during a period generally considered to be a super-greenhouse," says Groecke.

"Whatever hits the water causes a ripple effect on land," says Groecke. "Earth often undergoes rapid temperature fluctuations, and this new information may help us to understand how the biosphere will respond to human-generated alterations of CO2 concentration."

He said the research not only challenges conventional wisdom surrounding ancient climates, it makes a case for the use of high-resolution sampling in order to reconstruct a more accurate picture of the ancient climate and its affect on the Earth.

Eurekalert, 23 February 2006


Many people would like to be kind to others so Leftists exploit that with their nonsense about equality. Most people want a clean, green environment so Greenies exploit that by inventing all sorts of far-fetched threats to the environment. But for both, the real motive is to promote themselves as wiser and better than everyone else, truth regardless.

Global warming has taken the place of Communism as an absurdity that "liberals" will defend to the death regardless of the evidence showing its folly. Evidence never has mattered to real Leftists

Comments? Email me here. My Home Page is here or here. For times when blogger.com is playing up, there are mirrors of this site here and here.


Monday, February 27, 2006


Report from The Wall Street Journal, 10 February 2006

Seeking to resolve a scientific dispute that has taken on a rancorous political edge, the National Academy of Sciences said it had agreed to a request from Congress to assess how well researchers understand the history of temperatures on earth. The study by the academy, an independent advisory body based in Washington, will focus on the "hockey stick," a chart of past temperatures that critics say is inaccurate. The graph gets its name because of the sudden, blade-like rise of recent temperatures compared with past epochs.

The controversy took a sharp political turn in July when Rep. Joe Barton (R., Texas), head of the House Energy and Commerce Committee, launched a probe into the work of three climate specialists who generated the graph, including Michael Mann, now a professor at Pennsylvania State University. Mr. Barton's inquiry drew a rebuke from several scientific societies as well as fellow Republican Sherwood Boehlert of New York, chairman of the House Committee on Science, who called it a blatant effort to intimidate global-warming researchers. After Mr. Barton didn't respond to an offer to jointly bring the issue to the National Academy, Mr. Boehlert independently asked for a review in November, science committee chief of staff David Goldston said. "It appeared that the issue was not going to go away by itself. We thought this was an appropriate way to get an assessment of the science," Mr. Goldston said in an interview.

Larry Neal, deputy staff director for Mr. Barton's committee, said in a statement that because "combating climate change is a breathtakingly expensive prospect," it deserved closer study, and that the academy was "unlikely" to address all of Mr. Barton's concerns. Mr. Barton has already sought a separate analysis of the hockey stick led by statistician Edward Wegman of George Mason University, people familiar with the matter said. Dr. Wegman couldn't be reached yesterday.

Using records stored in ice, tree rings, and coral reefs, scientists including Dr. Mann have estimated that current air temperatures exceed any in the past 1,000 years. Such findings are not only evidence for man-made global warming, but also underlie predictions of future temperature rises. An 11-member academy panel will now study the accuracy and importance of such research, in particular the work of Dr. Mann, whose hockey-stick graph was included in a report issued by the United Nations in 2001. An academy spokesman said the report would be completed in about four months.

Dr. Mann's critics, including two amateur Canadian climate researchers, say his work contains serious inaccuracies. Dr. Mann has denied that, but the debate has prompted several climate researchers to take a fresh look at temperature reconstructions. While some recent publications have found fault with the hockey stick and similar studies, others have sought to rebut critics.


Excerpt from a letter by Steve McIntyre and Ross McKitrick, 17 February 2006

We are writing to protest three of the appointments to the Panel because of bias, lack of objectivity and/or conflict of interest and to protest the failure of the Panel as presently constituted to meet policies of the National Academy of Sciences (NAS) regarding committee composition and balance. We have suggested several alternatives whose appointment would at least partly mitigate these problems.

Dr. Otto-Bliesner

The "Policy on Committee Composition and Balance and Conflicts of Interest for Committees Used in the Development of Reports", a policy statement of the National Academy of Science (NAS) issued in compliance with section 15 of the federal Advisory Committee Act, provides explicit statements about the issues of bias, lack of objectivity and conflict of interest. It states, with respect to conflict of interest:

It is essential that the work of committees of the institution used in the development of reports not be compromised by any significant conflict of interest. For this purpose, the term "conflict of interest" means any financial or other interest which conflicts with the service of the individual because it (1) could significantly impair the individual's objectivity or (2) could create an unfair competitive advantage for any person or organization. Except for those situations in which the institution determines that a conflict of interest is unavoidable and promptly and publicly discloses the conflict of interest, no individual can be appointed to serve (or continue to serve) on a committee of the institution used in the development of reports if the individual has a conflict of interest that is relevant to the functions to be performed. [bold in original]

and, with respect to bias and lack of objectivity:

Finally, it is essential that the work of committees that are used by the institution in the development of reports not be compromised by issues of bias and lack of objectivity. . Questions of lack of objectivity and bias ordinarily relate to views stated or positions taken that are largely intellectually motivated or that arise from the close identification or association of an individual with a particular point of view or the positions or perspectives of a particular group

The Panel is obviously going to have to consider our various criticisms of Mann et al. and will undoubtedly hear reference to a national Media Advisory by UCAR in May 2005 declaring that UCAR employee Caspar Ammann had shown that our various criticisms were "unfounded". This press release has been relied upon in material presented to the U.S. Congress by Sir John Houghton of IPCC, by Dr Mann and by the European Geophysical Union. Ammann has advised one of us that he has used these two unpublished articles in his annual employment review at UCAR.

One of the proposed panellists, Dr Otto-Bliesner, has not only been a frequent coauthor and presenter with Ammann, but is Ammann's immediate supervisor at UCAR (see http://www.cgd.ucar.edu/ccr/paleo/images/Bette1.jpg). As such, she has presumably considered Ammann's articles on our work in the course of carrying out Ammann's annual review. We presume that she would have been involved in preparing and/or approving the UCAR press release on Ammann's work last May. In addition, last year, she co-authored an article with Bradley (of Mann, Bradley and Hughes) and served on a committee with him. It appears to us that her association with Ammann rises to a conflict of interest within NAS policy, but, in the alternative, her associations with Ammann and Bradley certainly rise to bias and lack of objectivity. While she is undoubtedly a meritorious person, the field of candidates is not so limited that her participation in the panel is necessary to its functioning and indeed her continued participation might well diminish the actual and/or perceived ability of the panel to provide objective advice. For example, *** would be an equally competent alternate without the accompanying problems of bias, lack of objectivity and conflict of interest.

Dr. Nychka

Another proposed panellist, Dr Nychka, also a UCAR employee, is listed at Ammann's webpage as presently collaborating not only with Ammann, but with Mann (see http://www.assessment.ucar.edu/paleo/past_stationarity.html). This ongoing collaboration certainly creates the appearance of a "close identification or association of an individual with a particular point of view or the positions or perspectives of a particular group". Again, while Nychka is undoubtedly a meritorious person, the field of candidates is not so limited that he is irreplaceable on the panel and indeed his continued participation might well diminish both the actual ability and the perceived ability of the panel to provide objective advice.

Dr. Cuffey

We are also concerned about apparent bias and lack of objectivity in a third proposed panellist, Dr Cuffey, who in a newspaper op-ed recently wrote:

Mounting evidence has forced an end to any serious scientific debate on whether humans are causing global warming. This is an event of historical significance, but one obscured from public view by the arcane technical literature and the noise generated by perpetual partisans. (see http://www.sfgate.com/cgi-bin/article.cgi?file=/chronicle/archive/2005/10/09/ING5FF2U031.DTL&type=printable )

The panel is being asked to consider the "historical significance" of present climate change. A panellist who has a priori dismissed questions on the matter, some of which are necessarily quite technical, as being "arcane" and "noise generated by perpetual partisans" can be "reasonably perceived to be unwilling, to consider other perspectives or relevant evidence to the contrary" as defined in NAS policy.


UK industry will face increased costs of around 350 million pounds ($614 million) after the European Commission's decision to reject the UK's amended emissions plan, according to business leaders. On Wednesday, the Commission announced it was rejecting "on the grounds of late submission" the UK's national allocation plan (NAP) for the first phase (2005-07) of the EU Emissions Trading Scheme (ETS). It would have increased the UK's overall allocation for the three years by the equivalent of 20 million tonnes (Mt) of carbon dioxide, to 756Mt. The Commission had been legally bound to consider the amendment it lost a court case against the UK government in November 2005.

"This is very disappointing. It does nothing to reduce carbon emissions. It simply increases costs to UK Plc by about o350 million," said David Porter, chief executive of the UK's Association of Electricity Producers. "We shall be seeking talks with the government to see what more can be done," he added. "The 350 million pound costs of covering this misguided shortfall in the UK's carbon emissions allowance is unaffordable," said Matthew Farrow, head of environment for the Confederation of British Industry. "We will urge the government to pursue the case further," he added.

The government also expressed its disappointment at the decision. "We are considering our position, which includes possible further legal action," said a spokesman for the Department of Environment, Food and Rural Affairs (Defra). The government has two months to appeal against the Commission's decision. The NAP sets out the number of allowances - effectively, the emissions targets - for installations in the five industry sectors covered by the EU ETS (power generation, iron and steel, pulp and paper, mineral oil refineries and building materials).

The Commission said it took note of the obligation to consider amendments "as long as these are notified prior to the deadline by which member states must take the final allocation decision. For the first trading period, the relevant deadline was 30 September 2004." The UK submitted its amendment to the Commission in October 2004.


Greenies take on the bottled water nonsense

Drinking water must be one of the most harmless things people can do so I think the Greenies should be aiming their fire elsewhere (at soil erosion and uneconomic farming, for instance) but I do think they have got a point about what a lot of nonsense bottled water is

Australians' love affair with bottled water may be making healthy-living advocates happy but environmentalists say it's taking a heavy toll on the planet. With 65 per cent of plastic drink bottles ending up in landfill, environmentalists are calling for better recycling services to stop an increasingly popular healthy drinking habit from wreaking further damage. The popularity of buying water from a shop fridge is rising at a rate of 10 per cent a year as consumers become increasingly aware that staying well-hydrated is healthy. About 550 million litres of bottled water were consumed in 2004-05, the Australian Beverage Council said. Most purchases were in addition to consuming soft drinks rather than replacing them, it said. But the plastic containers are becoming a big environmental hazard because they use valuable fuels to manufacture and create mountains of rubbish when thrown away, environmentalists say.

Environmental scientist Tim Grant said it was "counterintuitive" that bottled water was such a successful product. "People pay $2.50 for something that's [otherwise] free," Mr Grant said. A recent report by the Washington-based Earth Policy Institute found that the global consumption of bottled water had risen by 57 per cent since 1999 to 154 billion litres in 2004. Much of the growth came from countries such as Australia, where most tap water was as high quality as any water that could be bought. The report's author Emily Arnold said bottled water worldwide required 2.7 million tonnes of plastic each year for its packaging. She said the manufacture of plastic water bottles used 1.5 million barrels of crude oil in the United States alone. "In contrast to tap water, which is distributed through an energy-efficient infrastructure, transporting bottled water long distances involves burning massive quantities of fossil fuels."

More here


Leftist elitists often deride the "Cookie-Cutter" houses of America's suburbs and exurbs. Well, below is what socialism gives you. The picture is taken from a helicopter so looks a bit unreal but I am assured that it has not been retouched.

The picture shows about 300 out of a total of about 10,000 "low income homes" in in Ixtapaluca, Mexico (near Mexico City). Socialism gives you REAL cookie-cutter houses.



A reader writes:

"I saw your link to the housing project in Mexico. I also thought it had been computer generated or digitally enhanced until I blew it up and found minute differences to each dwelling. They use a moving slipform style of concrete construction and say they can build a house in 31 days" (Big PDF).


Many people would like to be kind to others so Leftists exploit that with their nonsense about equality. Most people want a clean, green environment so Greenies exploit that by inventing all sorts of far-fetched threats to the environment. But for both, the real motive is to promote themselves as wiser and better than everyone else, truth regardless.

Global warming has taken the place of Communism as an absurdity that "liberals" will defend to the death regardless of the evidence showing its folly. Evidence never has mattered to real Leftists

Comments? Email me here. My Home Page is here or here. For times when blogger.com is playing up, there are mirrors of this site here and here.


Sunday, February 26, 2006

Climate of Uncertainty: Why global warming is back in the headlines

Climate change is heating up again in American politics, the result of an orchestrated campaign to push the issue to the forefront. Al Gore is hitting the road with his animated computer slide show and has a documentary movie coming out. Climate action advocates skillfully exploited the Bush administration's clumsy moves to limit the public statements of NASA's chief climate scientist, Dr. James Hansen, and landed panicky stories about climate "tipping points" and scientific censorship on the front pages of the New York Times and Washington Post.

The real head-turner, however, was the recent launch of the Evangelical Climate Initiative, in which nearly 100 evangelical leaders signed on to the environmentalist party line. Some are the same liberal evangelicals who tub-thumped for the nuclear freeze during the Reagan years, but some are conservative evangelicals important to Bush's red-state base, such as Rick (The Purpose Driven Life) Warren. When the eco-apocalypse meets the New Testament apocalypse, you know something is up. That something is a sense of political desperation among climate change alarmists, as the world slowly turns against them.

If there is any subject more certain than the federal budget process to bring on eye-glaze, it is global warming and the drearily repetitive argument about the Kyoto Protocol to reduce greenhouse gas emissions. The issue combines the worst of wonky numerology (parts per million of various gases, complex computer models, opaque cost-benefit analyses), an alphabet soup of unctuous international bureaucracies (IPCC, UNFCCC, SRES, TAR, USGCRP, etc., etc.), and the incessant braying of interest groups. No wonder Al Gore loves it so much. Yet the issue, seemingly stuck in a rut for almost two decades, is starting to shake loose and head in new directions.

How do you go about sorting out sense from nonsense? Very few people who follow closely the subject of climate change argue that there's nothing to it. There is unanimity that the planet has warmed by about 1 degree over the last century. Just about everyone agrees that the growth of greenhouse gas emissions from fossil fuels cannot continue forever. That's where the agreement ends. The range of possible temperature increase over the next century is fairly wide in the official forecasts, from 1.4 degrees Celsius on the low side, which might not be difficult to cope with, to 5.8 degrees Celsius on the high side, which would mean major environmental problems for the planet. How probable is any point along the distribution? For reasons having to do with the cascading statistical uncertainties of the thousands of variables in computer climate models, we can't assign a probability to any narrower range of temperature forecasts, though very clever people are trying.

So for most of the last decade we have been playing a back and forth game with signs and wonders that are offered as confirmation that catastrophic global warming is well under way. But these tend to be as controversial as the computer climate models. As good as our measurement techniques are, there is still large disagreement about basic facts. Are the polar ice caps melting or growing thicker? Both, depending on what data set you consult. Is the last decade the hottest in 2,000 years? You need a flak jacket to survive the crossfire on this one. Can variance in solar radiation account for some or most of the warming we've experienced to date? Better put on a second flak jacket. Do clouds warm or cool the planet? Both, and understanding the balance between their conflicting effects remains a huge problem for climate models. Are ocean temperatures rising and Gulf Stream currents changing? Probably, but we need better data to be sure. Will hurricanes get worse? Get a helmet to go with your flak jacket, and put FEMA on speed-dial. Aren't scientists overwhelmingly in agreement that the science is "settled"? Well, yes, except for the hundreds of scientists who've signed various statements and resolutions saying we lack adequate mastery of the subject.

At this point even most people with a scientific background throw up their hands and say, "Call me back in 50 years if I need to turn up my air conditioning." It does no good, as global warming skeptics and many official climate science reports often do, to call for reducing "uncertainty" in climate science. The uncertainties of climate change have less to do with the enormous complexity of the linkages of the various earth sciences comprising the issue, and more to do with the stakes involved. With near-term global greenhouse gas suppression costs called for at Kyoto calculated in the multiple trillions of dollars ($37 trillion according to one widely accepted estimate), political considerations magnify the importance of nailing down uncertainties beyond the ability of science to do so. In fact, with a subject as sprawling as climate change, the disciplinary diversity of science is going to magnify rather than narrow uncertainties.

Ultimately, policymakers will have to exercise their best judgment rather than wait for oracular scientific conclusiveness, which will never come. Notwithstanding the relentless drumbeat of studies offered as proof of onrushing catastrophe, policymakers are rightly wary of handing over the keys of the economy to the very same people who brought us the population bomb that turned out to be a wet firecracker, predicted imminent resource scarcity, which also fizzled, and even, in the 1970s, hyperventilated that our greatest climate risk was a new ice age. (The ice age scare was not the tiny sideshow climate action advocates today try to claim that it was; the EPA in the early 1970s thought one reason to reduce sulfur dioxide emissions was that "aerosols" like SO2 were reflecting too much sunlight and increasing the risk of cooling the planet.) The suspicion of hidden agendas is buttressed by the default position of the most vocal environmentalists and the front-page-seeking reporters who cover the climate beat: They greet with complete credulity the most extreme forecasts and portents, whether it is melting ice, boiling oceans, or expiring frogs.

This is more than just a problem of having cried wolf too often; there seems to have been little introspection or second thoughts among environmentalists about why their Malthusian alarms rang false in the past. Given their track record, why should anyone believe that this time the alarmists have it right? There has been only grudging acceptance among environmentalists of the positive role of economic growth, the resiliency of human beings, and the dynamic world human ingenuity creates. It might be possible to grant more credibility to the alarmists if there were signs that their current analysis incorporates fundamental corrections of their previous neo-Malthusian frameworks. The recently released U.N. Millennium Ecosystem Assessment appears to go some of the way toward this kind of reappraisal, but the 12-volume (so far), 3,000-page report, by its very length, defends itself against the risk of being read or comprehended.

This brings us to the official effort to assess climate change for the purposes of making policy: the U.N.'s Intergovernmental Panel on Climate Change (IPCC). In the abstract the IPCC deserves it due. The effort to get to the bottom of climate change may be the largest scientific inquiry in human history. It requires the coordination of thousands of specialists, the development of whole new scientific techniques, and the refinement of elaborate computer models that need weeks to run on the world's most powerful supercomputers. Even discounting for the inherent weaknesses of computer models, this kind of sustained effort is likely to generate valuable knowledge in the fullness of time. Producing a coherent report every few years that combines all of this work is an extraordinary feat. The IPCC is currently well into the process of producing its Fourth Assessment Report, due out next year.

The problem with the IPCC process, however, is that the scientists and experts participating in each iteration have become increasingly self-selected toward those with a taste for climate alarmism. Past reports, especially the Second Assessment Report in 1995, were badly politicized by U.N. bureaucrats, misrepresenting the "consensus" the report actually contained. Rumors abound of internal political pressures to "sex up" the reports to make the case for the economically ruinous Kyoto agreement more compelling. Honest skeptics qualified to participate have found the consensus-oriented IPCC process too frustrating and have dropped out. For example, Richard Lindzen, a participant and chapter author in the Third Assessment Report in 2001, is not participating in the next round. More and more, the IPCC is becoming an echo chamber for one point of view, and is closed to honest criticism from the outside. They have not merely rejected criticism; in the fashion of environmental activists, they have demonized their reasonable critics.

The case of David Henderson and Ian Castles is a good example. Henderson, the former chief economist of the OECD, and Castles, a highly regarded Australian economist, noticed three years ago a serious methodological anomaly in the IPCC's 100-year greenhouse gas emission forecasts, which are the primary input for the computer climate models. Henderson and Castles made a compelling argument that the forecasts were unrealistically high. Everyone recalls the first day of computer science class: garbage in, garbage out. If future greenhouse gas emissions are badly overestimated, then even a perfect computer climate model will spit out a false temperature prediction. If Henderson and Castles are right, it means we may have more time to address even the most alarmist global warming forecasts. Since Henderson and Castles opened the debate, the IPCC's emissions forecasts have been subject to withering criticism from dozens of other reputable economists, including from a number of climate alarmists who, to their credit, argue that this crucial question should be got right.

The IPCC's reaction to Henderson and Castles was startling. The panel issued a vituperative press release blasting the two men for peddling "disinformation." A few scientists and economists connected with the IPCC had the decency to say publicly that the press release was a regrettable error. But it is typical of the increasingly arrogant IPCC leadership. The IPCC's chairman, Dr. Rajendra Pachauri, compared Danish eco-skeptic Bjorn Lomborg to Hitler because of Lomborg's wholly sensible and well-founded calculation that near-term emissions reductions make no economic sense. "What is the difference between Lomborg's view of humanity and Hitler's?" Pachauri told a Danish newspaper in 2004. "If you were to accept Lomborg's way of thinking, then maybe what Hitler did was the right thing." It is hard to have much confidence in an organization whose chairman can say this and keep his job. (The reductio ad Hitlerum is contagious: Two weeks ago NASA's James Hansen compared having a Bush political appointee listen in on his media phone calls--an obnoxious but routine practice in the federal government--to Nazi Germany or the Soviet Union, eliciting rapturous applause from an audience in New York. And Hansen wonders why people call him an alarmist.)

Moreover, despite the cascade of criticism of the IPCC's emissions forecasts, the same set of forecasts will be used in the next round of climate models, assuring a defective result. The IPCC says it would take too long to do a fresh set of forecasts. Despite the IPCC's wall of resistance, the consensus is coming around to the Castles and Henderson view that the IPCC has done a poor job of handling this important aspect of the issue. Nature magazine, normally aligned with the alarmists, editorialized in January that the IPCC's "macroeconomic assumptions . . . ought really to be discarded as wishful thinking," and criticized the IPCC for not incorporating "economists' latest thinking" in their next assessment.

Given its size and the imperatives of bureaucracy, the IPCC monopoly on official climate science is probably unreformable. What it needs is competition--the equivalent of the famous "Team B" of Sovietologists at the CIA in the 1970s. A robust independent effort at assessing climate science would have the tonic effect of making the IPCC behave with more circumspection in its methodology and judgment. In the absence of a full-fledged Team B effort, governments ought to require greater involvement of their finance ministries. It is astonishing how aloof most government finance ministries are to the entire IPCC and Kyoto process; in most European governments (and in the U.S. government, too), the whole mess is left to environment departments and foreign ministries, assuring a high level of economic naivete.

There is some movement toward broadening the climate portfolio and introducing some competitive analysis, especially in Britain, which has set for itself the most ambitious emissions cuts of any nation, aiming for a 60 percent reduction by the year 2050. Her Majesty's Treasury has embarked on a full-scale review of the economics of climate change science and policy, coincidentally right after a bipartisan select committee of the House of Lords issued a blistering report on the deficiencies of economic analysis of the issue.

This is merely one sign of the crackup of the global climate change caucus. Slowly, most governments are coming around to what has been President George W. Bush's position on the matter since taking office in 2001: The Kyoto Protocol is a nonstarter. With just a few years to go before the end of the initial target date of Kyoto, almost no nation is on course to meet its targets (except those Eastern European nations who saw emissions reductions from shuttering defunct state-owned industries after the Soviet Union dissolved, and even there the trend is again upward). Even though Britain is the one European nation that has come closest to fulfilling its Kyoto commitment, ironically it is Prime Minister Tony Blair's acknowledgment that the climate change emperor isn't wearing any clothes that has brought new candor to international discussion of the issue.......

The final game-changer was Bush's successful initiative to launch the Asia-Pacific Partnership (APP) last summer. The APP consists of the United States, China, India, Japan, Australia, and South Korea, which together account for about half of the world's total greenhouse gas emissions. As such the APP represents an alternative to the U.N. process that gave us Kyoto, and may one day put the U.N. climate change process out of business. As the new year began, the APP held its first meeting in Sydney, Australia, and began to articulate an alternative strategy to the Kyoto approach. The APP emphasizes as its first priority economic development and the eradication of poverty. It also struck notes of realism about energy use, observing that "fossil fuels underpin our economies, and will be an enduring reality for our lifetimes and beyond." The partnership members pledged more resources for advanced energy research, but also for work on making current fossil-fuel energy cleaner. The real game afoot behind the APP is probably to accelerate the transfer of advanced technology to India and China, whose greenhouse gas emissions are expected to soar in the coming years if they use current fossil fuel energy technology.

These developments suggest that however more convincing the scientific case for serious global warming may become, most world leaders are recognizing that near-term emissions reductions aren't a sensible way to begin moving to a post-carbon energy future. Twenty or thirty years from now we are likely to look back on the Kyoto Protocol as the climate-policy equivalent of the discredited wage and price controls of the 1970s, even as the climate prediction models themselves may come to resemble the elaborate Keynesian models that were supposed to enable us to fine-tune the economy with perfect precision. The Keynesian understanding of the economy was not wholly wrong, but fell far short of the mastery of detail its backers claimed. Climate alarmists like to warn us of the danger of severe climate "surprises" that may come our way. But if we're really taken by surprise, what does it tell us about the limitations of their models?

Is there--to extend the analogy--a "supply-side" analogy for climate policy? Amazingly enough, a hot topic among environmental economists is the positive relationship between economic growth--the central pillar of Bush's climate strategy--and environmental improvement. There is even a conceptual curve for it, known as the "Environmental Kuznets Curve," that can be scribbled on a napkin. It looks just like the Laffer Curve.

More here


Some two dozen power plants are scheduled to be built or refurbished during the next five years in Canada, China, several European Union countries, India, Iran, Pakistan, Russia, and South Africa. In the US and the UK, governmental preparations are under way that may lead to 15 new reactor orders by 2007.

About 16% of the world's electricity supply comes from nuclear power, and energy demand is increasing (see PHYSICS TODAY, April 2002, page 54). Worldwide, nearly 80% of the 441 commercial nuclear reactors currently in operation are more than 15 years old. To maintain nuclear power's position in the overall energy mix, new reactors will have to replace decommissioned ones, says a report from the Paris-based International Energy Agency.

The new interest in civilian nuclear energy results from some heavy lobbying by groups involved in building reactors, says Edwin Lyman of the Union of Concerned Scientists, and from attempts to reduce carbon-dioxide emissions. EU Energy Commissioner Andris Piebalgs adds that there are also increasing concerns about energy security, particularly in light of the recent disruption of Russian gas supplies in Europe.

Most of the new reactor designs are third-generation pressurized-water reactors (PWR), although companies in China, France, and South Africa are looking to build a fourth-generation design called a gas-pebble-bed reactor (PBMR). The new reactors are supposed to be inexpensive to build, more powerful, and safer; and they can be operated for up to 60 years, according to nuclear-power trade groups.

The international view

Late last year, officials from Bruce Power, one of Canada's largest power companies, announced a Can$4.25 billion (US$3.6 billion) investment to rebuild two reactors that have stood idle for nearly 10 years on the eastern shore of Lake Huron, north of Kincardine, Ontario. Last December, the Ontario Power Authority proposed plans to build 12 new nuclear plants to help phase out Ontario's coal-fired power stations.

New 1600-MW European PWRs are being built, one in Finland and one in France, with respective power-up dates of 2008 and 2012. On 5 January, France's president, Jacques Chirac, announced plans for an expansion of renewable and nuclear energy sources for France, including a PBMR by 2020. UK Prime Minister Tony Blair is expected to announce this spring six to eight new reactors in the UK.

Russia is currently constructing several reactors, including an 800-MW fast neutron reactor, but financial difficulties may delay four of them, says the London-based World Nuclear Association. Iran is building two Russian-designed reactors, the first of which should go on line later this year. The first South African PBMR is set to be completed in 2012.

Nuclear-industry officials have long said that the majority of growth would come in Asia. Japan is building five new power plants by 2010, and China plans to build 30 nuclear reactors, based on domestic designs, by 2020. China also sees nuclear technology as a major export opportunity, say industry analysts, and is building its second of four power plants for Pakistan, which may lead to a larger order. India has nine power plants under construction, including a fast-breeder reactor that generates its own fuel.

Six countries-Argentina, Brazil, Bulgaria, Chile, the Czech Republic, and Turkey-may build two to five PWRs each, while Germany, Sweden, and Switzerland are now reevaluating plans to phase-out nuclear power.

US moves

The US nuclear power industry has been virtually frozen since the Three Mile Island accident in 1979, but in the US Congress 2005 energy bill, tax credits worth $3.1 billion, along with liability protection and compensation for legislative delays, were added for the industry. On 30 December 2005, for the first time in years, the Nuclear Regulatory Commission (NRC) certified the design of a new reactor-the 1000-MW Westinghouse advanced passive (AP) reactor.

Six US power-plant operators are preparing combined construction and operating license (COL) requests to the NRC that could restart construction in the next five years. NuStart Energy, a consortium of nine nuclear energy companies, submitted plans for a General Electric simplified boiling water reactor at the Grand Gulf nuclear station near Port Gibson, Mississippi, and an AP-1000 reactor at the Bellefonte nuclear plant near Scottsboro, Alabama.

Two AP-1000 reactors may be built in the Carolinas by Duke Energy, along with another reactor by Progress Energy. "Preparing this application provides us the option to continue using a diverse fuel mix in the future," says Brew Barron, Duke Energy's chief nuclear officer.

Constellation Energy of Baltimore, Maryland, is in partnership with AREVA, a large French-German engineering firm, to submit COL requests for a European PWR at the Calvert Cliffs Nuclear Power Plant site in southern Maryland and the Nine Mile Point nuclear plant in Oswego, New York. Entergy, another NuStart member, announced it was preparing its own COL request for a new reactor at its River Bend Station power plant in St. Francisville, Louisiana. On 6 December, two electric utilities, Scana Corp and Santee Cooper, filed a letter of intent with the Nuclear Regulatory Commission to build two new reactors north of Columbia, South Carolina, to meet growing regional power demands.

According to representatives of the electric utilities involved, the US government and the reactor technology suppliers are paying for most of the $150 million the certification process costs. "The utilities are waiting to see if they can get any more subsidies out of the government," says Lyman, "so it's still premature to say if any of them will go ahead." A satisfactory means for disposal of their radioactive waste products has not yet been announced.

But the nuclear power industry believes the first new US order is only two years away. Says NuStart Energy president Marilyn Kray, "Our country needs these advanced nuclear plants."



More coal-fired power plants threaten emissions targets

Japan's efforts to cut carbon dioxide emissions may be compromised as more coal-fired thermal power stations, which emit large amounts of the greenhouse gas, are being built, prompting the Environment Ministry to dig in its heels over the need to introduce an environment tax. Thermal power plants and factories crowd Chiba Prefecture's Keiyo Coastal Industrial Zone, on the east side of Tokyo Bay in this February 2005 photo. These power stations are attractive for utilities because coal is cheaper than oil and natural gas. And the recent liberalization of the power industry makes it easier for newcomers to the market to build them.

But the flip side of these advantages is that they are hampering the government's push to achieve its greenhouse gas reduction targets under the 1997 Kyoto Protocol, which aims to curb global warming. The protocol, which obliges Japan to reduce emissions by 6 percent by 2008-2012 from 1990 levels, went into effect last February. "The achievement of the protocol's target, which is severe in itself, will become even more difficult" because of the growing number of coal-fired power plants, an Environment Ministry official said.

At a news conference in late January, Environment Minister Yuriko Koike voiced her opposition to the construction of a coal-fired power plant planned in Ube, Yamaguchi Prefecture, by Sigma Power Yamaguchi Corp., a new power utility jointly owned by Toshiba Corp. and Orix Corp. "(The project) seems to be going in a considerably different direction from our pledge under the Kyoto Protocol, and the government's plans to achieve the target," she said. The power plant would emit 5.82 million tons of carbon dioxide annually, more than twice what a liquefied natural gas-fired station of the same power output would emit. In a note presented last week, the ministry asked the Ministry of Economy, Trade and Industry, which oversees power plants, to halve emissions at the plant, virtually calling on METI to order the facility to be converted to an LNG-fired plant.

The Environment Ministry's stance faces opposition from the power industry, including Sigma Power. "In line with the state's guidance, we have worked out power supply plans by balancing stable supply and prices, and environmental conservation. If we were to be suddenly asked to stop using coal, that's a problem," said an executive at one utility.

But Shigemoto Kajiwara, chief of the Environment Ministry's section fighting global warming, remains firm. "We will continue to express opinions in our environmental assessment to those parties that are large carbon dioxide emitters," he said. With the surge in coal-fired power generation in recent years, Japan's coal use in fiscal 2004 was 2.8 times higher than in fiscal 1990. While plans are in the works to construct 10 coal-fired power stations, including three with a generating capacity of 1,000 mw each, there is no plan to close older coal-fired plants or to reduce their power output.

The Environment Ministry is making a stand because emission controls are not proving effective. In fiscal 2004, Japan emitted 1.32 billion tons of carbon dioxide, 7.4 percent more than in fiscal 1990. It said increases in coal-fired power, coupled with the effects of prolonged nuclear power plant shutdowns due to accidents and safety problems, are pushing up the volume of carbon dioxide emitted per kilowatt generated, a measure called energy intensity. "A worsened energy intensity impacts the volume of emissions stemming from industrial and household consumption of electricity," one think tank researcher said. "Japan's situation is serious because the share of alternative energy sources, such as wind power, is lower than in Europe, for example."

METI will map out a new state energy strategy in June, but at METI-sponsored meetings of experts, the thrust of discussions has been how to secure a stable energy supply, including stockpiling, with hardly any mention of steps to prevent global warming and promote energy conservation. The Environment Ministry-proposes environment tax was discussed in 2004 and 2005 but not introduced, largely due to opposition from industry. Top Environment Ministry officials are becoming increasingly irritated with this turn of events.



Japan's Environment Ministry turned off its heating this week, leaving staff unable to even make a cup of tea, in an effort to spur the country to meet its target for reducing greenhouse gas emissions, an official said Thursday. The weeklong shutdown, which began Tuesday, comes as Japan lags far behind its Kyoto Protocol pledge to cut output of gases believed to be warming Earth's atmosphere to 6 percent below 1990 levels by 2010. In 2004, the last year for which statistics are available, output was up 7.4 percent from 1990.

The ministry's "Warm Biz" campaign urges Japan's bureaucracy and businesses to bundle up with sweaters and scarves to cut down on energy use. "It's actually not that cold. We're all keeping warm from the heat of our computers," ministry spokesman Masanori Shishido said, but admitted he has taken to wearing thermal underwear. Temperatures in Tokyo on Thursday were 10 degrees Celsius (50 degrees Fahrenheit).....

More here


Many people would like to be kind to others so Leftists exploit that with their nonsense about equality. Most people want a clean, green environment so Greenies exploit that by inventing all sorts of far-fetched threats to the environment. But for both, the real motive is to promote themselves as wiser and better than everyone else, truth regardless.

Global warming has taken the place of Communism as an absurdity that "liberals" will defend to the death regardless of the evidence showing its folly. Evidence never has mattered to real Leftists

Comments? Email me here. My Home Page is here or here. For times when blogger.com is playing up, there are mirrors of this site here and here.


Saturday, February 25, 2006


But cooling is a sign of warming, of course. I reproduce below the popular summary as given in "Science" magazine and follow that by the journal Abstract

Observations reveal that the substantial cooling of the global lower stratosphere over 1979-2003 occurred in two pronounced steplike transitions. These arose in the aftermath of two major volcanic eruptions, with each cooling transition being followed by a period of relatively steady temperatures. Climate model simulations indicate that the space-time structure of the observed cooling is largely attributable to the combined effect of changes in both anthropogenic factors (ozone depletion and increases in well-mixed greenhouse gases) and natural factors (solar irradiance variation and volcanic aerosols). The anthropogenic factors drove the overall cooling during the period, and the natural ones modulated the evolution of the cooling.


Anthropogenic and Natural Influences in the Evolution of Lower Stratospheric Cooling

V. Ramaswamy, M. D. Schwarzkopf, W. J. Randel, B. D. Santer, B. J. Soden, G. L. Stenchikov

Since 1980, the lower stratosphere has cooled significantly. This cooling trend has been ascribed to the influence of anthropogenic effects--mainly stratospheric ozone depletion and the buildup of greenhouse gases. However, this process occurred in two major steps. Ramaswamy et al. (p. 1138) investigated the temporal structure of the trend using simulations with a climate model, in order to delineate the roles of natural and anthropogenic forcings. Although the overall downward trend in temperature is the result of anthropogenic factors, natural forcing by changes in solar irradiance and volcanic aerosols have superimposed on the gradual longer term decrease the shorter time-scale structure recorded in the observations. Thus, while anthropogenic factors are responsible for the 25-year-long stratospheric cooling trend, the steps were caused by natural forcing.

From: "Science" 24 February 2006: Vol. 311. no. 5764, pp. 1138 - 1141 DOI: 10.1126/science.1122587


Global warming of course warms the oceans up and thus produces more evaporation -- which comes down again as rain or snow (precipitation). Global cooling, of course, reduces precipitation. So which are we seeing? The latest news from the U.K. below:

Hosepipe bans must be ordered within weeks to avoid the threat of standpipes and rationing this summer, the head of the Environment Agency said yesterday. Despite steady rainfall during the past fortnight the South East is facing its worst water shortages since the drought of 1976. Over the past 15 months rainfall has been below average for virtually the whole of England and Wales with southern and central regions being the driest. The position has been worsened by the fifth-driest winter since 1964, with the whole of Britain receiving below-average rainfall, and rivers and water tables are now at alarmingly low levels.

Baroness Young of Old Scone, the chief executive of the agency, said that the South East was facing its worst drought in a century and water shortages would be seen across England and Wales. The threat, she said, was so severe that hosepipe bans must be ordered by water companies by the end of next month or householders would face the prospect of having to queue at standpipes for water as they had done in 1976.

Lady Young also called for all non-essential water use, including washing windows, cleaning cars and watering gardens, to be banned. “If water companies delay introducing hosepipe bans now, extreme steps to manage water supplies over summer may be needed, such as standpipes and rota cuts,” she said. “We’re in a serious situation now, where both the environment and our water supplies are at risk. Water companies shouldn’t just hope for rain – they must act now in case the weather stays dry.” Water rationing, with supplies cut off for several hours at a time, may also be required unless rainfall levels for the next three months rise to 20 per cent above average.

The Met Office, cautioning that long-range rain forecasts were unreliable, said yesterday that there was only a one in five chance of there being sufficient rain to bring water levels back to normal by the end of spring. Kent and Sussex are likely to see the worst shortages with Southern Water and Mid Kent Water already having sprinkler and hosepipe bans in place. Further restrictions being considered include limiting crop irrigation. London, the Thames Valley, East Anglia and South Coast counties face bans on non- essential uses of water in homes and businesses, including crop watering, and localised shortages are expected to cause environmental damage in the rest of England and Wales. Plants and animals are expected to suffer with drought causing heathland, grass and forest fires, and many species, including birch and beech trees, being killed through dehydration.

The agency plans to introduce closer monitoring of water companies to ensure that they do eveything to minimise the impact of drought, including the reduction of leakage from pipes. Britain has endured long, hot summers in recent years — most notably in 2003, when much of Europe suffered water shortages



An environmentalist faces federal charges of teaching others how to start an arson fire during a 2003 lecture in San Diego, where the costliest act of ecoterrorism in U.S. history had just occurred. In an indictment unsealed Wednesday, prosecutors said Rodney A. Coronado gave the lecture 15 hours after a $50 million fire destroyed a massive apartment complex in a north San Diego neighborhood. The indictment, however, does not link Coronado to that fire. Coronado, 39, was arrested Wednesday in Tucson, Ariz., on a charge of distribution of information relating to explosives, destructive devices and weapons of mass destruction. He will be arraigned there Thursday. Defense attorney R. Antonio Felix of Tucson, Ariz., did not return a message left seeking comment.

Coronado previously served four years in federal prison for a 1992 blaze at a Michigan animal research facility. Daniel Dzwilewski, special agent in charge of the San Diego FBI office, alleged that Coronado was a national leader of the radical Earth Liberation Front. ELF is an underground movement with no public leadership, membership or spokesperson, according to its Web site. An e-mail sent to the Web site didn't elicit an immediate response.

The 2003 fire destroyed a five-story, 206-unit apartment complex, an underground parking garage and a construction crane in the University City area of San Diego. No one was injured. A 12-foot banner found at the scene read "If you build it, we will burn it" with the initials of the ELF. The group, which only communicates with the news media by e-mail, issued a brief statement in response to media inquiries, saying the banner "is a legitimate claim of responsibility by the Earth Liberation Front."

Coronado's subsequent talk covered animal rights and militant environmental activism. According to an account and photos of the speech posted on the Internet, Coronado demonstrated how to build a crude ignition device using a plastic jug filled with gasoline and oil. Three animal rights activists who attended the lecture were ordered jailed for contempt for their refusal to testify before a grand jury investigating the fire.

While he repeatedly insisted that he had no role in the arson, Coronado has said he sympathized with the arsonists. Describing himself as an unofficial ELF spokesman, Coronado told The Associated Press at the time that young activists are "doing the only thing they know to do and that is strike a match and draw a whole lot of attention to their dissatisfaction with protecting the environment."

Authorities said the charge on which Coronado was indicted has only been used four times since it was written in 1997, most recently in an Ohio case unsealed Tuesday against three men charged with attempting to wage terror attacks against the United States. The charge carries a maximum penalty of 20 years in prison.

Coronado was previously sentenced to nearly five years in prison for a crime in which he said he did not participate: the 1992 firebombing of a Michigan State University laboratory and the offices of two animal researchers that caused $1.2 million in damage.

In December, a federal jury in Tucson, Ariz., convicted Coronado of illegally entering the Sabino Canyon Recreation Area to interfere with efforts to trap and relocate mountain lions following public sightings. He faces up to 7 1/2 years in prison when he's sentenced in March. That indictment called Coronado a member of Earth First!, perhaps best known for forest protests aimed at halting logging.



But Greenies are stranger than fiction

"The next time you reach for bottled water stacked on the supermarket shelf, spare a thought for the planet. You may think that it is better for you to buy such water, but better for the environment it certainly is not.

Despite its pure image, bottled water is making a significant contribution to climate change. The industry produces as much greenhouse gas as the electricity consumption of about 20,000 homes in a year, according to research by The Times.

To supply the more than two billion litres of bottled water that is consumed by Britons every year, a quarter of which comes from abroad, bottled-water companies produce 33,200 tonnes of CO2 emissions, just less than the electricity consumption of 20,000 households, and the equivalent of the energy needs of 6,000 households.

The principal environmental cost comes from transport - about a fifth of bottles come from southeast France, about 600 miles (1,000km) away - but there are also costs involved in the manufacture and disposal of bottles. Evian transports its water about 930km from Lake Geneva, producing about 14,000 tonnes of CO2 in the process. Volvic, whose water comes from Auvergne, produces about 9,000 tonnes.

British suppliers, with smaller distances to travel, are less environmentally costly. Highland Spring, whose plant is in Blackford, Perthshire, produces about 5,500 tonnes each year, while Powwow produces an estimated 3,000 tonnes.

Most water bottles are made from PET plastic, a crude-oil extract that accounts for about 0.25 per cent of the world's annual oil consumption. The majority end up in landfill sites, where they take about 450 years to break down, or are incinerated. Of the 10 per cent of bottles that are recycled, more than half are shipped to countries such as China, 13,000km away, to be processed, and produce around half a million kilos of CO2 emissions getting there...."

Excerpt from "The Times". Tim Worstall has certainly had some fun with it.


Many people would like to be kind to others so Leftists exploit that with their nonsense about equality. Most people want a clean, green environment so Greenies exploit that by inventing all sorts of far-fetched threats to the environment. But for both, the real motive is to promote themselves as wiser and better than everyone else, truth regardless.

Global warming has taken the place of Communism as an absurdity that "liberals" will defend to the death regardless of the evidence showing its folly. Evidence never has mattered to real Leftists

Comments? Email me here. My Home Page is here or here. For times when blogger.com is playing up, there are mirrors of this site here and here.


Friday, February 24, 2006


A US city is hoping to harness the power of dog poo, which accounts for almost four per cent of its residential waste. San Francisco already recycles more than 60 per cent of its garbage, but officials hope to turn into energy the 5,900 metric tonnes of dog waste a year - nearly as much as disposable nappies, according to the city. Within the next few months, Norcal Waste, a garbage hauling company that collects San Francisco's rubbish, will begin a pilot program using biodegradable bags and dog-waste carts to pick up droppings at a popular dog park. The droppings will be tossed into a contraption called a methane digester, a tank in which bacteria feed on faeces for weeks to create methane gas. The methane could then be piped directly to a gas stove, heater, turbine or anything else powered by natural gas. It can also be used to generate electricity.

Methane digesters are nothing new. The technology was introduced in Europe about 20 years ago, and more than 600 farm-based digesters are in operation there. Nine are in use on California dairy farms, and chicken and hog farms elsewhere in the United States also use them.

Neither Norcal Waste spokesman Robert Reed nor Will Brinton, a Maine-based recycling and composting consultant, knew of anyone in the United States who is using the $A1.36 million devices to convert pet waste to energy. But Brinton said some European countries process dog droppings along with food and yard waste. "The main impediment is probably getting communities around the country the courage to collect it, to give value to something we'd rather not talk about," Brinton said. "San Francisco is probably the king of pet cities. This could be very important to them." San Francisco - the city named after Saint Francis, patron saint of animals - has an estimated 240,000 dogs and cats.

Some experts believe methane digestion must become more attractive economically before it gets popular. Landfill space is relatively cheap, and natural gas and electricity also remain fairly inexpensive. Reed points to San Francisco's groundbreaking food composting program, which began 10 years ago, as proof an unusual idea can work in this forward-thinking city. A Norcal Waste subsidiary collects 272 tonnes of food scraps per day from homes and restaurants and converts it into a rich fertiliser sold to vineyards and organic farms.



From the Adam Smith blog

The UK government is to give ministers a choice of 'green' cars - a Toyota Prius hybrid or a Jaguar that runs on biodiesel - alongside the conventional alternative. (The picture shows a Toyota Prius being driven by a government minister).

Ministers could, of course, save the planet in more effective ways. Why do they all have to have official cars in the first place? It's pretty appalling to see ministers and their officials being driven the 300 yards from the two ministries near the ASI to the House of commons.

In any case, the chauffer-driven lifestyle separates them from their electors, who have to crowd into the trains and buses. They actually forget how the rest of us live.

They could do most for the environment, however, by issuing less paper. Like all those Bills and regulations (and, no doubt, Whitehall rule-books on things like the specifications for ministers' cars. The volume of official reports and 'consultation documents' - not to mention just straight government puff pieces - that are regularly mailed or biked round to ASI from ministries and quangos is quite ridiculous.

Save trees - stop employing so many scribblers, having so many rules and passing so many laws!


Ruling against a lower court, the Oregon Supreme Court on Tuesday upheld a sweeping, voter-approved measure that could allow many Oregon landowners to develop their property more intensively than current land-use regulations allow. Measure 37, passed by Oregon voters in 2004, allows property owners to seek compensation from local or state agencies if land-use laws and rules reduce the value of their land. If governments can't afford to pay - and none in Oregon says it can - those regulations would be waived.

Measure 37's approval inspired the Washington State Farm Bureau to file a similar property-rights measure, Initiative 933, in this state earlier this month. But supporters and opponents of the initiative said the Oregon Supreme Court's decision won't affect their plans much, if at all. "It's good news for property owners in Oregon," said Dan Wood, the Farm Bureau's government-affairs director, "but we didn't take the language in Measure 37 as the model for ours."

Environmentalists and other opponents hope to persuade Washington voters to defeat I-933 if it is on the ballot this fall, which would mean its constitutionality would never need to be tested in court, said Aisling Kerins of the Community Protection Coalition. "It's basically a big win for big developers," she said of the Oregon ruling. To qualify I-933 for the November ballot, backers must collect signatures of at least 224,880 registered voters by July 7. Supporters and opponents are fighting over the wording of the ballot title, but Wood said petitions should be available by the second week of March.

Measure 37's approval in Oregon sent ripples across the nation. The state adopted land-use policies in 1973 that are often regarded as a national model for protecting farmland and open space and encouraging compact growth. Those policies sparked a property-rights revolt that eventually produced Measure 37. More than 2,000 claims for compensation or waivers were filed after the measure took effect in December 2004. Many landowners simply sought permission to build a home, but some wanted to put large subdivisions or shopping centers on farmland. The law has been in a legal limbo since October, when Marion County Circuit Judge Mary James said it violated the state and federal constitutions. The state's highest court ruled otherwise Tuesday, saying James' arguments were not persuasive. The ruling means people whose claims were bottled up after James' decision can go ahead and try to get local and state agencies to approve their development plans.

While Oregon's high court said the measure was constitutional, that is not the last word. There are still a raft of legal disputes involving such issues as whether a landowner can transfer a Measure 37 right to develop property through a sale or a bequest. "Without some action by the Legislature, it may be years before additional court cases begin to clarify all of the uncertainties about the law," Gov. Ted Kulongoski said in his response to the Tuesday ruling. "In the process, those cases will entail substantial costs and frustrations for state and local governments and private-property owners throughout Oregon."....

Andrew Cook, a lawyer in the pro-property-rights Pacific Legal Foundation's Bellevue office, said the Oregon court ruling could help I-933 if it passes and is challenged. "Courts do look to other states," he said. "This sets a good precedent for Washington state."

More here


Below are two open letters to the chief collaborators in the frauds

Open letter to "Science" magazine from Benny Peiser and others, dated 22 February 2006:

R. Brooks Hanson
Managing Editor, Physical Sciences, Science
American Association for the Advancement of Science

Dear Dr Hanson

In early March, the National Research Council of The National Academies of the United States is convening a committee to study "Surface Temperature Reconstructions for the Past 1,000-2,000 Years". According to the NAS announcement, "the committee will be asked to summarize the current scientific information on the temperature record over the past two millennia, describe the proxy records that have been used to reconstruct pre-instrumental climatic conditions, assess the methods employed to combine multiple proxy data over large spatial scales, evaluate the overall accuracy and precision of such reconstructions, and explain how central the debate over the paleoclimate temperature record is to the state of scientific knowledge on global climate change."

In order for the NAS panel and the invited scientific experts to evaluate the overall accuracy and precision of temperature reconstructions based on multiple proxy data, it is essential that a complete archive of the data is made available. This is particularly relevant for a number of contentious papers published in Science that will feature prominently during the NAS assessment.

We understand that some authors of paleo-climate reconstructions published in Science (Osborn and Briffa, 2006; Thompson et al., 1989; 1997; Esper et al., 2002) have failed to provide complete data archives. We would like to ask Science to ensure that the NAS assessors and scientific experts will have full access to the data and that the authors in question provide a complete archive as required under Science policies.

Yours sincerely

Benny Peiser, Liverpool John Moores University, UK
Sir Colin Berry, Queen Mary, University of London, UK
Freeman Dyson, Institute for Advanced Study, Princeton, USA
Chris de Freitas, The University of Auckland, New Zealand
Mick Fuller, University of Plymouth, UK
Lord Taverne, House of Lords, UK

Letter to "Science" from Steve McIntyre, dated 19 February 2006:

Dear Dr Hanson,

I am writing in connection with the failure of Osborn and Briffa [2006] to comply with Science's policies on data archiving, which are both explicit and mandatory. For example:

Science supports the efforts of databases that aggregate published data for the use of the scientific community. Therefore, before publication, large data sets ... must be deposited in an approved database and an accession number provided for inclusion in the published paper.

In addition, I am also writing to remind you of the similar continuing failures in connection with Esper et al [2002] and Thompson's Dunde and Guliya ice cores, both of which are cited directly or indirectly in Osborn and Briffa, and about which we have corresponded in the past without any positive outcome.

We note that D'Arrigo et al. [2006] have arrived at precisely opposite conclusions to Osborn and Briffa [2006] on the relationship several Osborn and Briffa sites (Jaemtland, Boreal, Upperwright) to gridcell temperatures. In some cases, although Osborn and Briffa appear to say that they have used identical sites to Esper et al. [2002], the attributions of some datasets seem to differ (e.g. Esper et al. attribute a Quebec dataset to Payette and Filion, whereas Osborn and Briffa cite cana169.) Obviously, the exact data, as contemplated under Science data archiving policies, is necessary to reconcile these differences.

Accordingly, would you please ensure that authors Osborn and Briffa provide a complete archive including the following information:

1. Digital versions of all 14 series as used in their final compilations;

2. For each of the tree ring sites analysed (both the 11 retained and Esper site not used, including Gotland, Jaemtland, Mackenzie Mts and Zhaschiviersk), an exact data citation to a public archive (e.g. WDCP) for the data set used; or, in the alternative, an archive of the data set at the Science website. In cases, where the publicly archive dataset for a site is related to but different from the version used by Osborn and Briffa, please archive the data set as used.

3. Digital versions of the specific gridcell temperature series used in each of the reported temperature correlations together with version date.

Would you similarly ensure that Esper et al. also provide a complete archive including the following information:

4. Exact data citations to a public archive for all datasets used, or, if such do not exist, an archive of the data set at the Science website.

5. A clear and operational definition distinguishing "linear" and "nonlinear" trees, preferably with source code showing any differences in methodology.

Osborn and Briffa [2006] use a composite from Yang et al [2002], which uses data from Thompson's Dunde and Guliya ice cores, previously published in Science. As discussed in previous correspondence, there are several inconsistent grey versions of this data, which cannot be reconciled on the present record. We have previously corresponded about this without any information being provided by Thompson. The matter has re-surfaced once again with Thompson's grey data once again being used indirectly in Osborn and Briffa [2006]. This is a very unsatisfactory situation. Would you please ensure that:

6. Thompson provides a complete archive of both Dunde and Guliya ice cores, including both isotope and chemical data.

Thank you for your consideration.

Yours truly,

Stephen McIntyre

"Science" has replied to McIntyre but are still evading the issue. See here


Many people would like to be kind to others so Leftists exploit that with their nonsense about equality. Most people want a clean, green environment so Greenies exploit that by inventing all sorts of far-fetched threats to the environment. But for both, the real motive is to promote themselves as wiser and better than everyone else, truth regardless.

Global warming has taken the place of Communism as an absurdity that "liberals" will defend to the death regardless of the evidence showing its folly. Evidence never has mattered to real Leftists

Comments? Email me here. My Home Page is here or here. For times when blogger.com is playing up, there are mirrors of this site here and here.


Thursday, February 23, 2006

THE LATEST SCARE: Acidifying oceans

Even given that atmospheric CO2 levels are higher and stay higher, this seems implausible to me. Social scientists don't usually know much about chemistry but from my limited recollection of it, carbonic acid is very unstable and breaks down rapidly. And neutralizing it is hardly a technological problem either. And given the admission that acidification has occurred naturally in the past for unknown reasons, connecting any such phenomenon to anthropogenic global warming is mere assertion

Pollution is quickly making the world's oceans more acidic, and if unchecked this could cause a mass extinction of marine life similar to one that occurred when the dinosaurs disappeared, a researcher says. The researcher, Ken Caldeira of the Carnegie Institution of Washington, D.C., has developed computer models predicting a continuation of a trend other scientists have also noted: the oceans are slowly turning into mild acids. Caldeira said he compared his computer models predicting how far this will go in the next century, with evidence from the fossil record, and has found some startling similarities.

The finding offers a glimpse of what the future might hold for ocean life if society does not drastically curb carbon dioxide emissions, he added. "The geologic record tells us the chemical effects of ocean acidification would last tens of thousands of years," Caldeira said. "But biological recovery could take millions of years. Ocean acidification has the potential to cause extinction of many marine species." When carbon dioxide from the burning of coal, oil, and gas dissolves in the ocean, some of it becomes carbonic acid. Over time, accumulation of this carbonic acid makes ocean water more acidic.

Previous estimates, Caldeira said, suggest that in less than a century, the pH of the oceans could drop by as much as half a unit from its natural value of 8.2 to about 7.7. On the pH scale, lower numbers are more acidic and higher numbers are more basic.

This trend would especially damage marine animals such as corals, that make shells out of a mineral called calcium carbonate, Caldeira added. Under normal conditions the ocean is full of this substance, making growth easy for such creatures. A more acidic ocean would more easily dissolve calcium carbonate, putting these species at severe risk, he added.

The last time the oceans endured such a drastic change in chemistry, he added, was 65 million years ago, when the dinosaurs went extinct. Though researchers don't yet know what caused this ancient acidification, it was related to the cataclysm that wiped out the giant beasts, he added. The extinction pattern in the ocean was consistent with ocean acidification, he explained: the fossil record reveals a plunge in the number of species with calcium carbonate shells in the upper ocean, especially corals and plankton. During the same period, species with shells made from resistant silicate minerals were more likely to survive. "Our energy system could make the oceans corrosive to coral reefs and many other marine organisms," Caldeira cautioned. He presented the findings Monday in Honolulu at the Ocean Sciences Meeting of the American Geophysical Union and the American Society of Limnology and Oceanography.



A reader writes:

Regarding your post about carbonic acid: There are two glaring problems with what they say. The pH is becoming more acidic, but it is still the basic side of neutral. Why not headline it with "seas becoming more neutral"?

Carbonic acid is required to form calcium carbonate in the first place! If the acidification was caused by some other acid, eg hydrochloric, then it is straightforward to show that calcium carbonate would be eroded since no more carbonate is being added. But here the acid is carbonic acid, its dissolution into the sea creates a weak acid but also ADDS more carbonate to the water in equal proportion. I'm not sure what the equilibrium equations are but I don't think it is as simple as the authors make out. The comparison with ancient extinctions is also dubious, what acids were responsible? They don't say. There are in fact acids based on silica... is it inconceivable that the acidification was caused by silicic acids, which affected carbonate shells far more than silicates? The cataclysm was probably a strike on the Earth by a large body, throwing up a lot of material. What is the bulk of the Earth made of? Silicas. What would form as they rained out? Silicic acids!

You do have to wonder what the hell sort of modelling they are using! I'd expect without doing any serious investigation that the carbonates would thrive under these conditions whereas the silicate based shells may suffer a little. But silicate shells are sturdier so they won't suffer very badly under a slight pH change.


You can never please a Greenie. If all we had were caves, Greenies would oppose them

President Bush's new nuclear energy initiative is supposed to help cure America's "addiction to oil" by redesigning a taboo technology, originally used to obtain plutonium for bombs, to reuse spent nuclear fuel. Unlike past reprocessing methods, the administration says, the new technique would make it prohibitively difficult for would-be proliferators to extract weapons-grade plutonium from spent fuel, and it would drastically reduce the volume of radioactive waste to be stored at repositories such as Nevada's Yucca Mountain.

The result, Energy Secretary Samuel W. Bodman said early this month, would be increased use of nuclear power, reduced oil consumption and fewer hydrocarbon emissions, "making the world a better, cleaner and safer place to live."

If it works. Both supporters and opponents of the Global Nuclear Energy Partnership agreed that although it marks a radical change in U.S. nuclear energy policy, it also relies on unproven technologies that will take decades to mature, and it does not guarantee success. Bodman, in congressional testimony last week, acknowledged that the $250 million requested for the program this year will be used to design a test reprocessing plant so that Bush over "the next two or three years" can make "a go or no-go decision as to whether this is something that makes sense."

But one problem with this calculation, opponents say, is that even a toe-wetting start-up requires that the United States reverse nearly 30 years of opposition to reprocessing at a time of increasing concern about weapons programs in North Korea, Iran and other nations. That "is the wrong signal to send," said Edwin Lyman of the Union of Concerned Scientists, which opposes reprocessing. Also, Lyman and others challenged the administration's view that the new technology does not produce "proliferation proof" plutonium, and suggested that would-be proliferators would almost certainly find new ways to handle the spent fuel by the time the new system is ready.

Deputy Energy Secretary Clay Sell acknowledged these concerns but noted that the U.S. refusal to reprocess spent fuel has been a stance "that virtually no one [else] followed." The world "has moved on without us," he added, and a new technology that makes it harder to obtain plutonium "will make the United States a leader rather than a spectator."

Still, there are other misgivings. Experts in both science and industry doubt that the plan could meet what Sell called an "admittedly aggressive time schedule" to have commercial reprocessing up and running by 2025. If development drags on, these experts say, reprocessing would have little immediate effect on nuclear waste storage. Meanwhile, the government will be spending billions of dollars developing a fuel that probably will be too expensive to buy in the foreseeable future, except with a government subsidy. "I'm not dogmatic -- the claims may not ultimately be wrong," said Richard K. Lester, a nuclear scientist at the Massachusetts Institute of Technology. "But on the time scale that's going to matter, it's very difficult to come close to achieving the objectives that have been set."

Reprocessing technology was first developed by the United States in the 1950s as a way to obtain plutonium for nuclear warheads, but President Jimmy Carter banned it in 1977 because of proliferation concerns. President Ronald Reagan rescinded the ban in 1981, but even then, reprocessing was so expensive and technologically daunting that no U.S. power company ever sought to develop it.

France, Japan, Russia, India and the United Kingdom do reprocess commercially, and all use the old U.S. technology, called purex, which derives plutonium oxide from spent fuel and then combines it with uranium to create a mixed-oxide fuel, called MOX, that can be used in some power plants. MOX is much more expensive than the uranium fuel in conventional reactors. The conventional plants, which include all 103 nuclear generators currently operating in the United States, use "once through" fuel rods in a controlled reaction to produce steam that drives turbine generators. The rods are replaced every 18 to 24 months, and the spent fuel -- about 2,000 metric tons annually -- is put into temporary storage on the reactor sites.

Eventually, the spent fuel is supposed to go to Yucca Mountain, which will open, at the earliest, in 2012. By that time, the industry will have 70,000 metric tons of spent fuel waiting to ship to it. "We need to solve a couple of big problems," said Phillip J. Finck, deputy associate director for applied science technology and national security at Argonne National Laboratory. "We have to deal with the waste and destroy plutonium." The new technology, as described by Finck in a telephone interview, begins with a new reprocessing technique called urex-plus, which, like purex, dissolves spent fuel rods in a bath of nitric acid. The used fuel rods are composed of uranium, plutonium, heavy radioactive metals called "transuranics" and lighter radioactive elements known as "fission products."

Unlike purex, which separates out the plutonium, urex-plus leaves the plutonium and transuranics mixed together, making the resulting product unsuitable for weapons and much more difficult to handle for anyone trying to build a bomb. The new fuel would be used in a "fast reactor," where neutrons move about much more energetically than in conventional reactors, breaking down the long-lived transuranics into lighter fission products with shorter half-lives. The spent fuel from the fast reactor would then be reprocessed using another new technology known as "pyroprocessing," which separates the fuel by dissolving it in molten salt and running an electric current through it. The fuel could be recycled several times until the long-lived transuranics all but disappear.

If successful, the new reprocessing method would replace purex, the stockpile of civilian plutonium would stop growing, and the whole cycle would become much more proliferation resistant, Finck said. Also, he added, Yucca Mountain's storage capacity "would increase by a factor of 100." Instead of filling up by 2030, or earlier, the repository would last beyond the end of the century.

That is if the new reprocessing system is ready by 2025. Steven Kraft, senior director of used fuel management for the Nuclear Energy Institute, an industry policy group, voiced doubts: "This is a matter of developing future technologies, and those technologies are 50 to 60 years away." Kraft endorsed Bush's plan as a worthy long-range goal, but nonproliferation advocates said impurities in reprocessed plutonium are not likely to dissuade would-be proliferators from stealing it.

Arjun Makhijani, president of the Institute for Energy and Environmental Research, an energy think tank, said: "You can get a one-kiloton explosion with impure plutonium, and if you're a terrorist the most important thing is to have the capability. Such a blast would be the equivalent of 1,000 tons of dynamite. "You don't care whether you destroy the tip of Manhattan or the whole island," he said.


California: Junk science wins at the OUC-- families and jobs to be harmed-- but Greenies will cheer!

In a bid to slow global warming, California regulators are scheduled to vote today to limit the amount of greenhouse gases the state's utilities are allowed to pump into the air. The measure before the California Public Utilities Commission would place California at the forefront of a nationwide effort to rein in carbon dioxide emissions, blamed for raising temperatures worldwide. "If we're going to deal with the greenhouse gas issue in California, we're going to have to go down this road," said commission President Michael Peevey, who proposed the cap.

Today's vote would merely begin the process of setting a specific cap, with the key details to be worked out later in discussions with environmentalists and the state's three investor-owned utilities. Those specifics include the actual number of tons each utility could emit and the penalties for those who go over the limits. In addition, the commission has no legal authority to include under the cap the state's municipal utilities, including those in Los Angeles and Sacramento. Doing so would require action by the Legislature. Peevey said the process of nailing down the cap's details would take several years. "That's important also to provide business with some assurance that we're going to do this based on sound economic sense," he said.

While the federal government has resisted limiting carbon dioxide, citing potential costs, states have been far more aggressive. Gov. Arnold Schwarzenegger has made fighting global warming a central goal of his administration, aiming to reduce the state's emissions to 1990 levels by the year 2020. A coalition of seven Northeastern states, including New York, agreed in December to cut their emissions using a "cap-and-trade" system, which forces power plant operators to buy and sell credits for producing specific amounts of the gas.

California may adopt such a cap and trade system as a result of the commission's decision today. "If we didn't see any such action at the national level, then we need to see some action at the regional level," said Christy Dennis, spokeswoman for Pacific Gas and Electric Co. The utility, based in San Francisco, supports the cap-and-trade concept. A similar process already limits the amount of sulfur spewed from power plants, one of the main causes of acid rain. But critics from both ends of the political spectrum warn that carbon dioxide could be much harder to control. All power plants running on coal, natural gas or oil emit carbon dioxide. So do cars, humans, animals, fireplaces and wildfires. The most cost-effective way to produce hydrogen, a fuel some environmentalists hope will replace oil, also produces carbon dioxide. Crafting a system that can significantly cut emissions of the gas won't be easy. "I think you cannot extrapolate from sulfur, which few things emitted," said David Hamilton, director of the Sierra Club's global warming and energy program. "It was basically power plants. Carbon is just kind of everything -- it's cars, it's industry, it's power plants, it's agriculture." The environmental organization doesn't take a position for or against cap-and-trade systems.



I linked yesterday to a media report of a statement saying that recent storm activity cannot be linked to global warming. Below is the preamble from the actual scientific statement itself:

Statement from Australian Bureau of Meteorology, February 2006. Submitted to CAS-XIV under Agenda Item 7.3 by Dr G. B. Love, Permanent Representative for Australia. Prepared by the WMO/CAS Tropical Meteorology Research Program, Steering Committee for Project TC-2: Scientific Assessment of Climate Change Effects on Tropical Cyclones. February 2006

To provide an updated assessment of the current state of knowledge of the impact of
anthropogenically induced climate change on tropical cyclones.

The WMO CAS Tropical Meteorology Research Program has undertaken a series of assessments of the potential influence of climate change on global tropical cyclone activity. The most recent was published in the Bulletin of the American Meteorological Society by Henderson- Sellers et al (1998) and had the following major conclusions:

* Whilst there was evidence of substantial multidecadal variability (particularly for intense Atlantic hurricanes), there was no clear evidence of long-term trends;

* The Maximum Potential Intensity of cyclones will remain the same or undergo a modest increase of up to 10-20%. These predicted changes are small compared with the observed natural variations and fall within the uncertainty range in current studies;

* Little can be said about the potential changes of the distribution of intensities as opposed to maximum achievable intensity;

* Current knowledge and available techniques are not able to provide robust quantitative indications of potential changes in tropical cyclone frequency;

* The broad geographic regions of cyclogenesis and therefore also the regions affected by tropical cyclones are not expected to change significantly;

* The modest available evidence points to an expectation of little or no change in global frequency. Regional and local frequencies could change substantially in either direction, because of the dependence of cyclone genesis and track on other phenomena (e.g. ENSO) that are not yet predictable;

* The rapid increase of economic damage and disruption by tropical cyclones has been caused, to a large extent, by increasing coastal populations, by increasing insured values in coastal areas and, perhaps, a rising sensitivity of modern societies to disruptions of infrastructure."



Many people would like to be kind to others so Leftists exploit that with their nonsense about equality. Most people want a clean, green environment so Greenies exploit that by inventing all sorts of far-fetched threats to the environment. But for both, the real motive is to promote themselves as wiser and better than everyone else, truth regardless.

Global warming has taken the place of Communism as an absurdity that "liberals" will defend to the death regardless of the evidence showing its folly. Evidence never has mattered to real Leftists

Comments? Email me here. My Home Page is here or here. For times when blogger.com is playing up, there are mirrors of this site here and here.