Friday, April 02, 2010
Direct Evidence that Most U.S. Warming Since 1973 Could Be Spurious
My last few posts have described a new method for quantifying the average Urban Heat Island (UHI) warming effect as a function of population density, using thousands of pairs of temperature measuring stations within 150 km of each other. The results supported previous work which had shown that UHI warming increases logarithmically with population, with the greatest rate of warming occurring at the lowest population densities as population density increases.
But how does this help us determine whether global warming trends have been spuriously inflated by such effects remaining in the leading surface temperature datasets, like those produced by Phil Jones (CRU) and Jim Hansen (NASA/GISS)?
While my quantifying the UHI effect is an interesting exercise, the existence of such an effect spatially (with distance between stations) does not necessarily prove that there has been a spurious warming in the thermometer measurements at those stations over time. The reason why it doesn’t is that, to the extent that the population density of each thermometer site does not change over time, then various levels of UHI contamination at different thermometer sites would probably have little influence on long-term temperature trends. Urbanized locations would indeed be warmer on average, but “global warming” would affect them in about the same way as the more rural locations.
This hypothetical situation seems unlikely, though, since population does indeed increase over time. If we had sufficient truly-rural stations to rely on, we could just throw all the other UHI-contaminated data away. Unfortunately, there are very few long-term records from thermometers that have not experienced some sort of change in their exposure…usually the addition of manmade structures and surfaces that lead to spurious warming.
Thus, we are forced to use data from sites with at least some level of UHI contamination. So the question becomes, how does one adjust for such effects?
As the provider of the officially-blessed GHCN temperature dataset that both Hansen and Jones depend upon, NOAA has chosen a rather painstaking approach where the long-term temperature records from individual thermometer sites have undergone homogeneity “corrections” to their data, mainly based upon (presumably spurious) abrupt temperature changes over time. The coming and going of some stations over the years further complicates the construction of temperature records back 100 years or more.
All of these problems (among others) have led to a hodgepodge of complex adjustments.
A SIMPLER TECHNIQUE TO LOOK FOR SPURIOUS WARMING
I like simplicity of analysis — whenever possible, anyway. Complexity in data analysis should only be added when it is required to elucidate something that is not obvious from a simpler analysis. And it turns out that a simple analysis of publicly available raw (not adjusted) temperature data from NOAA/NESDIS NOAA/NCDC, combined with high-resolution population density data for those temperature monitoring sites, shows clear evidence of UHI warming contaminating the GHCN data for the United States.
I will restrict the analysis to 1973 and later since (1) this is the primary period of warming allegedly due to anthropogenic greenhouse gas emissions; (2) the period having the largest number of monitoring sites has been since 1973; and (3) a relatively short 37-year record maximizes the number of continuously operating stations, avoiding the need to handle transitions as older stations stop operating and newer ones are added.
Similar to my previous posts, for each U.S. station I average together four temperature measurements per day (00, 06, 12, and 18 UTC) to get a daily average temperature (GHCN uses daily max/min data). There must be at least 20 days of such data for a monthly average to be computed. I then include only those stations having at least 90% complete monthly data from 1973 through 2009. Annual cycles in temperature and anomalies are computed from each station separately.
I then compute multi-station average anomalies in 5×5 deg. latitude/longitude boxes, and then compare the temperature trends for the represented regions to those in the CRUTem3 (Phil Jones’) dataset for the same regions. But to determine whether the CRUTem3 dataset has any spurious trends, I further divide my averages into 4 population density classes: 0 to 25; 25 to 100; 100 to 400; and greater than 400 persons per sq. km. The population density data is at a nominal 1 km resolution, available for 1990 and 2000…I use the 2000 data.
All of these restrictions then result in thirteen 24 to 26 5-deg grid boxes over the U.S. having all population classes represented over the 37-year period of record. In comparison, the entire U.S. covers about 31 40 grid boxes in the CRUTem3 dataset. While the following results are therefore for a regional subset (at least 60%) of the U.S., we will see that the CRUTem3 temperature variations for the entire U.S. do not change substantially when all 31 40 grids are included in the CRUTem3 averaging....
Significantly, the warming trend in the lowest population class is only 47% of the CRUTem3 trend, a factor of two difference.
Also interesting is that in the CRUTem3 data, 1998 and 2006 would be the two warmest years during this period of record. But in the lowest population class data, the two warmest years are 1987 and 1990. When the CRUTem3 data for the whole U.S. are analyzed (the lighter red line) the two warmest years are swapped, 2006 is 1st and then 1998 2nd.
From looking at the warmest years in the CRUTem3 data, one gets the impression that each new high-temperature year supersedes the previous one in intensity. But the low-population stations show just the opposite: the intensity of the warmest years is actually decreasing over time.
To get a better idea of how the calculated warming trend depends upon population density for all 4 classes, the following graph shows – just like the spatial UHI effect on temperatures I have previously reported on – that the warming trend goes down nonlinearly as population density of the stations decrease. In fact, extrapolation of these results to zero population density might produce little warming at all!
This is a very significant result. It suggests the possibility that there has been essentially no warming in the U.S. since the 1970s.
More HERE (See the original for links, graphics etc.)
Destroying America with the EPA's Carbon Lies
By Alan Caruba
Lisa Jackson, Obama’s EPA director, has just announced the agency’s new auto regulations of gas mileage based on global warming. In addition, the agency asserts the right to regulate carbon dioxide (CO2) emissions under the Clean Air Act.
There is absolutely no scientific justification for this and, indeed, many observers believe the EPA lacks the legal authority regarding its stance on CO2.
There is NO need to limit greenhouse gas emissions because there is NO “global warming.”
Greenhouse gases are purported to be the primary cause of this fraud. The EPA, like a dozen other U.S. agencies, has been pushing the global warming fraud for decades. One more lie, even a whopper about CO2, is of little concern to the EPA at this point.
Beyond the issue of scientific fraud, there are the scientific facts that demonstrate that CO2 plays a miniscule role, if any, as regards the Earth’s climate. Carbon dioxide is less than one percent of the Earth’s atmosphere (386 parts per million).
There is, in fact, no greenhouse effect. The most active element of the atmosphere is the 95% of water vapor that forms a protective layer around the Earth.
The science involved is fairly simple. Clouds have a warming effect because, in order for water vapor to condense back into water droplets, the water molecules must first re-emit the energy they absorbed to become vapor. That latent heat causes the local environment to feel warmer. It is this constant interchange that determines whether wherever you’re at right now is warmer or cooler.
The public is rarely, if ever, told that meteorologists have NO idea why clouds act as they do. All they can do is track cloud activity via satellite images, but they can only accurately predict the weather at best for three to four days ahead. This is why, when you watch a televised weather forecast, they mostly just point to cloud systems.
The Earth’s oceans contain fifty times the CO2 in the Earth’s atmosphere. The Earth’s biomass, oceans, near-surface rocks and soils contain 100,000 times the carbon in the atmosphere.
To declare CO2 toxic, the EPA is saying that all that natural CO2, plus the six pounds of carbon dioxide that every human exhale every day is a “pollutant.”
How can carbon dioxide be a pollutant when all life on Earth is dependent upon it?
CO2 is to vegetation what oxygen is to human and other animal life. Without CO2, all vegetation dies and then all animal life dies for lack of the nutrients provided by food crops.
The EPA will blame the generation of CO2 on energy use, but 97% of the Earth’s CO2 is produced by Nature!
Only about 3% of all the CO2 in the atmosphere is produced by humans via industrial and transport activity. This estimate, in fact, comes from the UN Intergovernmental Panel on Climate Change! The IPCC’s other alleged climate data is subject to serious challenge, but this is not. It falls into the category of common knowledge among climate scientists.
Environmentalists are insanely opposed to all energy use with the exception of bicycles, canoes, and walking. They particularly hate automobiles, but these are the same people, along with the EPA, that have insisted on the inclusion of ethanol, otherwise known as moonshine, in every gallon of gasoline. The immediate result is less mileage per gallon and the production of more CO2!
None of these facts is a secret yet, since 1989, the U.S. government has spent $79 billion in taxpayer’s money on “climate change” research. To suggest that the government, using the data generated, has any “control” over the climate is absurd.
The result of all that government funded research has been a public that has been subjected to the massive fraud called global warming. Weather data provided by NASA and NOAA, for example, has had to have been withdrawn due to errors.
Not only has the scientific community learned that the IPCC data was manipulated and that efforts were made to suppress data refuting global warming, but the Earth has irrefutably been in a cooling cycle for over a decade at this point.
The EPA regulatory control of auto mileage and CO2 emissions is a complete fraud and a contemptible lie. In doing so it has become a gangster agency that has abandoned any credibility.
Finally, the Cap-and-Trade Act awaiting a vote in the Senate is based on the global warming fraud and, if enacted, would impose massive taxation on all energy use. It must be stopped.
The EPA's latest move must be stopped. The fate of the nation’s economy literally depends on this.
EPA Limits On Greenhouse Gases Will Shift U.S. Production Overseas
Climate change represents a tough and complex policy issue. That's the reason U.S. lawmakers — more than 30 years since scientists first introduced the concept of global warming into the American political dialogue — continue to debate the best way to structure legislation aimed at reducing greenhouse gas (GHG) emissions without damaging the economy.
As Congress debates domestic legislation, the administration participates in an international process with similar objectives.
As this takes place, the Environmental Protection Agency is engaging in a form of political blackmail, threatening to push ahead on its own if elected officials don't move as quickly as the agency wants. It has already taken the first step toward leveraging the Clean Air Act to mandate GHG reductions.
Headed toward regulatory seppuku, the agency recently finalized a climate change determination that current concentrations of GHGs — about 435 parts per million (ppm) in CO2 equivalents — in the atmosphere endanger public health and welfare.
EPA makes this claim not because these emissions are toxic like pollutants covered by the Clean Air Act, but because of their heat-trapping capacity. EPA has accepted the theory that increasing emissions of these gases will lead to unprecedented increases in the earth's temperature and that a much warmer earth will mean health- and even life-threatening problems.
Although it has been as warm or warmer in the past, EPA wants to act assuming a worst-case scenario — even if the probability of occurrence is incredibly small.
There is support for the theory (a theory, not a fact) that the projected path of human emissions will lead to global concentrations that pose a risk. The U.N.'s Intergovernmental Panel on Climate Change has set an ambitious goal of cutting GHG emissions so concentrations do not exceed 450 ppm and keeping the global average temperature increase below 2 degrees Celsius.
Leaving aside whether achieving such a goal is even technologically or economically feasible, let's focus on the fact that the international goal — which the Obama administration has acknowledged in the Copenhagen Accord — represents a higher concentration than what EPA claims is already endangering all of us.
If EPA Administrator Lisa Jackson really believes this, she must be setting the stage for a massive regulatory assault by declaring that any increase in GHG concentrations will further harm human health and welfare. Once the agency determines that current concentrations of greenhouse gases are unhealthy, how can U.S. negotiators agree to a U.N. limit that is higher?
The Battle Between Image and Reality
Several events, including record snow falls in many parts of the country have brought climate change, also known as global warming, back to the public’s attention. These events include the release of emails indicating manipulation of temperature data, admissions by the former head of the Climate Research Unit about recent and historical temperature, and the reluctant acknowledgement that the Intergovernmental Panel on Climate Change’s (IPCC) last assessment report contained several glaring errors.
The image, carefully crafted and marketed for over two decades, that temperature increases have been accelerating in recent decades, as a result of human activity, is now suspect. Without that image it is hard to convince the public that a climate apocalypse is likely later this century.
The image that human activity is radically changing our climate does not conform with factual data. The facts suggest claims about accelerating temperatures are an exaggeration and temperature records are of questionable accuracy.
The apocalyptic rhetoric about rapidly rising temperatures is matched with similar rhetoric about sea levels, flooding of small islands and coastal regions, increased drought, diseases, and other ecological effects.
This vision of a dismal future pervaded the recent global climate meeting in Copenhagen. Here in the U.S. such images were used by the Environmental Protection Agency (EPA) to justify issuing an endangerment finding which alleges that temperature increases from carbon dioxide (CO2) emissions are a threat to human health and the environment.
Whenever advocacy groups and the media use scary images, it is always wise to be more than a little skeptical and to check the underlying facts. All too often, the facts do not support the images. That is what led the late historian, Daniel Boorstin, to observe that we live in an age where facts get tested by the image instead of the image getting tested by the facts.
The image that the apocalyptics are touting is that our temperatures were like Goldilocks’ porridge, just right until human activity caused them to increase rapidly because of fossil energy use to heat our homes, operate our businesses and power our cars.
Are the facts consistent with the image? No. There have been periods in the earth’s geological history when CO2 levels and temperatures were higher than they are today. Our more recent temperature history provides empirical evidence that the earth is not experiencing run-away warming and that there has been a cyclical pattern to temperatures, at least, over the past 115 years.
In the late 1800s, we were beginning to emerge from the Little Ice Age and our average temperatures were in the 51º range. Over the next several decades, they ranged between 52º and 53º. Then in the 1940s, temperatures began to decline and that brought concerns of another ice age. In the late 1970s, temperatures started increasing again, rising from 52.3º to a peak of 55.1º in 1998. An increase of 3º in three decades would be worrisome if it indicated a new and continuing trend. But, that is not what has happened. Since 1998, which was an El Niño aberration, temperatures have begun to decline once again...
Professor Robert Balling, a well-known climatologist, has observed, “Confounding many glib assertions is the fact that the warming rate in the early twentieth century (1915-1945) is not significantly different from the warming rate of the past three decades.”
The temperature record since 1895 is a series of ups and downs with changes from year to year or decade to decade being measured in tenths or hundreds of degrees. And last year’s average temperature was similar to what we experienced in the 1920s and 30s.
The figure shows that temperatures have not increased in a manner consistent with the theory that human produced CO2 is causing unprecedented warming and that temperatures in the past decade are not significantly different from temperatures early in the 20th century.
More HERE (See the original for links, graphics etc.)
Make Sure of the Facts on Climate Change
The two largest purchases most people make today are houses and cars. Because of the amount of money involved, prudent consumers do thorough research before buying. When buying a car, the purchaser looks at the manual and test-drives the vehicle, but also talks to owners of the same model and reads reviews in car and consumer magazines and websites.
The prospective homebuyer walks around the neighborhood and looks at the house in person, but also gets reliable information on area home prices, crime statistics, local schools, and of course, a home inspection by a qualified expert.
In the legal field, this is called performing due diligence and as a popular business motto has it, “an informed consumer is our best customer.” If an individual buyer is willing to put hours of work into researching a purchase of tens or hundreds of thousands of dollars, how much effort should the nation put into investigating a proposal which will cost the nation trillions? That is the estimated cost of the climate legislation before Congress.
Much of the information relevant to houses and cars is easily understood by the general public – safety statistics or the presence or absence of radon and termites – but in the case of climate science, the way in which relevant information is developed and synthesized is far more complex and opaque.
Enough errors and misstatements by reputable climate researchers and organizations have come out recently to provoke a reaction in the public and this provides a good starting point for a discussion of how scientific data is generated, analyzed, stored and used.
Our understanding of the physical world will always be imperfect, but we still have a responsibility to test, validate and revalidate, to be as certain as we can be about the climate data on which our future well-being depends.
Since the late 1980s, there has been growing concern among climate scientists that the earth’s temperature is increasing, which they determined from the study of ancient climate proxies and from modern temperature records. The cause of this warming has been attributed by some to human emissions of greenhouse gases, chiefly carbon dioxide (CO2), although other causes such as natural climate variability have been suggested.
The United Nations’ International Panel on Climate Change (IPCC)’s reports, which are meant to be the most accurate and up-to-date summary of the state of climate change research, include warnings about the potential for significant increases in temperature and the catastrophic consequences which they will produce.
Because of their involvement in this international group, many governments have agreed to greatly reduce CO2 emissions, in spite of the economic and social costs which will result from such a rapid shift in energy use. (It is the abruptness of the change in energy generation and use and the lack of costcompetitive alternatives to fossil energy which are likely to cause economic disruption. Decarbonization, which is the gradual shift from higher-carbon, less efficient energy sources to lower-carbon, more efficient ones, has been going on for centuries and is the default “business as usual” evolution of energy use......
Corrections and even retractions of research findings are a normal part of the scientific method. For example, a prediction made last year in Nature magazine of future sea-level rise of up to 32 inches by 2100 was retracted recently by the authors due to mistakes they detected in their modeling.
Unfortunately, this kind of transparency is not as common as it should be. Leading climatology researchers at the Climate Research Unit (CRU) at the University of East Anglia and the University of Pennsylvania have also contributed substantially to the IPCC’s reports. These centers and the scientists affiliated with them have been buffeted by charges of improper and unprofessional conduct in their research and analysis, particularly after a mass of their hacked email correspondence was released online in November 2009.
The petty and vindictive tone of many of the messages is not surprising to anyone who has worked in academia, but it shocked and disappointed many people who supposed that scientists were exempt from human prejudice and error. Professor Susan Dudley described the deference usually given to the scientific community as a “cultification of science,” complete with unimpeachable high priests.
Wallace Sayre of Columbia University once quipped, “The politics of the university are so intense because the stakes are so low,” but the stakes here are of great consequence. Some of the errors which have been reported so far are simple to correct. The 2007 IPCC report stated that 55 per cent of the Netherlands is below sea level, when the actual figure is 26 percent.
The same report claimed that it was “very likely” (meaning a greater than 90 percent chance) that Himalayan glaciers would disappear by 2035 if current warming trends continued. It was later found that this was simply an opinion expressed by one climate scientist a decade before.
Another claim in the 2007 report was that by 2020 global warming could reduce crop yields in some African nations by half; further investigation showed that the claim was wholly groundless.
Errors of this type might be expected in a high-school research paper, but they are indefensible at the level of the IPCC.
More serious, systemic problems have emerged from the hacked emails. Some messages suggest that the writers intended to keep contrary views from being published in scientific journals. Such actions would seriously endanger intellectual freedom and the scientific method.
Others show that scientists at the University of East Anglia’s Climate Research Unit repeatedly ignored Freedom of Information requests for climate data and discussed erasing email messages to avoid having them exposed.
Professor Philip Jones of the CRU, whose work figures prominently in the IPCC’s reports, has admitted that he had not organized his data well and his refusal to release the data behind his findings to outside researchers was based, at least in part, on his inability to document his sources. Jones’ colleagues have suggested that actually he may have lost the original data.
Poor organization is a fact of life and the loss or corruption of scientific data can usually be dealt with on the university department level. But when that data is used as the foundation for national and international economic policy, the standards of generating, organizing and maintaining it must be as high as humanly possible. If Professor Jones’ original data can be retrieved or reconstructed and if it confirms his conclusions, then it can be used in further study. If it cannot be recovered and therefore is not falsifiable, then his conclusions and anything based on them have no scientific validity.
As Karl Popper said, “The criterion of the scientific status of a theory is its falsifiability, or refutability, or testability.” Scientific principles require that an experiment, process or observation be testable and repeatable in order to be validated and without the original data, this is impossible. For that reason, transparency of both method and data is vital for progress in understanding climate.
Equally important, climate data must be reliable, or robust, as scientists say, which means that it must be thoroughly and repeatedly examined for flaws before it becomes the basis for analysis and public policy. No matter how sophisticated our climate modeling is, if the inputs are unreliable, the outcome is of no value.
One solution would be to require that data and analytical methodologies used in governmental and intergovernmental reports be made public for review by both experts and the interested public. This, together with independent external auditing and validation, would go a long way toward guaranteeing the quality of the data and the conclusions drawn from it.
If the scandals arising from the released climate emails and clumsy mistakes in the IPCC reports have any positive result, it will be a demand for greater transparency and corroboration of the scientific data which forms the basis of our public policy.
We cannot guarantee that our politicians will always make the right decision even with the best possible information, but without it, making the right decision is almost impossible.
Feminists and Greens, not Christians, are the real wowsers
"Wowser" roughly means "killjoy" and is of American origin but is now commonly used only in Australia. It was originally an abbreviation of "We only want social evils removed" and was part of the campaign for Prohibition
I'd have thought stripping was the last thing you'd do in a country called Iceland, and now it is. The country's parliament last week voted to ban striptease shows, making it a crime to turn a buck from a naked woman.
Now, normally news from Iceland - even news including words such as "stripper" and "nude" - cuts no ice with me. But there's a moral to this story that helps explain why Formula One driver Mark Webber protested last weekend that Australia had turned into "a bloody nanny state in which we've got to read an instruction book when we get out of bed - what we can do and what we can't do".
Mark, all the way from Iceland comes your explanation. The telling thing about Iceland's ban on strippers is that it's long been a Christian country - yet it's not Christians now forcing everyone else to live by their finger-wagging code.
True, Iceland's Christianity is of that wobbly northern European kind, with only one in 10 believers in a pew on Sundays. Sounds a bit like Australia. Yet the 90 per cent of Icelanders formally registered with a church have long tolerated the strip shows that a new breed of believers have now banned.
And who are these new wowsers? Why, followers of a creed that's also growing strong here, and dangerously lacks Christianity's tolerance. You see, Iceland is the first country in the world to ban stripping and lapdancing not for religious reasons but feminist. Indeed, it's the only European country other than the Vatican City and Andorra to ban stripping at all.
Kolbrun Halldorsdottir, the politician who first proposed this law, says its adoption by parliament is "mainly as a result of the feminist groups putting pressure on parliamentarians". Moreover, almost half of Iceland's parliamentarians are female, and president Johanna Sigurardottir is not only a feminist but the world's first openly lesbian head of state.
This is the coalition of ideologues who have banned what Christians wouldn't, ruling that men may not enjoy what feminists don't like, and other women may not make money in ways their betters think sinful.
True, stripping is a demeaning trade and the less of it the healthier. I'd even admit to being dismayed that our local councils are so indifferent to the trashing of our culture that they allow the huge twirling sign next to the busy Richmond train station that advertises two sleazier strip clubs.
But the issue here is not whether stripping should be banned (I'd say no), but who is most likely to ban it - and ban lots of other things they don't like. Or put it this way: it's to identify just who is most likely to tick off Mark Webber.
We're so often told that the real straighteners are Christians, and especially Catholics. "Get your rosaries off my ovaries," screeched an anti-Catholic Age columnist at Tony Abbott on the ABC's Q&A, as if the Opposition Leader really was about to ban abortions and anything else his Catholic faith didn't like.
In truth, Abbott as prime minister would do no such authoritarian thing, and not just because he's terrified of losing the votes of women. He actually knows Christianity has what so many in-your-face ideologies of the Left do not - a respect for freedom and individual conscience.
Christians, or at least those with brains, understand their God gave us freedom to choose, or else there'd be no Hell. How we choose is ultimately up to us - and it's in that free choice that we show our moral worth. That helps to explain why Australia, despite being an overwhelmingly Christian country, has so few laws passed on purely religious grounds.
We can watch strip shows, divorce, drink, draw rude pictures of Christ, work on Sundays, visit brothels, have abortions, buy condoms, blaspheme and call the Pope a Nazi. We are free to sin against the Christian creed - and to be judged.
But the new moralists aren't quite so keen on such freedom. It's the old problem. Most moralists are really after power, not goodness, and moralising licenses them to do almost anything to make other people nicer, since their bullying is for their victim's own good. And if they don't believe in God, they'll feel even more obliged to do his judging for him.
That's why feminists feel free to ban other women from stripping, and why Webber is being driven mad by laws that are intrusive, expensive, patronising and inconvenient impertinences.
The worst of them are our new racial and religious vilification laws that have done no good and much harm, most notoriously when they were used to punish Christian pastors for quoting Koranic passages on jihad to their flock. This preaching was illegal, ruled the VCAT judge, because it elicited "a response from the audience at various times in the form of laughter".
Amazing - a law against blasphemy that was passed not by Christians but secular multiculturalists.
Then there are those countless other pestering laws that are meant to spread the latest faith of the faithless, rather than achieve any practical good.
* Take the ban on shopping bags that inconveniences shoppers without saving the planet.
* Take the ban on bottled water passed by the NSW town of Bundanoon that strikes the approved attitude but won't stop global warming by a flicker.
* Take the new Victorian law that demands all new houses have six-star green rating, forcing buyers to pay extra to live someone else's green dream.
* Take the recycling laws that force people to ritually separate their garbage at no benefit to anyone.
* Most infuriating of all such laws is Victoria's mad ban on new dams that has forced hundreds of thousands of garden lovers to water their gardens by hand at dark dawn, thanks to the man-made water shortages that followed. We must stand, hose in hand, in our gardens at dawn because influential greens felt more water was a sin.
That's the new green moralist forcing you to genuflect to their faith, using the law in ways Christians can't and won't.
YOU don't believe me? You can't believe that our laws are being used just to ram this new faith down your throat? Then read this letter in The Age from a green in Northcote (where else) infuriated by the Brumby Government's pre-election announcement that it will relax its water restrictions, thanks to good rain and the new desalination plant.
"I was stunned to hear that Melburnians can shortly return to squandering water," this Gaian raged. "After years of drought ... the population was beginning to show that behaviour can be changed, we can become responsible consumers of the Earth's resources. "Now, with a state election looming ... we can return to the head-in-the-sand wasteful behaviour of years past. Shame on you, Mr Brumby." Other writers agreed.
This is the green version of a law to force us all to eat fish on Friday.
Some of these new moralists would go even further, and turn the country into a theocracy. A green Iran. Take James Lovelock, the Gaia guru, who preaches that "climate change may be an issue as severe as a war" and "it may be necessary to put democracy on hold for a while".
Or Prof Clive Hamilton, the Greens candidate, who claims global warming may require "emergency responses such as the suspension of democratic processes". Sure, Clive. Can I be the dictator, or is that job reserved for you?
You know the answer. After all, Hamilton is the fanatic who preaches that the "Gaian earth in its ecological, cybernetic way, (is) infused with some notion of mind or soul or chi".
And if it takes a law to make us see things his way - well, let Webber blubber. Or flee to a truly Christian land where he's still free to blaspheme.
For more postings from me, see DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC, GUN WATCH, SOCIALIZED MEDICINE, AUSTRALIAN POLITICS, IMMIGRATION WATCH INTERNATIONAL and EYE ON BRITAIN. My Home Pages are here or here or here. Email me (John Ray) here. For readers in China or for times when blogger.com is playing up, there are mirrors of this site here and here
Posted by JR at 5:32 PM