Tuesday, August 23, 2011


A cautious whitewash

The whitewash described below is not particularly surprising -- just establishment scientists covering for one-another. What IS surprising is something not mentioned below: The heavily qualified conclusions of the investigation. Look carefully at the OIG report from the NSF. They basically found no reason to question Penn State’s own very limited investigation, and said nothing improper was conducted WITH NSF FUNDS.

It doesn’t conclude there is “nothing wrong” with Mann’s conclusions, all it concludes is there is no basis to conclude he did anything improper (WITH NSF FUNDING).


Michael Mann, a Pennsylvania researcher who’s been a target of climate-change skeptics, was cleared of wrongdoing by U.S. investigators in the flap surrounding e-mails hacked from a U.K. university.

Finding no “evidence of research misconduct,” the Arlington, Virginia-based National Science Foundation closed its inquiry into Mann, according to an Aug. 15 report from the inspector general for the U.S. agency. Pennsylvania State University, where Mann is a professor of meteorology, exonerated him in February of suppressing or falsifying data, deleting e- mails and misusing privileged information.

Climate-change doubters pointed to the stolen U.K. e-mails, which surfaced in blogs in 2009, as proof that researchers conspired to quash studies questioning the link between human activity and warming. Last week, Texas Governor Rick Perry, who is seeking the Republican nomination for president, renewed the assertion that scientists have “manipulated” data on climate change.

“It was a pretty definitive finding” that the charges “swirling around for over a year” were baseless, Mann said in an interview. “I was very pleased.”

The report confirms findings from the U.S. National Oceanic and Atmospheric Administration’s inspector general and a separate panel of seven scientists based at universities in the U.K., U.S. and Switzerland. The University of East Anglia announced the committee. Ron Oxburgh, the former head of Shell Transport & Trading Plc and a member of the U.K. House of Lords, was chairman.
‘Closes the Books’

“It certainly closes the books on Michael Mann and the e- mails,” Joe Romm, a blogger for the Center for American Progress, an advocacy group with ties to President Barack Obama’s administration, said in an interview. “They found nothing wrong with the science, or any evidence that there was anything wrong with how the scientists went about their work.”

More HERE






The Bait-and-Switch Greenie Swindle

In reaching to remain relevant, the environmental movement has had to change tactics.

Back in the seventies, when America looked like China does today, environmental issues needed attention. But then we cleaned up the air and water. The skies and rivers went from brown to blue. As Greenpeace cofounder Patrick Moore explains, in order to stay relevant, environmentalists had to find new issues.

For most of the last decade global warming has been their cause, and carbon —or burning fossil fuels— was vilified as the cause. This gave way to a whole new industry: green. Green energy would replace fossil fuels. Wind and solar would replace coal as the source fuel for electricity and ethanol, or other fuels generated from biomass, would replace liquid fuels. Green energy would provide new “green” jobs. The world would be a beautiful place.

This all sounded nice. It felt good. But that was before data began to be show how much more all of this was going to cost and the urgent need to save the planet passed. The polar bears were not drowning. The measurements were found to be falsified. Consensus science didn’t work. The seas did not rise and the world seemed to adapt to whatever the various changes have been. There was a “newfound hostility to climate policy.” Suddenly, we did not want to spend so much on “feel good.”

Obama’s cap and trade campaign promise died. Ethanol is on the budget chopping block. Switching to wind and solar is not proving to be as easy as expected. Environmentalists admitted defeat.

But, wait! They have organizations set up, offices with leases, and employees who need to be paid. They can’t just pack up. A new approach was needed.

Enter public health. Last month when Mayor Bloomberg gave $50 million to the Sierra Club’s campaign to shut down coal plants, he stated: “Coal is a self-inflicted public health risk.” The discussion has changed to something every mother can get behind.

Along with this, we see television ads attacking the emissions from coal-fueled power plants, not for their CO2 emissions, not for their impact on climate change, but for the health risks. The American Lung Association and the EPA must be in cahoots on this campaign—the EPA has given the ALA nearly $30 million in taxpayer dollars.

According to the National Institutes for Health’s Data Fact Sheet on Asthma Statics, “The prevalence of asthma has been increasing since the early 1980s.” If the prevalence of asthma has been increasing as America’s air has been getting cleaner and cleaner, perhaps adding new and expensive regulations on behalf of public health isn’t really about public health. In fact, a recent study done by Kendle M. Maslowski and Charles R. Mackay published in the Nature Immunology indicates that we may have cleaned up the air so much that the body doesn’t have the chance to build up immunities.

While only a small percentage of the population suffer from asthma, and the science is questionable as to whether or not pushing the law of diminishing returns will help, the Obama administration talks about rolling back regulations while pushing the EPA to enact harsh new regulations that will eliminate the best economic asset America has: comparatively cheap energy.

Specifically in question here is the EPA’s new ozone regulations—with a final decision expected in the next couple of weeks. But there is more than just ozone, there is the Cross-State Air Pollution Rule, BART and MACT—all of which are expected to shut down a large percentage of existing coal-fueled power plants because the cost to retro fit is just too high. Many units have already shut down throughout the country.

With these “public-health” aimed regulations added on top of one another, it is amazing that Americans are living longer and longer. If all of these regulations are really about health, why are they being rammed through by the Environmental Protection Agency—not the Department of Health?

As Congress continues to threaten to defund the EPA, perhaps, like the environmentalists, they have had to reinvent themselves to stay relevant—but in doing so, they are raising the price of energy and everything else, including food and clothing and all other basic necessities as they, too, are energy dependent.

If they can so easily switch from climate change to public health, you have to wonder if climate change was ever the issue and if public health is the real concern now. Why is it that the powers that be are so set on raising the cost of energy—through whatever means seems publicly viable?

The obvious answer is something not palatable to most Americans. Which brings up the next question: What can we do to stop them?

At a recent a meeting with Karl Rove, I asked: “Given the current administration, what can the public do to change the energy policy in America?” In short, his answer was, keep reminding people how important energy is. November 2012 is coming.

If Americans are to continue to have the freedoms we have, energy has to be a part of the discussion and Americans need to understand the real benefits to cost-effective energy. Together we can change the energy/environment discussion.

SOURCE





Behind the Aliens-Will-Smite-Us News Story

It isn’t every day that a research paper published in an obscure academic journal attracts its own, full blown article in a major newspaper. It isn’t every day that a science correspondent writes an article that merits a headline as bizarre as the following: Aliens may destroy humanity to protect other civilizations, say scientists.

One of the main assertions of this Guardian news story has already been withdrawn. Over at WattsUpWithThat.com there’s a screengrab of what the article looked like yesterday. Just under the headline, the article was claiming:

Rising greenhouse emissions may tip off aliens that we are a rapidly expanding threat, warns a report for Nasa [bold added]

Today the reference to NASA has been removed. As the homepage of Seth Baum, the paper’s lead author, explains:

The article was not in any way prepared for or sponsored by NASA. Instead, it was a spare-time project of three researchers, one of whom happens to be a NASA employee…

It’s difficult not to feel some sympathy for the NASA-affiliated person, whose name is Shawn Domagal-Goldman. As he himself explains in a blog post:

This isn’t a “NASA report.” It’s not work funded by NASA, nor is it work supported by NASA in other ways. It was just a fun paper written by a few friends, one of whom happens to have a NASA affiliation.

…So here’s the deal, folks. Yes, I work at NASA. It’s also true that I work at NASA Headquarters. But I am not a civil servant… just a lowly postdoc. More importantly, this paper has nothing to do with my work there. I wasn’t funded for it, nor did I spend any of my time at work or any resources provided to me by NASA to participate in this effort.

…I do admit to making a horrible mistake. It was an honest one, and a naive one… but it was a mistake nonetheless. I should not have listed my affiliation as “NASA Headquarters.” I did so because that is my current academic affiliation. But when I did so I did not realize the full implications that has. I’m deeply sorry for that, but it was a mistake born our [sic] of carelessness and inexperience and nothing more. I will do what I can to rectify this… [bold in the original]

So what the newspaper told us was a report for NASA written by scientists turns out to be a fun paper written by a few friends. One of them (Baum) is still working on his PhD. Another (Haqq-Misra) got his PhD last year. The third (Domagal-Goldman) is, in his own words, just a lowly postdoc.

Lead author Baum sounds like the a sweet young man you’d be delighted to learn was dating your daughter. He says all his activities and interests “revolve around the theme of making the world (universe(s)?) a better place.” As he explains:

My dissertation research, with advisor Bill Easterling, is on the ethics and moral psychology of discounting in the context of climate change assessment. I also work on reducing global catastrophic risk, which is anything that could end human civilization or even cause human extinction.

That’s all well and good. But if the academic paper that caught the attention of Guardian science correspondent is any indication of how the current generation of young scientists think, we’ll need to start viewing scientific findings with more than a grain of salt.

You see, the worldview embraced by these youngsters is as depressing as it is astonishing. I’m no longer surprised to learn that arts students have absorbed a humans-are-a-pox-on-the-planet philosophy. But apparently this is now true of our scientifically-trained minds, as well.

In the event that God or Mother Nature doesn’t punish us for our eco sins, the three young men who wrote this paper speculate that maybe aliens will, instead. According to the Guardian:

…reducing our emissions might just save humanity from a pre-emptive alien attack, scientists claim.

Yes, and eating only cauliflower for breakfast might save us from an alien attack, too. Since no one has ever detected any aliens, never mind figured out what their value system might be, my guess is surely as good as anyone else’s.

At first I worried that the newspaper article was exaggerating. But then I took a look at the academic paper itself. What’s remarkable about it isn’t just the angst these young people exude. It’s that ideas I consider highly debatable – such as the claim that humans are responsible for widespread loss of biodiversity – are all assumed, by these scientific minds, to be well-established facts.

For example, on page 21, the authors write:

Given that we have already altered our environment in ways that may viewed as unethical by [aliens] it may be prudent to avoid sending any message that shows evidence of our negative environmental impact…any message that indicates of [sic] widespread loss of biodiversity or rapid rates of expansion may be dangerous…On the other hand [the aliens] may already know about our rapid environmental impact by listening to leaked electromagnetic signals or observing changes in Earth’s spectral signature. In this case, it might be prudent for any message we send to avoid denying our environmental impact so as to avoid the [aliens] catching us in a lie.

This just makes me want to weep.



SOURCE






Climate Forecasting Models Aren’t Pretty, And They Aren’t Smart

By Dr. Larry Bell

Anyone who says they can confidently predict global climate changes or effects is either a fool or a fraud. No one can even forecast global, national or regional weather conditions that will occur months or years into the future, much less climate shifts that will be realized over decadal, centennial and longer periods.

Nevertheless, this broadly recognized limitation has not dissuaded doomsday prognostications that have prompted incalculably costly global energy and environmental policies. Such postulations attach great credence to computer models and speculative interpretations that have no demonstrated accuracy.

The primary source of scary climate change alarmism routinely trumpeted in the media originates from politically cherry-picked summary report items issued by the U.N.’s Intergovernmental Panel on Climate Change (IPCC). Yet even the IPCC’s 2001 report chapter titled “Model Evaluation” contains this confession: “We fully recognize that many of the evaluation statements we make contain a degree of subjective scientific perception and may contain much ‘community’ or ‘personal’ knowledge. For example, the very choice of model variables and model processes that are investigated are often based upon subjective judgment and experience of the modeling community.”

In that same report the IPCC further admits, “In climate research and modeling, we should realize that we are dealing with a coupled non-linear chaotic system, and therefore that the long-term prediction of future climate states is not possible.” Here, the IPCC openly acknowledges that its models should not be trusted. Still, the IPCC obviously needs to apply them to justify its budget and influence. Without contrived, frightening forecasts, they would soon be out of business.

So in the IPCC’s most recent 2007 report the story changed significantly, placing “great confidence”: in the ability of General Circulation Models (GCMs) to responsibly attribute observed climate change to anthropogenic (man-made) greenhouse gas emissions. It states that “ climate models are based on well-established physical principles and have been demonstrated to reproduce observed features of recent climate and past changes.”

Yet even Kevin Trenberth, a lead author of 2001 and 2007 IPCC report chapters, has admitted that the IPCC models have failed to duplicate realities. Writing in a 2007 “Predictions of Climate” blog appearing in the science journal Nature.com he stated, “None of the models used by the IPCC are initialized to the observed state and none of the climate states in the models correspond even remotely to the current observed state.”

Syun-Ichi Akasofu, the former director of the International Arctic Research Center at the University of Alaska-Fairbanks, has determined that IPCC computer models have not even been able to duplicate observed temperatures in Arctic regions. While the atmospheric CO2 forecasts indicated warm Arctic conditions, they were lower than actually reported, and colder areas were absent. Akasofu stated , “If fourteen GCMs cannot reproduce prominent warming in the continental Arctic, perhaps much of this warming is not produced by greenhouse effect at all.”

Graeme Stephens at the Colorado State University’s Department of Atmospheric Science warned in a 2008 paper published in the Journal of Climate, that computer models involve simplistic cloud feedback descriptions: “Much more detail on the system and its assumptions [is] needed to judge the value of any study. Thus, we are led to conclude that the diagnostic tools currently in use by the climate community to study feedback, at least as implemented, are problematic and immature and generally cannot be verified using observations.”

The prominent, late scientist Joanne Simpson developed some of the first mathematical models of clouds in an attempt to better understand how hurricanes draw power from warm seas. Ranked as one of the world’s top meteorologists, she believed that global warming theorists place entirely too much emphasis upon faulty climate models, observing, “We all know the frailty of models concerning the air-surface system We only need to watch the weather forecasts [to prove this].”

A recent study reported in the peer-reviewed science journal Remote Sensing concludes that NASA satellite data between the years 2000-2001 indicate that GCMs have grossly exaggerated warming retained in the Earth’s atmosphere. The study’s co-author, Dr. Roy Spencer, observes: “There is a huge discrepancy between the data and the forecasts that is especially big over the oceans. Not only does the atmosphere release more energy than previously thought, it starts releasing it earlier in the warming cycle.”

Spencer, a principal research scientist at the University of Alabama-Huntsville and former senior scientist for climate studies at NASA, has also observed that results of the one or two dozen climate modeling groups around the world often reflect a common bias. One reason is that many of these modeling programs are based upon the same “parameterization” assumptions; consequently, common errors are likely to be systematic, often missing important processes. Such problems arise because basic components and dynamics of the climate system aren’t understood well enough on either theoretical or observational grounds to even put into the models. Instead, the models focus upon those factors and relationships that are most familiar, ignoring others altogether. As Spencer notes in his book Climate Confusion, “Scientists don’t like to talk about that because we can’t study things we don t know about.”

A peer-reviewed climate study that appeared in the July 23, 2009 edition of Geophysical Research Letters went even farther in its characterization of faulty climate modeling practices. The paper noted IPCC modeling tendencies to fudge climate projections by exaggerating CO2 influences and underestimating the importance of shifts in ocean conditions. The research indicated that influences in solar changes and intermittent volcanic activity have accounted for at least 80% of observed climate variation over the past half century. Study coauthor John McLean observed: “When climate models failed to retrospectively produce the temperatures since 1950, the modelers added some estimated influences of carbon dioxide to make up the shortfall.” He also highlighted inability of computer models to predict El Nino ocean events which can periodically dominate regional climate conditions, hence further reducing model meaningfulness.

J. Scott Armstrong, a professor at the University of Pennsylvania’s Wharton School, and a leading expert in the field of professional forecasting, believes that prediction attempts are virtually doomed when scientists don’t understand or follow basic forecasting rules. He and colleague Kesten Green of Monash University conducted a “forecasting audit” of the 2007 IPCC report and “found no references to the primary sources of information on forecasting methods” and that “the forecasting procedures that were described [in sufficient detail to be evaluated] violated 72 principles. Many of the violations were, by themselves, critical”.

A fundamental principle that IPCC violated was to “make sure forecasts are independent of politics”. Armstrong and Green observed that “the IPCC process is directed by non-scientists who have policy objectives and who believe that anthropogenic global warming is real and a danger.” They concluded that: “The forecasts in the report were not the outcome of scientific procedures. In effect, they were the opinions of scientists transformed by mathematics and obscured by complex writing We have not been able to identify any scientific forecasts of global warming. Claims that the Earth will get warmer have no more credence than saying it will get colder”.

Trenberth argued in his 2007 Nature blog that “the IPCC does not make forecasts”, but “instead proffers ‘what if’ projections that correspond to certain emission scenarios”; and then hopes these “projections will guide policy and decision makers.” He went on to say: “there are no such predictions [in the IPCC reports] although the projections given by the Intergovernmental Panel on Climate Change (IPCC) are often treated as such. The distinction is important”.

Armstrong and Green challenge that semantic defense, pointing out that “the word ‘forecast’ and its derivatives occurred 37 times, and ‘predict’ and its derivatives occurred 90 times in the body of Chapter 8 of [the IPCC’s 2007] the Working Group I report.”

Of course there would be very little interest in model forecasts at all if it were not for hysterical hype about a purported man-made climate crisis caused by carbon dioxide fossil fuel emissions. Without CO2 greenhouse gas demonization there is no basis for cap-and-tax schemes, unwarranted “green” fuel subsidies, expansion of government regulatory authority over energy production and construction industries through unintended misapplications of the Clean Air Act, claims of polar bear endangerment to prevent drilling in ANWR, or justifications for massive climate research budgets including guess what? Yup! Lots of money to produce more climate model forecasts that perpetuate these agendas.

SOURCE




Solar furnaces are not cheap and use lots of water

Concentrating solar has promised big additions to renewable energy production with the additional benefit of energy storage -- saving sun power for nighttime -- but there's a catch. Most of the new power plants are big water users despite being planned for desert locations.

With solar photovoltaic (PV) prices dropping so rapidly, does concentrating solar still make sense?

Concentrating solar thermal power uses big mirrors to focus sunlight and make electricity. Think kids with magnifying glasses, but making power instead of frying ants. The focused sunlight makes heat, the heat makes steam, and the steam powers a turbine to make electricity. In "wet-cooled" concentrating solar power plants, more water is used to make power than in any other kind of power plant. The following chart illustrates the amount of water used to produce power from various technologies:

Water consumption can be cut dramatically by using "dry-cooling," but this change increases the cost per kilowatt-hour (kWh) of power generated from concentrating solar power (CSP). In the 2009 report "Juice from Concentrate," the World Resources Institute reports that the reduction in water consumption adds 2-10 percent to levelized costs and reduces the power plant's efficiency by up to 5 percent.

Let's see how that changes Institute for Local Self-Reliance's original levelized cost comparison between CSP and solar PV. Here's the original chart comparing PV projects to CSP projects, with no discussion of water use or energy storage:

To make the comparison tighter, we'll hypothetically transform the CSP plants from wet-cooled to dry-cooled, adjusting the levelized cost of power.

Using the midpoint of each estimate from "Juice from Concentrate" (6 percent increase to levelized costs and 2.5 percent efficiency reduction), the change in the cost per kWh for dry-cooling instead of wet-cooling is small but significant. For example, all three concentrating solar power projects listed in the chart are wet-cooled power plants. With a 6 percent increase in costs from dry-cooling and a 2.5 percent reduction in efficiency, the delivered cost of electricity would rise by approximately 1.7 cents per kWh.

With the increased costs to reduce water consumption, CSP's price is much less competitive with PV. A distributed solar PV program by Southern California Edison has projected levelized costs of 17 cents per kWh for 1-2 megawatt solar arrays, and a group purchase program for residential solar in Los Angeles has a levelized cost of just 20 cents per kWh.

In other words, while wet-cooled CSP already struggles to compete with low-cost, distributed PV, using dry- cooling technology makes residential-scale PV competitive with CSP.

But there's one more piece: storage.

While Nevada Solar One was built without storage, the PS10 and PS20 solar towers were built with one hour of thermal energy storage. Let's see how that changes the economics.

To make the comparison comparable, we'll add the cost of one hour of storage to our two PV projects, a cost of approximately $0.50 per Watt, or 2.4 cents per kWh. The following chart illustrates a comparison of PV to CSP, with all projects having one hour of storage (Nevada Solar One has been removed as it does not have storage):

When comparing CSP with storage (and lower water use) to PV with battery storage, we have a comparison that is remarkably similar to our first chart. Distributed PV at a commercial scale (1-2 megawatts) is still cheaper than CSP, but residential PV is more expensive.

Even though dry-cooled CSP competes favorably on price, it still uses much more water than PV. That issue is probably why many solar project developers are switching from CSP to PV technology for their large-scale desert projects.

Concentrating solar thermal power had its moment of cost advantage a few years ago, but the rapid pace (and zero water use) of solar PV installations has quickly eroded even the energy storage advantage of CSP.

SOURCE (See the original for links and graphics)






The Dubious Science Of The Climate Crusaders

By William Happer (William Happer is the Cyrus Fogg Brackett Professor of Physics at Princeton University)

"The object of the Author in the following pages has been to collect the most remarkable instances of those moral epidemics which have been excited, sometimes by one cause and sometimes by another, and to show how easily the masses have been led astray, and how imitative and gregarious men are, even in their infatuations and crimes,” wrote Charles Mackay in the preface to the first edition of his Extraordinary Popular Delusions and the Madness of Crowds.

I want to discuss a contemporary moral epidemic: the notion that increasing atmospheric concentrations of greenhouse gases, notably carbon dioxide, will have disastrous consequences for mankind and for the planet. The “climate crusade” is one characterized by true believers, opportunists, cynics, money-hungry governments, manipulators of various types—even children’s crusades—all based on contested science and dubious claims.

I am a strong supporter of a clean environment. We need to be vigilant to keep our land, air, and waters free of real pollution, particulates, heavy metals, and pathogens, but carbon dioxide (CO2 ) is not one of these pollutants. Carbon is the stuff of life. Our bodies are made of carbon. A normal human exhales around 1 kg of CO2 (the simplest chemically stable molecule of carbon in the earth’s atmosphere) per day. Before the industrial period, the concentration of CO2 in the atmosphere was 270 ppm. At the present time, the concentration is about 390 ppm, 0.039 percent of all atmospheric molecules and less than 1 percent of that in our breath. About fifty million years ago, a brief moment in the long history of life on earth, geological evidence indicates, CO2 levels were several thousand ppm, much higher than now. And life flourished abundantly.

Now the Environmental Protection Agency wants to regulate atmospheric CO2 as a “pollutant.” According to my Webster’s New Collegiate Dictionary, to pollute is “to make or render unclean, to defile, to desecrate, to profane.” By breathing are we rendering the air unclean, defiling or desecrating it? Efforts are underway to remedy the old-fashioned, restrictive definition of pollution. The current Wikipedia entry on air pollution, for example, now asserts that pollution includes: “carbon dioxide (CO2)—a colorless, odorless, non-toxic greenhouse gas associated with ocean acidification, emitted from sources such as combustion, cement production, and respiration.”

As far as green plants are concerned, CO2 is not a pollutant, but part of their daily bread—like water, sunlight, nitrogen, and other essential elements. Most green plants evolved at CO2 levels of several thousand ppm, many times higher than now. Plants grow better and have better flowers and fruit at higher levels. Commercial greenhouse operators recognize this when they artificially increase the concentrations inside their greenhouses to over 1000 ppm.

Wallis Simpson, the woman for whom King Edward VIII renounced the British throne, supposedly said, “A woman can’t be too rich or too thin.” But in reality, you can get too much or too little of a good thing. Whether we should be glad or worried about increasing levels of CO2 depends on quantitative numbers, not just qualitative considerations.

How close is the current atmosphere to the upper or lower limit for CO2? Did we have just the right concentration at the preindustrial level of 270 ppm? Reading breathless media reports about CO2 “pollution” and about minimizing our carbon footprints, one might think that the earth cannot have too little CO2, as Simpson thought one couldn’t be too thin—a view which was also overstated, as we have seen from the sad effects of anorexia in so many young women. Various geo-engineering schemes are being discussed for scrubbing CO2 from the air and cleansing the atmosphere of the “pollutant.” There is no lower limit for human beings, but there is for human life. We would be perfectly healthy in a world with little or no atmospheric CO2—except that we would have nothing to eat and a few other minor inconveniences, because most plants stop growing if the levels drop much below 150 ppm. If we want to continue to be fed and clothed by the products of green plants, we can have too little CO2.

The minimum acceptable value for plants is not that much below the 270 ppm preindustrial value. It is possible that this is not enough, that we are better off with our current level, and would be better off with more still. There is evidence that California orange groves are about 30 percent more productive today than they were 150 years ago because of the increase of atmospheric CO2.

Although human beings and many other animals would do well with no CO2 at all in the air, there is an upper limit that we can tolerate. Inhaling air with a concentration of a few percent, similar to the concentration of the air we exhale, hinders the diffusional exchange of CO2 between the blood and gas in the lung. Both the United States Navy (for submariners) and nasa (for astronauts) have performed extensive studies of human tolerance to CO2. As a result of these studies, the Navy recommends an upper limit of about 8000 ppm for cruises of ninety days, and nasa recommends an upper limit of 5000 ppm for missions of one thousand days, both assuming a total pressure of one atmosphere. Higher levels are acceptable for missions of only a few days.

We conclude that atmospheric CO2 levels should be above 150 ppm to avoid harming green plants and below about 5000 ppm to avoid harming people. That is a very wide range, and our atmosphere is much closer to the lower end than to the upper end. The current rate of burning fossil fuels adds about 2 ppm per year to the atmosphere, so that getting from the current level to 1000 ppm would take about 300 years—and 1000 ppm is still less than what most plants would prefer, and much less than either the nasa or the Navy limit for human beings.

Yet there are strident calls for immediately stopping further increases in CO2 levels and reducing the current level. As we have discussed, animals would not even notice a doubling of CO2 and plants would love it. The supposed reason for limiting it is to stop global warming—or, since the predicted warming has failed to be nearly as large as computer models forecast, to stop climate change. Climate change itself has been embarrassingly uneventful, so another rationale for reducing CO2 is now promoted: to stop the hypothetical increase of extreme climate events like hurricanes or tornados. But this does not necessarily follow. The frequency of extreme events has either not changed or has decreased in the 150 years that CO2 levels have increased from 270 to 390 ppm.

There have been many warmings and coolings in the past when the CO2 levels did not change. A well-known example is the medieval warming, about the year 1000, when the Vikings settled Greenland (when it was green) and wine was exported from England. This warm period was followed by the “little ice age” when the Thames would frequently freeze over during the winter. There is no evidence for significant increase of CO2 in the medieval warm period, nor for a significant decrease at the time of the subsequent little ice age. Documented famines with millions of deaths occurred during the little ice age because the cold weather killed the crops. Since the end of the little ice age, the earth has been warming in fits and starts, and humanity’s quality of life has improved accordingly.

A rare case of good correlation between CO2 levels and temperature is provided by ice-core records of the cycles of glacial and interglacial periods of the last million years of so. But these records show that changes in temperature preceded changes in CO2 levels, so that the levels were an effect of temperature changes. This was probably due to outgassing of CO2 from the warming oceans and the reverse effect when they cooled.

The most recent continental ice sheets began to melt some twenty thousand years ago. During the “Younger Dryas” some 12,000 years ago, the earth very dramatically cooled and warmed by as much as 10 degrees Celsius in fifty years.

The earth’s climate has always been changing. Our present global warming is not at all unusual by the standards of geological history, and it is probably benefiting the biosphere. Indeed, there is very little correlation between the estimates of CO2 and of the earth’s temperature over the past 550 million years (the “Phanerozoic” period). The message is clear that several factors must influence the earth’s temperature, and that while CO2 is one of these factors, it is seldom the dominant one. The other factors are not well understood. Plausible candidates are spontaneous variations of the complicated fluid flow patterns in the oceans and atmosphere of the earth—perhaps influenced by continental drift, volcanoes, variations of the earth’s orbital parameters (ellipticity, spin-axis orientation, etc.), asteroid and comet impacts, variations in the sun’s output (not only the visible radiation but the amount of ultraviolet light, and the solar wind with its magnetic field), variations in cosmic rays leading to variations in cloud cover, and other causes.

The existence of the little ice age and the medieval warm period were an embarrassment to the global-warming establishment, because they showed that the current warming is almost indistinguishable from previous warmings and coolings that had nothing to do with burning fossil fuel. The organization charged with producing scientific support for the climate change crusade, the Intergovernmental Panel on Climate Change (IPCC), finally found a solution. They rewrote the climate history of the past 1000 years with the celebrated “hockey stick” temperature record.

The first IPCC report, issued in 1990, showed both the medieval warm period and the little ice age very clearly. In the IPCC’s 2001 report was a graph that purported to show the earth’s mean temperature since the year 1000. A yet more extreme version of the hockey stick graph made the cover of the Fiftieth Anniversary Report of the United Nation’s World Meteorological Organization. To the surprise of everyone who knew about the strong evidence for the little ice age and the medieval climate optimum, the graph showed a nearly constant temperature from the year 1000 until about 150 years ago, when the temperature began to rise abruptly like the blade of a hockey stick. The inference was that this was due to the anthropogenic “pollutant” CO2.

This damnatia memoriae of inconvenient facts was simply expunged from the 2001 IPCC report, much as Trotsky and Yezhov were removed from Stalin’s photographs by dark-room specialists in the later years of the dictator’s reign. There was no explanation of why both the medieval warm period and the little ice age, very clearly shown in the 1990 report, had simply disappeared eleven years later.

More HERE

***************************************

For more postings from me, see DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC, GUN WATCH, AUSTRALIAN POLITICS, IMMIGRATION WATCH INTERNATIONAL and EYE ON BRITAIN. My Home Pages are here or here or here. Email me (John Ray) here. For readers in China or for times when blogger.com is playing up, there are mirrors of this site here and here

*****************************************

1 comment:

slktac said...

Concerning aliens: There is an equal probability that lack of CO2 could indicate we have no ability to fight back, has no one heard of "terra forming" (doesn't matter what Earth looks like then) and a lot of CO2 might indicate we are too dumb a race to bother with. There are hundreds of possiblilities--these guys chose the one that would get them the most attention. Maybe the aliens will come for them!