Sunday, December 23, 2007

The Hebrew University debate on Global Warming

By Nir J. Shaviv (D.Sc.)

On Sunday last week, a global warming debate was held at the Hebrew University, in front of a large public audience. The speakers included myself, and Prof. Nathan Paldor from the HU, on the so called sceptic side, and Prof. Dan Yakir (Weizmann) and Prof. Colin Price (Tel-Aviv Univ.) on the anthropogenic greenhouse gas (AGHG) side...

Although it was called a debate, it wasn't really one. It included 4 short presentations (about 12 mins each + 3 min for clarifying questions) and then another 45 mins of questions from the audience.

In my short presentations, I stressed a few major issues. First, there are no fingerprints proving that 20th century warming is necessarily human. Second, once you check the details, you find that there are notable inconsistencies. In particular, the AGHG theory predicts warming over the whole troposphere, while in reality, only the ground appears to have warmed over the past few decades, and that Earth's climate response to volcanic eruptions is significantly smaller than predicted by computer models. This is because these models tend to have an exaggerated climate sensitivity. Third, the only reason we can attribute the warming to humans, is because allegedly there is nothing else to blame, but there is, the increasing activity of the sun. And then I quickly showed some of the evidence showing that the sun affects climate through the cosmic ray climate link.

The second speaker was Prof. Dan Yakir. He started by saying that there is really no place or need to hold such debates anymore since the vast majority of scientists believe that the warming is anthropogenic. He mentioned Gore's nobel prize (yes, committees are to decide about scientific truths), Oreskes findings that from a 1000 papers, none contradict anthropogenic global warming, etc. He then attempted to debunk the cosmic ray - climate theory, some of his claims were supposed inconsistencies in the theory, and some were simply non-scientific arguments. Since I was not given a chance to address these claims, the response to each and every point raised can be found below.

The third speaker was Prof. Nathan Paldor. He emphasized the large uncertainties in our current understanding of climate systems. One such example was that of global dimming. Because of these large uncertainties, computer based modeling of 20th century warming or predictions of future climate change is mostly pointless at this time. He mentioned the 70's during which scientists urged Nixon to prepare the US for the upcoming ice-age, especially considering that the Soviets are better prepared for it!

The fourth speaker was Prof. Price, who emphasized the agreements between computer model predictions of AGHG theory and the observations. He showed, for example, that computer models trying to model 20th warming with only natural radiative forcings cannot explain the observed temperature trend, and models with anthropogenic contributions added, can explain. He then continued by trying to debunk the cosmic-ray climate theory. He mentioned several inconsistencies in the cosmic ray climate link (at least so he supposed). He also showed that for the cosmic ray climate mechanism to work, it rests on many links, some of which he doubted.

Before addressing the critiques, let me add that neither Yakir nor Price brought any evidence proving the standard anthropogenic scenario. Yakir did not attempt to prove anything (there is no need since the majority of scientists anyway support it), and Price did bring supporting evidence, but evidence that in reality does not prove anything about the validity of the AGHG theory.

The main claim raised by Price was that when computer models are used to fit 20th century warming, they do a very lousy job if you include all the natural forcing only, but they do a wonderful job if you include the anthropogenic forcing as well. This supposedly implies that one needs the large anthropogenic contribution to explain the warming. The key point here is that the natural forcings included all the known forcings, and not the unknown forcings, or specifically, the large indirect solar/climate link which these models fail to include, because the modelers bluntly neglect this mechanism. More about it here.

Although Yakir and Price did have a chance to address the critiques I raised about the AGHG (unlike the opposite), they chose not to.

Much more here -- including lots of lovely graphs.






Strange Greenie "science"

When I was a member of Greenpeace in the 1980s I received a request for money supported by the claim that about 30,000 species each year were becoming extinct. Until then I'd been an unsceptical environmentalist, but this sounded like an awful lot, so I called Greenpeace to ask how they knew. I made several queries but they didn't seem very interested. Finally they told me they didn't know where the figure came from, and I resigned from the organisation.

I later found the figure almost certainly came from the work of the biologist Edward Wilson, originally an expert on ants. Wilson made his name in the area of conservation biology in the 1960s when he proposed a mathematical model that could be used to calculate species loss due to habitat destruction. Based on this and his invention of the concept of "biodiversity", he later announced the world was experiencing "one of the great extinction spasms of geological history" and losing up to 100,000 species a year. Wilson's claims are one of the mainstays of the modern environmental movement, and a foundation of government environmental policies around the globe.

This experience with Greenpeace gave me a long-running interest in the way much environmental science involves mathematical formulas or computer models. The most famous recent examples of these are the "general circulation models" used to produce predictions of future climatic conditions. An important book has just been published by an Australian academic that raises the question of whether this should be regarded as science at all.

The book is Science And Public Policy (Edward Elgar Publishing), and the author is Professor Aynsley Kellow, the head of the school of government at the University of Tasmania. Kellow believes that environmental science has often been corrupted by the good intentions of its practitioners, so that it consists of wishful thinking rather than facts and provable theories. Perhaps the first big case of this was the notorious Limits To Growth study published by the Club of Rome in 1972, based on computer modelling and subsequently disproved. One might expect the quality of models to improve, but since then they have been used for all sorts of predictions, and there is little evidence they have got much better.

Despite this, the predictions made by such models are now contained in scientific papers published in leading journals, which gives the status of science to what is often little more than wishful thinking. Kellow describes one paper published in the journal Nature in January 2004 that "warned of the loss of thousands of species with a relatively small warming over the next century. But just how virtual was this science is apparent when we consider that the estimates of species loss depended upon a mathematical model linking species and area; modelled changes in the . distributions of areas of habitat depended in turn upon the results of climate models tuned to reflect climate changes as a result of increasing greenhouse gases . these in turn were driven by scenarios of what [such] emissions might look like over the next century, driven in turn by economic models." Kellow notes that a similar warming over the previous century had not left anything like the trail of species devastation being proposed in the paper, yet this observational data was considered irrelevant compared with the virtual world of the models.

The widespread concern over climate change is based substantially on calculations similar to the one just described. Is this a problem? Kellow thinks it is, because virtual science is ripe for manipulation, usually unconsciously, by virtuous scientists. Few people are aware of the large element of subjectivity, not only in the design of immensely complicated general circulation models, but in the data that goes into them. Even basic information such as contemporary temperatures is often incomplete or uncertain and tweaked by those who operate the models.

An interesting article on this appeared on the BBC website last month. The author is Dr John Christy, a professor of atmospheric science at the University of Alabama and a climate expert. He runs one of the two dozen or so general circulation models in the world. He says the only way to test models is to compare their predictions with outcomes "not known ahead of time", and when he has done that he has found "gross inconsistencies - hence I am sceptical of our ability to claim cause and effect about both past and future climate states".

It is not just in the field of climate change that modelling produces wildly over-pessimistic projections. Writing in the latest IPA Review, the biologist Dr Jennifer Marohasy, a senior fellow at the Institute of Public Affairs, notes that "in the lead-up to the 2001 federal election, the National Farmers' Federation and the Australian Conservation Foundation joined forces to lobby the government for $65 billion on the premise that vast areas of farmland were in ruin and salinity was spreading". But in 2005 Dr Wendy Craik, the head of the Murray-Darling Basin Commission, admitted publicly that flawed models had been used to talk up the salinity threat.

We often hear that the predictions accepted by the Intergovernmental Panel on Climate Change are based on "the science". It's important to realise that this is often a very different type of science to other science, the sort that explains why a jumbo jet won't fall out of the sky or why a certain treatment will cure a certain disease.

Source





Sunshine duration accounts for 93% of all warming since 1951

Post below lifted from Gust of Hot air. See the original for links

Abstract: Using twenty two weather stations across Australia, the variable sunshine duration is shown to have significantly increased since 1951. Its correlation with maximum temperature anomalies is highly statistically significant. By eliminating the influence of sunshine duration from the maximum temperature dataset, maximum temperature trends were shown to drop from an average of 1.4 to 0.1 degrees increase per 100 years. Hence the variable sunshine duration accounts for 93% of all positive trends in maximum temperature since 1951 in Australia. Implications of these findings and the relationship of the variable sunshine duration with respect to cloud cover trends and how they is measured will be discussed.

Our introduction on Tuesday laid out that we intend on looking at the variable sunshine duration to see if it has any effect on temperature change over the years.

Using our dataset we found a highly significant increase in maximum temperatures (t = 5.95, p < 0.001). Maybe because we have used urban stations or maybe because the bulk of the weather stations occur on the east coast of Australia (an area which has seen the majority of increase), that the rate of increase of temperature as shown on that graph (linked) is at 1.42 degrees per 100 years, which is more than greater Australia and the rest of the world.

Either way, this doesn't matter, as we are merely looking at the relationship between sunshine duration and temperature. Because some stations have data for maximum temperature that goes back further than sunshine duration (and vice versa), all years that did not have recordings for sunshine duration as well maximum temperatures were eliminated from the dataset for each individual station.

Interestingly, sunshine duration also significantly increased since 1951 (t = 2.58, p = 0.013). The strength of the trend is not as strong as temperature, but is still statistically significant.



The two variables shown on the same graph is shown above. Note that in general when temperatures are high, so too is sunshine duration and vice versa. The last six years of data highlights this. anomalies

The relationship between the maximum temperature per year per station as well as sunshine duration per year per station is shown below. The correlation between them is highly significant (t = 14.71, p < 0.001), and the r squared indicates that 17.5% of the variance of temperature can be explained by sunshine duration.

That might not sound like much, but when we account for the variable sunshine duration (i.e. minus its relationship with temperature off the original dataset), then we can analyse temperature without any influence of sunshine duration. In other words, we can look at temperature trends over the past 50 years by assuming that there has been no trends and no anomalies in sunshine duration at all.

The results are amazing. The following graph shows the temperature trend since 1951 should there be no variance in sunshine duration. The increase in temperature since 1951 still occurs, as is statistically significant (t = 5.8, p < 0.001), but take a look the rate of change of temperature, in particular the formula for the line of best fit as well as the left hand axis.



When taking into account sunshine duration, temperature rise in Australia is at the rate of 0.00099 per year or 0.099 degrees per 100 years. Now a 0.1 degree increase every 100 years is hardly anything to get worries about. It's not going to cause any great catastrophe. So we've gone from 1.4 degrees of warming per 100 years to 0.1 degrees of warming per 100 years. The variable sunshine duration has accounted for 93% (1.3/1.4) of all warming trend that we have seen since 1951.

So therefore the warming that we are seeing, is by and large highly correlated with sunshine duration. So does this mean that clouds are the major cause of global warming? Well, probably not. In order to discuss why the variable "sunshine duration" has a major effect on temperature change, we have to look into how it is measured, and the trends of clouds in Australia. And that will be in the next post







Beyond Bali: Fight Global Warming by Dumping Kyoto

Even assuming that man-caused warming is real, Kyoto is not a sane response to it

Last week at the UN's global warming meeting in Indonesia, polar bear costumed activists passed out huge pieces of cake. They were celebrating the 10th anniversary of the Kyoto Protocol, a treaty aimed at cutting greenhouse gas emissions. I wonder if they understand how their obsessive focus on Kyoto as the "only solution" hinders progress?

Kyoto is both a technical and a political failure. (If fully implemented, Kyoto will reduce global temperature by only 0.03 degrees Celsius.) Activists demanded that the U.S. sign Kyoto, but it won't. Why? Because it is a terrible deal. The U.S. would have had to bear up to two-thirds, or more, of the cost of Kyoto, likely more than all other nations combined.

A new approach is required. Gwyn Prins and Steve Rayner outline one in a new paper, The Wrong Trousers: Radically Rethinking Climate Policy. The authors believe the threats of climate change are real and that action is warranted.

But they make explicit what every serious observer has known for quite some time: "The Kyoto Protocol is a symbolically important expression of...concern about climate change. But as an instrument for achieving emissions reductions, it has failed. It has produced no demonstrable reductions in emissions or even in anticipated emissions growth."

"The politically charged rhetoric within which the climate change question is discussed means that anyone who questions...reduction goals...is regarded with suspicion. Unquestioning support for...Kyoto...has become a litmus test for determining who takes the threat of climate change seriously."

"...[Kyoto's] narrow focus on mitigating the emission of greenhouse gases (in which it has failed) has created a taboo on discussing other approaches, in particular, adaptation to climate change. ...For the past fifteen years, it has given the...public an illusion of effective action, tranquillizing political concern. This has been, perhaps, its most damaging legacy." (Read the whole thing on FREE's website.)

The most vulnerable countries are those that depend on agriculture and are at low latitudes. They need to be more resilient to climate change today.

But reducing carbon emissions will not fix poor land-use practices, restore degraded local environments, improve emergency preparedness, or eliminate floods, droughts, or disease outbreaks.

Again Prins and Rayner: "Many...activists assume that slowing greenhouse-gas emissions has logical and ethical priority over adapting to climate impacts.... It is not clear to us that the interests of millions of people in poorer countries who depend on marginal ecosystems are best served by an exclusive preoccupation with mitigation. Indeed, such a narrow focus is likely to be a fatal error."

Reducing global carbon dioxide emissions fast enough and far enough to avoid "dangerous human interference in the climate system" (defined as an increase in temperature of about 2 degrees Celsius) requires an unprecedented transformation of our energy systems. For example, to cut global carbon emissions in half by 2050 requires that, on average, the world economy in the middle of the century will have the same carbon intensity as Switzerland had in 2004.

China and India are making huge investments in energy infrastructure that has a lifetime of 50 years or more. Their fuel of choice is coal. We will not achieve meaningful carbon reductions unless we can develop the technology to capture and safely store emissions from coal-fired power plants. This requires huge R&D investments in this emerging technology.

Another area for investment is to develop storage for wind and solar energy. Wind and solar are growing energy sources, but their usefulness will be limited to a niche role unless economical, large-scale storage can be developed. This is a basic science question that has proven difficult to crack.

Addressing climate change requires international cooperation for a simple reason: it is a global problem. No nation is going to sign an agreement whose costs greatly outweigh the benefits. Permanent solutions require the discovery and adoption of new technologies. The next U.S. President should make this explicit. Constructive proposals include transferring efficient energy technologies to developing countries. This will be both cheaper and more effective than continuing to push the Kyoto Protocol.

Source







USA Today Won't Take Back Claim that Fish Poison Babies

Newspaper makes outrageous claim: 600,000 born annually with brain damage due to fish-eating mothers. More mercury madness. Prolonged exposure to high doses of mercury does indeed cause a form of madness but just the mention of mercury is enough to cause madness among Greenies and their acolytes

It's no wonder businesspeople tend to distrust the media - especially when unsubstantiated alarmist statements are made and no recourse is offered. "As many as 600,000 babies may be born in the USA each year with irreversible brain damage because pregnant mothers ate mercury-contaminated fish, the Environmental Protection Agency says," USA Today's Larry Wheeler wrote. Sounds troublesome, right?

This article in the October 29 issue of USA Today ignored the tenets of ethical journalism by advocating a position that mercury-contaminated fish were responsible for a half a million babies annually being born with brain damage. But the story didn't end there. That claim and others in the article prompted the National Fisheries Institute (NFI) to react with an exchange of various responses, but the results were unsatisfactory, according to Jim McCarthy, a spokesman for NFI.

When eating fish is linked to hurting babies, it is certain to get attention. But according to the National Fisheries Institute (NFI), the statement that 600,000 babies are born in the USA each year with irreversible brain damage is incorrect. "That is false - and in several respects," John Connelly, president of NFI wrote in a November 1 letter to Gannett News Service Editor Val Ellicott. "First, the EPA has never made any such assertion. One official from EPA, Kathryn Mahaffey, extrapolated that figure at EPA's Fish Forum conference in 2004. The agency itself subsequently disavowed connection with the assertion - and placed a disqualifying reference to it on their public website. What's more, there is not a single documented case of any child in this country with mercury levels above RfD [Reference Dose] - let alone with resulting brain damage."

Despite the NFI's efforts, USA Today was uncooperative and even issued an incorrect correction in its November 5 edition: "In 2005, the Environmental Protection Agency estimated that 410,000 newborns in the USA were born to mothers whose methyl mercury levels were higher than the maximum level the EPA considers safe. An Oct. 30 story cited a 2004 study that estimated the number at 600,000 newborns."

But USA Today was also wrong with that statement, according to Connelly: "The figure you cite, that 410,000 infants are exposed to mercury above the EPA reference dose, is taken from a presentation delivered by one (1) EPA scientist, not the agency itself. In fact, here is what EPA had contributed as a slide in the beginning of that presentation: `The Findings and Conclusions in This Presentation Have Not Been Formally Disseminated by U.S. EPA and Should Not Be Construed to Represent Any Agency Determination or Policy.' We are unaware of any change in EPA's position. Therefore, obviously, this is not an EPA estimate as you assert."

Click here to see the EPA slides. See slide #25 for the 410,000 figure - an assertion solely of Ms. Mahaffey. On slide #2, EPA itself officially disavows that assertion.

According to the Society of Professional Journalists Code of Ethics, journalists are supposed to:

* Distinguish between advocacy and news reporting.

* Support the open exchange of views, even views they find repugnant.

* Examine their own cultural values and avoid imposing those values on others.

But Wheeler disregarded those tenets when he included background in his October 29 article. The article, which has recently popped up in other newspapers including the December 3 Poughkeepsie (N.Y.) Journal, has many similarities to the left-wing Natural Resources Defense Council (NRDC) Web site on mercury-contaminated fish. "Physicians and public health officials acknowledge it is extremely difficult to `prove' a direct link between eating a tuna steak and high blood mercury levels," Wheeler wrote.

However, both Wheeler's article and the NRDC Web site cite the same expert, Jane Hightower, a San Francisco doctor. Hightower claimed a link exists between fish consumption and what she diagnosed as a disorder she calls "fish fog." But as a November 3 letter from Connelly to Val Ellicott, editor of Gannett News Service, indicates - there is no published evidence of such a condition. "Medical and scientific researchers have, in fact, been studying the effects of mercury for decades," Connelly wrote. "It should also be noted that there is no medical condition known as `fish fog.' It is not a term used in the medical community and there is no documented evidence that anyone has such a condition."

Although the story didn't run until October 29, Wheeler originally contacted NFI on September 17 and insisted on a response by 3 p.m. the next afternoon. "Our PR firm took contemporaneous notes when Wheeler first called on Monday, September 17th," Mary Anne Hansan wrote in a letter to Gannett News Service Managing Editor Laura Rehrmann. "He said he was `handing my editor a copy of the piece at 5pm today.' He went on to say he would fill the `hole' in his story if we responded no later than 3pm on Tuesday, the 18th." USA Today's rush to have NFI to respond was peculiar - because they sat on the story for 40 days before finally publishing it October 29.

Published in every issue of USA Today is a message on its editorial page pledging a "Commitment to Accuracy": "To report corrections and clarifications, contact Reader Editor Brent Jones ." However, Jones, who was unwilling to comment on this particular story to the Business & Media Institute, told BMI he isn't an ombudsman. "USA TODAY doesn't have an actual ombudsman," Jones wrote in an e-mail to BMI. "But as reader editor, I am the primary link between the news organization and its print and online audiences. The job includes reviewing and sharing input from readers, selecting and publishing letters to the editor and serving on the newspaper's editorial board."

Since the fallout from a USA Today scandal involving Jack Kelley, a longtime reporter at the newspaper who in March 2004 was discovered to have been fabricating stories, the prominence of the Reader Editor was raised by Ken Paulson, editor of USA Today, according to a Dec. 10, 2004, PBS "NewsHour" case study on the Kelley-USA Today scandal. "In every edition of the paper, a photo of Reader Editor Brent Jones appears on the editorial page, soliciting feedback and concerns about USA Today's coverage," the study said. "Jones then takes that feedback into a daily meeting with editors at the paper. He told the Online NewsHour that `readers appreciate having a direct line of communication' with the editors of the paper and that the overall reaction has been `positive.'"

Source

***************************************

For more postings from me, see TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC, GUN WATCH, SOCIALIZED MEDICINE, AUSTRALIAN POLITICS, DISSECTING LEFTISM, IMMIGRATION WATCH INTERNATIONAL and EYE ON BRITAIN. My Home Pages are here or here or here. Email me (John Ray) here. For times when blogger.com is playing up, there are mirrors of this site here and here.

*****************************************

No comments: