Wednesday, October 23, 2013

Antarctic Ice Sets New All Time Record High In October

NSIDC are now back up and running again, after the Federal shutdown.

Quite astonishingly, Antarctic sea ice has set another record for maximum extent, beating the previous record of 19.513 million sq km, set on 21st September this year.

Note that current levels are more than 2SDs above average, which is huge  --JR

What makes the new record so astonishing is that it was set in October, on the 1st. Climatologically, the maximum extent is reached on 22nd September, so it is most unusual for the ice still to be growing 10 days later.

As at the 18th October, extent is still running at 998,000 sq km above normal.

With the Arctic ice running at 728,000sq km below normal, this means that global sea ice is 270,000 sq km above the 1981-2010 norm.


2013 – a year with MINIMAL extreme weather events in the US

There have been many forecasts in the news in recent years predicting more and more extreme weather-related events in the US, but for 2013 that prediction has been way off the mark. Whether you’re talking about tornadoes, wildfires, extreme heat or hurricanes, the good news is that weather-related disasters in the US are all way down this year compared to recent years and, in some cases, down to historically low levels.


To begin with, the number of tornadoes in the US this year is on pace to be the lowest total since 2000 and it may turn out to be the lowest total in several decades. The table below lists the number of tornadoes in the US for this year (through 10/17) and also for each year going back to 2000.

Year         # of Tornadoes
2013                    771
2012                   1119
2011                   1894
2010                   1543
2009                   1305
2008                   1685
2007                   1102
2006                   1117
2005                   1262
2004                   1820
2003                   1374
2002                    938
2001                   1219
2000                   1072


Second, the number of wildfires across the US so far this year is on pace to be the lowest it has been in the past ten years and the acreage involved is at the second lowest level in that same time period (table below).

2013            Fires: 40,306           Acres: 4,152,390
2012            Fires: 67,774           Acres: 9,326,238
2011            Fires: 74,126           Acres: 8,711,367
2010            Fires: 62,471           Acres: 3,233,461
2009            Fires: 78,792           Acres: 5,921,786
2008            Fires: 80,094           Acres: 5,254,109
2007            Fires: 85,822           Acres: 9,321,326
2006            Fires: 96,358           Acres: 9,871,939
2005            Fires: 66,552           Acres: 8,686,753
2004            Fires: 63,608           Acres: 8,097,880
*2013 data through 10/16

Extreme Heat

In addition to wildfires, extreme heat is also way down across the US this year. In fact, the number of 100 degree days across the country during 2013 is not only down for this year, but it is perhaps going to turn out to be the lowest in about 100 years of records (chart below).

The five summers with the highest number of 100 degree days across the US are as follows: 1936, 1934, 1954, 1980 and 1930. In addition to the vast reduction in 100 degree days across the US this year, the number of high temperature records (ie hi max and hi min records) is way down compared to a year ago with 22,965 records this year as compared with 56,885 at this same time last year.


Finally, as far as hurricanes are concerned and keeping in mind that the season isn't over yet, there have been only two hurricanes so far this year in the Atlantic Basin (Humberto and Ingrid) and they were both short-lived and weak category 1 storms. Also, the first forming hurricane this year occurred at the second latest date going back to the mid 1940’s when hurricane hunters began to fly. Overall, the tropical season in the Atlantic Basin has been generally characterized by short-lived and weak systems.

In addition, this suppressed tropical activity has not been confined to just the Atlantic Ocean. The eastern Pacific Ocean has had no major hurricanes this season meaning there has been no major hurricane in either the Atlantic or eastern Pacific which only occurred one other year in recorded history – 1968. This is actually quite extraordinary since the two basins are generally out of phase with each other i.e. when one is inactive the other is active.

One of the best ways to measure “total seasonal activity” in the tropics is through an index called the Accumulated Cyclone Energy (ACE) which is a metric that accounts for both intensity and duration of named tropical storms. Indeed, the ACE for this tropical season so far in the Atlantic Basin is only 29% percent of normal (through 10/17) when compared to the climatological average from 1981-2010 and it is the 7th lowest since 1950. Elsewhere, the ACE across the northern hemisphere is only 58% of normal and global ACE is 62% of normal.

Finally, another interesting stat with respect to hurricanes has to do with the fact that we are currently in the longest period since the Civil War Era without a major hurricane strike in the US (i.e., category 3, 4 or 5). The last major hurricane to strike the US was Hurricane Wilma during late October of that record-breaking year of 2005 - let’s hope this historic stretch continues. By the way, just as a point of comparison, in 1954 the US was hit by 3 major hurricanes in less than 10 weeks.

SOURCE  (See the original for links and references)

Feds Will Spend $18M to Develop ‘Reliable’ Climate Change Predictions

LOL!  Good luck with that!

–The National Science Foundation (NSF) and the U.S. Department of Agriculture have plans to spend up to $18 million over the next five years to develop “reliable” climate change predictions for the next few decades.
The “funding opportunity enables interagency cooperation on one of the most pressing problems of the millennium: climate change and how it is likely to affect our world,” according to NSF’s official request for bids. (See NSF Decadal & Regional Climate Prediction.pdf)

“This solicitation is intended to support development of reliable regional and decadal climate predictions that take into account the influences of living systems and are essential for projecting how living systems might adapt to climate change and its consequences for their physical environment,” the program solicitation explains.

Current methods of predicting future climate change have proved to be wildly inaccurate. For example, none of the 73 computer models used by the United Nation’s Intergovernmental Panel on Climate Change (IPCC) predicted that there would be no statistically significant global warming for the past 17 years as determined by actual temperature records stored in five different databases worldwide.

And despite claims that climate change caused by global warming is causing "extreme weather," a new study by the SI Organization, Inc., a systems engineering firm, ranks 2013 as “one of the least extreme U.S. weather years ever,” noting that “there has been no major hurricane in either the Atlantic or eastern Pacific, which only occurred one other year in recorded history – 1968.”

“Whether you’re talking about tornadoes, wildfires, extreme heat or hurricanes, the good news is that weather-related disasters in the US are all way down this year compared to recent years and, in some cases, down to historically low levels,” SI points out.

But when asked Eric C. Itsweire, program director at NSF’s Directorate for Geosciences, how the lack of global warming over the past 17 years impacts the project's primary assumption that climate change is "one of the most pressing problems of the millennium," he replied that "there is more than one aspect in assessing whether there is climate change, so there is not a simple answer to your question. That’s more a policy question."

“There’s no political agenda on our end," Itsweire added. "We’re just a research agency trying to get the best ideas to understand the complicated system we live in.”

However, merely increasing scientific knowledge about the Earth’s complex climate system is not the only goal of the $18 million federally funded study, according to NSF’s own request for bids, which are due December 23. The results will be used to “effectively translate climate predictions and associated uncertainties into the scientific basis for policy and management decisions related to human interventions and adaptation to the projected impacts of climate change.”

NSF, which has a $6.8 billion budget, is an independent, federally funded agency dedicated to the promotion of scientific progress.


Global warming or not: What can you believe?

The battle over whether our nation should take aggressive actions to deal with climate change continues unabated.  One would think that it would be easy to determine the nation’s policy based upon what the science on the ground shows.

However, that is where it gets tricky, because the science and indeed, even the headlines are anything but settled.

This summer,  sailors  who believed those who predicted that the Arctic would be ice free this summer, became trapped with predictably disastrous consequences.

Yet, the Nordic Orion, a heavy ice graded bulk cargo freighter, became the first ship to traverse the Northwest Passage giving credence to those who claim that the Arctic ice is melting.

Independent scientists report that Arctic ice has increased by 29 percent this past summer with 533,000 square miles more ice recorded than the previous year.

Yet, NASA reports that this summer’s minimum  was the sixth lowest Arctic ice extent of the satellite record and is 432,000 square miles (1.12 million square kilometers) lower than the 1981-2010 average.

The Washington Post reports that South Pole sea ice has reached a 35 year high, confounding scientists to explain why Antarctic sea ice is actually growing rapidly on a year to year basis.

Yet, Environmental Times expresses concerns about the stability of the western Antarctic ice shelf due to alleged warmer temperatures.

There is universal agreement that global temperatures have not risen over the past sixteen years, which has proven to be an inconvenient truth to those who are pushing the global warming mantel.  In fact, the release of the Intergovernmental Panel on Climate Change’s most recent report was delayed as governments demanded an explanation for this pause to be included in the document.

Yet, this latest report from the IPCC projects that the next seventeen years will see a .5-1.0 degree average increase in temperature than the period from 1986 to 2005.  This projection directly ignores the broken hockey stick that underlay their previous erroneous projections.

The contradictions go on and on, as the former claimed scientific consensus collapses around data that just doesn’t conform to computer models used as the basis of directing policies that have harsh economic impacts on the economies of developed countries.

In fact, while the argument rages over whether the earth is warming, there is an even more intense debate over what is causing the “warming” with some NASA scientists pointing to solar cycles, others looking at normal ocean warming and cooling cycles and still others pointing to increases of CO2 in the atmosphere.

At a time when the United States Environmental Protection Agency is engaged in an aggressive anti-carbon campaign that will significantly diminish the supply of electricity in our nation over the next three years, one would hope that they would look up from their computers and ask the question, “What if we’re wrong?”

Somehow, I suspect that is a question that never crosses the minds of the green zealots who are determined to save the world, even when Mother Earth herself seems to be telling them that she doesn’t need their help.


The UN IPCC's Climate Modeling Procedures Need Serious Remodeling

The United Nations Intergovernmental Panel on Climate Change (IPCC) climate modeling produces terrifying scenarios of global warming and apocalyptic consequences including polar melting, coastal flooding, extreme weather, and species extinctions. Such forecasts assume that human emissions of greenhouse gases—principally carbon dioxide from burning coal and oil—will cause unprecedented warming.

Although the IPCC claims that they provide “scenarios” rather than forecasts, they conflate the terms in practice to suggest that their scenarios describe what will actually happen to climate over the 21st Century.

Are they right? Can they predict the future?

Two leading experts on forecasting and one on the physics of climate joined forces to investigate IPCC modeling procedures and conclusions. They are Dr. Kesten Green, Professor J. Scott Armstrong, and Dr. Willie Soon. What they found was truly shocking: the modeling procedures that the IPCC relies upon to produce their scary climate change scenarios ignore most principles of scientific forecasting.

Kesten Green, based at the University of South Australia in Adelaide, has published pioneering articles on forecasting methods and is co-director of a major website on forecasting methods, Scott Armstrong teaches at the University of Pennsylvania in Philadelphia and is a founder of the two major journals on forecasting methods, editor of the Principles of Forecasting handbook, and the world’s most highly cited author on forecasting methods. Willie Soon is a solar physicist who has published many important empirical papers on the causes of climate changes.

I have asked Kesten to discuss their findings.

Kesten, through your joint investigations into this matter I understand that you found that the modeling procedures the IPCC uses to create their of climate change projections violated 72 of 89 relevant forecasting principles. Should we be concerned that an extraordinarily well-funded international agency reporting to governments has followed less than 20 percent of what could be thought of as scientific procedures?

Larry, there really is no excuse for the negligence IPCC has shown in overlooking the findings of nearly a century of research on forecasting methods in addressing matters so vital to the public interest. They can hardly claim they have lacked necessary resources to discover and apply the best that scientific forecasting has to offer.

Here I’ll briefly provide a little background on the discipline of forecasting.  Thirty-nine experts from many disciplines and from around the world developed forecasting principles from published experimental research. A further 123 forecasting experts reviewed their work. The principles were published in 2001, and are freely available on the Internet as a public service to help forecasters produce and validate their work. The 140 principles constitute the only published set of evidence-based standards for forecasting.

The media ascribe great importance to the IPCC scenarios, and they have considerable influence on government policies – energy policies in particular. To put the IPCC methodological failures into a familiar context, please consider this. Would you go ahead with an international flight operated by an obscure airline if you overheard two of the ground crew discussing how the pilot had skipped 80 percent of the pre-flight safety checklist?

Kesten, with all this knowledge available about how to forecast properly, has anyone taken the trouble to actually produce scientific climate forecasts?

Larry, astonishingly, given the extremely costly policies that have been proposed and implemented in the name of preventing dangerous “man-made global warming”, there is only one published peer-reviewed paper that claims to provide scientific forecasts of long-range global mean temperatures. The paper is our own, a 2009 article published in the International Journal of Forecasting.

So how do the long-range forecasts you obtained compare with the scenarios that the IPCC promotes to policymakers and the media?

First we followed forecasting principles to choose the most appropriate forecasting method. We then applied it to predict global mean temperatures since 1850 (roughly the start of the Industrial Revolution), as measured by the same data that the IPCC used. To choose a method, we examined the state of knowledge and available empirical data.

In this application we concluded that the “no-trend” model is the proper method to use. Our conclusion is based upon a substantial body of research that found complex models do not work well compared to simple models in complex and uncertain situations. This is the case here where the climate is so complex and insufficiently understood that any net effect of human emissions on global temperatures cannot be identified.

By contrast, the IPCC relies upon complicated computer models to represent their assumption that the relatively small human contribution of CO2 to the atmosphere will cause dangerous global warming. And because the models are so complex, the modelers need to include numerous assumptions. Many mainstream scientists question those assumptions.

Not surprisingly, given that they were the product of scientific forecasting principles, our no-trend forecasts were much more accurate than the IPCC’s warming scenario temperatures over the period of exponentially increasing CO2 emissions from 1850.


Climate censorship

By forecasting expert Professor J. Scott Armstrong

Censorship of skeptic global warming views by the press has been going on for many years. This week, Paul Thornton, letters editor for the Los Angeles Times announced the paper will “no longer publish letters from climate change deniers,” as reported by

Thornton says, “Simply put, I do my best to keep errors of fact off the letters page; when one does run, a correction is published. … Saying ‘there’s no sign humans have caused climate change’ is not stating an opinion, it’s asserting a factual inaccuracy.”

Really? Is this kind of censorship good public-service policy for the Los Angeles Times?

It is a good policy for the global warming alarmist movement because those who are more knowledgeable about climate change are more likely to dismiss the alarm as unfounded.

Aristotle suggested that persuasiveness is higher when both sides of an issue are presented.

It is not so good for citizens who would otherwise benefit from the freedom to make up their own minds after being exposed to different arguments and diverse evidence.

Is such censorship good business for newspapers and other mass media? Given that most people in the U.S. do not believe that there is a global warming problem, this seems doubtful.

One-sided coverage loses readers who do not share the editorial viewpoint.

Aristotle suggested that persuasiveness is higher when both sides of an issue are presented.

Later research found that Aristotle’s suggestion only works when one can rebut the other side.

Failing that, it is best to try to prevent the other side from being heard.

If persuasion is the goal, and not science, then it is sensible for the warming alarmists to avoid two-sided discussions.

In our study of situations that are analogous to the current alarmover global warming, Kesten Green and I identified 26 earlier movements based on scenarios of manmade disaster (including the global cooling alarm in the 1960s). None of them were based on scientific forecasts. And yet, governments imposed costly policies in response to 23 of them.

In no case did the forecast of major harm come true.

Will it be different this time?

Isn’t it important for the public to be informed about scientific evidence on the issue? And because the alarm is based on the fear of future harm, shouldn’t the public insist on scientific forecasts?

The UN’s Intergovernmental Panel on Climate Change (IPCC) uses models that provide computer scenarios, not forecasts, of dangerous manmade global warming.

When we assessed the scenarios as if they were forecasts of what would actually happen, Kesten Green and I found that they violated 72 of 89 relevant scientific forecasting principles.

Would you go ahead with your flight, if you overheard two of the ground crew discussing how the pilot had skipped 80 percent of the pre-flight safety checklist?

For rational policy-making and regulating, scientific forecasts are necessary.

We are astonished that there is only one published peer-reviewed paper that claims to provide scientific forecasts of long-range global mean temperatures. The paper is a 2009 article in the International Journal of Forecasting by Kesten Green, Willie Soon, and me.

When we tested our forecasts against the IPCC scenarios using data from 1850 to the present, our forecasts, based on a model that adhered to scientific principles, were more accurate over all forecast horizons from 1 to 100 years.

They were especially more accurate for long-term forecasts.

For example for forecasts 91 to 100 years ahead, the IPCC forecast errors were over 12 times larger than our forecast errors. Perhaps that qualifies as relevant evidence for citizens. And it would be “news” for 99% of them. Yet our forecasts received virtually no mass media coverage. Meanwhile, non-scientific climate-scare “forecasts” regularly get widespread attention from the mass media.

Want to bet which forecast of global mean temperatures is going to be correct? Mr. Gore did not want to bet against me in 2007 when he was warning that the world was at a climate “tipping point.” That was wise decision on his part. Scientific forecasting methods tend to be more accurate than political forecasting methods.

Fortunately, with many mass media outlets attempting to influence people by using censorship, citizens are able turn to alternative sources of information and argument on the Internet to inform their decisions. And many have. The polls provide evidence that the alarmist case is so weak that even with widespread censorship, citizens are not persuaded.



For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here


No comments: