Friday, December 18, 2009

Statistically sophisticated Model Predicts Flat Temperatures Through 2050

And refutes Greenie attempts to dismiss the temperatures of the last 10 years

While climate skeptics have gleefully pointed to the past decade's lack of temperature rise as proof that global warming is not happening as predicted, climate change activists have claimed that this is just “cherry picking” the data. They point to their complex and error prone general circulation models that, after significant re-factoring, are now predicting a stretch of stable temperatures followed by a resurgent global warming onslaught. In a recent paper, a new type of model, based on a test for structural breaks in surface temperature time series, is used to investigate two common claims about global warming. This statistical model predicts no temperature rise until 2050 but the more interesting prediction is what happens between 2050 and 2100.

David R.B. Stockwell and Anthony Cox, in a paper submitted to the International Journal of Forecasting entitled “Structural break models of climatic regime-shifts: claims and forecasts,” have applied advanced statistical analysis to both Australian temperature and rainfall trends and global temperature records from the Hadley Center's HadCRU3GL dataset. The technique they used is called the Chow test, invented by economist Gregory Chow in 1963. The Chow test is a statistical test of whether the coefficients in two linear regressions on different data sets are equal. In econometrics, the Chow test is commonly used in time series analysis to test for the presence of a structural break.

A structural break appears when an unexpected shift in a time series occurs. Such sudden jumps in a series of measurements can lead to huge forecasting errors and unreliability of a model in general. Stockwell and Cox are the first researchers I know of to apply this econometric technique to temperature and rainfall data (a description of computing the Chow test statistic is available here). They explain their approach in the paper's abstract:
A Chow test for structural breaks in the surface temperature series is used to investigate two common claims about global warming. Quirk (2009) proposed that the increase in Australian temperature from 1910 to the present was largely confined to a regime-shift in the Pacific Decadal Oscillation (PDO) between 1976 and 1979. The test finds a step change in both Australian and global temperature trends in 1978 (HadCRU3GL), and in Australian rainfall in 1982 with flat temperatures before and after. Easterling & Wehner (2009) claimed that singling out the apparent flatness in global temperature since 1997 is ’cherry picking’ to reinforce an arbitrary point of view. On the contrary, we find evidence for a significant change in the temperature series around 1997, corroborated with evidence of a coincident oceanographic regime-shift. We use the trends between these significant change points to generate a forecast of future global temperature under specific assumptions.

The climatic effects of fluctuations in oceanic regimes are most often studied using singular spectrum analysis (SSA) or variations on principle components analysis (PCA). In other words, by decomposing rainfall and temperature into periodic components. Such approaches can capture short period phenomena like the effects of El Niño , and the potential impact of longer term phenomena such as the Pacific Decadal Oscillation (PDO) on variations in global temperature. These phenomena take place over a period of years or decades. For finding and testing less frequent regime-shifts different techniques are called for. According to the authors: “An F-statistic known as the Chow test (Chow, 1960) based on the reduction in the residual sum of squares through adoption of a structural break, relative to an unbroken simple linear regression, is a straightforward approach to modeling regime-shifts with structural breaks.” All the statistical details aside, the point here is that a sequence of data that contains sudden shifts or jumps is hard to model accurately using standard methods.

The paper investigates two claims made in the climate literature: first, a proposed regime-shift model of Australian temperature with a slightly increasing trend to 1976, rapidly increasing to 1979 (the shift), and slowly increasing since then; and second, a claim of lack of statistical significance regarding the declining temperature since the El Niño event in 1998. Regarding the first, the authors state: “The increase in Australian temperature of around 0.9°C from the start of the readily available records in 1910 is conventionally modeled as a linear trend and, despite the absence of clear evidence, often attributed to increasing concentrations of greenhouse gases (GHGs).” The main reason to apply econometric techniques to climate time series data it that simple linear forecasting can fail if the underlying data exhibit sudden jumps. “That is, while a forecast based on a linear model would indicate steadily changing global temperatures, forecasts based on shifts would reflect the moves to relatively static mean values,” the study states. The choice of underlying model may also impact estimates of the magnitude of climate change, which is one of the major points put forth by this work.

As for the “cherry picking” assertion, the authors claim that the flat global temperatures since 1998 are not an anomaly but are representative of the actual climate trend. That climate trend exhibits two distinct breakpoints, one in 1978 and another in 1997. The proposed new climate model is what is know as a change point model. Such models are characterized by abrupt changes in the mean value of the underlying dynamical system, rather than a smoothly increasing or decreasing trend. Confidence in the 1978 breakpoint is strengthened by the results for global temperatures since 1910. These data indicate the series can be described as gradually increasing to 1978 (0.05 ± 0.015°C per decade), with a steeper trend thereafter (0.15 ± 0.04°C per decade).

The Chow test since 1978 finds another significant breakpoint in 1997, when an increasing trend up to 1997 (0.13 ± 0.02°C per decade) changes to a practically flat trend thereafter (−0.02 ± 0.05°C per decade). Contrary to claims that the 10 year trend since 1998 is arbitrary, structural change methods indicate that 1997 was a statistically defensible beginning of a new, and apparently stable climate regime. Again, according to the authors: “The significance of the dates around 1978 and 1997 to climatic regimeshifts is not in dispute, as they are associated with a range of oceanic, atmospheric and climatic events, whereby thermocline depth anomalies associated with PDO phase shift and ENSO were transmitted globally via ocean currents, winds, Rossby and Kelvin waves .”

Perhaps most interesting is the application of this analysis to the prediction of future climate change, something GCM climate modelers have been attempting for the past 30 years with little success. Figure 3 from the paper illustrates the prediction for temperatures to 2100 following from our structural break model, the assumptions of continuous underlying warming, regime-shift from 1978 to 1997, and no additional major regime-shift. The projections formed by the presumed global warming trend to 1978 and the trend in the current regime predicts constant temperatures for fifty years to around 2050. This is similar to the period of flat temperatures from 1930-80.
Prediction of global temperature to 2100, by projecting the trends of segments delineated by significant regime-shifts. The flat trend in the temperature of the current climate-regime (cyan) breaks upwards around 2050 on meeting the (presumed) underlying AGW warming (green), and increases slightly to about 0.2°C above present levels by 2100. The 95% CI for the trend uncertainty is dashed. Figure 3 from Stockwell and Cox.

What is even more encouraging is that, even though temperatures resume their upward climb after 2050, the predicted increase for the rest of the century is only about 0.2◦ C above present levels. That is around one tenth the increase generally bandied about by the IPCC and its minions, who sometimes predict as much as a 6°C rise by 2100. It must be kept in mind that this extrapolation is based on a number of simplifying assumptions and does not incorporate many of the complexities and natural forcing factors that are incorporated in GCM programs. Can a relatively simple statistical model be more accurate than the climate modelers' coupled GCM that have been under continuous development for decades?

Mathematical models based on statistics are often the only way to successfully deal with non-linear, often chaotic systems. Scientists often find that physical reality at its most detailed level can defy their computational tools. Consider fluid flow, which can be either laminar or turbulent. Laminar fluid flow is described by the Navier-Stokes equations. For cases of non-viscus flow, the Bernoulli equation can be used to describe the flow. The Navier-Stokes equations are differential equations while the Bernoulli equation is a simpler mathematical relationship which can be derived from the former by way of the Euler Equation.

In effect, both are ways of dealing with massive numbers of individual molecules in a flowing fluid collectively instead of individually. At the finest physical level, fluid flow is a bunch of molecules interacting with each other, but trying to model physical reality at the level of atomic interaction would be computationally prohibitive. Instead they are dealt with en mass using equations that are basically statistical approximations of how the uncountable number of molecules in flowing fluid behave. Often such mathematical approximations are accurate enough to be useful as scientific and engineering tools.

Indeed, many of these types of equations find their way into GCM to model parts of the system climate scientists are trying to simulate. Instead of simply looking at the statistical behavior of Earth's climate, GCM try to model all the bits and pieces that comprise the Earth's climate system. Unfortunately, not all of the pieces of the Earth system are well understood and many factors cannot be modeled at the course physical scales forced on the modelers because of the lack of computational capacity. As I have discussed on this blog before, simply changing the structural components of a model, leaving all of the scientific assumptions and factors intact, can radically change the answers a model cranks out (see “Extinction, Climate Change & Modeling Mayhem”). Beyond that, there are the matters of inherent data inaccuracy and error propagation as presented in The Resilient Earth chapters 13 and 14.

If the new model's prediction is true, global temperatures in 2100 will not even approach the tripwire-for-Armageddon 2°C level set by the IPCC as humanity's point of no return. Can a statistical model be better at predicting future temperatures than complex yet incomplete GCM? With the lack of theoretical understanding, paucity of good historical data, and overwhelming simplifications that have to be made to make climate models run on today's supercomputers I would have to say that the statistical model comes off pretty well. Give me a well known statistical technique over a fatally flawed climate model any day.

Be safe, enjoy the interglacial and stay skeptical.

SOURCE (See the original for links, graphics etc.)

East Anglia CRU’s below-standard computer modeling

While the rest of the world focused on the e-mails from East Anglia’s CRU describing attempts to “hide the decline” and silence critics, a programmer in the UK focused on the programming code for CRU’s computer modeling. John Graham-Cumming says he’s not an AGW skeptic, but he’s becoming a skeptic of East Anglia, thanks to its below-standard computer programming for its climate modeling. Breitbart TV has his assessment and damning conclusion in this clip, which gets down to the brass tacks: would you spend money on any conclusions offered by these models? Watch the clip below for the answer:


NASA admits sun/temperature link

It's been known to others for over 100 years but what the heck!

New measurements from a NASA satellite show a dramatic cooling in the upper atmosphere that correlates with the declining phase of the current solar cycle. For the first time, researchers can show a timely link between the Sun and the climate of Earth's thermosphere, the region above 100 km, an essential step in making accurate predictions of climate change in the high atmosphere.

Scientists from NASA's Langley Research Center and Hampton University in Hampton, Va., and the National Center for Atmospheric Research in Boulder, Colo., will present these results at the fall meeting of the American Geophysical Union in San Francisco from Dec. 14 to 18.

Earth's thermosphere and mesosphere have been the least explored regions of the atmosphere. The NASA Thermosphere-Ionosphere-Mesosphere Energetics and Dynamics (TIMED) mission was developed to explore the Earth's atmosphere above 60 km altitude and was launched in December 2001. One of four instruments on the TIMED mission, the Sounding of the Atmosphere using Broadband Emission Radiometry (SABER) instrument, was specifically designed to measure the energy budget of the mesosphere and lower thermosphere. The SABER dataset now covers eight years of data and has already provided some basic insight into the heat budget of the thermosphere on a variety of timescales.

The extent of current solar minimum conditions has created a unique situation for recent SABER datasets. The end of solar cycle 23 has offered an opportunity to study the radiative cooling in the thermosphere under exceptionally quiescent conditions. "The Sun is in a very unusual period," said Marty Mlynczak, SABER associate principal investigator and senior research scientist at NASA Langley. "The Earth's thermosphere is responding remarkably -- up to an order of magnitude decrease in infrared emission/radiative cooling by some molecules."

The TIMED measurements show a decrease in the amount of ultraviolet radiation emitted by the Sun. In addition, the amount of infrared radiation emitted from the upper atmosphere by nitric oxide molecules has decreased by nearly a factor of 10 since early 2002. These observations imply that the upper atmosphere has cooled substantially since then. The research team expects the atmosphere to heat up again as solar activity starts to pick up in the next year.

While this warming has no implications for climate change in the troposphere, a fundamental prediction of climate change theory is that the upper atmosphere will cool in response to increasing carbon dioxide. As the atmosphere cools the density will increase, which ultimately may impact satellite operations through increased drag over time.

The SABER dataset is the first global, long-term, and continuous record of the Nitric oxide (NO) and Carbon dioxide (CO2) emissions from the thermosphere. "We suggest that the dataset of radiative cooling of the thermosphere by NO and CO2 constitutes a first climate data record for the thermosphere," says Mlynczak.

The TIMED data provide a fundamental climate data record for validation of upper atmosphere climate models which is an essential step in making accurate predictions of climate change in the high atmosphere. SABER provides the first long-term measurements of natural variability in key terms of the upper atmosphere climate. As the TIMED mission continues, these data derived from SABER will become important in assessing long term changes due to the increase of carbon dioxide in the atmosphere.


Lying About Climate Change

The despicable Al Gore features in this story of how climate alarmists have knowingly and persistently lied to support their cause (or obsession). But so do prominent scientists who presumably should be held to a higher standard. Paul Reiter, a professor of medical entomology, writes in the Spectator:
I am a scientist, not a climatologist, so I don't dabble in climatology. My speciality is the epidemiology of mosquito-borne diseases. As [Al Gore's] film [An Inconvenient Truth] began, I knew Mr Gore would get to mosquitoes: they're a favourite with climate-change activists. When he got to them, it was all I feared. In his serious voice, Mr Gore presented a nifty animation, a band of little mosquitoes fluttering their way up the slopes of a snow-capped mountain, and he repeated the old line: Nairobi used to be 'above the mosquito line, the limit at which mosquitoes can survive, but now...' Those little mosquitoes kept climbing.

The truth? Nairobi means 'the place of cool waters' in the Masai language. The town grew up around a camp, set up in 1899 during the construction of a railway, the famous 'Lunatic Express'. There certainly was water there -- and mosquitoes. From the start, the place was plagued with malaria, so much so that a few years later doctors tried to have the whole town moved to a healthier place. By 1927, the disease had become such a plague in the 'White Highlands' that £40,000 (equivalent to about £350,000 today) was earmarked for malaria control. The authorities understood the root of the problem: forest clearance had created the perfect breeding places for mosquitoes. The disease was present as high as 2,500m above sea level; the mosquitoes were observed at 3,000m. And Nairobi? 1,680m.

These details are not science. They require no study. They are history. But for activists, they are an inconvenient truth, so they ignore them. Even if Mr Gore is innocent, his advisers are not. They have been spouting the same nonsense for more than a decade. As scientists, we have repeatedly challenged them in the scientific press, at meetings and in news articles, and we have been ignored.

In 2004, nine of us published an appeal in the Lancet: 'Malaria and climate change: a call for accuracy'. Clearly, Mr Gore didn't read it. In 2000, I protested when Scientific American published a major article loaded with the usual misrepresentations. And when I watched his animated mosquitoes, his snow-capped mountain was oddly familiar. It took a few moments to click: the images were virtually identical to those in the magazine. The author of the article, Dr Paul Epstein, features high in Gore's credits.

Dr Epstein is a member of a small band dedicated to a cause. And their work gains legitimacy, not by scholarship, but by repetition. While they publish their work in highly regarded journals, they don't write research papers but opinion pieces and reviews, with little or no reference to the mainstream of science. The same claims, the same names; only the order of authors change. I have counted 48 separate pieces by just eight activists. They are myth-makers. And all have been lead authors and/or contributory authors of the prestigious [United Nations Intergovernmental Panel on Climate Change] assessment reports.

Take their contention, for example, that as a result of climate change, tropical diseases will move to temperate regions and malaria will come to Britain. If they bothered to learn about the subject, they would know that in a period climatologists call the Little Ice Age, when Charles II held ice parties on the Thames, malaria -- 'the ague' -- was rampant in the Essex marshes, on a par even with regions in Africa today. In the 18th century, the great systematist Linnaeus wrote his doctorate on malaria in central Sweden. In 1922-23 a massive epidemic swept the Soviet Union as far north as Archangel, on the Arctic circle, killing an estimated 600,000 people. And malaria was only eliminated from the Soviet Union and large areas of Europe in the 1950s, after the advent of DDT. So it's hardly a tropical disease. And yet when we put this information under the noses of the activists it is ignored: ours is the inconvenient truth.

That's the story of climate activism. Whenever one portion of the evidence alarmists rely on is shown to be fraudulent, the response is, "But there's lots of other evidence." Yes, and that other evidence is fraudulent, too.


What Energy Crisis? The Truth About America's Oil Reserves

One of the stories that we’ve been hearing for years now, in justification of the government’s refusal to allow any more drilling or the construction of any new refineries, is that our oil reserves are so low that they won’t last us very long if we use them.

It’s true that many people believed that in the past, and many apparently still do, but the truth that’s beginning to emerge now, is going to bury that idea, in the same way that we’re currently debunking the absurd idea of Anthropogenic Global Warming.

While there have been some stunning new discoveries, a huge part of our new found reserves, has come as a result of recent developments in drilling technology which now allow us to drill far deeper, as well as allowing us to change the direction that we’re drilling in once we’re already down quite deep. As a result of these new techniques, many of the existing proven reserves, can now supply us with far more oil than we thought possible before.

But the numbers are still stunning! According to our government’s Energy Information Agency’s (EIA) latest report (March 2009) the total known oil reserves of the world amounted to roughly 1.3 trillion barrels of which the USA had only 21 billion. While that may sound like a lot of oil to some, it amounted to only 1.6% of the world’s known reserves. According to those statistics, Saudi Arabia, with 265 billion barrels, had close to 13 times as much oil in reserves, as the entire United States.

Now here’s the good news: According to an Investor’s Business Daily ( report dated Nov 5, 2009, the Congressional Research Service (CRS) in now reporting that the total known energy reserves of the world amounts to roughly 5.58 trillion barrels which is 4.3 times greater than what we had previously been told by the EIA. What’s even more exciting, is that Saudi Arabia, with 540 billion barrels no longer has the biggest slice of the pie. WE DO!

That’s right. According to the CRS statistics, America has 1.32 TRILLION barrels of reserves, or 2.4 times as much as Saudi Arabia, and this doesn’t include all the oil in the shale & tar sands which are mostly in Canada anyway. According to the report, the breakdown of the world’s major reserves are as follows: (in billions of barrels)

1: USA: 1,320
2: Russia: 1,250
3: Saudi Arabia: 540
4: China: 490
5: Iran: 430

Numbers 6 thru 10 belong to the lesser Gulf States, along with Canada at 220, Venezuela at 160, and Nigeria at 130

The truth about the reserves has been known around the oil patch for some time now, but it totally freaks out the enviro fascists, because the majority of these reserves are located in our territorial waters, and all those mindless fools can think of is images of oil rigs visible all along the west coast, and the east & gulf coasts as well. Of course the truth of the matter is that very few of those rigs would be visible at all, as most of the deposits are far enough from the beaches that they couldn’t be seen at all.

The next cry we’ll hear from the enviro fascists is how those rigs would cause an environmental disaster that would destroy the oceans of the world. By the time that happens we’ll probably be referring to BS like that as an “AlgorLie”, because it’s so darn far from the truth.

If you ask any experienced fisherman about offshore oil rigs, you’ll learn that the rigs actually become artificial reefs that attract marine life, not repel it. Fishermen LOVE them, and just like hunters, fishermen for the most part are the true environmentalists of their realm.

The risk to the environment from spills from offshore rigs, is extremely small, and frankly insignificant by comparison to the benefits that they will produce. With the trillions of dollars in wealth that they’d be creating, we’ll be able to clean up the worst possible spills far more efficiently than we have in the past, and we must remember that even the worst spills can indeed be cleaned up. It took quite a while to clean up the majority of the mess from the EXXON Valdez, and mother nature will take care of the rest over time, but the bottom line is that taking such a small risk in order to bring America back from her current bankrupt condition back to being the richest nation in the world once again, is a risk well worth taking. The only potential concern that I have is that we must first get rid of the socialists and communists who are currently running our country, because if a sociopath like Obama were to get his hands on that much wealth, we’d never be able to get out from under his control.

Is there nobody in the government or mainstream media who will tell us the truth about anything anymore? “America has an energy crisis, America is bad, the Israelis are the real terrorists, and on top of that, the world’s about to be roasted on a spit”


The biters bit: Greenie thugs complain that the Japanese use aggressive tactics

They don't like a dose of their own medicine. Typical attitude: Everybody else has to obey the rules but we can do as we like. This lot are very vicious and have attacked whaling ships in the past so the Japanese are fully entitled to use strong countermeasures

THE anti-whale hunting group the Sea Shepherd said a Japanese security ship has illegally followed them into French Antarctic waters and fired "a military class weapon" at their helicopter. The Sea Shepherd Conservation Society said the Japanese ship the Shonan Maru No 2 had been following its vessel, the Steve Irwin, for nine of the 10 days it had been sailing since leaving Fremantle. Sea Shepherd said the Shonan Maru had been constantly reporting the Steve Irwin's location, preventing it from closing in on the Japanese whaling fleet.

A Sea Shepherd spokeswoman said the Steve Irwin entered French Antarctic territorial waters, with permission from the French Base at Dumont Durville, in a bid to lose the Shonan Maru. But, the Shonan Maru No 2 followed the Steve Irwin without the French's permission, Sea Shepherd said. The Sea Shepherd group said the French base had confirmed that the Japanese ship neither requested nor received permission to enter French waters.

The spokeswoman said the Steve Irwin helicopter then flew back to film the Shonan Maru in its pursuit. "In response, the crew of the Japanese ship activated their Long Range Acoustical Device (LRAD) at the Sea Shepherd helicopter," she said. "LRAD is a military class weapon." Helicopter pilot Chris Aultman said activating the weapon was extremely irresponsible. [And ramming whaling ships IS responsible??] "That device can cause nausea and disorientation and the use of it against an aircraft is extremely dangerous," he said.

The Sea Shepherd helicopter returned to the Steve Irwin for safety, at which point the Shonan Maru increased speed and aimed their water canons at the helicopter on the landing pad, the spokeswoman said.

Steve Irwin Captain Paul Watson said the situation was now very dangerous. "We have deliberately led the Japanese ship into thick ice in order to lose them in the ice," he said. "The icebergs could easily damage either vessel."

Sea Shepherd said had reported the incident to the French authorities adding that so far the Steve Irwin ship was undamaged but that the pursuit continued.



For more postings from me, see DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC, GUN WATCH, SOCIALIZED MEDICINE, AUSTRALIAN POLITICS, IMMIGRATION WATCH INTERNATIONAL and EYE ON BRITAIN. My Home Pages are here or here or here. Email me (John Ray) here. For readers in China or for times when is playing up, there are mirrors of this site here and here


No comments: