Friday, December 02, 2005

GULF STREAM HAS BEEN WEAKENING BUT NOBODY NOTICED

Now 30% weaker but average temperature changes have been minuscule. Another Greenie doom scenario looking absurd

The North Atlantic's natural heating system, which brings clement weather to western Europe, is showing signs of decline. Scientists report that warm Atlantic Ocean currents, which carry heat from the tropics to high latitudes, have substantially weakened over the past 50 years.

Oceanographers surveying the 'Atlantic meridional overturning circulation', the current system that includes the warm Gulf Stream current, report that it seems to be 30% weaker than half a century ago.

Failures of the Atlantic Ocean's circulation system are thought to have been responsible for abrupt and extreme climate changes during the ice age that lasted from 110,000 to 23,000 years ago. More recently, a fictional shutdown of the Gulf Stream inspired the 2004 Hollywood blockbuster The Day after Tomorrow.

The climate shifts depicted in the movie, in which New York is engulfed by an instant ice age, are mere fancy. But scientists are worried about the real changes measured in the North Atlantic. Both salinity and water density, which influence the transport of warm waters, have previously been found to be decreasing.

More here




NOT QUITE THE DAY AFTER TOMOROW: SUDDEN ICE AGE MYTH PUT TO REST

The controversial idea that global warming could trigger a sudden drop in temperatures - maybe not in a matter of days as portrayed in the recent disaster movie The Day After Tomorrow, but possibly within a century - has finally been put to rest. The latest ice core drilled from northern Greenland is showing that the last interglacial period, despite being warmer than today, did not end in a sudden freeze. Rather, it took thousands of years for the warm temperatures to give way to the next ice age.

The Greenland ice sheet is made from layers of snow that have compacted into ice over millennia. By drilling a core of ice, researchers can look back in time and determine the temperature when the snow fell by analysing the ratio of oxygen isotopes in the ice. Two previous Greenland ice cores, one known as GRIP extracted by European scientists in 1992, and another called GISP2 retrieved by Americans a year later, gave climatologists their best ever records of temperatures going far back in time.

The two cores agreed almost perfectly all the way back to 113,000 years ago, but then diverged dramatically. GRIP showed that temperatures in Greenland, and presumably worldwide, underwent many sudden fluctuations between 113,000 to 125,000 years ago. In one instance, temperatures appeared to plummet by up to 14 degrees C within 70 years. This sparked alarm because the last interglacial period, known as the Eemian, lasted from about 130,000 to 115,000 years ago, and conditions then are thought to closely parallel today's climate. Scientists worried that warm temperatures during the Eemian could have shut down the Gulf Stream, which keeps the north-eastern US and northern Europe relatively warm for their latitudes.

But controversy erupted when GISP2 found no record of such fluctuations. It soon became clear that at least one team, and possibly both, had drilled in a region where the underlying rock is very hilly, potentially jumbling the bottom 10 per cent of the ice. To resolve the debate, European researchers went back to northern Greenland in 1996 and started drilling in a region with flat bedrock, which they reached in July 2003. The new core, known as NGRIP, goes back 123,000 years, and at roughly 3085 metres it is the longest ice core recovered from Greenland.

Besides analysing the oxygen isotopes in the ice, the Europeans also looked at levels of methane trapped in air bubbles. Methane levels rise during warm periods and fall when it gets cold, and the variations back up the oxygen-isotope data. "This time we are 100 per cent certain that the ice core is reliable:' says team member Jurgen Peder Steffensen of the University of Copenhagen in Denmark. "The new analysis also shows that the two older ice cores are only reliable to 105,000 years."

The NGRIP core reaches into the final 8000 years of the Eemian. The team found that Greenland was then about five degrees warmer on average than today, and that the climate was stable. The warm period ended with a slow cooling over 5000 years (Nature, vol 1431, p 147). "This is important information," says Eric Wolff of the British Antarctic Survey in Cambridge.

"The Eemian may not be a perfect analogue of a future warmer world, but it is the best we've got. A crucial point is that this part of the Greenland ice sheet apparently did not melt substantially in spite of the high temperatures." NGRIP contains nearly a centimetre of ice for each year towards the end of the Eemian and the inception of the ice age, enough to reveal air temperature and atmospheric chemistry for each year during that period. Such data will be invaluable for understanding how an ice age starts. "The NGRIP ice core gives us a climate record of unsurpassed detail from high latitudes where the ice sheets start to grow," says Wolff.


More here




MUCH OF GLOBAL WARMING MIGHT BE NATURAL AFTER ALL, SENIOR SCIENTISTS ADMIT

(Excerpts from Quaternary Science Reviews, Volume 24, Issues 20-21 , November 2005, Pages 2164-2166)

Climate: past ranges and future changes

By Jan Esper et al.

Abstract

Comparison of large-scale temperature reconstructions over the past millennium reveals agreement on major climatic episodes, but substantial divergence in reconstructed (absolute) temperature amplitude. We here detail several research priorities to overcome this 'amplitude desideratum', and discuss the relevance of this effort for the prediction of future temperature changes and the meaning of the Kyoto protocol.




Persisting controversy (Regalado, 2005) surrounding a pioneering northern hemisphere temperature reconstruction (Mann et al., 1999) indicates the importance of such records to understand our changing climate. Such reconstructions, combining data from tree rings, documentary evidence and other proxy sources are key to evaluate natural forcing mechanisms, such as the sun's irradiance or volcanic eruptions, along with those from the widespread release of anthropogenic greenhouse gases since about 1850 during the industrial (and instrumental) period. We here demonstrate that our understanding of the shape of long-term climate fluctuations is better than commonly perceived, but that the absolute amplitude of temperature variations is poorly understood. We argue that the knowledge of this amplitude is critical for predicting future trends, and detail four research priorities to solve this incertitude: (i) reduce calibration uncertainty, (ii) preserve 'colour' in proxy data, (iii) utilize accurate instrumental data, and (iv) update old and develop new proxy data.

When matching existing temperature reconstructions (Jones et al., 1999; Mann et al., 1999; Briffa, 2000; Esper et al., 2002; Moberg, et al., 2005) over the past 1000 years, although substantial divergences exist during certain periods, the timeseries display a reasonably coherent picture of major climatic episodes: 'Medieval Warm Period', 'Little Ice Age' and 'Recent Warming' (Fig. 1). However, when calibrated against instrumental temperature records, these same reconstructions splay outwards with temperature amplitudes ranging from 0.4 to 1.0 °C for decadal means (Moberg et al., 2005). Further, a comparison of commonly used regression and scaling approaches shows that the reconstructed absolute amplitudes easily vary by over 0.5 °C, depending on the method and instrumental target chosen (Esper et al., 2005). Overall, amplitude discrepancies are in the order of the total variability estimated over the past millennium, and undoubtedly confuse future modelled temperature trends via parameterisation uncertainties related to inadequately simulated behaviour of past variability......

Solutions to reduce calibration uncertainty include the use of pseudo-proxy experiments (Osborn and Briffa, 2004; von Storch et al., 2004) derived from ensemble simulations of different models (Knutti et al., 2002; Stainforth et al., 2005) to test statistical calibration methods, e.g. principal component (Cook et al., 1994) and timescale-dependent (Osborn and Briffa, 2000) regression. Such analyses, however, should mimic the character of empirical proxy data, e.g. the decline of replication (numbers of sites, quality per site) back in time, and the addition of noise typical to empirical proxy data (i.e., not just white; Mann and Rutherford, 2002). Further, reconstructions from areas such as Europe (Luterbacher et al., 2004; Xoplaki et al., 2005), where long instrumental series and high densities of proxy records exist, allow extended calibration periods and increased degrees of freedom enabling the assessment of robust relationships at all timescales (i.e., low and high frequency), both critical to reduce calibration uncertainty. Subsequent comparison of such regional records with hemispheric reconstructions that can be downscaled should provide greater understanding of reconstructed amplitudes at larger spatial scales.....

The instrumental target data chosen (Esper et al., 2005), and adjustments made to these data are also vital to the reconstructed amplitude. A recent analysis of a carefully homogenised instrumental network from the Alps and surrounding areas (Boehm et al., 2001), for example, shows the annual temperature trend over the last ca 110 years to be 1.1 °C-twice that observed over the same alpine gridboxes in the global dataset provided by the Climatic Research Unit (Jones et al., 1999). Such changes in the character of observational data, resulting from homogeneity adjustments and methodology differences (Moberg et al., 2003), directly affect the temperature amplitude in proxy-based reconstructions, since instrumental calibration sets the pulse in these paleorecords (Büntgen et al., 2005). Accurate instrumental data are therefore crucial to the reconstructed amplitude, and this again argues for regional studies where mutual verification between proxy and instrumental records is viable (Frank and Esper, 2005; Wilson et al., 2005).

Finally, more proxy data covering the full millennium and representing the same spatial domain as the instrumental target data (e.g., hemisphere) are required to solve the amplitude puzzle. The current pool of 1000-year long annually resolved temperature proxies is limited to a handful of timeseries, with some of them also portraying differing seasonal (e.g., summer or annual) responses. Furthermore, the strength of many of these local records and literally all tree ring chronologies varies and almost always declines back in time (Cook et al., 2004). The reasons are manifold and include dating uncertainty, loss of signal fidelity in the recent period, assumptions about signal stationarity, reduction of sample replication, etc., and are generally not considered in the uncertainty estimates of combined large-scale reconstructions. Also, data from the most recent decades, absent in many regional proxy records, limits the calibration period length and hinders tests of the behaviour of the proxies under the present 'extreme' temperature conditions. Calibration including the exceptional conditions since the 1990s would, however, be necessary to estimate the robustness of a reconstruction during earlier warm episodes, such as the Medieval Warm Period, and would avoid the need to splice proxy and instrumental records together to derive conclusions about recent warmth.

So, what would it mean, if the reconstructions indicate a larger (Esper et al., 2002; Pollack and Smerdon, 2004; Moberg et al., 2005) or smaller (Jones et al., 1998; Mann et al., 1999) temperature amplitude? We suggest that the former situation, i.e. enhanced variability during pre-industrial times, would result in a redistribution of weight towards the role of natural factors in forcing temperature changes, thereby relatively devaluing the impact of anthropogenic emissions and affecting future predicted scenarios. If that turns out to be the case, agreements such as the Kyoto protocol that intend to reduce emissions of anthropogenic greenhouse gases, would be less effective than thought. This scenario, however, does not question the general mechanism established within the protocol, which we believe is a breakthrough.

(The Doi (permanent) address for the full article above is here)




THE DEATH OF SCIENTIFIC CAUTION

Mooney fails to address past and present liberal manipulations of science. While it is true that most climate scientists believe the Earth is warming, Mooney ignores arguments over the extent and implications of any warming. The Intergovernmental Panel on Climate Change (IPCC) has suggested that the Earth's temperature will rise anywhere between 0.5 and 6.5 degrees Celsius in the next 100 years, but there is little consensus on the precise figure (2). This is important, since a change of less than two degrees Celsius is likely to have negligible or even benign effects, while a change of three degrees Celsius and beyond is likely to be much more destructive.

An obvious reason why there is so little consensus is because modelling climate change is very difficult. Most computer climate simulations, for example, suggest a sharp warming in the low troposphere (the layer of air from just above the Earth's surface to about eight kilometres up) but measurements of the troposphere find less warming than predicted. Such inconsistencies are not unusual and reflect the vast number of variables that can enter a model - clouds, ground temperatures, air pressures, soil moisture, ocean currents, vegetation, population changes, energy consumption, to name a few - and the high degree of uncertainty in the prediction and measurement of these factors. Consequently different models can provide for radically different predictions of future warming, with variable implications for policy.

In the past, scientists might have hedged their predictions and provided caveats that were reasonable if infuriating to their political sponsors. A phenomenon that Mooney does not comment on is the apparent increasing willingness of scientists to abandon uncertainty in pursuit of policy changes that they see as desirable. Promoting environmental protection is seen by many scientists as a necessity that trumps any doubts they may have about their data. Even worse, scientists may engage in alarmism to promote their own field of research - attracting funding, media attention and political influence.

While Mooney is quick to denounce the pernicious influence of the fossil fuel industry he does not consider the financial, ideological and personal interests that may promote opportunism by scientists and their activist or media supporters. There have been multiple examples of scientists and their supporters peddling outlandish theories of disease and disaster (AIDS, SARS, mad cow disease, grey goo destruction, death by sugar and fat, terrorist threats, and so on) that Mooney either ignores or mentions with approval....

More here

***************************************

Many people would like to be kind to others so Leftists exploit that with their nonsense about equality. Most people want a clean, green environment so Greenies exploit that by inventing all sorts of far-fetched threats to the environment. But for both, the real motive is to promote themselves as wiser and better than everyone else, truth regardless.

Global warming has taken the place of Communism as an absurdity that "liberals" will defend to the death regardless of the evidence showing its folly. Evidence never has mattered to real Leftists


Comments? Email me here. My Home Page is here or here. For times when blogger.com is playing up, there are mirrors of this site here and here.

*****************************************

No comments: