Tuesday, December 05, 2006


The paper below is very technical but focuses on the fact that much time-series data has gaps in it -- and that it is usual to replace the missing data with "estimates". The author shows that this can make the data look like it supports some theory when it does not. The parallel with global warming "models" (which are full of guesswork) is obvious. Excerpts only below.

Comment on "Long-period astronomical forcing of mammal turnover" by van Dam et al., Nature 443:687-691

By Mensur Omerbashich


[1] claimed 2.4-2.5 and 1.0 Myr turnover cycles in a Spanish rodent lineages record. However, the record's variance-spectrum, which is missing in [1], shows that the varying reliability and multiple alterations of raw (gapped; unaltered) data by [1] unnaturally boosted the 2.5 and 1.0 Myr noise-cycles to the 99% confidence level, while failing to recognize a third such "99% significant" noise period, of 0.55 Myr at 5.3 var%. Thus at least one claimed period (of 1.0 Myr) is a simple modulation of a relatively stronger noise cycle (of 0.55 Myr) overlooked by [1]. All "99% significant" periods reach mere 5-6% var levels which can be hardly distinguished from noise: those periods' fidelity is at staggering 1-2 orders of magnitude below the usual signalnoise separation marker at 12.0. Remarkably, at least ten noise-periods got boosted to 95% confidence level, and some five noise-periods to near 95% confidence level, as well. Even the zero padding of just 4% of data, as done by [1], significantly suppresses (hence unreported) the strongest 99% significant period, of 7.28 Myr at 7.5 var%. Therefore, the periods claimed are due to strong noise reflection of some intermediary. As hand-waving cyclic-cataclysm claims start to frequent scientific journals, revision of editorial policies is called for on spectral analyses of inherently gapped long records, and of records composed mostly of natural data of significantly inconsistent reliability.


Studies claiming that cataclysmic events are responsible for cyclic variations in long records of natural data appear occasionally in scientific journals. Due to their obvious sensationalism, such reports are almost certain to attract broader public attention. However, long records of natural data in many cases are inherently gapped, and since they are also long the information that they carry are burdened with various influences from many different intermediaries that played a role in the creation of the record.

Unfortunately, it is a common approach in the spectral analysis community to simply proceed to edit such records in order to make them fit the (mostly Fourier) spectral analysis algorithms. This means that the original raw (gapped; unaltered) data and all of data distributions present generally are assumed as entirely understood. One assumption follows another, and it soon becomes easily and erroneously believed by many that data "preparation" could not affect the raw data significantly. Consequently, records end up heavily edited, zero-padded, with values invented, trends subtracted etc.

One such recent study of a mammal record from central Spain [1] reported "new periods" allegedly so close to certain astronomical cycles that a claim of cataclysmic causality was immediately laid.

Yet another recent study claimed to have found a new period in a world fossil record [2], which was unlike any other known astronomical cycle but still those authors too made a catastrophic causality claim... In the latter case a closer inspection showed that the cycles claimed by [2] were in fact byproduct of the data treatment applied therein [3]. I show here that the former study too is biased in the same manner, except it produces a result that is coincidentally (by way of a noise intermediary) close to a known astronomical period. The notion of wrong data treatment stands since both of those studies claimed cataclysmic disappearance / reappearance of genera / species, without seriously addressing the characteristics of the spectral analysis technique or its applicability to the data of interest.



Experience teaches us that even a single catastrophic event claim, when promoted in the media, could result in havoc. It seems inevitable that researchers could soon start making "predictions" out of the many reports alleging some "99% certain" past recurring cataclysms. As the handwaving cyclic-cataclysm reports start to frequent scientific journals, revision of editorial policies is called for in cases of the spectral analyses of long and inherently gapped records of natural data, or, more generally, records that contain natural data most of which have significantly varying reliability. Such revisions could entail approaches presently unthinkable of when refereeing those reports, like the imposing of mandatory blind test(s) using synthetic data, or/and repetitions of critical computations, or/and testing results (using independent methods) for spectrum distortion due to data manipulations, and so on. Although seemingly elementary, such measures could prevent researchers from missing a bigger picture, as they get technical.

Spectral analysis is at least as much art as it is science, requiring various choices to be made prior to punching the data into an algorithm. Proper choice of the analysis technique (algorithm) must not be the first but instead the last of those choices to be made. This should be preceded by considerations such as selection of the criteria for data treatment approaches to be used. (A fundamental such criterion, of using raw data, was applied here.)

Hopefully, this could prevent blunders like [1] from ever entering into press. The primary goal of scientific analyses of physical time series should be to responsibly produce publicly useful information on cyclic natural phenomena, so that science is not being undermined by the ill reputes of cyclic failures. This call for action coincides in time with challenges faced by humankind in taking full responsibility (instead of accusing nature and divine acts alike) for harming life and environment on Earth.



Cattle-rearing generates more global warming greenhouse gases, as measured in CO2 equivalent, than transportation, according to a new United Nations report released on Thursday. "Livestock are one of the most significant contributors to today's most serious environmental problems," senior UN Food and Agriculture Organization (FAO) official Henning Steinfeld was quoted by the Ghana News Agency as saying on Thursday. "Urgent action is required to remedy the situation," he said in a statement released by the UN Information Centre in Accra. According to the report, smarter production methods, including improved animal diets to reduce enteric fermentation and consequent methane emissions, are urgently needed.

Cattle-rearing is also a major source of land and water degradation, according to the FAO report, Livestock's Long Shadow-Environmental Issues and Options, of which Steinfeld is the senior author. "The environmental costs per unit of livestock production must be cut by one half, just to avoid the level of damage worsening beyond its present level," it warns.

When emissions from land use and land use change are included, the livestock sector accounts for 9 percent of CO2 deriving from human-related activities, but produces a much larger share of even more harmful greenhouse gases. It generates 65 percent of human-related nitrous oxide, which has 296 times the Global Warming Potential (GWP) of CO2.

The report said this accounted for respectively 37 percent of all human-induced methane (23 times as warming as CO2), which is largely produced by the digestive system of ruminants, and 64 percent of ammonia, which contributes significantly to acid rain. "With increased prosperity, people are consuming more meat and dairy products every year," the report notes. Global meat production is projected to more than double from 229 million tons in 1999/2001 to 465 million tons in 2050, while milk output is set to climb from 580 to 1043 million tons.



They are at last admitting that the "hockey stick" picture of temperature stability before the 20th century is wrong -- but only because they think they can find an explanation for one of the earlier cold periods that does not upset their theories too much. The most honest sentence in the article is however the last one -- which I have highlighted in red.

The Gulf Stream - the ocean current that helps to bring warm weather to much of the North Atlantic region - was significantly weakened during the period known to historians as the Little Ice Age, new research reveals. The discovery supports the notion that a slowing of ocean currents - as some fear might happen in our future - can have significant consequences for climate.

From around 1200 until 1850, during which average temperatures across the Northern Hemisphere dipped by around 1 oC, the strength of the Gulf Stream also slackened by up to 10%, oceanographers report. The Gulf Stream, which is part of a vast pattern of currents nicknamed the ocean conveyor belt, carries warm surface waters from the tropical Atlantic northeastwards towards Europe. The reduced flow that occurred during medieval times would have transported less heat, contributing to the icy conditions that persisted until Victorian times.

"This gives us some sense of the natural range in strength. If the change is greater in the future then maybe that will mean something unusual is happening," says David Lund of the California Institute of Technology in Pasadena, who led the research while based at the Massachusetts Institute of Technology and the Woods Hole Oceanographic Institution.

Weakened waters

A weakening Gulf Stream has been predicted to have dire consequences for temperate climates in the Northern Hemisphere. But oceanographers say that it is very unlikely to shut down, as depicted in the Hollywood blockbuster The Day After Tomorrow. "That's definitely an absurdity," Lund says.

But the new research by Lund's group shows what can happen if the Gulf Stream is weakened. He and his colleagues studied the remains of tiny animals called foraminifera in sediments off the coast of Florida, where currents feed into the Gulf Stream. Changes in the composition of oxygen isotopes in their shells reflect changes in water temperature and salinity, which in turn reveals the density of the water they were living in.

Mapping the water density between Florida and the Bahamas gives the researchers a picture of how fast the current was moving between them. Lund and his team report their results, which extend back some 1,000 years, in this week's Nature1.

Fresh or salty

Lund and his colleagues think that the Gulf Stream's weakening was caused by a southward shift of the zone of tropical rains that usually feed fresh water into the Atlantic Ocean off the coast of Florida. This rain provides a less-dense top layer of water that bolsters the surface current flowing north. Their measurements show that, during times when the current was weakest, the waters were saltier, suggesting that they contained less fresh water from rain.

The slowing of the current this way can fix itself, however - the extra saltiness of the water should help the water to sink at the northern end of its cycle, Lund says, driving the bottom half of the ocean circulation and re-energizing the current.

This process is in contrast to current fears about the Gulf Stream. Climatologists are worried that continued melting of the Greenland ice sheet could dump too much fresh water into the northern end of the circulation system, where cold waters normally sink and drive the bottom half of the current - dense waters flowing south along the ocean floor. Too much fresh water in the north makes the water less dense and less likely to sink, slowing the current.

Some fear that this process would not fix itself, but rather lead to a runaway effect that slows the current even more severely. Researchers measuring the ocean currents today say that the Gulf Stream shows no clear signs of slowing. Last month, a scientific meeting on the issue resulted in media reports that the Gulf Stream had shut down completely for 10 days in 2004. But as Harry Bryden of the University of Southampton, who led that study, explains, the temporary shutdown actually occurred in deeper currents that form just part of the complex circulation system. The Gulf Stream, he says, was unaffected. "The Gulf Stream seems rather robust to us," he adds.

But big changes could lie in our future. "Now, with the amount of carbon dioxide in the atmosphere, we're in a 'no analogue' situation," Lund says. With the world warming and the poles melting, it's impossible to say what might happen to the currents.

"We just don't know."


The relevant journal abstract follows:

Gulf Stream density structure and transport during the past millennium

By David C. Lund et al

The Gulf Stream transports approximately 31 Sv (1 Sv = 10^6 m^3 s^-1) of water 1, 2 and 1.3 10^15 W of heat 3 into the North Atlantic ocean. The possibility of abrupt changes in Gulf Stream heat transport is one of the key uncertainties in predictions of climate change for the coming centuries. Given the limited length of the instrumental record, our knowledge of Gulf Stream behaviour on long timescales must rely heavily on information from geologic archives. Here we use foraminifera from a suite of high-resolution sediment cores in the Florida Straits to show that the cross-current density gradient and vertical current shear of the Gulf Stream were systematically lower during the Little Ice Age (ad 1200 to 1850). We also estimate that Little Ice Age volume transport was ten per cent weaker than today's. The timing of reduced flow is consistent with temperature minima in several palaeoclimate records4, 5, 6, 7, 8, 9, implying that diminished oceanic heat transport may have contributed to Little Ice Age cooling in the North Atlantic. The interval of low flow also coincides with anomalously high Gulf Stream surface salinity10, suggesting a tight linkage between the Atlantic Ocean circulation and hydrologic cycle during the past millennium.



A slowing of the Gulf Stream--the Atlantic Ocean's massive warm-water current--may have been responsible for a minor ice age that occurred between 1200 and 1850 C.E. If true, the finding could have implications for tracking future climate change in the northern hemisphere.

Ocean currents can influence weather on a continental scale. Witness the impact of El Nio, the building up of warm water in the western Pacific Ocean, which causes droughts and severe storms across North and South America. Similar effects can happen with the Gulf Stream, which carries tropical waters from the southeastern United States to Scandinavia--and thereby provides western Europe with a more temperate climate than its latitude would justify. A team of scientists hoped to get a better handle on the Gulf Stream's climatic influence by studying its history during the Little Ice Age. Between 1200 and 1800 C.E., average temperatures in Europe dropped about 4o Celsius.

It turns out that as temperatures chilled in Europe, the Gulf Stream decelerated. The team, led by David Lund, now at the California Institute of Technology, came to this conclusion by analyzing ocean sediment cores going back 1000 years from two widely separated sites in the Florida Straits, where the Gulf Stream originates. In particular, the researchers charted the chemical composition of foraminifera, microscopic creatures whose fossilized shells contain evidence of salinity. From the shells of the forams, as they are called, Lund's team deduced a spike in salinity at the water's surface, suggesting cooler temperatures and a slower current. The team's calculations, reported tomorrow in Nature, indicate the Gulf Stream slowed by about 10% just about the time the Little Ice age began, and resumed its current speed around 1850.

Not everyone is convinced. Some scientists have suggested the core-sample data aren't precise enough. Part of the reason is continuing uncertainty about the entire North Atlantic circulation system itself (ScienceNOW, 17 November). Further skepticism comes from oceanographer Carl Wunsch at MIT in Cambridge, Massachusetts, who thinks the researchers are overinterpreting their data. "There are many problems," he says. For example, it is an "unjustified inference" that a weakened Gulf Stream implies less heat being transported northward, leading to a colder Europe.



The European Commission imposed swingeing cuts in permitted carbon emissions by industry yesterday, provoking a storm of protest from European governments and warnings that the cost of Europe's emissions trading system (ETS) would drive up electricity prices. Brussels has drawn a red line through the national allocation plans of nine EU governments and demanded overall cuts of 7 per cent in annual greenhouse gas emissions in the second phase of ETS, from 2008 to 2012.

Germany has been forced to cut its carbon cap from 482 million 453 million tonnes, a reduction that Michael Glos, the Economy Minister, judged "totally unacceptable". Latvia and Lithuania have seen their proposed allocations halved. Greece, Ireland, Malta, Slovakia and Sweden have also suffered big cuts. Of the ten national allocation plans considered, only Britain's passed muster. The UK offered to cap its emissions at 246 million tonnes.

A spokesman for the Commission said that there was no appeal against its ruling. The crackdown on carbon follows the ETS's disastrous first year, in which the price of carbon permits plummeted because of massive over-allocation of permits by EU governments. Under the scheme, governments are expected to allocate carbon permits to firms such that emissions are capped at a level that creates a shortage of permits and an incentive to reduce emissions. Companies that emit less than their allocation of carbon can sell permits to polluters.

Fearing a humiliating outcome, France withdrew its national allocation plan on Monday after an emergency meeting between Nelly Olin, the Environment Minister and Stavros Dimas, the European Environment Commissioner. Mr Dimas said: "Today's decision sends a strong signal that Europe is fully committed to achieving the Kyoto target and making the EU ETS a success."

However, German utilities complained that the tighter caps on carbon dioxide could hinder new power generation and Latvia's Environment Minister said that the ruling was "far too tight for us to fit into".

The ETS has come under fire for allowing free allocations of carbon permits instead of a sale by auction. Power generators have enjoyed billions of pounds in windfall profits because electricity suppliers are entitled to pass on the cost of carbon to electricity consumers, despite having received their carbon permits at nil cost.

The windfall profits are deterring investment in cleaner power generation and encourage utilities in Germany to continue operating power stations that use dirty coal, argues Centrica, the British utility.

David Miliband, the UK Environment Secretary, said that the Commission's decision was good news. "The EU has a responsibility to ensure scarcity in the carbon market and a sustainable price of carbon," he said.

The Commission is assessing a further eight national plans and has taken out infringement proceedings against six states - Austria, the Czech Republic, Denmark, Hungary, Italy and Spain - for failing to submit their plans.



Many people would like to be kind to others so Leftists exploit that with their nonsense about equality. Most people want a clean, green environment so Greenies exploit that by inventing all sorts of far-fetched threats to the environment. But for both, the real motive is to promote themselves as wiser and better than everyone else, truth regardless.

Global warming has taken the place of Communism as an absurdity that "liberals" will defend to the death regardless of the evidence showing its folly. Evidence never has mattered to real Leftists

Comments? Email me here. My Home Pages are here or here or here. For times when blogger.com is playing up, there are mirrors of this site here and here.


No comments: