Megafauna demise is traced to arrival of blacks in Australia around 50,000 years ago. That has always been the obvious conclusion but scientists have resisted it because of the Green/Left romanticization of native people. Any alternative to our hated modern world is to be preferred, even if you have to make most of it up
Debate has raged about the demise of "whopper hopper" P. goliah. A fossil study of the extinct giant kangaroo has added weight to the theory that humans were responsible for the demise of "megafauna" 46,000 years ago. The decline of plants through widespread fire or changes toward an arid climate have also played into the debate about the animals' demise. But an analysis of kangaroo fossils suggested they ate saltbush, which would have thrived in those conditions. The research is in Proceedings of the National Academy of Sciences.
There has long been dissent in the palaeontology community about the cause for extinctions worldwide after the end of the last ice age. Central to the debate has been the demise of the Australian megafauna, including animals such as marsupial lions, hippopotamus-sized wombats and the 2m-tall giant kangaroo Procoptodon goliah.
Last year, researchers dated fossils from Tasmania with the best precision yet, finding that many species survived more than 2,000 years after the arrival of humans. The researchers concluded that the megafauna eventually met their end due to hunting.
Now, researchers from Australia and the US have combined radiocarbon dating with a so-called microwear analysis of the teeth of P. goliah to determine what it ate and drank. Different sources of water and food leave trace amounts of particular types, or isotopes, of hydrogen and carbon atoms, which are deposited in the teeth like a recorded diet. Additionally, tiny patterns of wear give clues about the type of food a given creature chewed. The team concluded that the giant kangaroos fed mainly on saltbush shrubs.
Because fire does not propagate well among saltbush, and because it thrives in a dry, arid climate, the case supporting two of the three potential causes for extinction was weakened. Evidence suggests therefore that the P. goliah was hunted to extinction.
However, it is just one of many species whose disappearance fuels the debate, and there is much more work to be done before it can be considered a definitive proof. "I'm a little hesitant to make a big conclusion," said co-author of the study, Larisa DeSantis of the University of Florida. "What's really exciting is that this is one of the first instances where we've been able to use both isotopes and the microwear method to identify this very unique diet," she told BBC News. Dr DeSantis said that she was pursuing a similar analysis of other megafauna fossils in other regions of Australia.
"This study neatly ties up several loose threads in the long-running extinction debate," said Richard Roberts of the University of Wollongong in Australia. "By independently reaching the same conclusion for two very different environments - the mountainous rainforests of Tasmania and the dry rangelands of inland Australia - the mystery is no longer whether humans were ultimately responsible for the disappearance of the giant marsupials, but how they did it."
SOURCE
Hansen the political activist
Any pretense that he is an unbiased scientist is clearly disproven by his own actions -- denials notwithstanding. He is basically just a childish attention-seeker. When as a little kid, he said "Mommy, look at me", his mother probably did not look
James E. Hansen, the NASA climate scientist who has become an outspoken campaigner against coal burning, was among 31 protesters arrested on charges of obstructing officers and impeding traffic during a protest against mountaintop mining. They had initially sought to enter the grounds of a facility run by Massey Energy, the biggest company conducting mountaintop mining in West Virginia. But several hundred miners and relatives, along with supporters of the coal industry, blocked the entrance, according to the Charleston Gazette.
The protesters included Ken Hechler, 94, a former congressman, the actress Darryl Hannah and the executive director of the Rainforest Action Network. (Here are more photos from the day’s events, taken by Antrim Caskey.)
In a statement distributed by the Rainforest Action Network, Dr. Hansen said:
I am not a politician; I am a scientist and a citizen. Politicians may have to advocate for halfway measures if they choose. But it is our responsibility to make sure our representatives feel the full force of citizens who speak for what is right, not what is politically expedient. Mountaintop removal, providing only a small fraction of our energy, should be abolished.
Dr. Hansen has said for years that growing reliance on coal, far more so than oil, is the biggest threat to the global climate. As a result, he has strongly criticized the climate bill that is facing a vote by the full House of Representatives on Friday. He cites studies concluding that various provisions would allow expanded coal use in coming decades despite an overall cap on emissions of carbon dioxide. In a profile of Dr. Hansen by Elizabeth Kolbert in the current issue of the New Yorker (subscription required), she pressed him on his stance:
Dr. Hansen pointed out that the bill explicitly allows for the construction of new coal plants and predicted that it would, if passed, prove close to meaningless. He said that he thought it would probably be best if the bill failed, so that Congress could “come back and do it more sensibly".
I said that if the bill failed I thought it was more likely Congress would let the issue drop, and that was one reason most of the country’s major environmental groups were backing it. “This is just stupidity on the part of environmental organizations in Washington,” Dr. Hansen said. “The fact that some of these organizations have become part of the Washington ‘go along, get along’ establishment is very unfortunate.”
Dr. Hansen has pushed far beyond the boundaries of the conventional role of scientists, particularly government scientists, in the environmental policy debate.
SOURCE
ANALYSIS: CAP-AND-TRADE WAR
Despite indications that much of President Obama's agenda is meeting intra-party skepticism all over Capitol Hill, there is one policy nexus where congressional leaders are still doggedly determined to move the country left: energy and the environment. Speaker Pelosi will reportedly allow a vote on the controversial Waxman-Markey "cap-and-trade" legislation at the end of this week.
And it gets even better. Not content to tempt political fate by imposing huge carbon taxes on the American middle class, Democrats have added a provision which imposes stiff tariffs on our trading partners if they don't adopt aggressive carbon restrictions of their own.
You heard correctly: progressives have authored a bill that earns the mortal enmity of domestic energy consumers and our most crucial trading partners at the same time. Economy-killing climate policies and a trade war - together at last!
What happened is this: An early draft of Waxman-Markey already contained triggers that gave the president the choice to introduce carbon tariffs if jobs and industry "leak" overseas to countries that don't constrain emissions so dramatically. (China and India come to mind.) The original version empowered the president to impose the carbon-linked tariffs beginning in 2025.
But though the language is not public yet, the House Ways and Means Committee is reportedly considering provisions that will give extra comfort to protectionists. Leaks from Hill offices indicate that the president would now be forced to impose the carbon tariffs - and could only opt out of doing so with permission from both chambers of Congress. Carbon-intensive imports would be subject to penalties at the border unless the country of origin requires emission reduction measures at least 80 percent as costly as ours. (The original Waxman-Markey bill had a threshold of 60 percent.)
Unfortunately for the amendment's authors, World Trade Organization rules make fairly clear that trade-limiting measures imposed to protect the environment should have the purpose of protecting the environment, and not to address any adverse competitiveness effects on domestic industry. Break that connection between measure and purpose, and you've got yourself a problem. The result could be litigation, retaliatory tariffs, or both. Does anyone really expect China to stand idly by in 2025 as their trade is embargoed?
And just for the sake of discussion, exactly how much global warming will be prevented by this assurance of future trade turmoil? Well, let's use the federal government's own model which - we are not making this up - is called MAGGIC (Model for the Assessment of Greenhouse-gas Induced Climate Change). It comes from the National Center for Atmospheric Research in Boulder, Colorado.
Let's compare the effects of Waxman-Markey to the United Nations' "business-as-usual" emissions scenario that's in their big 2007 climate change compendium. If the U.S. only adopts Waxman-Markey, global warming would be reduced by a grand total of 0.2°F by 2100. This is too small to even detect, because global temperatures bounce around by about this amount every year. For those who like to think more near-term, the amount of warming prevented by 2050 would be 0.07 of a degree.
According to the UN, without Waxman-Markey the warming from 1990 to 2050 would be 2.8°F, and 5.3° by 2100. (Of course, observed warming since 1990 is running about 40 percent below the expected rate, largely because there hasn't been any net warming since the very warm year of 1998.)
Now, let's be completely unrealistic and assume that every nation that has "obligations" under the (failed) Kyoto Protocol cuts emissions as much as we do. Then the saved warming balloons all the way to 0.14°F by 2050 and 0.4° by 2100, or 5 and 7 percent, respectively, of the "business-as-usual" total.
Let's add it all up. We don't do anything measurable to reduce global warming, we alienate some of our biggest trade partners, we risk a trade war, and Americans are allowed to emit the same carbon volumes as the average citizen did in 1867. What's not to hate?
All of which explains why Waxman-Markey is being rushed to the floor. If people find out what is really in it, how risky it is and how small the purported benefits, it is hard to believe that it will pass.
SOURCE
GREENIE HEARTBURN OVER WAXMAN-MARKEY
As the Waxman-Markey Climate Bill nears a vote in the U.S. House of Representatives, environmental groups are "teetering at the edge of existential crisis," writes Josh Harkinson. "Almost all environmental groups agree that Waxman-Markey is far from ideal," but some are supporting it, while others "believe the bill is so deeply flawed it might actually make matters worse." Critics say the bill "lines the pockets of polluters with little to show for it. The most it would cut carbon emissions by 2020 is 17 percent below 1990 levels, nowhere near the 25 to 40 percent reduction sought by scientists and international climate negotiators."
Other concerns are that the bill may decrease clean energy production, as it would overrule higher renewable mandates in states like California; it would strip the Environmental Protection Agency of its ability to regulate carbon dioxide emissions from coal plants; and it would auction just 15 percent of emissions permits, giving a whopping 50 percent "to the fossil fuel industry for free."
Some environmentalists blame the United States Climate Action Partnership, "a coalition of industry and moderate environmental groups," for sticking with a "quietly hammered out" agreement developed during the Bush administration. Others criticize President Obama, "who spoke out in favor of auctioning off pollution permits during his campaign ... but is now thought likely to sign whatever bill crosses his desk." Meanwhile, the industry front group Cooler Heads Coalition is planning efforts to oppose the bill, with "scientific skeptics and legislative critics," reports Greenwire.
SOURCE
BRITAIN'S MYSTIC MET OFFICE PREDICTS NEIGHBOURHOOD THERMAGEDDON
On Thursday, the Met Office launched its new report on global warming: UK Climate Projections 2009, otherwise known as UKCP09. This is based on the output of Hadley Centre climate models that predict temperature increases of up to 6°C with wetter winters, dryer summers, more heatwaves, rising sea levels, more floods and all the other catastrophes that one would expect from similar exercises in alarmism.
What makes this report different from any of its predecessors is the resolution of the predictions that the Met Office is making. They are not just presenting a general impression of what might happen globally during this century, or even how climate change could affect the UK as a whole. They are claiming that they can predict what will happen in individual regions of the country - down to a 25km square. You can enter your postcode and find out how your street will be affected by global warming in 2040 or 2080.
All this is rather unexpected. In May last year, I posted here and here about a world summit of climate modellers that took place at Reading University. On the agenda was one very important problem for them; even the most powerful super-computers that have been developed so far are not capable of running the kind of high resolution models that they claim would allow them to reduce the degree of uncertainty in their predictions, and also make detailed regional predictions that policy makers would like to have so that they can build climate change into infrastructure planning.
Here are a couple of excerpts from the conference website:
The climate modelling community is therefore faced with a major new challenge: Is the current generation of climate models adequate to provide societies with accurate and reliable predictions of regional climate change, including the statistics of extreme events and high impact weather, which are required for global and local adaptation strategies? It is in this context that the World Climate Research Program (WCRP) and the World Weather Research Programme (WWRP) asked the WCRP Modelling Panel (WMP) and a small group of scientists to review the current state of modelling, and to suggest a strategy for seamless prediction of weather and climate from days to centuries for the benefit of and value to society.
A major conclusion of the group was that regional projections from the current generation of climate models were sufficiently uncertain to compromise this goal of providing society with reliable predictions of regional climate change.
Modellers also fretted that the GCMs, or General Circulation Models, were blunt instruments.
Current generation climate models have serious limitations in simulating regional features, for example, rainfall, mid-latitude storms, organized tropical convection, ocean mixing, and ecosystem dynamics. What is the scientific strategy to improve the fidelity of climate models?
This was summed up by Julia Slingo (at that time Professor of Meteorology at Reading University, who also chaired part of the conference) in a report by Roger Harrabin on the BBC News website:
So far modellers have failed to narrow the total bands of uncertainties since the first report of the Intergovernmental Panel on Climate Change (IPCC) in 1990.
And Julia Slingo from Reading University admitted it would not get much better until they had supercomputers 1,000 times more powerful than at present. "We've reached the end of the road of being able to improve models significantly so we can provide the sort of information that policymakers and business require," she told BBC News. "In terms of computing power, it's proving totally inadequate. With climate models we know how to make them much better to provide much more information at the local level... we know how to do that, but we don't have the computing power to deliver it."
Professor Slingo said several hundred million pounds of investment were needed. "In terms of re-building something like the Thames Barrier, that would cost billions; it's a small fraction of that. "And it would allow us to tell the policymakers that they need to build the barrier in the next 30 years, or maybe that they don't need to."
If, since the conference, several hundred million pounds had been invested in producing a new generation of supercomputers, a thousand times more powerful than the present generation, and the Met Office had already developed and run the kind of high resolution models which were so far beyond the scientist's grasp just a year ago, then I suspect that this might have seeped into the media and we would have head about it. So far as I am aware, the fastest supercomputers are still a thousand times slower than the modellers consider necessary for credible regional scale modelling of the climate.
So I wondered whether Professor Slingo had anything to say about the Met Office's new. In fact, she did:
"Through UKCP09 [UK Climate Predictions 2009] the Met Office has provided the world's most comprehensive regional climate projections with a unique assessment of the possible changes to our climate through the rest of this century. "For the first time businesses and other organisations have the tools to help them make risk-based decisions to adapt to the challenges of our changing climate." Slingo confidently explained the 'breakthrough' to Bloomberg. "We can attach levels of certainty," she said.
So what’s changed since last year? Well one thing is that Julia Slingo has a new job. She has been appointed as Chief Scientist at the Met Office. So far as I know, the limitations that lack of computing power place on the accuracy and resolution of models are just the same.
During a rather bad-tempered interview on Thursday evening’s Newsnight, Kirsty Wark asked Hilary Benn, the UK Environment Secretary, why local authorities were being told to use the Met Office predictions as a template for infrastructure planning when their report had not been peer reviewed and the authors had postponed publication of information about the methodology that they had used. She also told him that there was considerable concern among other climate scientists about the Met Office’s research.
Myles Allen made an appearance on the programme warning that local authorities should be very wary about planning infrastructure projects on the basis of climate models unless they were sure that the science was robust. Mr Benn parroted the usual mantras without addressing the questions, and looked as though he would have much preferred to be elsewhere.
SOURCE
MEDITERRANEAN SEA 'NOT WARMING'
17 JUN 2009 From the ongoing OGS conference on Observational Oceanography in Trieste, Italy - Rome, 17 June (Apcom) - No water warming processes are likely to be undergoing in the Mediterranean. It's one of the preliminary results obtained under MedArgo, the "sister project", coordinated by OGS [the Italian National Institute on Oceanography and Experimental Geophysics].
MedArgo deals specifically with the Mediterranean Sea and surrounding countries and is part of EuroArgo, the European component of the international Argo project.
Argo's objective is an intensive analysis of the seas to see what are the impacts of climate change and global warming on the waters of our planet and, consequently, also on its ecosystems. That is why 60 European scientists are comparing data and knowledge at the Second EuroArgo Conference on Observational Oceanography, being held in Trieste, and organized by OGS.
In order to study the chemical and physical parameters of the waters of the seas, OGS uses special tools called "float profilers" [?], battery-powered cylindrical tubes released into sea currents. Devices last between 3 and 4 years and collect 150-200 profiles before being abandoned.
"These instruments - says Pierre-Marie Poulain, Head of the Remote Sensing Group at OGS and coordinator of MedArgo - go down to an average depth of 350 meters and remain there for five days. Then they do a quick foray to 2,000 meters and come back up, measuring the physical parameters of the water column and transmitting the data via satellite. Everything is done in real time: the data arrives at research centers, scattered throughout the world, where it is processed, managed and disseminated to the community of scientists."
At present, there are around 3,000 profilers worldwide, spaced apart by about 300 kilometers. In the European seas there are 800 profiles, 23 of which in the Mediterranean Sea, with the objective of bringing the total to 30 for a complete coverage of the basin.
As well as coordinating the launching of the profilers, OGS is also involved in collecting the data recorded on the characteristics of currents, temperature and salinity. The researchers from Trieste are, in fact, among the few with the oceanographical skills needed to perform the necessary quality control.
MedArgo so far has collected a series of data that illustrate what is happening in the Mediterranean. "The Mediterranean current - adds Poulain - is an important engine of the local circulation, because it influences all motions of this enclosed sea. On the basis of information gathered so far, all we can anticipate is that at the moment there are no processes warming the waters. But we will have more details only at the end of the project, with the final data in hand."
SOURCE
***************************************
For more postings from me, see DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC, GUN WATCH, SOCIALIZED MEDICINE, AUSTRALIAN POLITICS, IMMIGRATION WATCH INTERNATIONAL and EYE ON BRITAIN. My Home Pages are here or here or here. Email me (John Ray) here. For readers in China or for times when blogger.com is playing up, there is a mirror of this site here.
*****************************************
No comments:
Post a Comment