An editorial by Greenie heretic Mark Lynas published in the August 2011 edition of Nature Climate Change is highly critical of the IPCC's use of non-peer-reviewed "grey" literature [such as propaganda from Greenpeace] and that "a Greenpeace campaigner was put in charge of reviewing and highlighting his own work within Working Group III..." leading to the embarrassing and widely debunked claim that 80% of the world's energy could be supplied by renewable energy by 2050.
A second editorial in the same issue states regarding this same conflict of interest, "For a body that represents the state of understanding on one of the most complex and important issues of our time, repeating previously acknowledged mistakes is completely unacceptable."
That such a Green organ as "Nature" has become a platform for Mark Lynas is undoubtedly a big retreat.
The full article is paywall protected but a graphic of the first article below gives you an idea of it.
SOURCE (Edited)
Polar bear scare appears to have been a fraud
A FEDERAL wildlife biologist whose observation that polar bears likely drowned in the Arctic, which helped galvanise the global warming movement seven years ago, has been placed on administrative leave as officials investigate scientific misconduct allegations.
Although it wasn't clear what the exact allegations are, a government watchdog group representing Anchorage-based scientist Charles Monnett said investigators have focused on his 2004 journal article about the bears that garnered worldwide attention.
The group, Public Employees for Environmental Responsibility, filed a complaint on Mr Monnett's behalf on Thursday with the agency, the US Bureau of Ocean Energy Management, Regulation and Enforcement. BOEMRE told Mr Monnett on July 18 that he was being put on leave, pending an investigation into "integrity issues".
The investigator has not yet told him of the specific charges or questions related to the scientific integrity of his work, said Jeff Ruch, the watchdog group's executive director. A BOEMRE spokeswoman acknowledged there was an "ongoing internal investigation" but declined to get into specifics.
The complaint seeks Mr Monnett's reinstatement and a public apology from the agency and inspector general, whose office is conducting the probe. The group's filing also seeks to have the investigation dropped or to have the charges specified and the matter carried out quickly and fairly, as the Obama policy states.
Mr Monnett, who has coordinated much of BOEMRE's research on Arctic wildlife and ecology, has duties that include managing about $US50 million worth of studies, according to the complaint. The agency spokeswoman said other agency scientists would manage the studies in Mr Monnett's absence.
According to documents provided by Ruch's group, which sat in on investigators interviews with Mr Monnett, the questioning focused on observations that Monnett and fellow researcher Jeffrey Gleason made in 2004.
SOURCE
Deeds speak louder than words: Finland and Russia to modernise their icebreaker fleets
After last winter in the Baltic you can understand why:
More than 60 vessels stuck in ice in Gulf of Finland
"The Baltic Sea currently has the most extensive ice cover that it has seen in 24 years. On Thursday, 310,000 square kilometres of the sea were covered in ice, and the area is growing.
After the severe winter of 1987, the ice cover in the Baltic was nearly 400,000 square kilometres wide, which in practical terms means that the entire surface of the Baltic was effectively covered in ice. The entire sea has been completely frozen over just 20 times since 1720."
The global warming alarmists do not seem to have convinced the Finnish and Russian governments about ever warming winters. Finland´s new government has decided to modernise its icebreaker fleet:
The programme of the new government contains a pledge to undertake replacement of outdated icebreakers. One matter under consideration is who will own the vessels. The present fleet is owned by Arctia Shipping, a state-owned company.
Finland has five traditional-type icebreakers and three multi-purpose icebreakers. The oldest is the Voima that has been in service since 1954. The most powerful icebreakers, the Urho and Sisu have been in service for 35 years and the most recent traditional icebreaker has seen 25 years of service.
The three multi-purpose icebreakers were constructed in the 1990s.
According to Ilmari Aro, an expert on winter shipping at the Transport Ministry, the Voima is to be replaced during the term of the present government. Arctia Shipping's CEO Vauraste says that the rest of the fleet is to be replaced by around 2020. The lifetime of the vessels can be extended with investment in repairs and maintenance.
A new traditional icebreaker carries a price tag of around 100 million euros. Multi-purpose icebreakers are a bit more expensive.
Icebreakers and their services are important because of Finland's heavy reliance on maritime shipping for exports and imports. Estonia and Finland are the only countries in the world where all of the nation's ports freeze over in the winter
Neither is the Russian government trusting the warmists´ predictions:
“A very important decision was made in connection with the situation in the Gulf of Finland this winter, to build icebreakers. Currently all the fleet used to escort ships in the ice, except for two icebreakers, is obsolete. The new icebreakers to be ordered should be built to optimized projects”, stated A.Davydenko.
Both governments are, of course, basing their decisions to modernise the ice breaker fleets on the realities of the northern winter - not the climate scientists´ computers models. One must hope that the Finnish and Russian - as well as other - governments perform the same reality checks with regard to other areas involved in the global warming scam.
SOURCE
Obama's NASA is no longer about space
NASA stands for "The National Aeronautics and Space Administration". You would associate it with astronauts, perhaps pilots, space research, and the Universe. Some young readers may even be ignorant about those things but NASA brought the first men to the Moon and has done lots of other fascinating things. But look at this graph of funding from Nature:
Between 2011 and 2012, the astrophysics budget is expected to drop from $1.1 to $0.65 billion, i.e. by 40 percent. Astrophysics would become almost as small as heliophysics (physics of the Sun) which keeps its $0.6 billion.
Meanwhile, planetary sciences are proposed to grow by one or a few percent to $1.5 billion and the Earth science should only drop by less than 5 percent to $1.7 billion, preserving its dominant position.
Just think how completely insane these ratios are. NASA has always been about the space (well, at least from the times when airplanes became ordinary), about flying away from Mother Earth. However, among these 4 scientific disciplines, astrophysics only has less than 15 percent of the money and astrophysics plus heliophysics have something like 28 percent.
No wonder that the James Webb Space Telescope, Hubble's successor, is likely to die.
Meanwhile, the Earth science budget is meant to remain high, for no good reason. There are no projects that would be as important - and not even as well-known - as the James Webb Space Telescope. Still, the Earth science section is scheduled to get 3 times more money than astrophysics. The reasons are clearly ideological, not meritocratic.
Much of this money - $1.7 billion - is paid for the people to spread lies about "global warming" which are convenient to make the government bigger.
These numbers are even scarier if one looks at the evolution over longer periods of time. In 2008, the Earth science budget in NASA was $1.25. So it's meant to increase by 36 percent by 2012. Meanwhile, the astrophysics budget drops from $1.4 to $0.65 between 2008 and 2012.
The Earth_science/astrophysics budget ratio is scheduled to increase, between 2008 and 2012, by the factor (double ratio) of almost three! A factor of three added within three years is a pretty fast change: the (absolute) global mean temperature has increased by 0.2% in the last 100 years.
Would someone dare to claim that this tripling has nothing to do with the global warming hysteria? Would someone dare to claim that the typical people in NASA such as Gavin Schmidt are not affected by (relative) tripling of the money they're receiving? That they don't realize what they are actually being paid for? The likes of Gavin Schmidt get the tripling of the funds instead of life in prison that they deserve.
Needless to say, the tripling of NASA's Earth science budget relatively to NASA's astrophysics budget over 4 years has absolutely no conceivable rational explanation. It's all about the dislocation of resources and wasting of the U.S. taxpayer's money.
The House has its fingers in these nasty modifications of the structure of NASA's funding - and unsurprisingly, that's what Nature focuses upon. But Obama doesn't move a finger to stop this pernicious replacement of high-tech science and technology in NASA by low-brow ideological babbling, propaganda, lies, pseudoscience, and emotions.
These people actively work on the decomposition of the space research, astrophysics research, and cosmology research in the once famous country named America.
In 1966 when it was all about the space, NASA got 4.4 percent of the federal budget. This percentage dropped to 0.5 percent in recent years - by an order of magnitude. The massively high spending for the space research didn't prevent America from running budget surpluses throughout the 1950s and 1960s.
SOURCE
Climate Witchcraft and Post-Normal Science
By Norman Rogers
French philosophers invented deconstructionism and postmodernism, or the theory that nothing means what it says. Followers of these ideas are adept at finding hidden messages of capitalist oppression in the most unexpected places. A related ideological disturbance is post-normal science. Post-normal scientists favor relaxing scientific rigor in order to better pursue political goals. Those goals often involve a reorganization of society that will elevate the importance of scientists. Archimedes supposedly said, Give me a long enough lever and I can move the world. The post-normal scientists think that science is a lever that can be used to rule the world.
Certain important climate scientists are very eager to reorganize society. They proclaim, on weak evidence, that the Earth is doomed by global warming unless we follow a green plan to remake the economy and the social order. We have to give up cars for trolleys. Windmills will become ubiquitous. The most famous climate scientist, James Hansen, wants to put his opponents on trial for crimes against humanity.[i] Implicit in all this is the idea that a central committee of Dr. Strangeloves should rule the world. Instead of prince this and duke that, we will have doctor this and doctor that. These radical intellectuals secretly despise the present system of rule by the rabble, otherwise known as democracy.
Some intellectuals think that they don't get attention and status commensurate with their importance. This is especially true in America, where the cleaning lady or plumber is inclined to treat them as equals. One way to be important is to proclaim a theory that something very bad is going to happen. If the theory has some scientific basis and is backed by other prominent scientists, the claims will be credible.
A lot of this doomsday science, disguised as environmental concern, has been going around during the last 50 years. Global warming is just the latest example of ideologically motivated catastrophe theory. James Delingpole's book, Watermelons, describes the phenomenon in amusing detail. Like radical environmentalists, watermelons are green on the outside and red on the inside.
If it weren't for the prophecies of doom, climate science would be an obscure academic niche. Global warming has made everyone in the field rich, at least in academic currency if not dollars. The wealth has spread to other academic niches that have become more important in light of connections to climate. Global warming is a huge bonanza for the do-good environmental organization industry. Organizations like the Sierra Club or the Environmental Defense Fund[ii] need a perpetual stream of impending environmental disasters. When the public becomes bored with an impending disaster that never materializes, a new impending disaster must be found.
Climate science has embraced computer climate models as the tool it uses to compute the magnitude of the warming effect of CO2. The climate models are riddled with problems. Kevin Trenberth, a noted climate scientist and a prominent promoter of global warming alarmism, said this about the models: "none of the climate states in the models correspond even remotely to the current observed climate." The effect of CO2 is measured by a theoretical number called climate sensitivity. There are more than 20 climate modeling groups around the world. These groups each spend millions on programmers and supercomputers, searching for the value of climate sensitivity. They all get different answers, differing by a ratio of more than two to one. This failure of consensus would normally be considered a sign that the approach is not working. But if climate science can't make predictions of doom, it will cease to be important and funding will collapse. The climate science establishment had to relax the normal rules of science for its own survival and for the sake of its post-normal-science political goals.
The global warming establishment devised a solution. They decided to take the average of the various disagreeing models and claim that the average is closer to the truth than any of the models incorporated in the average. They call this a multi-model ensemble. The skeptic will ask if averaging together the results from more modeling groups makes the result better, why not spend a few billion dollars more and establish another 20 or 50 modeling groups to still better zero in on the truth? To read the justifications for multi-model ensembles is to enter a reality distortion field.
The climate models make predictions that cannot be tested because you would have to wait 50 or 100 years to see if the predictions are correct. The models are evaluated and calibrated by simulating the observed climate of the 20th century. The entirely unjustified assumption is made that if the models can match the 20th-century climate they must be working well and will be able to predict the future. This is known as backtesting. The problem with backtesting is that models may fit the historical data for the wrong reasons. If a model is complicated, with enough adjustable parameters, it may be capable of fitting almost anything. Many people have devised stock market models that work well when tested against history. If such models could predict the future movement of markets or pick winning stocks it would be far easier to make money in the stock market than it is.
The climate models have dozens of adjustable parameters. Inputs to the models, related to physical drivers of climate, are highly uncertain. For example, one input is aerosols or reflective particles injected into the air from smokestacks and natural sources. These have an effect on climate but the historical aerosol record is difficult to quantify. We don't know very accurately how much and what kind of aerosols there were year by year in the 20th century, and we don't know what the effect on the energy flows was. In order to model the 20th century, you must supply the model with a historical record of the effect of aerosols. Since this is poorly known you might be tempted to fabricate the historical record so as to make the model fit the 20th century better.[iii] This is either a clever strategy or circular reasoning. Climate scientists call this the inverse method of computing the effects of aerosols.[iv] Ocean heat storage provides another example of a necessary but poorly known aspect of climate models. Adjusting the internal model parameters related to this effect provides another lever for making a model fit the historical climate of the 20th century. The unfortunate result is that different climate models treat ocean heat storage quite differently, but the Earth has only one way of treating ocean heat storage.[v]
The International Panel on Climate Change, otherwise known as the IPCC, or perhaps as the Vatican of climate change, has an established procedure for making predictions using multi-model ensembles. Each of the modeling groups is instructed to fit or calibrate its model to the 20th century and then to run the model into the 21st century to get a prediction of the future. Each group is directed to use inputs as it deems appropriate for the 20th-century fitting.[vi] The modeling groups can independently adopt their own set of assumptions about the reality of the 20th-century climate. It's like the parallel Earths, in parallel universes, often seen in science fiction. There is only one Earth. There are no parallel universes.
The net result from these tricks is that fitting the models to the 20th century becomes an exercise in curve-fitting implemented by custom fudging with a different fudge recipe at each modeling laboratory. The result of this exercise in inventing historical data is illustrated by the figure below from the 2007 IPCC report.
The ensemble mean fits the observed temperature history[vii] very well, even taking dips when volcanos erupt and inject cooling aerosols. The only place where the fit fails is the early-20th-century warming from 1910 to 1940. The problem during that period is that there is nothing plausible to explain this early warming that is also consistent with the doctrine that CO2 is the only cause of the late-century warming. Most of the modelers assume that the early warming is due to a change in the sun's output, but they don't dare go too far with that because in general they have to minimize the effect of the sun and maximize the effect of CO2 to avoid giving comfort to the skeptic school that thinks climate is controlled mostly by the sun.
Multiple runs of the same model are included in the graph. Slightly different starting conditions are used for runs using the same model, so the results of different runs by the same model are not identical, and in fact exhibit considerable chaotic variation. The chaotic or random variations average out to a characteristic climate associated with that model, if enough runs from the same model are averaged. The interesting fact about the graph is that the 13 different models, averaged, give an excellent fit to the temperature history even though we know that the models disagree sharply on the effect of the rapidly rising CO2 in the second half of the 20th century.[viii]
The apparent good performance of the models in the graph is a consequence of stacking the deck by adjusting the assumptions about the Earth independently for each model. That adding more models to the mix makes the graph fit the observed climate better, as the IPCC claims, is an elementary result of curve fitting theory. If you use several different curve-fitting methods (e.g., different models) and the errors in the fits are random or uncorrelated, then the errors are reduced proportional to the square root of the number of fits averaged together. This has nothing to do with climate. It is a mathematical and statistical result. Of course all this is well-known to climate scientists.
Why would the IPCC use such an unscientific scheme for predicting the future climate? A better scheme comes easily to mind. Why not have a contest to pick the best model? The conditions of the test against the 20th-century observed climate should be set strictly so that inputs are the same for all models and non-physical or physically inconsistent assumptions internal to the models would be prohibited. Although this scheme would hardly be guaranteed to result in reliable predictions of the future climate, it would surely be sounder than the corrupt scheme currently used.
But, wait a minute. If one laboratory out of 20 was picked as having the best model, what would the reaction of the other 19 laboratories be? After all, one can assume that the other 19 labs have 19 times the political influence that the winning lab would have. Wouldn't the other labs be deeply worried that their funding would be cut or diverted to the winning lab? Suppose the winning lab was an American lab. Might the European labs suspect cheating or bias? Suppose a French lab won. What would the Americans think? Would the Congress support research at a French lab, at the expense of the American labs? Obviously, a climate model shootout would break the unity of the climate science establishment and is thus unthinkable.
Climate models are useful heuristic tools that help in understanding climate. Most of the work done in developing models is honest. But the models are not remotely good enough to make predictions about the future climate under the influence of CO2. The IPCC and its allies have created a bizarre scheme to force doomsday predictions out of the disagreeing models in order to pursue bureaucratic and political goals. The resultant predictions are looking very foolish in the face of 14 years of no general climate warming, and of no ocean warming since a reliable monitoring system was deployed in 2003.
President Eisenhower anticipated post-normal science in his 1961 farewell address when he warned that public policy could become the captive of the scientific-technological elite. We are accustomed to various special interest groups cooking the books to promote their interests in Washington. We don't expect the science establishment to be cooking the science, but that is what is happening. The arrogance and irresponsibility exhibited by the science establishment is quite amazing. It will take a while for the public to adjust to the idea that organized science is as corrupt as the trial lawyers or the teachers' union.
SOURCE
The usual Greenie unrealism in "model" of carbon tax effects
By economist Henry Ergas, writing from Australia
EARLIER this week, Wayne Swan said the results of updated Treasury modelling of the government's proposed carbon tax would not differ much from those already released.
If our Defence planners told us it didn't matter to their modelling whether our next war was with China or with Vanuatu, we would worry about the quality of their planning.
So it is not reassuring that increasing the starting carbon price by more than 10 per cent, closing the Hazelwood generator early, slashing the number of permits that can be purchased overseas, precluding the borrowing of permits from the future and using realistic assumptions about what other countries are doing would have little impact on the policy's estimated costs. That said, many adjustments that should be made will likely not be made. The modelling will therefore remain an exercise in the economics of nirvana: easily assumed, less easily attained. But even were significant adjustments made, there is a technical reason why the model's estimate of costs might change little.
That reason is a quirk in Treasury's modelling. Called the "marginal abatement cost curve" or MAC, it provides abatement like manna from heaven, that is, at no cost. But it is even better: for the more the price of bread rises, the more manna showers from the skies. Or in this case, the higher the permit price, the more abatement we get for free.
The mechanics of this device can be explained as follows. As the carbon price rises producers replace more emissions-intensive processes with less emissions-intensive alternatives. This typically involves some investment costs. For example, a firm might spend an additional $100 on scrubbers to reduce emissions. As the scrubbers must be paid for, the firm's costs and prices would rise, causing, among other things, changes in demand.
But here comes the interesting bit. As the carbon price rises, the MAC kicks in, and provides further reductions in emissions, but without requiring new investment. And the higher the permit price, the more of those reductions it generates. It is as if the scrubber, without needing to be replaced, suddenly eliminated more emissions simply because the carbon price had increased.
And the savings generated by the MAC are not trivial. Indeed, thanks to a parameter in the model, in principle up to 90 per cent of emissions affected by the MAC could be eliminated at no cost. In practice, the reductions are unlikely to approach that ceiling. In the modelling for Australia, for example, the MAC does not apply to some sectors that are large emitters of carbon. But it does apply to other important activities, including mining.
And because the quantity of free emissions reductions increases as the carbon price rises, the model reduces the estimated cost of toughening the policy, as the government has done by (for instance) limiting purchases of permits from overseas.
How can such a mechanism be justified? The best gloss that can be put on it is that higher carbon prices would induce emissions-savings innovation beyond that assumed in the base case. And that could indeed happen. But if that is what the MAC is assumed to be doing, there are at least three problems with the way it does it.
First, induced innovation is highly uncertain and involves long delays: there can be many years between a price change and the successful technical advances it has encouraged. And even once innovations are available, their spread is typically slow. But the modelling assumes a virtually immediate and predictable response.
Second, once emissions are substantially reduced, finding innovations that can reduce them further becomes ever more difficult. But in Treasury's MAC curve, the opposite occurs.
Third and last, the best things in life may be free, but new technologies are not. Innovations are costly and must be paid for. Indeed, it is the prospect of reaping those rewards that ensures innovations occur. That Treasury, of all places, would instead assume a free lunch is truly remarkable.
How big is the resulting error? Without access to the model, no one can tell. It is therefore not surprising that the government refuses to disclose it. But this refusal hardly flatters Treasury's hard work and the millions of taxpayer dollars spent on the model. Has Swan so little confidence in his department that he cannot face the risk of criticism?
Nor is that refusal consistent with a loudly proclaimed commitment to science. For science grows by disclosure and refutation, not secrecy and manipulation.
And it is even less consistent with the pledge of openness on which Labor was elected. But few governments have shown as flexible an attitude to the relationship between principles and practice as that of Julia Gillard. It professes a belief in informed argument but works on the basis that what others don't know can't hurt it. Little wonder it is reduced to selling its policies like bars of soap. And its credibility lies in tatters. A modest step it could take to restore confidence would be to release the Treasury model. Until that is done, Swan's assurances will be little more than wasteful emissions of carbon dioxide.
SOURCE
***************************************
For more postings from me, see DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC, GUN WATCH, AUSTRALIAN POLITICS, IMMIGRATION WATCH INTERNATIONAL and EYE ON BRITAIN. My Home Pages are here or here or here. Email me (John Ray) here. For readers in China or for times when blogger.com is playing up, there are mirrors of this site here and here
*****************************************
No comments:
Post a Comment