Sunday, June 25, 2006


By Steve McIntyre

The early rumors on the NAS Panel was that it was "two handed" - on the one hand, ..., on the other hand, ... with something for everyone. I'd characterize it more as schizophrenic. It's got two completely distinct personalities. On the one hand, they pretty much concede that every criticism of MBH is correct. They disown MBH claims to statistical skill for individual decades and especially individual years.

However, they nevertheless conclude that it is "plausible" - whatever that means - that the "Northern Hemisphere was warmer during the last few decades of the 20th century than during any comparable period over the preceding millennium". Here, the devil is in the details, as the other studies relied on for this conclusion themselves suffer from the methodological and data problems conceded by the panel. The panel recommendations on methodology are very important; when applied to MBH and the other studies (as they will be in short order), it is my view that they will have major impact and little will be left standing from the cited multiproxy studies.

Update: Eduardo Zorita's take posted up below was:

In my opinion the Panel adopted the most critical position to MBH nowadays possible. I agree with you that it is in many parts ambivalent and some parts are inconsistent with others. It would have been unrealistic to expect a report with a summary stating that MBH98 and MBH99 were wrong (and therefore the IPC TAR had serious problems) when the Fourth Report is in the making. I was indeed surprised by the extensive and deep criticism of the MBH methodology in Chapters 9 and 11

I thought that the tone of the question period showed that some reporters were pretty unsettled - there were questions about the "over-selling" of MBH with the panel taking pains to suggest that IPCC would be responsible rather than MBH (conveniently omitting that Mann was section author of the section promoting MBH and in his capacity of IPCC author, ratcheted up the statistical claims); there was discussion of what "plausible" meant, with a reporter wondering if this was "damning with faint praise".


In the preface, North summarizes the criticisms:

Critics of the original papers have argued that the statistical methods were flawed, that the choice of data was biased, and that the data and procedures used were not shared so others could verify the work. (ix)

He left out the criticism that concerned the Barton Committee and launched the entire matter - that adverse results were withheld or even misrepresented. In its text, the panel concedes every one of our criticisms of the statistical methods, providing some useful new guidelines. However, they do not apply these guidelines to either to MBH or to other studies.

They do not clearly discuss biased data selection, but concede that strip-bark samples, such as bristlecones, which we had strongly criticized, "should be avoided in temperature reconstructions". However, they then proceed to rely on studies that rely on strip-bark bristlecones (and foxtails) and even the criticized MBH PC1 (which is even illustrated in an alter ego in Figure 11-2.)

They do not grasp the nettle of reporting on previous data and method availability, but do endorse the principle that sharing data and methods is a good thing in paleoclimate. Schizophrenically, their graphics and conclusions rely heavily on studies where data and/or methods are not available.

They stay well away from grasping the nettle of providing an opinion on whether adverse MBH results were withheld or misrepresented. However, they report factual findings that MBH failed cross-validation tests and was not robust to presence/absence of all dendroclimatic indicators, contrary to prior claims of Mann et al.

Flawed Statistical Methods

On p 107, the panel reports our two principal criticisms of MBH statistical methods, finding

"Some of these criticisms are more relevant than others, but taken together, they are an important aspect of a more general finding of this committee, which is that uncertainties of the published reconstructions have been underestimated. Methods for evaluation of uncertainties are discussed in Chapter 9."

Chapter 9 then sets out some important guidelines, dealing with several critical issues that we raised in our presentation: that it is inadequate to just consider one statistic in assessing a statistical model; that confidence interval calculations should use verification period residuals rather than calibration period residuals; that autocorrelation should be considered in calculating confidence intervals.

The panel's schizophrenia is very evident here, because, having set out these methods, they do not apply these methods to the models in front of them. D'Arrigo et al 2006 report that their model does not verify after 1985 during the period of warming of most direct interest. The panel was aware of this, the matter came up in presentations, but did not directly report or discuss this.

The panel recommends the use of a Durbin-Watson statistic for calibration, but do not report the failure of the various models under this statistic, even though they were aware of this failure. (We presented this information to them in our presentation. ...

More here


They will need lots more trailer parks for the poor in future

New Mexico is spearheading a national effort to redefine building standards so that they reduce emissions linked to global warming. The "2030 Challenge" is a national initiative backed by the U.S. Conference of Mayors. The challenge seeks to immediately reduce greenhouse gas emissions from new buildings and to make all buildings completely independent of fossil-fuel energy by the year 2030. It has so far been embraced by the mayors of Santa Fe, Albuquerque, Chicago, Miami and Seattle, all of whom have ordered that new city-owned buildings adhere to the standards.

New Mexico Gov. Bill Richardson issued an executive order earlier this year requiring that all new state buildings and major renovations meet the challenge's call for a 50 percent reduction in fossil fuel energy consumption from what traditional buildings use. The New Mexico Environment Department tracks state emissions and reports them as a member of the Chicago Climate Exchange, a voluntary, legally binding program whose members agree to reduce greenhouse gas emissions by one percent each year. The 2030 Challenge calls for new building standards to go into place every five years, with each new round of standards further reducing energy use in new buildings by another 10 percent. By 2030, the challenge calls for new buildings to be completely free from a dependence on fossil fuels and to not release greenhouse gasses....

Bruce Milne, a biology professor and director of the Sustainability Studies Program at the University of New Mexico, says of the 2030 plan, "in my mind, it's the only strategy that is designed to work," explaining that the plan sets a specific goal for states to meet in reducing greenhouse gas emissions. Milne was recently appointed to serve on Gov. Richardson's climate change advisory group. He says the group has come up with 70 ideas on how to reduce greenhouse gas emissions and possible incentives to make it work. One of those ideas, he says, includes a tax break for businesses that adopt clean energy policies and for farmers who convert manure into renewable energy. He says any tax incentive would have to be passed by the state Legislature and that the group hopes to propose such incentives to the 2007 session.

More here


There's lots of talk about nasty-sounding "chemicals" ("liquid fluoride"!) below so a Greenie would froth just to read it

The generation and use of energy is central to the maintenance of organization. Life itself is a state of organization maintained by the continual use of sources of energy. Human civilization has reached the state it has by the widespread use of energy, and for the large fraction of the world that aspires to a higher standard of living, more energy will be required for them to achieve it.

Therefore, I embrace the idea that we need energy, and probably need much more of it than we currently have. We should never waste energy, and should always seek to use energy efficiently as possible and practical, but energy itself will always be needed.

This weblog is about the use of thorium as an energy source of sufficient magnitude for thousands of years of future energy needs. Thorium, if used efficiently, can be converted to energy far more easily and safely than any other energy source of comparable magnitude, including nuclear fusion and uranium fission.

Briefly, my basic principles are:

1. Nuclear reactions (changes in the binding energy of nuclei) release about a million times more energy than chemical reactions (changes in the binding energy of electrons), therefore, it is logical to pursue nuclear reactions as dense sources of energy.

2. Changing the binding energy of the nucleus with uncharged particles (neutrons inducing fission) is much easier than changing the nuclear state with charged particles (fusion), because fission does not contend with electrostatic repulsion as fusion does.

3. Naturally occuring fissile material (uranium-235) will not sustain us for millennia due to its scarcity. We must fission fertile isotopes (uranium-238, thorium-232) which are abundant in order to sustain energy production for millenia. Fertile isotopes such as U-238 and Th-232 basically require 2 neutrons to fission (one to convert, one to fission), and require fission reactions that generate more than 2 neutrons per absorption in a fissile nucleus.

3. For maximum safety, nuclear reactions should proceed in a thermal (slowed-down) neutron spectrum because only thermal reactors can be designed to be in their most critical configuration, where any alteration to the reactor configuration (whether through accident or intention) leads to less nuclear reactions, not more. Thermal reactors also afford more options for achieving negative temperature coefficients of reactivity (which are the basic measurement of the safety of a nuclear reactor). Reactors that require neutrons that have not been slowed significantly from their initial energy (fast-spectrum reactors) can always be altered in some fashion, either through accident or intention, into a more critical configuration that could be dangerously uncontrollable because of the increased reactivity of the fuel. Basically, any fast-spectrum reactor that is barely critical will be extremely supercritical if its neutrons are moderated in some way.

4. "Burning" uranium-238 produces a fissile isotope (plutonium-239) that "burns" inefficiently in a thermal (slowed-down) neutron spectrum and does not produce enough neutrons to sustain the consumption of uranium-238. "Burning" thorium-232 produces a fissile isotope (uranium-233) that burns efficiently in a thermal neutron spectrum and produces enough neutrons to sustain the consumption of thorium. Therefore, thorium is a preferable fuel, if used in a neutronically efficient reactor.

5. Achieving high neutronic efficiency in solid-fueled nuclear reactors is difficult because the fuel sustains radiation damage, the fuel retains gaseous xenon (which is a strong neutron poison), and solid fuel is difficult to reprocess because it must be converted to a liquid stream before it is reprocessed.

6. Fluid-fuel reactors can continuously strip xenon and adjust the concentration of fuel and fission products while operating. More importantly, they have an inherently strong negative temperature coefficient of reactivity which leads to inherent safety and vastly simplified control. Furthermore, decay heat from fission products can be passively removed (in case of an accident) by draining the core fluid into a passively cooled configuration.

7. Liquid-fluoride reactors have all the advantages of a fluid-fueled reactor plus they are chemically stable across a large temperature range, are impervious to radiation damage due to the ionic nature of their chemical bond. They can dissolve sufficient amounts of nuclear fuel (thorium, uranium) in the form of tetrafluorides in a neutronically inert carrier salt (lithium7 fluoride-beryllium fluoride). This leads to the capability for high-temperature, low-pressure operation, no fuel damage, and no danger of fuel precipitation and concentration.

8. The liquid-fluoride reactor is very neutronically efficient due to its lack of core internals and neutron absorbers; it does not need "burnable poisons" to control reactivity because reactivity can continuously be added. The reactor can achieve the conversion ratio (1.0) to "burn" thorium, and has superior operational, safety, and development characteristics.

9. Liquid-fluoride reactors can retain actinides while discharging only fission products, which will decay to background levels of radiation in ~300 years and do not require long duration (>10,000 year) geologic burial.

10. A liquid-fluoride reactor operating only on thorium and using a "start charge" of pure U-233 will produce almost no transuranic isotopes. This is because neutron capture in U-233 (which occurs about 10% of the time) will produce U-234, which will further absorb another neutron to produce U-235, which is fissile. U-235 will fission about 85% of the time in a thermal-neutron spectrum, and when it doesn't it will produce U-236. U-236 will further absorb another neutron to produce Np-237, which will be removed by the fluorination system. But the production rate of Np-237 will be exceedingly low because of all the fission "off-ramps" in its production.

11. We must build thousands of thorium reactors to displace coal, oil, natural gas, and uranium as energy sources. This would be impractical if liquid-fluoride reactors were as difficult to build as pressurized water reactors. But they will be much simpler and smaller for several reasons. They will operate at a higher power density (leading to a smaller core), they will not need refueling shutdowns (eliminating the complicated refueling equipment), they will operate at ambient pressure and have no pressurized water in the core (shrinking the containment vessel dramatically), they will not require the complicated emergency core cooling systems and their backups that solid-core reactors require (because of their passive approach to decay heat removal), and their power conversion system will be much smaller and power-dense (since in a closed-cycle gas turbine you can vary both initial cycle pressure and overall pressure ratio). In short, these plants will be much smaller, much simpler, much, much safer, and more secure.

More here


Many people would like to be kind to others so Leftists exploit that with their nonsense about equality. Most people want a clean, green environment so Greenies exploit that by inventing all sorts of far-fetched threats to the environment. But for both, the real motive is to promote themselves as wiser and better than everyone else, truth regardless.

Global warming has taken the place of Communism as an absurdity that "liberals" will defend to the death regardless of the evidence showing its folly. Evidence never has mattered to real Leftists

Comments? Email me here. My Home Page is here or here. For times when is playing up, there are mirrors of this site here and here.


1 comment:

Wojtek said...

You are wrong about fast reactor criticality - "fast reactor that is barely critical will be very supercritical if some moderation will be introduced".

The opposite is true. Nuclear fission proceeds well (high chance of fission) where there is WELL moderated spectrum (good share of neutron is in thermal range), or mostly not moderated (most neutrons having high energy).
A bit moderated (epithermal range) is most often captured, not causing fission.

Adding some moderation to fast reactor will make it subcritical. Unless you will quickly replace hundreds/thousands of tons of sodium (most commonly used as heat transport) with pure water - you won't make it supercritical ;)

The opposite is a problem - sodium SLIGHTLY moderates neutrons. If some sodium will go out because reactor overheats and make sodium boil (900 degrees celsius), it will make neutrons faster and cause supercriticality.

In the other hand - it will allow more neutrons to fly away from reaction zone.

By carefully arranging the fuel lattice it is possible to avoid the danger by making second effect stronger than first.

And - fast reactors are operated at 500-550 degrees, lot of margin to boiling, and lot of time for control rods to react.

Fast reactors ARE safe if properly designed.

In US EBR-2 fast reactor operators performed tests like stopping pumps or stopping taking out heat while not stopping reactor.

It safely stopped itself without any help!