The real story is carried in the words of the sceptical scientist, Dr Roy Spencer on the excellent Watts Up With That? blog. The media hatchet job is most prevalent in the Guardian and on its broadcast arm, the BBC. Dr Spencer goes on to explain the findings in layman’s terms on his own website. In response to the resignation of Wolfgang Wagner, Dr Roger Pielke Snr puts the politicisation of science into context. And the ludicrous position on observations having to fit in with computer models as advanced by Dr Pete Gleick, and Dr Phil Jones’ comment about keeping sceptical papers out of the public domain, are both covered by Indur Goklany on WUWT.
What we are seeing is anti-science. We are experiencing pseudo science that aims not to question or challenge, but to reinforce the validity of a body of opinion that is yet to make the jump from theory to fact. It is being done to fit a political agenda. It is a corruption of science and the latest example of why people should be sceptical of the claims made about climate change and its causes and effects
In closing, one comment left on Watts Up With That? sums up the situation superbly and deserves to be repeated widely to help others understand what really is going on:
This is all part of the same pattern that has characterized the warmists’ approach to climate “science” since the last century. They come up with models and use these to produce predictions which are then baptized as sovereign truth. In real science, they would have been required to demonstrate the predictive validity of their models before their predictions would be granted any confidence – and when observations contradicted predictions, they would have been expected to revise their models instead of beating the data until it fit the model outputs. Instead, thanks to Algore, Hansen, left-wing politicians looking for regulatory and legislative mechanisms to control the polity and extract more tax dollars, and a compliant left-leaning media hungry for “imminent disaster” headlines, the burden of proof has been shifted to those who challenge the modellers instead of being left where it belongs: with the modellers who still have not demonstrated the validity of their models. I simply cannot believe we are still discussing a theory that, 20 years after it went mainstream, has yet to produce a single scrap of confirmatory empirical evidence.
The extent to which the AGW true believers have warped the scientific method to serve their pecuniary and political ends is simply breathtaking. Climate science represents the greatest perversion of the scientific method since the Enlightenment. It is phlogiston, phrenology and Lysenkoism all rolled up into one big, fat, corrupt boil desperately in need of lancing.
More HERE
Keep a note of this "17 years" prophecy
The hottest year since WWII was 1998 so by 2015 we will know there is no global warming. The date-setting folly of Warmists never ceases to amaze me. They always get it wrong. They are not even clever false prophets
Santer et al have a new paper out on trends in the tropospheric temperature.
Abstract
We compare global-scale changes in satellite estimates of the temperature of the lower troposphere (TLT) with model simulations of forced and unforced TLT changes. While previous work has focused on a single period of record, we select analysis timescales ranging from 10 to 32 years, and then compare all possible observed TLT trends on each timescale with corresponding multi-model distributions of forced and unforced trends. We use observed estimates of the signal component of TLT changes and model estimates of climate noise to calculate timescale-dependent signal-to-noise ratios (S/N). These ratios are small (less than 1) on the 10-year timescale, increasing to more than 3.9 for 32-year trends. This large change in S/N is primarily due to a decrease in the amplitude of internally generated variability with increasing trend length. Because of the pronounced effect of interannual noise on decadal trends, a multi-model ensemble of anthropogenically-forced simulations displays many 10-year periods with little warming. A single decade of observational TLT data is therefore inadequate for identifying a slowly evolving anthropogenic warming signal. Our results show that temperature records of at least 17 years in length are required for identifying human effects on global-mean tropospheric temperature.
SOURCE
Global Mean Sea Level Determination: An Ocean Of Uncertainty
What follows is a good overview explaining why the measurement of Global Mean Sea Level (GMSL) is fraught with much uncertainty and so subject to substantial error. Michael Limburg of the European Institute for Climate and Energy tells us why.
An exact determination of GMSL is a very difficult if not a fundamentally impossible task. Even more difficult is determining sea level rise (or drop) over time. Different authors using the same datasets arrive at completely different results. It’s little wonder sea level expert W. Siefert in Hamburg recently said in an interview: “When examined closely, sea level is being exposed more and more as a pure mathematical prop, inadequate and, foremost, not very meaningful. Especially when it is to be used as a sole standard of measure, or used to derive horror scenarios…(1)“
This is also confirmed by researchers like Douglas [Douglas, 1994], who illustrated in great detail why e.g. Barnett (1984), Emery and Aubry (1991) Pirazzoli (1993) concluded: "…the determination of a single sea-level curve of global applicability is an illusory task.”
Douglas hoped that improved research instruments would bring better and more reliable results in the years ahead. And with the possibilities presented by satellite altimetry, these results may now be at hand. But so far mostly controversial results have been produced and no really reliable findings have been gained.
And later, in spite of the many new instruments and techniques that can now be used to find the much sought signal of global warming on GMSL, they say… "…these tools seem to have raised more questions than they have answered.”
Obtainable accuracy
It must also be allowed to say that a claimed measurement accuracy of a few tenths of a millimetre per year for the (only measurable) Relative Sea Level (RSL) and the GMSL are not possible with the available historical data. Only the newest satellite altimetry tools may allow this in principle. Therefore using the unit of measure “mm” is grossly misleading. The IPCC, many experts, publicly informed media, and laymen are hence falsely claiming an accuracy that simply cannot be reached. These figures are computed values only.
In reality sea level changes can be measured accurately only to centimetres, and often only to within several centimetres – and very often not even that. Munk [Munk, 2003] (3) confirms this writing that…: “…the jury is still out on the interpretation of the tide gauge records.”
For sea level and for trends from global historical temperature data, the old saying of Carl Friedrich Gauss (1777-1855) – known as a great inventor of many basic statistical principles and algorithm - remains true: "Nothing shows a lack of mathematical understanding more than an exaggeratedly accurate calculation.”
Therefore the only thing certain is that the statistical construct of GMSL over the last 120 years indicates a rise of between 10 and 20 cm/century. The error lies in the scale of the calculated value, and likely may be even higher. While Mörner expects a mean rise of 10 cm/century, the IPCC (AR4) sees approx. 19 cm/century. Today IPCC experts see an increase in the rate of rise over the last 20 years, while others explicitly exclude such a rate increase.
Why all the uncetrtainty?
It is due to built-in systematc errors. Many of these errors are widely unknown in their historical size, appearance and direction. They are included in the data and they involve an array of factors such as: dtermination of sea level measurement reference points, datasets of various lengths, contaminated datasets, rapid shifting of tectonic plates and their vertical components, barometric pressure, density of water, etc, etc.
A close analysis of all these error factors indicate that the errors are of a systematic nature and, because they are mostly subtle, they cannot be determined on the scale of the sought local sea level rise. Therefore they have to be indicated using error bars in accordance with good scientific practice. But this has very rarely been done. Credible figures regarding the attainable accuracy are as a rule the exception, e.g. Mörner +10 ± 10 cm by the year 2100 (or + 5 ± 15 cm) [Mörner, 2004] (2). Anything else has to be taken with much caution.
Assigning the causes of sea level rise
Detailed attempts to determine GMSLR are made by authors of the IPCC and others by breaking it down into various components. But one has to keep in mind that that this approach is prone to failure. Cazenave et al [Cazenave, 2004] is quoted on this: "…for the past 50 years, sea-level trends caused by change in ocean heat storage also show high regional variability,” ´
and
"..has led to questions about whether the rate of 20th-century sea-level rise, based on poorly distributed historical tide gauges, is really representative of the true global mean.”
The estimates for eustatic and steric components cannot be brought in agreement with the observed data. The movement of single tectonic plates with speeds in the range of more than 15 cm/year, the vertical components therein, which can decisively impact the volume of the ocean above it, is certainly a cause of the observed changes in RSL and thus GMSL. But recording these changes and quantitively attributing them to a source has been impossible up to now.
Also a potential temperature-dependency is not detectable over the last 1000 years, as clearly shown by Storch et. al [Storch, 2008] (5). In their model that looks back, they did not find any correlation between sea level trend and temperature.
For all measurement locations, sea level changes can be more easily attributed to natural changes (glacial isostatic adjustment GIA / post glacial rebound PGR or other tectonic shifts) or, similar to the UHI for temperature, attributed indeed to man-made social-economic factors, e.g. urban growth and thus sinking. A greenhouse effect is not necessarily needed to explain it. This means future projections of the GMSL are purely speculative because of the great lack of understanding of the involved processes and the lack of data.
Nothing makes this more explicit than the wide range of estimates among IPCC lead authors (e.g. Rahmstorf) and other specialists: Jevreva, Mörner or Singer. Rahmstorf [Rahmstorf, 2007a](4) believes a maximum of 140 cm is possible by the end of the century, James Hansen estimates up to 600 cm under certain conditions, the IPCC shows estimates between 14 to 59 cm (final), Singer only 18-20 cm, and Mörner [Mörner, 2004] a mere 10 cm. This might be the reason why the IPCC authors are unusually cautious ( see for reference AR4 of WG I Observations: Oceanic Climate Change and Sea Level on page 410 Chapter 5.5.2.): "…there is an increasing opinion that the best estimate lies closer to 2 mm/yr than to 1 mm/yr…”
Conclusion
When basing conclusions solely on the ever-escalating opinions of a few scientists – some of them well known for blowing the horn of alarmism – political leaders should not decide on extremely costly measures to curb completely doubtful global sea leve rise, which is a mere statistical construct from the very beginning.
The only rational conclusion one can draw is: All global mean sea level rise claims with an accuracy of better than ± 10 cm/century have to be taken with great caution. Breaking down the rise into components and attributing a respective rise to each, especially to temperature rise, is not possible with today’s level of knowledge. Each assignment of factors is simply too speculative.
SOURCE (See the original for references and graphics)
Carbon Dioxide Not a Well Mixed Gas and Can’t Cause Global Warming
One of the least challenged claims of global warming science is that carbon dioxide in the atmosphere is a “well-mixed gas.” A new scientific analysis not only debunks this assertion but also shows that standard climatology calculations, applicable only to temperature changes of the minor gas, carbon dioxide were fraudulently applied to the entire atmosphere to inflate alleged global temperature rises.
Acceptance of the “well-mixed gas” concept is a key requirement for those who choose to believe in the so-called greenhouse gas effect. A rising group of skeptic scientists have put the “well-mixed gas” hypothesis under the microscope and shown it contradicts not only satellite data by also measurements obtained in standard laboratory experiments.
Canadian climate scientist, Dr Tim Ball, is a veteran critic of the “junk science” of the International Panel on Climate Change (IPCC) and no stranger to controversy.
Ball is prominent among the “Slayers” group of skeptics and has been forthright in denouncing the IPCC claims: “I think a major false assumption is that CO2 is evenly distributed regardless of its function.“
School Children Prove Carbon Dioxide is Heavier than Air
Dr. Ball and his colleagues appear to be winning converts with their hard-nosed re-examination of the standard myths of climate science and this latest issue is probably one of the easiest for non-scientists to comprehend.
Indeed, even high school children are taught the basic fact that gravity causes objects heavier than air to fall to the ground. And that is precisely what CO2 is – this miniscule trace gas (just a very tiny 0.04% of atmosphere) is heavy and is soon down and out as shown by a simple school lab experiment.
Or we can look at it another way to make these technical Physics relationships easy. This is because scientists refer to ratios based on common standards. Rather than refer to unit volumes and masses, scientists use the concept of Specific Gravity (SG). Giving standard air a value of 1.0 then the measured SG of CO2 is 1.5 (considerably heavier). [1.]
CO2: The Heavy Gas that Heats then Cools Faster!
The same principle is applied to heat transfer, the Specific Heat (SH) of air is 1.0 and the SH of CO2 is 0.8 (heats and cools faster). Combining these properties allows for thermal mixing. Heavy CO2 warms faster and rises, as in a hot air balloon. It then rapidly cools and falls.
This 'thermal' mixing is aided by wind flow patterns, but the ratios of gases in the atmosphere are never static or uniform anywhere on Earth. Without these properties CO2 would fill every low area to dangerously high levels. Not 'high' in a toxic sense, only that CO2 would displace enough Oxygen that you could not have proper respiration. Nitrogen is 78% of the atmosphere and totally non-toxic, but if you continue to increase Nitrogen and reduce Oxygen the mixture becomes 'unbreathable.’
It is only if we buy into the IPCC’s “well mixed gas” fallacy that climate extremists can then proceed to dupe us further with their next claim; that this so-called “well mixed” CO2 then acts as a “blanket” to “trap” the heat our planet receives from the sun.
More HERE
Affordable energy is essential for jobs, justice – and better health
Amid mounting criticism of its voluminous rulemaking proposals, EPA continues to insist that its new rules for coal-fired power plant emissions will generate benefits far in excess of their costs. Those claims are have no basis in fact, as this article by Affordable Power Alliance co-chair Niger Innis emphasizes.
EPA is able to make these assertions only by cherry-picking data and studies – and, more significantly, by failing to address the serious adverse effects that its proposed rules will have on electricity prices, jobs, affordable heating and air conditioning, and other essential foundations of “human health,” “public welfare” and “environmental justice. Any proper analysis would fully consider these impacts. EPA’s analysis ignores them.
The Environmental Protection Agency insists that its recent air quality initiatives will protect minority and poor Americans from pollution that “disproportionately affects” their health and impairs “environmental justice.” The Affordable Power Alliance is not convinced.
We believe EPA needs to reexamine its entire air pollution regulatory program and carefully consider all aspects of health, welfare and justice, especially those it has failed to address thus far.
As a coalition of minority, civil rights, religious, elderly and small business groups, the APA strongly supports public health, pollution control and justice. However, we are deeply concerned that EPA’s proposed rules actually undermine those objectives, by impairing access to affordable, reliable energy – and thus people’s health and welfare.
EPA’s health claims about mercury, soot, ozone, sulfur dioxide, nitrogen dioxide and other pollutants are speculative and based on selective literature searches, according to an extensive analysis by natural scientist Dr. Willie Soon (posted at www.AffordablePowerAlliance.org). The agency failed to consider studies that contradict its claims that poor and minority communities face serious, immediate health risks from power plant emissions, say Soon and scientists cited in his report.
These emissions have been declining for decades and are not related to asthma rates – which have been rising for reasons unrelated to outdoor air pollution, say air pollution consultant Joel Schwartz and other experts. Indeed, it defies logic to suppose that power plant emissions are causing increased asthma, if asthma rates are rising while pollution is declining. Rapid power plant emission reductions of the magnitude contemplated by EPA would thus not seem necessary.
Worse, EPA’s pollution rules will impair access to affordable electricity. They will force the closure of multiple power plants, send electricity prices soaring 12-60 percent, and severely impact business and family budgets, according to studies by Management Information Services (MIS), utility associations and other experts.
Especially in the 26 states that rely on coal for 48-98% of their electricity, EPA’s actions will raise family electricity costs by hundreds of dollars a year. They will increase factory, hospital, office, hotel, school, church, charity and other business electricity costs by thousands to millions of dollars annually.
Because every $30,000 in increased energy costs could mean the elimination of another entry-level job, EPA’s rules will cause further job losses. MIS predicts that 3.5 million jobs and up to $82 billion in annual economic production will be lost in just six Midwestern manufacturing states.
Chicago public schools alone will face an extra $2.7 million a year for electricity costs by 2014, notes the Chicago Tribune. These increases will mean reductions in school employment, salaries, and academic, sports and music programs.
Unemployment is already 9.1% nationally and over 17% in black communities. EPA’s plans will worsen these rates, significantly increase household energy costs, and make poor, minority and elderly families even less able to afford gasoline, food, clothing, healthcare and other basic needs.
Many families will suffer increased stress, drug and alcohol abuse, domestic violence and crime rates. Unable to afford proper heating and air conditioning, disproportionate numbers of people in low income communities will face hypothermia during frigid winter months and heat prostration during summer heat waves. People will die, as cash-strapped states run out of money for heating and AC assistance, even more rapidly than they did last year.
Retrofitting older power plants is often too costly to justify and, in today’s regulatory and litigious environment, replacing them will be extremely difficult, especially under EPA’s short timeframe for further cleaning up … or simply closing down … the older plants.
Analysts project that EPA’s rules could cost Illinois 3,500 megawatts of electricity generation by 2014 – enough to power 3,500,000 homes and small businesses. The United States could lose 17,000 to 81,000 megawatts of capacity by 2017, industry and independent experts forecast. The Federal Energy Regulatory Commission estimates up to 81,000 megawatts of capacity could be lost by 2018.
That means further impaired electricity availability and reliability during peak use periods. It will likely result in brownouts and blackouts, further harming businesses, schools, families, jobs and health.
EPA says the benefits of its new rules “far exceed” their costs. However, the agency’s analyses and definitions of “human health,” “public welfare” and “environmental justice” fail to consider the vital factors presented here. The fact is, the adverse effects of unemployment, sharply higher energy costs and generally lower socio-economic conditions far outweigh asserted benefits of improved air quality.
“Even when properly done, science can only provide the analytical and factual basis for public policy decisions,” says Dr. Roger McClellan, former chair of EPA’s Clean Air Scientific Advisory Committee. “It cannot and should not dictate a particular policy choice in setting and implementing standards.”
Those decisions must consider the full spectrum of energy, employment, economic, health, welfare and justice issues presented here and by other analysts. So far, EPA has failed to do this and has relied on biased analyses in setting its unscientific pollution standards.
McClellan also agrees with Supreme Court Justice Stephen Breyer, whose commonsense, comparative health approach recognizes the detrimental impacts that unemployment and reduced living standards have on people’s health and welfare. “Those impacts far outweigh benefits from further improvements in already good air quality,” especially as calculated using EPA’s computer models and linear extrapolations from limited health and air quality data, McClellan explains.
EPA says it cannot consider the economic effects of its regulations. However, if the regulations also affect human health and welfare, EPA needs to consider those impacts fully and carefully.
EPA’s mission is to protect Americans from real health risks – not from speculative dangers based on cherry-picked data and extrapolations, McClellan, Schwartz, Soon and other experts emphasize. The agency must refrain from implementing rules that adversely affect vital components of “public health and welfare,” like those discussed here, until all these factors are examined fully and carefully.
Abundant, reliable, affordable energy is the foundation for everything we eat, make, ship and do – and for jobs, human health, environmental quality, civil rights progress and environmental justice.
America needs a full national and congressional debate on EPA’s rules, before they cause serious damage that many experts fear is inevitable if the regulations are implemented.
SOURCE
British green energy reforms ‘to put £300 on household energy bills'
Green energy policies are set to add more than £300 a year to the average household energy bill, according to Downing Street calculations. David Cameron has been warned that there will be a 30 per cent rise in consumer bills by 2020 as a direct result of the Coalition’s policies.
The note from the Prime Minister’s senior policy adviser Ben Moxham also labels as ‘unconvincing’ Energy Secretary Chris Huhne’s claims that price increases would be offset by lower consumption due to energy efficiency measures.
The projected rise of nearly a third in the average household energy bill of £1,059 is blamed on policies designed to promote the use of renewables and nuclear power sources.
New obligations on energy firms to use increasing amounts of electricity from renewable sources and to help low-income homes become energy-efficient are also major factors.
Worryingly, the 30 per cent rise is not described as a worst-case scenario, merely a ‘mid-case’ projection.
‘Over time it is clear that the impact of our policies on consumer bills will become significantly greater,’ Mr Moxham states.
He adds: ‘DECC’s (Department of Energy and Climate Change) mid-case gas price scenario sees policies adding 30 per cent to consumer energy bills by 2020 compared to a world without policies.’
The note is dated July 29, 2011, and is copied to senior Downing Street advisers including Mr Cameron’s chief of staff Ed Llewellyn, permanent secretary Jeremy Heywood and policy chief Steve Hilton.
Mr Moxham also warns: ‘We find the scale of household energy consumption savings calculated by DECC to be unconvincing’.
The projected rise in energy bills is a major headache for Mr Cameron, who promised before last year’s general election to tackle soaring prices by giving regulators more powers.
Mr Huhne has repeatedly dismissed claims that fuel bills will rise by hundreds of pounds as ‘absolute nonsense’ and ‘rubbish calculations’.
The rise projected by No 10 is still far lower than that made by independent experts.
Earlier this year, experts at Unicredit banks said that a raft of green measures could mean energy bills double within just four years. Their report said: ‘According to our analysis, a typical UK energy bill could rise from the current level of £1,000 per year to over £2,000 per year by 2015. ‘As investment occurs, bills could double every five years until 2020, in our view.’
SOURCE
***************************************
For more postings from me, see DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC, GUN WATCH, AUSTRALIAN POLITICS, IMMIGRATION WATCH INTERNATIONAL and EYE ON BRITAIN. My Home Pages are here or here or here. Email me (John Ray) here. For readers in China or for times when blogger.com is playing up, there are mirrors of this site here and here
*****************************************
No comments:
Post a Comment