An announcement below from S. Fred Singer (singer@sepp.org)
The UN-COP-15 conference opens in Copenhagen on Dec 7 -- Pearl Harbor Day.
Geologist Leighton Steward and I will both be in CPH, holding forth at a briefing session at the Danish Parliament, together with Lord Monckton and other notables (incl some surprise participants). But we propose to conduct also a Virtual Conference on the Internet, entitled "Copenhagen -- another Pearl Harbor" This title suggests that a CPH Accord would be a disaster for the US, and indeed the whole world -- even worse than the notorious Kyoto Protocol.
Leighton runs the website www.co2isgreen.org and will organize the VC. I am the organizer and coauthor of NIPCC reports www.NIPCCreport.org [NIPCC is our answer to IPCC, and presents the scientific evidence against IPCC in the report “Nature – Not Human Activity – Rules the Climate”]
In my current role as a 'community organizer ' I seek your participation in the VC. We plan to run dozens of short videos (30 to 120 seconds) where you can identify yourself and present a brief statement against IPCC and AGW, against CO2 rationing, against the folly of Cap and Trade, and discourse on the uselessness of the CPH conference.
If you are willing to cooperate, please send your name, a bio (20-40 words), phone number, and e-mail address directly to Leighton Steward via the contact form on CO2 is Green. A professional organization will then contact you and handle the production.
Your participation is entirely voluntary, of course. But I hope you will want to play a part in this pioneering venture and look foward to seeing your handsome face on the web on Dec 7.
Assumptions trump measurements
On 11th I put up a translation of a report by a Warmist (Leinfelder) that noted many doubts about climate change being expressed by German scientists. The article was headed: GERMANY’S GEO-RESEARCH INSTITUTES CRITICISE CLIMATE ALARMISM.
In an endeavour to refute the doubts, Leinfelder said: "Not mentioned, however, was the essential fact: the CO2 emitted by us is now gathering in the atmosphere, unlike other greenhouse gases over thousands of years, where it now exceeds all historical values for at least a million years."
That is certainly an IPCC assumption but it is simply not true. CO2 does NOT accumulate in the atmosphere for any length of time. It is absorbed, mainly by the oceans, within a few years of being emitted. I reproduce below a 2007 article on that which reports comments by Norwegian geologist Tom Segalstad. Segalstad also has an August 2009 article here which reports further research confirmation for his comments given below
We are doomed, say climate change scientists associated with the United Nations Intergovernmental Panel on Climate Change, the United Nations body that is organizing most of the climate change research occurring in the world today. Carbon dioxide from man-made sources rises to the atmosphere and then stays there for 50, 100, or even 200 years. This unprecedented buildup of CO2 then traps heat that would otherwise escape our atmosphere, threatening us all.
"This is nonsense," says Tom V. Segalstad, head of the Geological Museum at the University of Oslo and formerly an expert reviewer with the same IPCC. He laments the paucity of geologic knowledge among IPCC scientists -- a knowledge that is central to understanding climate change, in his view, since geologic processes ultimately determine the level of atmospheric CO2.
"The IPCC needs a lesson in geology to avoid making fundamental mistakes," he says. "Most leading geologists, throughout the world, know that the IPCC's view of Earth processes are implausible if not impossible."
Catastrophic theories of climate change depend on carbon dioxide staying in the atmosphere for long periods of time -- otherwise, the CO2 enveloping the globe wouldn't be dense enough to keep the heat in. Until recently, the world of science was near-unanimous that CO2 couldn't stay in the atmosphere for more than about five to 10 years because of the oceans' near-limitless ability to absorb CO2.
"This time period has been established by measurements based on natural carbon-14 and also from readings of carbon-14 from nuclear weapons testing, it has been established by radon-222 measurements, it has been established by measurements of the solubility of atmospheric gases in the oceans, it has been established by comparing the isotope mass balance, it has been established through other mechanisms, too, and over many decades, and by many scientists in many disciplines," says Prof. Segalstad, whose work has often relied upon such measurements.
Then, with the advent of IPCC-influenced science, the length of time that carbon stays in the atmosphere became controversial. Climate change scientists began creating carbon cycle models to explain what they thought must be an excess of carbon dioxide in the atmosphere. These computer models calculated a long life for carbon dioxide.
Amazingly, the hypothetical results from climate models have trumped the real world measurements of carbon dioxide's longevity in the atmosphere. Those who claim that CO2 lasts decades or centuries have no such measurements or other physical evidence to support their claims.
Neither can they demonstrate that the various forms of measurement are erroneous. "They don't even try," says Prof. Segalstad. "They simply dismiss evidence that is, for all intents and purposes, irrefutable. Instead, they substitute their faith, constructing a kind of science fiction or fantasy world in the process."
In the real world, as measurable by science, CO2 in the atmosphere and in the ocean reach a stable balance when the oceans contain 50 times as much CO2 as the atmosphere. "The IPCC postulates an atmospheric doubling of CO2, meaning that the oceans would need to receive 50 times more CO2 to obtain chemical equilibrium," explains Prof. Segalstad. "This total of 51 times the present amount of carbon in atmospheric CO2 exceeds the known reserves of fossil carbon-- it represents more carbon than exists in all the coal, gas, and oil that we can exploit anywhere in the world."
Also in the real world, Prof. Segalstad's isotope mass balance calculations -- a standard technique in science -- show that if CO2 in the atmosphere had a lifetime of 50 to 200 years, as claimed by IPCC scientists, the atmosphere would necessarily have half of its current CO2 mass. Because this is a nonsensical outcome, the IPCC model postulates that half of the CO2 must be hiding somewhere, in "a missing sink." Many studies have sought this missing sink -- a Holy Grail of climate science research-- without success.
"It is a search for a mythical CO2 sink to explain an immeasurable CO2 lifetime to fit a hypothetical CO2 computer model that purports to show that an impossible amount of fossil fuel burning is heating the atmosphere," Prof. Segalstad concludes. "It is all a fiction."
SOURCE
Why Joe Romm is influential
Some excerpts below from a long article by by Michael Shellenberger and Ted Nordhaus
Abstract:
What gave rise to Joe Romm and Climate McCarthyism? In a word: hyper-partisanship. America is more polarized politically today than it has been in 130 years. The fracturing of traditional media has political partisans looking for people who will filter news, analysis, and opinions for them. Democrats who care about the environment have been turning to Joe Romm. They wished for somebody tough to stand up to the bad guys on climate change. They wished for somebody to simplify complicated questions. In "The Hyper-Partisan Mind," we see why they should be careful what they wish for.
In Part 1 and in Part 2 we documented how Joe Romm uses McCarthyite tactics, including character assassination, misrepresentation, and guilt-by-association, to intimidate the press corps and discredit non-skeptical climate experts as "global warming deniers." In this post we will explore one of the main forces that gave rise to Climate McCarthyism: hyper-partisan polarization.
America is more polarized today than at any time since Reconstruction. A major quantitative analysis by social scientists Nolan McCarty, Keith Poole and Howard Rosenthal found today to be the most polarized period in 130 years.
Little wonder then that Romm's strength lies in his appeals to Democratic partisan identity. He writes for a Democratic audience and mobilizes liberal and environmentalist readers to attack reporters, activists, and policymakers who diverge, literally, from the Party line.
Today's fractured and polarized media environment has allowed Joe Romm to become the most influential liberal climate activist in the country, largely because he has convinced liberals and Democrats that he is an energy and climate science expert. This explains why Nobel Prize Winner and New York Times columnist Paul Krugman says "I trust Joe Romm," Thomas Friedman calls ClimateProgress.org "the indispensable blog," Al Gore relies on him for technical analysis, and the Center for American Progress makes him the organization's chief spokesperson on climate and energy issues.
In this post we will see how Romm helps Democrats make mental short-cuts about who to trust and distrust, which technologies are promising and which are chimeras, and which policies to advocate and which to oppose. We will document how Romm does this by inventing associations between people he disagrees with and various Republicans, particularly George W. Bush.
And we will argue - against those who pooh-pooh his influence - that Joe Romm is, in fact, far more influential today than Joe McCarthy was in the 1950s, a fact that, unfortunately, has proven poisonous to creating the consensus needed for serious action on climate.
Partisan Identity as a Mental Short-Cut
It's no coincidence that America's Climate McCarthyite-in-chief is a blogger at the largest liberal think tank and not a U.S. Senator. Busy fundraising and campaigning, members of Congress have largely outsourced the deliberative process of legislating to partisan interest groups and think tanks.
Meanwhile, the explosion of new media and the resulting flood of information means that educated partisans - including beat reporters and national columnists -- are looking for partisan specialists to filter their news, analysis, and media commentary. "We may believe intellectually in the clash of opinions," Times columnists Nicholas Kristof noted, "but in practice we like to embed ourselves in the reassuring womb of an echo chamber."
Much has been written about the ideological echo chamber conservatives like Sen. James Inhofe, Rush Limbaugh, and Glen Beck have created to enforce anti-environmental orthodoxy on the Right. Less remarked upon has been the creation of its analog on the Left - an accomplishment in which Romm has taken a leading role. Romm has mastered the echo chamber in its liberal expression and creates a reassuring green womb for his growing cadre of loyal readers. Every day of the week he dutifully filters the news, telling readers the good news of yet another McKinsey report on how energy efficiency more than pays for itself!, and the bad news of yet another outrageous declaration by the dastardly Sen. Inhofe. In one post Romm serves up news stories of natural disasters as evidence of the imminent apocalypse, while in the next he touts new studies showing how cheap solar power is and how expensive nuclear is.
Most importantly Romm functions to inform his readers of the partisan identity of any given thing, whether it be a new technology, policy, or analysis. Thus, when it came time for Romm to criticize a rather technical piece on the rising carbon intensity of the global economy that appeared in the journal Nature -- which we discussed in our last post -- he attacked it, not as inaccurate or incorrect, but rather as Republican....
More HERE
How Dense Can They Get? Good fuels need no subsidies
When it comes to power, density is the key. Energy density. The reason that solar power, wind power, and ethanol are so expensive is that they are derived from very diffuse energy sources. It takes a lot of energy collectors such as solar cells, wind turbines, or corn stalks covering many square miles to produce the same amount of power that traditional coal, natural gas, or nuclear plants can on just a few acres.
Each of these alternative energy sources is based on mature technology. Agriculture and fermentation have their roots in prehistory; windmills date back at least to 65 B.C.; the photovoltaic effect was discovered in 1839. Yet in nowhere in the world are these technologies serving as primary energy sources without significant government subsidies. While incremental improvements can be expected, what is needed for them to become viable is an order-of-magnitude increase in productivity. As old and as well-researched as the technologies are, such improvements are possible but unlikely. As significant future energy sources, these technologies are dead ends, which is why the government, and not the private sector, is funding them.
Industry is more than willing to risk research dollars on technologies that show real promise, but it is not willing to flush shareholder money down a rat hole. Politicians, however, operate from different incentives. When a crisis, real or imagined, makes headlines, they want voters to see them doing “something” about it, and they must move quickly because election cycles and constituent attention spans are short. Funding long-term research in promising technologies is not sufficient to meet politicians’ needs. Solar panels, wind turbines, and ethanol refineries are all current technology and can be erected quickly with fanfare and photo-ops. By the time these alternative power sources prove to be financial and, possibly, environmental busts, the politicians will have been reelected and voters’ attention will have shifted to the next crisis.
Another benefit of subsidizing “shovel ready” solutions is that existing technologies have existing supporters who can provide campaign funds. Such supporters, however, constitute a well-financed “status quo” that will make government funding, once started, difficult to end. For example, even though corn-based ethanol has driven up food and fuel prices, increased auto emissions, raised atmospheric carbon dioxide concentrations (by causing additional acreage to be tilled), and possibly resulted in net energy losses, the government is still subsidizing the industry and still requiring that the fuel be added to gasoline.
Wind turbines, for their part, kill large numbers of birds, and this will only get worse as more turbines are erected. Eventually, such kills could reach a level at which they hurt local environments by reducing the natural check that birds and bats place on insect and rodent populations. Even should this occur, however, the wind turbine juggernaut will be hard to stop in the face of an entrenched lobby.
By contrast, consider the significant oil-industry investments in researching biofuels made from algae. Unlike ethanol, biofuels are chemically similar to fuel made from petroleum and, like petroleum-based fuels, have a significantly higher energy content than does ethanol. Biofuels can also be handled by current fuel distribution systems and can be burned in today’s vehicles.
Algae can be grown in brackish water on desert land and, with today’s technology, can produce over 2,000 gallons of fuel per acre each year. This compares favorably with the approximately 250 gallons of ethanol that can be produced from an acre of corn – a ratio of 8 to 1. Accounting for the differences in BTU content, the ratio jumps to over 12 to 1. It may even be possible to boost productivity to 100,000 gallons per acre per year, raising algae’s potential to over 600 times that of corn-based ethanol!
Biofuels are carbon-neutral because the carbon dioxide released when they are burned is first extracted from the atmosphere by the algae. Unlike burning petroleum-based fuels, then, burning biofuels will not result in a net increase in atmospheric CO2 levels.
With algae’s vast potential, it is easy to understand why private industry is interested and why no government subsidies are needed to encourage investment. Moreover, if algae-based fuels do not prove viable, the companies now researching them will have no “status quo” problems with ending their investments and shifting scarce resources to more promising technologies – where “promise” is measured in density.
SOURCE
Britain's Warmist fairytale
Britain has no chance of meeting its main carbon-reduction target because it lacks the engineering and manufacturing capacity to deliver the required renewable energy, a study has found.
The Government has made a legally binding commitment to cut emissions by 80 per cent by 2050 but has failed to set out how this could be achieved.
The study by the Institution of Mechanical Engineers says that the target, the central plank of Britain’s negotiating position at the UN climate change summit in Copenhagen next month, is “an act of faith” with no grounding in reality. Britain would need to build the equivalent of 30 nuclear power stations by 2015 to be on course to meet the target, the study says. On Monday the Government said it hoped that private companies would build ten by 2025.
The institution calls on the Government to accept the “uncomfortable reality” that the 80 per cent target, mandated in the Climate Change Act, is unachieveable. It says: “Given the magnitude of the engineering challenge and the pace of action required, the institution concludes that the Climate Change Act has failed even before it has started. It seems likely that the Act will have to be revisited by Parliament or simply ignored by policymakers.”
The study estimates that, even using optimistic assumptions about annual rates of carbon reduction, the earliest the target for 2050 could be achieved is 2100.
Emissions per unit of GDP, known as carbon intensity, would have to fall by 5 per cent a year for the next 40 years to meet the target. Britain’s highest rate of carbon intensity reduction was 2.3 per cent a year in the mid-1990s when several coal-fired power stations were replaced by more efficient gas-fired ones. In recent years, carbon intensity has been falling by about 1.3 per cent a year.
The institution accuses the Government of ignoring its own evidence about how long it takes to deliver infrastructure. It dismisses the idea that Britain could recruit engineers from abroad. It also says that private companies, on which the Government is relying to deliver low-carbon infrastructure, are “simply not that interested”.
Stephen Tetlow, the institution’s chief executive, said Britain needed to adopt a “wartime mentality”, with people as acutely aware of their energy consumption as they were of food consumption during the Second World War.
SOURCE
Brits to pay up big in pursuit of a Greenie fantasy
Families will pay a new levy on electricity bills for at least the next 20 years to fund technology designed to capture the carbon from coal-fired power stations. The Government is planning to raise £9.5 billion from the levy to subsidise up to four carbon capture and storage (CCS) demonstration plants. Details of the first plant will be announced early next year. The Department for Energy and Climate Change said yesterday that uncertainty over the commercial viability of CCS meant that public support might have to continue beyond 2030.
The Government is promoting CCS to justify approving new coal plants to replace the eight due to close by 2015 under European rules on air pollution. Burning coal produces far more carbon than burning gas for the same amount of electricity but ministers want to build new coal plants to reduce Britain’s dependence on imported gas.
E.On announced last month that it was delaying its plan for a new coal station with CCS at Kingsnorth, Kent, for at least three years. However, the Kingsnorth plant may yet go ahead and, along with a proposed plant at Longannet in Scotland, is competing to be the first subsidised CCS demonstration project.
The department said the CCS levy, likely to start in 2011, would be about £17 a year per household. It said that the cost could be higher if its assumptions about the cost of CCS proved too optimistic. The initial levy, which will be imposed on electricity suppliers but passed on to consumers, will run for 15 years. This will pay for the first phase of CCS, under which new coal plants will have to capture the carbon from only about a quarter of their generating capacity.
Ed Miliband, the Energy and Climate Change Secretary, said that the levy could be continued beyond the 15-year period to subsidise CCS for the entire output of the four plants.
An official from the department said it was possible that the levy could remain in place for an additional 15 years, but this would depend on the price of permits to emit carbon. If the price remained at the present low level, CCS would continue to need huge subsidies because it would be cheaper for generators to buy permits for their carbon emissions than to invest in technology to reduce them.
Mr Miliband admitted that further regulations or financial incentives might be needed to encourage the development of CCS. He ordered a “rolling review” of progress on CCS and said it would report by 2018 on whether it was “technically or economically viable”. The department said its ambition was for any coal plant opening after 2020 to have CCS covering its full capacity from the outset. Its draft policy on “clean coal” said it hoped that the four demonstration projects would allow CCS to be applied to existing coal plants from 2020. “Our ambition is for CCS to be ready for widespread deployment from 2020.”
It admitted that there was a risk that CCS, which has yet to be shown to work commercially anywhere in the world, might prove unviable. “In the event that CCS is not on track to become technically or commercially viable, preventing retrofit, an appropriate regulatory approach for managing emissions will be needed.”
Keith Allott, head of climate change at the environmental group WWF-UK, said: “The acknowledgement that we need a safety net in place, in case carbon capture and storage technology doesn’t work or costs too much, is a sensible step forward. However, waiting until the 2020s to put such a plan into action is foolhardy. “It gives us no guarantee that the advice of the Committee on Climate Change, which urges the UK to decarbonise the power sector by 2030, will be met. “It would also do nothing to stop the building of largely unabated coal power stations in the interim.”
SOURCE
Australia: More dreamers who want you to believe that they can see 100 years into the future
The usual stupid straight-line extrapolation; No mention that sea levels have stopped rising in recent years; No mention that we could well be in the middle of an ice-age by then
Almost 250,000 homes, now worth up to $63 billion, will be "at risk of inundation" by the end of the century, under "worst-case but plausible" predictions of rising sea levels. The study -- released ahead of the crucial Senate vote on Labor's emissions trading scheme -- modelled the effect of a 1.1m sea-level rise on cities and towns around Australia. This is a higher level than the 79cm end-of-century rise predicted by the last Intergovernmental Panel on Climate Change, but in the mid-range of some subsequently published research.
It found between 157,000 and 247,000 homes "at risk of inundation" -- meaning they would be permanently flooded or frequently flooded by storm surges or king tides -- with hospitals, water-treatment plants and other public buildings also found to be at risk. Even Sydney airport would be at "increased risk" of inundation, according to the study, written by the Department of Climate Change with input from CSIRO, Geosciences Australia and scores of academics.
The study -- which models possible risks down to township and local government areas complete with aerial photographs of towns showing the possible inundation -- appears timed to give the public a sharp reminder of the possible dangers of climate change. It also increases pressure on the opposition as the government's ETS bill is brought back to parliament next week.
It found NSW had "the greatest exposure", with between 40,800 and 62,400 homes at risk, followed by Queensland (35,900 to 56,900), Victoria (27,600 to 44,600), South Australia (25,200 to 43,000) and Western Australia (18,700 to 28,000). Within each state, it identified the local government areas where property was most "at risk" -- for NSW, Lake Macquarie, Wyong, Gosford, Wollongong, Shoalhaven and Rockdale; for Queensland, Moreton Bay, Mackay, the Gold Coast, Fraser Coast, Bundaberg and the Sunshine Coast; and for Victoria, Kingston, Geelong, Wellington and Port Phillip.
The study says that "based on the recent science 1.1m was selected as a plausible value for sea-level rise for this risk assessment. It is important to note that the purpose of a risk assessment is to identify areas of risk and therefore plausible worse-case scenarios need to be considered."
Andrew Ash, director of the CSIRO climate-change adaption flagship, said the 1.1m sea-level rise was "certainly plausible". "As things stand, the only variation will be exactly when we reach that level," Dr Ash said. Given the study was meant to help government planning decisions, it was therefore "both plausible and appropriate" to model a 1.1m rise. As well as the threat of inundation, the study calculates how many buildings are under threat from "soft" erodable shorelines.
SOURCE
***************************************
For more postings from me, see DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC, GUN WATCH, SOCIALIZED MEDICINE, AUSTRALIAN POLITICS, IMMIGRATION WATCH INTERNATIONAL and EYE ON BRITAIN. My Home Pages are here or here or here. Email me (John Ray) here. For readers in China or for times when blogger.com is playing up, there are mirrors of this site here and here
*****************************************
No comments:
Post a Comment