Wednesday, August 31, 2016
The Arctic ice was lower than today 6,000 years ago
And they had no anthropogenic global warming then! So could the present ice level be just another natural fluctuation? If not, why not?
Arctic Ocean perennial sea ice breakdown during the Early Holocene Insolation Maximum
Christian Strannea et al.
Abstract
Arctic Ocean sea ice proxies generally suggest a reduction in sea ice during parts of the early and middle Holocene (∼6000–10,000 years BP) compared to present day conditions. This sea ice minimum has been attributed to the northern hemisphere Early Holocene Insolation Maximum (EHIM) associated with Earth's orbital cycles. Here we investigate the transient effect of insolation variations during the final part of the last glaciation and the Holocene by means of continuous climate simulations with the coupled atmosphere–sea ice–ocean column model CCAM. We show that the increased insolation during EHIM has the potential to push the Arctic Ocean sea ice cover into a regime dominated by seasonal ice, i.e. ice free summers. The strong sea ice thickness response is caused by the positive sea ice albedo feedback. Studies of the GRIP ice cores and high latitude North Atlantic sediment cores show that the Bølling–Allerød period (c. 12,700–14,700 years BP) was a climatically unstable period in the northern high latitudes and we speculate that this instability may be linked to dual stability modes of the Arctic sea ice cover characterized by e.g. transitions between periods with and without perennial sea ice cover.
SOURCE
Ice scares aren’t all they’re cracked up to be
The sea ice in the Arctic Ocean is approaching its annual nadir. By early September each year about two-thirds of the ice cap has melted, then the sea begins to freeze again. This year looks unlikely to set a record for melting, with more than four million square kilometres of ice remaining, less than the average in the 1980s and 90s, but more than in the record low years of 2007 and 2012. (The amount of sea ice around Antarctica has been increasing in recent years, contrary to predictions.)
This will disappoint some. An expedition led by David Hempleman-Adams to circumnavigate the North Pole through the Northeast and Northwest passages, intending to demonstrate “that the Arctic sea ice coverage shrinks back so far now in the summer months that sea that was permanently locked up now can allow passage through”, was recently held up for weeks north of Siberia by, um, ice. They have only just reached halfway.
Meanwhile, the habit of some scientists of predicting when the ice will disappear completely keeps getting them into trouble. NASA climate scientist Jay Zwally told the Associated Press in 2007: “At this rate, the Arctic Ocean could be nearly ice-free at the end of summer by 2012.” Two years later Al Gore quoted another scientist that “there is a 75 per cent chance that the entire north polar ice cap, during the summer months, could be completely ice-free within five to seven years” — that is, by now.
This year Professor Peter Wadhams of Cambridge University has a new book out called Farewell to Ice, which gives a “greater than even chance” that the Arctic Ocean will be ice-free next month. Not likely.
He added: “Next year or the year after that, I think it will be free of ice in summer … You will be able to cross over the North Pole by ship.” The temptation to predict a total melt of the Arctic ice cap, and thereby get a headline, has been counter-productive, according to other scientists. Crying wolf does not help the cause of global warming; it only gives amusement to sceptics.
Would it matter if it did all melt one year? Here’s the point everybody seems to be missing: the Arctic Ocean’s ice has indeed disappeared during summer in the past, routinely. The evidence comes from various sources, such as beach ridges in northern Greenland, never unfrozen today, which show evidence of wave action in the past. One Danish team concluded in 2012 that 8500 years ago the ice extent was “less than half of the record low 2007 level”. A Swedish team, in a paper published in 2014, went further: between 10,000 years ago and 6000 years ago, the Arctic experienced a “regime dominated by seasonal ice, ie, ice-free summers”.
This was a period known as the “early Holocene insolation maximum” (EHIM). Because the Earth’s axis was tilted away from the vertical more than today (known as obliquity), and because we were then closer to the Sun in July than in January (known as precession), the amount of the Sun’s energy hitting the far north in summer was much greater than today. This “great summer” effect was the chief reason the Earth had emerged from an ice age, because hot northern summers had melted the great ice caps of North America and Eurasia, exposing darker land and sea to absorb more sunlight and warm the whole planet.
The effect was huge: about an extra 50 watts per square metre 80 degrees north in June. By contrast, the total effect of man-made global warming will reach 3.5 watts per square metre (but globally) only by the end of this century.
To put it in context, the EHIM was the period during which agriculture was invented in about seven different parts of the globe at once. Copper smelting began; cattle and sheep were domesticated; wine and cheese were developed; the first towns appeared. The seas being warmer, the climate was generally wet so the Sahara had rivers and forests, hippos and people.
That the Arctic sea ice disappeared each August or September in those days does not seem to have done harm (remember that melting sea ice, as opposed to land ice, does not affect sea level), and nor did it lead to a tipping point towards ever-more rapid warming. Indeed, the reverse was the case: evidence from stalagmites in tropical caves, sea-floor sediments and ice cores on the Greenland ice cap shows that temperatures gradually but erratically cooled over the next few thousand years as the obliquity of the axis and the precession of the equinoxes changed. Sunlight is now weaker in July than January again (on global average).
Barring one especially cold snap 8200 years ago, the coldest spell of the past 10 millennia was the very recent “little ice age” of AD1300-1850, when glaciers advanced, tree lines descended and the Greenland Norse died out.
It seems that the quantity of Arctic sea ice varies more than we used to think. We don’t really know how much ice there was in the 1920s and 30s — satellites only started measuring it in 1979, a relatively cold time in the Arctic — but there is anecdotal evidence of considerable ice retreat in those decades, when temperatures were high in the Arctic.
SOURCE
Warmists (sort of) eat humble pie
A paper from 2013 below that has lost none of its relevance today. They say that their models predicted twice as much warming as has actually occurred and they admit the C21 "hiatus". And they can only guess why it all went so wrong
Recent observed global warming is significantly less than that simulated by climate models. This difference might be explained by some combination of errors in external forcing, model response and internal climate variability.
Global mean surface temperature over the past 20 years (1993–2012) rose at a rate of 0.14 ± 0.06 °C per decade (95% confidence interval). This rate of warming is significantly slower than that simulated by the climate models participating in Phase 5 of the Coupled Model Intercomparison Project (CMIP5). To illustrate this, we considered trends in global mean surface temperature computed from 117 simulations of the climate by 37 CMIP5 models (see Supplementary Information).
These models generally simulate natural variability — including that associated with the El Niño–Southern Oscillation and explosive volcanic eruptions — as well as estimate the combined response of climate to changes in greenhouse gas concentrations, aerosol abundance (of sulphate, black carbon and organic carbon, for example), ozone concentrations (tropospheric and stratospheric), land use (for example, deforestation) and solar variability. By averaging simulated temperatures only at locations where corresponding observations exist, we find an average simulated rise in global mean surface temperature of 0.30 ± 0.02 °C per decade (using 95% confidence intervals on the model average).
The observed rate of warming given above is less than half of this simulated rate, and only a few simulations provide warming trends within the range of observational uncertainty (Fig. 1a).
The inconsistency between observed and simulated global warming is even more striking for temperature trends computed over the past fifteen years (1998–2012). For this period, the observed trend of 0.05 ± 0.08 °C per decade is more than four times smaller than the average simulated trend of 0.21 ± 0.03 °C per decade (Fig. 1b). It is worth noting that the observed trend over this period — not significantly different from zero — suggests a temporary ‘hiatus’ in global warming. The divergence between observed and CMIP5- simulated global warming begins in the early 1990s, as can be seen when comparing observed and simulated running trends from 1970–2012 (Fig. 2a and 2b for 20-year and 15-year running trends, respectively the current generation of climate models (when run as a group, with the CMIP5 prescribed forcings) do not reproduce the observed global warming over the last 20 years, or the slowdown in global warming over the past fifteen years. This interpretation is supported by statistical tests of the null hypothesis that the observed and model mean trends are equal, exchangeable with each other (that is, the ‘truth plus error’ view); or (2) the models are exchangeable with each other and with the observations (see Supplementary Information).
Differences between observed and simulated 20-year trends have p values (Supplementary Information) that drop to close to zero by 1993–2012 under assumption (1) and to 0.04 under assumption (2) (Fig. 2c). Here we note that the smaller the p value is, the stronger the evidence against the null hypothesis. On this basis, the rarity of the 1993–2012 trend difference under assumption (1) is obvious. Under assumption (2), this implies that such an inconsistency is only expected to occur by chance once in 500 years, if 20-year periods are considered statistically independent. Similar results apply to trends for 1998–2012 (Fig. 2d). In conclusion, we reject the null hypothesis that the observed and model mean trends are equal at the 10% level.
SOURCE
H/T Paul Homewood
Warmest EVER?
We are bombarded with claims that some month or year (e.g., 2016) is the “warmest ever.” But what does that mean? We are living in a relatively cool era. Temperatures today are lower than they have been something like 90% of the time since the last Ice Age ended 12,000 or so years ago. In fact, “ever” means since approximately the 1880s, when thermometer records became widespread. As it happens, that was also around the time when the Little Ice Age ended, so–happily!–the Earth is a bit warmer now than it was then.
One of the many problems with global warming hysteria is that it is based on the surface temperature record since the 1880s, which is deeply flawed when it is not outright falsified by alarmists who control the historical records. This happens often, as we and others have documented. This week’s The Week That Was from the Science and Environmental Policy Project explains some (but by no means all) additional problems with the surface temperature record:
Unfortunately, the IPCC, and others, use surface temperatures to evaluate the global climate models. The failure of the models to track the surface temperatures is not surprising. Historic data is very sparse, largely from western Europe and the US. The data is contaminated by significant changes in land use, particularly urbanization. And, as shown in the 2008 NIPCC report, since about 1970, there has been a marked decline in the stations used to establish surface temperatures, and dramatic decline in the number of 5 degree by 5 degree grid boxes covered. Around the year 2000 about 100 of the total of 2,592 possible grid boxes ([180/5] x [360/5]) were covered – 4%. Complicating matters has been the trend, at least in the US, of using stations at airports. Both pavement and flying frequency create measurement problems.
When the Charney report was produced in 1979, there were no comprehensive, global temperature data. But starting in 1989, going back to December 1978, we have had comprehensive global satellite data of the atmosphere. As shown in the report by John Christy, the comprehensive satellite data show that, generally, the global climate models greatly overestimate warming of the atmosphere, where the greenhouse effect occurs. Both satellite and surface data are influenced by weather events such as the El Niño Southern Oscillation (ENSO). But, since the satellite data is “cleaner” it should be easier to separate natural and other human effects from CO2 caused warming.
If the purpose of the models is to estimate the effect of CO2, then surface data are poor proxy data at best. Atmospheric data is far superior. The kindest possible justification for the IPCC, and others, not to use satellite data is mental inertia.
Actually, the explanation is political. The IPCC was explicitly established by the U.N. for one purpose only, to “study” the impact of human-emitted CO2 on global temperatures. This was for the purpose of justifying government control over industry worldwide. Anyone who is interested in science rather than left-wing politics relies on the satellite data, which are transparent and have not been “adjusted” by political activists.
SOURCE
EPA spills again in Colorado
The Environmental Protection is admitting to a spill from a treatment plant it set up after it dumped 3 million gallons of toxic wastewater into a Colorado river last year.
The EPA said Thursday night that the spill happened on Tuesday, and officials are still attempting to determine how much and what metals were contained in the sludgy discharge, according to the Associated Press.
The spill occurred near the site of last year's spill at the abandoned Gold King Mine in Silverton, Colo., where agency contractors didn't adequately check the mine's pressure before attempting to open it up after several years of being idle. The result was a massive mine blowout that sent 3 million gallons of metal-tainted water into the waterways of three states.
The Navajo Nation sued the agency over the spill last week after the EPA inspector general and the Justice Department opened a criminal investigation into the incident a few days before the Aug. 5 anniversary of the 2015 spill. The Navajo argue in their lawsuit that the spill significantly harmed the tribe's primary source of revenue from crops and other agricultural products.
Local officials said this week's release was not large enough to warrant a public advisory.
Last year's spill sent nearly 1 million pounds of metals into the waterways of the Animas and San Juan rivers, which traverse three states. The metals include arsenic, cadmium, copper, lead, mercury, nickel and zinc.
This week's spill came from the treatment plant that the EPA set up near the mine to filter water coming from the mine before releasing it into the creek and river systems. A large amount of rain in Colorado caused the treatment facility to overflow and some of the untreated water to spill into the waterways.
EPA said the water that spilled from he plant was partially treated, and the metals present in it should quickly settle to the bottom of waterways where they are less harmful.
SOURCE
Australian Report Predicts Global Coffee Shortage Will Get Worse
It's hard to know where to start in dismissing this nonsense. All that global warming would do for ANY crop is to shift polewards the areas where it was grown. There is no conceivable reason for an OVERALL shortage. There are always new areas opening up for coffee growing anyway.
Secondly, the current problem is described as drought. Yet a warming world would mean a wetter world so warming could in fact SOLVE problems of coffee growing!
Thirdly, if they understood any economics they would know that any lasting reduction in supply would cause price increases and sustained price increases would then draw out more supply. Australia's empty North, for instance, could undoubtedly be opened up to coffee growing in some parts. There is already a small operation on the Atherton Tableland. They even grow Arabica there
A new report from Australia's Climate Institute predicts that by 2050, global warming will make at least half of the land currently used for coffee production unable to produce quality beans.
By 2080, it cautions, hot temperatures could make wild coffee plants completely extinct. Although this report is projecting what will happen to supplies in decades to come, the coffee shortage isn't really off in the distant future.
It's already started to fall. Brazil -- the source for over a third of the world's coffee -- has seen its coffee stores dip dramatically in the last two years as the result of a long drought. So far, unusually large harvests in other world coffee markets helped to make up most of the difference.
But we can hardly expect these big harvests to continue. In fact, their trend may actually reverse.
Much of Brazil's latest shortfall was made up for by a record-breaking coffee harvest in Honduras -- which is a coffee-growing area that this new report says will probably be hit particularly hard in the coming decades.
Even the relatively smaller shift from Brazil's shortage in the last couple years resulted in a price surge and a jump in counterfeit coffee beans (which pretend to be fancier coffee varieties than they are).
With the spread of the shortage, we can only expect to see rising coffee prices and counterfeiting show up as even more of a problem in our daily cups.
SOURCE
***************************************
For more postings from me, see DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are here or here or here. Email me (John Ray) here.
Preserving the graphics: Most graphics on this site are hotlinked from elsewhere. But hotlinked graphics sometimes have only a short life -- as little as a week in some cases. After that they no longer come up. From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site. See here or here
*****************************************
Tuesday, August 30, 2016
Substituting prophecy for facts
It must be hard being a Warmist at times. The article below admits that the Antarctic is not shrinking and notes that all the models say that it should. The scientific response to those facts would be to reject the models. But you can't do that, of course. So they simply do some more model runs with models that are already known to be wrong and predict that warming in the Antarctic will happen "real soon now".
So how do they account for what is not happening in the Antarctic so far? They say that what is happening there is all a product of large "natural variability". Maybe so but at that rate could the slight global warming during C20 also be a product of natural variability? If not, why not? They offer no test of when natural variability is at work or not other than whether it suits their preconceptions. So we have yet another example of how Warmism destroys science
Anthropogenic impact on Antarctic surface mass balance, currently masked by natural variability, to emerge by mid-century
Michael Previdi and Lorenzo M Polvani
Abstract
Global and regional climate models robustly simulate increases in Antarctic surface mass balance (SMB) during the twentieth and twenty-first centuries in response to anthropogenic global warming. Despite these robust model projections, however, observations indicate that there has been no significant change in Antarctic SMB in recent decades. We show that this apparent discrepancy between models and observations can be explained by the fact that the anthropogenic climate change signal during the second half of the twentieth century is small compared to the noise associated with natural climate variability. Using an ensemble of 35 global coupled climate models to separate signal and noise, we find that the forced SMB increase due to global warming in recent decades is unlikely to be detectable as a result of large natural SMB variability. However, our analysis reveals that the anthropogenic impact on Antarctic SMB is very likely to emerge from natural variability by the middle of the current century, thus mitigating future increases in global sea level.
Environmental Research Letters, Volume 11, Number 9
So cosy at Youngstown State U
No hint below that the conservative side of politics mostly thinks global warming is a load of hooey. No mention of debate, dissent or alternative viewpoints. More like a religious seminary than an institution of learning. Youngstown State University is an urban university located in Youngstown, Ohio. As of fall 2010, there were 15,194 students
YSU lecture series to focus on global warming
The YSU Lecture Series on Energy and the Environment kicks off its third year Sept. 7 featuring lectures focusing on global warming.
“With all the extreme weather events we’ve been experiencing and with 2016 set to pass 2015 as the hottest year on record, these speakers are of the utmost relevance,” said Ray Beiersdorfer, Distinguished Professor of Geology and the founder/organizer of the lecture series.
The series goes international this year with a Skype talk from Denmark by Søren Hermansen, director of Samsoe Energy Academy and the head of the Samsoe renewable energy island project.
Also featured is the retired Chief Oceanographer of the Navy, Rear Admiral Jon White, who will be speaking about Ocean & Climate vs. National & Global Security. The lecture series also will include the local premier of a documentary about the community rights movement, “We the People 2.0,” which had its world premier in June at the Seattle International Film Festival.
The speaker series is sponsored in part by The James Dale Ethics Center and NextGen Climate Action.
All lectures are 7 p.m. Wednesdays in Room B100 of Cushwa Hall and run through Nov. 16.
SOURCE
Turning reality on its head
Any farmer knows that warm, moist weather improves crop growth and any Greenhouse owner can tell you how much lots of CO2 improves his crops but the crooks below are trying to tell you the opposite of that. So how do they get from the reality that global warming would be a boon for food crops to an assertion that global warming would be a disaster for food crops? Easy! By combining models with simulations! Reality not included. How desperate the Warmists are!
Institute for Climate Impact Research (PIK) shared to Database has revealed that global warming could create substantial economic damage in agriculture.
“Agriculture is very sensitive to climate change—even a small increase of global mean temperatures can have significant effects on regional crop yields, affecting both the profitability of agricultural production and the share of income spent on food,” lead PIK author Miodrag Stevanović said.
“Our study quantifies economic impacts and analysis the role of international trade as an adaptation measure. We find that economic losses in agriculture could add up to the annual amount of roughly 0.8 percent of global GDP at the end of the century with a very restricted trade regime,” Stevanovic explained.
He said as small as this percentage sounds, it actually translates to losses of $2.5 trillion and is comparably higher for regions with limited agricultural resources, with respect to growing agricultural demand, for example the Middle East, Africa and India.
“In contrast, further trade liberalization in agricultural commodities could reduce financial damage globally by 65 percent, to 0.3 percent of global GDP,” he added.
Alexander Popp, PIK coauthor, further explained that: “Both global warming and free trade favor northern regions, like Europe and the United States, since producers’ gains increase as trade patterns shift northward. At the same time, southern regions, like Africa or India, could theoretically reduce climate-change-related damages by half, through more liberalized food markets.”
Arguing, Popp said: “Irrespective of our assumptions on global trade, climate change will result in reduced crop yields in many areas. At the same time, intensifying production or expanding cultivated land into previously untouched areas may come at a risk: It could lead to additional greenhouse-gas emissions through tropical deforestation or increased fertilizer use.”
According to him, this could then further enhance climate-change pressure on agriculture.
Researchers at PIK laboratories combined 19 different climate projections with simulations of crop growth to assess economic impacts of climate change in the agricultural sector.
Researchers said, while the magnitude of damage varies with different assumptions on crop-productivity response to climate change, carbon dioxide plant fertilization affects socioeconomic projection.
The new study highlights the important role of trade as a key measure to partly reduce climate-change impacts.
Hermann Lotze-Campen, PIK’s chairman of research domain Climate Impacts and Vulnerabilities, said: “The best way to avoid these risks is to limit climate change. However, for impacts that cannot be avoided, an open and diversified trade system can be an important adaptation option.
“It can account for changes in global patterns of agricultural productivity and, thus, allow for reducing production costs and enhancing food security, as climate change will have an amplifying effect on the gap between developed and developing countries, and reductions in trade barriers will have to be accompanied by measures for poverty reduction and social safety nets.”
If food prices increase due to climate-change impacts, households will not only have to spend more on their food consumption, but could also face risks of insufficient access to food and malnutrition, the new study said.
SOURCE
The strangest libertarian yet
He hates guns and now wants a carbon tax. He calls it a "fee" but how a fee differs from a tax is not at all clear. Sounds like the Libertarian party has been hijacked by a liberal
Libertarian Party presidential nominee and former New Mexico Gov. Gary Johnson said he’s no skeptic of man-made global warming and endorsed a “fee” on carbon dioxide emissions.
It’s all part of his “free market” approach to global warming, Johnson told the Juneau Empire in an article published Sunday.
“I do believe that climate change is occurring,” Johnson said. “I do believe that it is man-caused” and “that there can be and is a free-market approach to climate change.”
Johnson’s “free market” approach to global warming includes “a fee — not a tax, he said — placed on carbon” to make those who emit the greenhouse gas pay the supposed cost of their actions, according to the Juneau Empire.
“We as human beings want to see carbon emissions reduced significantly,” he said, adding the U.S. only emits “16 percent of the (global) load” CO2.
Johnson said: “I don’t want to do anything that harms jobs.”
It’s not exactly clear how a “fee” on CO2 would be different than a “tax,” but Johnson’s announcement was picked up by environmentalists
Johnson’s carbon “fee” was touted by the group RepublicEN, a group of conservatives who endorse a carbon tax. RepublicEN has joined with environmentalists to promote a carbon tax as the best way to tackle global warming.
But they’re basically alone on the right, as most conservative groups see a carbon tax as a fool’s errand, and the Republican Party explicitly rejected a carbon tax in its 2016 platform.
“We oppose any carbon-tax,” reads the 2016 platform. “It would increase energy prices across the board, hitting hardest at the families who are already struggling to pay their bill in the Democrats’ no-growth economy.”
Republican presidential nominee Donald Trump told campaigners at the American Energy Alliance (AEA) in March he opposed a carbon tax.
“The Obama administration committed an overreach that punishes rather than helps Americans,” Trump answered in AEA’s survey. “Under my administration, all EPA rules will be reviewed. Any regulation that imposes undue costs on business enterprises will be eliminated.”
Republicans have been increasingly concerned about attempts to get a carbon tax through Congress. GOP lawmakers often argue taxing CO2 would amount to an energy tax that would raise the price of everything, hurting the poor.
Rhode Island Democratic Sen. Sheldon Whitehouse introduced a carbon tax bill last year to raise $2 trillion over 10 years and reduce CO2 emissions 40 percent. Whitehouse has also called on the Department of Justice to prosecute those who disagree with him on global warming.
SOURCE
Hurricane damages could grow faster than the U.S. economy
Just modelling -- and the models have never been right yet
When it comes to hurricanes in the U.S., large-scale trends are not in our favor. In fact, unless action is taken to curb both global warming and coastal development, the American economy may be set to take a perilous bashing from stronger storms, rising seas and too much high value, high-risk property lying in harms' way.
By the end of the century, a hurricane that strikes the eastern United States could cause up to three times more economic damage than a hurricane that strikes today, climate researchers warn in a new study.
If the world doesn’t drastically reduce its greenhouse gas emissions and Americans don’t move to safer ground, the U.S. could suffer an eight-fold jump in average annual financial losses from hurricanes by 2100, the study found.
In the study, published Tuesday in the journal Environmental Research Letters, the scientists from Germany's Potsdam Institute for Climate Impact Research show that future hurricane-related losses for American families, companies and communities could grow faster than the overall U.S. economy — meaning the country won’t be able to counteract the damages from extreme weather events by creating more jobs and wealth.
“We find that hurricane losses have risen and will rise faster than the economy,” Tobias Geiger, the paper’s lead author and a climate scientist at the Potsdam Institute, told Mashable.
“The impacts of climate change cannot be simply economically outgrown," he said.
In the U.S. alone, hurricanes caused $400 billion in estimated losses between 1980 and 2014, accounting for more than half of all weather-related economic losses, the German reinsurance giant Munich Re estimated last year.
Damages from other extreme weather events — including floods, wildfires, tornadoes and droughts — are also on the rise due to both human-caused global warming and unchecked development into floodplains, fire-prone forests and waterfronts.
SOURCE
How to Milk a Bull! Bad bee science and activist capture at the FT
Here we go again! A second rate correlation study gets published;
* Because it has the words “bees” and “pesticides” in the headline, the newly privatised research institute’s PR machine generates good media pick-up;
* Main media organisations interview the researchers (who seem to find the right apocalyptic vocabulary), but they have not read the study;
* The NGO activists, who might have read some of the news articles, go into campaign overdrive: buzz, buzz, buzz, spin, spin, spin;
* An activist from Friends of the Earth plants an editorial in the Financial Times;
* Green MEPs print up a banner announcing a bee-pocalypse and pose for a picture in Brussels.
Hasn’t anyone in the media learnt the skew of this cynical game yet? Have journalists abandoned responsibility to side with the tawdristic tirades of the activists? Has everyone given up the “investigate, research and then report” practices that I used to teach journalism students (back when there used to be journalism students)? Are journalists now merely opting to take part in campaign activism? I am afraid in a world where a media article’s success is measured according to social media viral lift there is little incentive for journalistic responsibility.
And the activists know this! They march their bull onto their stage and attempt to milk it. In the drama of the thumping about, no one even notices the bloody thing’s got horns. Conclusion: It can’t produce milk anymore and someone is to blame. Print, publish, promote!
Bee-awful: Correlation studies are not scientific
If I go into a poor neighbourhood wanting to confirm my prejudice that immigrants push drugs, I’ll find immigrants and I’ll find drugs. Such a correlation would make me a racist … or, according to the media, a scientist with firm evidence! There is no question of other factors, societal variables or complicating circumstances – the correlation gives what the researcher wants: a simplistic “Yes!”, headlines and a guaranteed publication.
A correlation study starts with a person having a bias, and then defining data parameters to confirm it, moving the bias to a conclusion. It should be used by scientists as a first step in determining whether a study should be conducted (eg, “I’m noticing that a lot of long-term smokers are developing certain cancers; maybe we should look into this”); but now, with the advent of activist science and the media attention garnered by white-coat celebrities like Stephanie Seneff and Dave Goulson, correlation studies have become sufficient to draw apocalyptic conclusions (and glorious reputation-enhancing headlines).
A recent correlation study by researchers mostly from the UK Centre for Ecology and Hydrology (CEH) has drawn such headlines. The CEH started with a prejudice that wild bees are dying in the UK (nobody really knows how many species of wild bees there are in the UK, so apparently you can make that claim), and they attempted to correlate it to the increase in UK production of oilseed rape (and the parallel use of neonicotinoids).
This correlation study had a long list of inherent weaknesses, namely:
The claim of a wild bee decline was based on data collected by a loose organisation of volunteer enthusiasts who would spot bees and record them during their walks. There was no focus to systematically gather data for a determined study and there was no new research conducted. It was simply random data (although the authors of the study would prefer the euphemism: “not structured”) pushed through some statistical analysis “tool”.
Wild bee data is quite thin and not robust enough to make any legitimate conclusions (the numbers and number of species in the UK is simply not known). Wild bees became the subject of save-the-bee activist concern after campaigners stopped making the claim that honeybees were affected by neonicotinoids, because, well, data seemed to prove that they weren’t and some activist scientists couldn’t keep ignoring the facts.
The study did not consider the shifts in agricultural practices over the 18-year survey period that might have had a detrimental effect on biodiversity levels. How farmers rotated their crops was also not brought into the parameters.
The researchers did not consider the variation from different types of neonicotinoids, the exposure levels of each, the type of dosage … they simply grouped a class of different chemicals as: “neonicotinoids” and did not get deeper into the science.
In fact, the researchers did not actually consider neonicotinoids, how or when they were applied. They looked at oilseed rape production, assumed it was treated, and then correlated it to the random data from their amateur bee counting enthusiasts. If the farmers had treated OSR fields with the older, less efficient pyrethroid insecticides, many more bees surely would have perished! This was not taken into consideration.
The researchers seemed, frankly, immature. At one point, the publication went off on a tangent and postulated that similar conclusions could be made for the influence on bees from treated sunflowers, even though they had not studied that and had no data. The key researcher, Dr Nick Isaac, acknowledged in a blog how he had designed the methodology to show how much of the wild bee decline was due to exposure to neonicotinoids. I believe the correct term is “if”. Building bias into the research methodology at the outset is, simply put, activist science.
A wide range of variables were simply ignored: weather, parasites, viruses, nutrition, other predators, regional urban development … the researchers included nothing that would complicate a clear correlation to prove their hypothesis.
There were no other hypotheses made to open further scientific debate or add to the body of knowledge. For example, organic farmers have introduced nematodes into their integrated pest management programme to protect against the increase of certain insects, without realising that these nematodes have a taste for wild bee nests and have been hoovering up wintering bumble bees at an alarming rate.
They did not recognise the limits of a correlation study in the paper. What they presented was assumed as clear, factual conclusions on the state of wild bee health due to neonicotinoids (even putting it down to causal percentages). I could easily perform a correlation study that shows that honeybee populations have increased globally since the introduction of neonicotinoids, or that these bees are thriving in OSR areas not affected by Varroa mites, but I don’t see the point of using a flawed approach for some vain need to prove I’m right!
Scientists should consider other elements and not be driven to drawing prejudiced conclusions on limited and compromised data.
In short, the conclusion drawn from the CEH correlation study is at best: mediocre; at worst: shameful. Let me go out on a limb and say this is the worst study on bee health and pesticides that I have ever read: based on random data, bias built into the methodology, no research into the chemical class it aims to condemn, total ignorance of all variables that most scientists acknowledge to be the source of bee decline and no responsibility for the consequences of its contrived conclusions (which were aimed to draw media attention).
What happened next was, well, regrettable. The scientists at the CEH started giving interviews and writing blogs where they showed their juvenile innocence. Rather than saying that their correlation findings were interesting and needed further study before any responsible conclusions could be drawn, they started making clear conclusions that indicted neonicotinoids. Perhaps the journalists set them up; perhaps the media jumped to conclusions and twisted their statements out of context; perhaps anti-pesticide bias is so built into our agricultural narrative that nobody noticed the claims had no factual bearing.
Call in the opportunists! Did they consider the study’s weaknesses?
The media, the NGO activists and the Green politicians have decided to ride this bull for wide range of opportunities:
it fills a slow August news cycle;
the CEH and its young researchers have yet to be tainted as an activist science organisation;
its scientists apparently haven’t received media training yet;
the EU will be reassessing the data from their neonic ban in January (expect a slew of activist-funded scare studies published in November and December);
the Brexit result implies the risk that the UK government might actually think about the plight of farmers and regulate rationally.
Bee-mused: How did Friends of the Earth capture the Financial Times?
While this CEH publication was pressed on most mainstream media, the Financial Times spin in the editorial “In defence of bees: a pesticide ban is justified” (FT View, 16 August 2016) was more than simply alarming. It was captured activism we would normally expect from campaign-driven outlets like Le Monde or the Guardian, but not the FT!
The author, assumedly an FT editor as it is uncredited, begins the defence of bees by praising a small, random bee-counting stunt by Friends of the Earth that identified 370,000 bees in the UK. How is that an impressive or accurate count worth starting the article – about one bee for every citizen in the British town of Bradford?
The article then introduces the CEH study:
“Now new research concludes that neonicotinoids — a group of pesticides chemically similar to the nicotine in tobacco — might have cut the presence of wild bees in the British countryside by up to 30 per cent since they became widely used on oilseed rape crops.”
So the FT says neonics don’t work. Hmmm, … citation please?
The CEH study does not actually say neonics are responsible for 30% of the wild bee deaths, and this headline shock factor is not even backed up with a link to the study or reference to it by name. So I guess the FT could say anything they want then!
The editorial then assumes the campaign attack position, declaring how much the global food supply depends on bees, that farmers don’t understand that neonicotinoids don’t work (no reference) and that the big industry players are lobbying hard to silence the critics. My other shoe dropped when I read:
“Policymakers should certainly treat the arguments put forward by pesticide manufacturers — whose tactics are compared by NGOs to those used in the past by tobacco companies — with scepticism.”
The source the FT referenced to justify likening pesticide manufacturers to Big Tobacco was none other than that same friendly bunch that counted all of those bees: Friends of the Earth (who published a rabid report in 2014 of loose innuendo on industry lobbying that was largely ignored and quickly forgotten … except, obviously, by the editor of the Financial Times!!!). At this point, does anyone not get that this article was ghost-written by Friends of the Earth?
The article concludes that the EU ban against neonicotinoids not only must be renewed in 2017, but made stronger, because, after all: “There are other ways to manage pests that deserve attention.” And that is the FT’s ‘defence of bees’!
This has to be, in my living memory, the worst example of activist capture of a main news organisation, and of all places, at the Financial Times. How did Friends of the Earth worm their way onto the FT’s editorial desk? Was the editor on holiday after inadvertently leaving the access codes on the Tube? Shouldn’t the author of this article, if he or she had followed FT standards, at least have read the CEH report (or perhaps bothered to do a Google search to provide a direct link)!
There was no critical analysis like the basic points drawn above. Instead, the FT editor chose to use the publication to make policy demands that are simplistic, harmful for agriculture and likely more detrimental to bee health when alternatives are considered.
While the Friends of the Earth lobbyists must be proud of their ability to plant this story in the FT, I suspect that its readers must be questioning such poor journalism and activist bias. This is some seriously sour bull-milk!
Milking that bull will only get you a heap of bullshit
When will we learn that the more we yank on that bull’s “teat”, the more noise you will make, perhaps with a show of muscle … but after all that work, you won’t get any milk. The save-the-bee activists have this bull by the balls and they are pumping away.
For those who see that this heifer is a hoofer, it is quite amusing to watch them trying so hard. For the rest who have been deceived, including some readers of the FT, they keep expecting milk and are getting impatient.
In the end, all we get from them is bullshit!
Apologies: Some may have got the crude pun in the title – note that this is what I feel activists are actually doing with their silly bee campaign.
Also I know that many farmers read my blog so let me assure them that I am fully aware that some dairy cows do have horns and some bulls do not – I was succumbing to artistic license!
SOURCE
***************************************
For more postings from me, see DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are here or here or here. Email me (John Ray) here.
Preserving the graphics: Most graphics on this site are hotlinked from elsewhere. But hotlinked graphics sometimes have only a short life -- as little as a week in some cases. After that they no longer come up. From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site. See here or here
*****************************************
Monday, August 29, 2016
All Natural… Four New Scientific Publications Show No Detectable Sea Level Rise Effect of CO2
It is widely assumed that sea levels have been rising in recent decades largely in response to anthropogenic global warming. However, due to the inherently large contribution of natural oscillatory influences on sea level fluctuations, this assumption lacks substantiation. Instead, natural factors or internal variability override the detection of an anthropogenic signal and may instead largely explain the patterns in sea level rise in large regions of the global oceans.
Scientists who have recently attempted to detect an anthropogenic signal in regional sea level rise trends have had to admit that there is “no observable sea-level effect of anthropogenic global warming,” or that the “sea level rise pattern does not correspond to externally forced anthropogenic sea level signal,” and that sea level “trends are still within the range of long-term internal decadal variability.”
Below are highlighted summaries from 4 peer-reviewed scientific papers published within the last few months.
1. Hansen et al., 2016
For the convenience of the readers, our basic results are shown in Figure 1. We identified five individual oscillations (upper panel), including a sea-level amplitude of 70 mm (top–bottom [t-b]) of the 18.6-year oscillation caused by the lunar nodal oscillation (LNO) … Together with a general sea-level rise of 1.18 mm/y, the sum of these five sea-level oscillations constitutes a reconstructed or theoretical sea-level curve of the eastern North Sea to the central Baltic Sea (Figure 1, lower panel), which correlates very well with the observed sea-level changes of the 160-year period (1849–2009), from which 26 long tide gauge time series are available from the eastern North Sea to the central Baltic Sea. Such identification of oscillators and general trends over 160 years would be of great importance for distinguishing long-term, natural developments from possible, more recent anthropogenic sea-level changes. However, we found that a possible candidate for such anthropogenic development, i.e. the large sea-level rise after 1970, is completely contained by the found small residuals, long-term oscillators, and general trend. Thus, we found that there is (yet) no observable sea-level effect of anthropogenic global warming in the world’s best recorded region.
2. Palanisamy, 2016
Building up on the relationship between thermocline and sea level in the tropical region, we show that most of the observed sea level spatial trend pattern in the tropical Pacific can be explained by the wind driven vertical thermocline movement. By performing detection and attribution study on sea level spatial trend patterns in the tropical Pacific and attempting to eliminate signal corresponding to the main internal climate mode, we further show that the remaining residual sea level trend pattern does not correspond to externally forced anthropogenic sea level signal. In addition, we also suggest that satellite altimetry measurement may not still be accurate enough to detect the anthropogenic signal in the 20-year tropical Pacific sea level trends.
3. Hadi Bordbar et al., 2016
The tropical Pacific has featured some remarkable trends during the recent decades such as an unprecedented strengthening of the Trade Winds, a strong cooling of sea surface temperatures (SST) in the eastern and central part, thereby slowing global warming and strengthening the zonal SST gradient, and highly asymmetric sea level trends with an accelerated rise relative to the global average in the western and a drop in the eastern part. These trends have been linked to an anomalously strong Pacific Walker Circulation, the major zonal atmospheric overturning cell in the tropical Pacific sector, but the origin of the strengthening is controversial. Here we address the question as to whether the recent decadal trends in the tropical Pacific atmosphere-ocean system are within the range of internal variability, as simulated in long unforced integrations of global climate models. We show that the recent trends are still within the range of long-term internal decadal variability.
4. Dangendorf et al., 2016
The observed 20th century sea level rise represents one of the major consequences of anthropogenic climate change. However, superimposed on any anthropogenic trend there are also considerable decadal to centennial signals linked to intrinsic natural variability in the climate system. … Gravitational effects and ocean dynamics further lead to regionally varying imprints of low frequency variability. In the Arctic, for instance, the causal uncertainties are even up to 8 times larger than previously thought. This result is consistent with recent findings that beside the anthropogenic signature, a non-negligible fraction of the observed 20th century sea level rise still represents a response to pre-industrial natural climate variations such as the Little Ice Age.
SOURCE
Don’t bee-lieve the latest bee-pocalypse scare
Now wild bee junk science and scare stories drive demands for anti-pesticide regulations
Paul Driessen
As stubborn facts ruin their narrative that neonicotinoid pesticides are causing a honeybee-pocalypse, environmental pressure groups are shifting to new scares to justify their demands for “neonic” bans.
Honeybee populations and colony numbers in the United States, Canada, Europe, Australia and elsewhere are growing. It is also becoming increasingly clear that the actual cause of bee die-offs and “colony collapse disorders” is not neonics, but a toxic mix of predatory mites, stomach fungi, other microscopic pests, and assorted chemicals employed by beekeepers trying to control the beehive infestations.
Naturally, anti-pesticide activists have seized on a recent study purporting to show that wild bee deaths in Britain have been correlated with neonic use in oil seed rape fields (canola is a type of OSR). In a saga that has become all too common in the environmental arena, their claims were amplified by news media outlets that share many activist beliefs and biases – and want to sell more subscriptions and advertising.
(Honeybees represent a small number of species that humans have domesticated and keep in hives, to produce honey and pollinate crops. Many are repeatedly trucked long distances, to pollinate almond and other crops as they flower. By contrast, thousands of species of native or wild bees also flourish across the continents, pollinating plants with no human assistance.)
The recent Center for Ecology and Hydrology study examined wild bee population trends over an 18-year period that ended in 2011. It concluded that there was a strong correlation between population and distribution numbers for multiple species of British wild bees and what study authors called their “measure of neonic dose” resulting from the pesticide, which is used as a seed coating for canola crops.
The study is deeply flawed, at every stage – making its analysis and conclusions meaningless. For example, bee data were collected by amateur volunteers, few of whom were likely able to distinguish among some 250 species of UK wild bees. But if even one bee of any species was identified in a 1-by-1 kilometer area during at least two of the study period’s 18 years, the area was included in the CEH study.
This patchy, inconsistent approach means the database that formed the very foundation for the entire study was neither systematic nor reliable, nor scientific. Some species may have dwindled or disappeared in certain areas due to natural causes, or volunteers may simply have missed them. We can never know.
There is no evidence that the CEH authors ever actually measured neonic levels on bees or in pollen collected from OSR fields that the British wild bees could theoretically have visited. Equally relevant, by the time neonics on seeds are absorbed into growing plant tissue, and finally expressed on flecks of pollen, the levels are extremely low: 1.3–3.0 parts per billion, the equivalent of 1–3 seconds in 33 years.
(Coating seeds ensures that pesticides are incorporated directly into plant tissue – and target only harmful pests that feed on the crops. It reduces or eliminates the need to spray crops, which can kill birds, bats and beneficial insects that are in the fields or impacted by accidental “over-sprays.” Indeed, numerous field studies on two continents have found no adverse effects from neonics on honeybees at the hive level.)
A preliminary U.S. Environmental Protection Agency risk assessment for one common neonic sets the safe level for residues on pollen at 25 ppb. Any observable effects on honeybee colonies are unlikely below that. Perhaps wild bees are more susceptible. However, at least two wild bee species (alfalfa leaf cutters and miner bees) are thriving in areas where OSR/canola fields are widespread, and the CEH study found reduced numbers of certain wild bees that do not collect pollen from oil seed rape.
Perhaps most important, the CEH authors appear to have assumed that any declines in wild bee numbers were due to neonicotinoid pesticides in OSR fields, even at very low doses. They discounted or ignored other factors, such as bee diseases, weather and land use changes.
For instance, scientists now know that parasitic Varroa destructor mites and phorid flies severely affect honeybees; so do the Nosema ceranae gut fungus, tobacco ringspot virus and deformed wing virus. Under certain circumstances, those diseases are known to spread to bumblebees and other wild bees.
Significant land development and habitat losses occurred in many parts of Britain from 1930 to 1990, causing wild bee populations to decline dramatically. Thankfully, they have since rebounded – during the same period that neonic use was rising rapidly, replacing older insecticides that clearly are toxic to bees! The CEH team also failed to address those facts.
To compensate for these shortcomings (or perhaps to mask them), the CEH researchers created a sophisticated computer model that supposedly describes and explains the 18 years of wild bee data.
However, as any statistician or modeler knows, models and output are only as good as the assumptions behind them and data fed into them. Garbage in/Garbage out (GIGO) remains the fundamental rule. Greater sophistication simply means more refined refuse, and faster computers simply generate faulty, misleading results more rapidly. They also enable emotional fear-mongering to trump real science.
The CEH models are essentially “black boxes.” Key components of their analytical methodologies and algorithms have not been made public and thus cannot be verified by independent reviewers.
However, the flawed data gathering, unjustified assumptions about neonic impacts, and failure to consider the likely effects of multiple bee diseases and parasites make it clear that the CEH model and conclusions are essentially worthless – and should not be used to drive or justify pesticide policies and regulations.
As Prime Minister Jim Hacker quipped in the theatrical version of the British comedy series Yes, Prime Minister: “Computer models are no different from fashion models. They’re seductive, unreliable, easily corrupted, and they lead sensible people to make fools of themselves.”
And yet studies like this constantly make headlines. That’s hardly surprising. Anti-pesticide campaigners have enormous funding and marvelous PR instincts. Researchers know their influence and next grant can depend on issuing studies that garner alarmist headlines and reflect prevailing news themes and imminent government actions. The news media want to sell ads and papers, and help drive public policy-making.
The bottom line is fundamental: correlation does not equal causation. Traffic lights are present at many intersections where accidents occur; but that does not mean the lights caused most or all of the accidents. The CEH authors simply do not demonstrate that a neonic-wild bee cause-effect relationship exists.
The price to society includes not just the countless dollars invested in useless research, but tens of billions in costs inflicted by laws and regulations based on or justified by that research. Above all, it can lead to “cures” that are worse than the alleged diseases: in this case, neonic bans would cause major crop losses and force growers to resort to older pesticides that clearly are harmful to bees.
There is yet another reason why anti-pesticide forces are focusing now on wild bees. In sharp contrast to the situation with honeybees, where we have extensive data and centuries of beekeeper experience, we know very little about the thousands of wild bee species: where they live and forage, what risks they face, even how many there really are. That makes them a perfect poster child for anti-neonic activists.
They can present all kinds of apocalyptic scenarios, knowing even far-fetched claims cannot be disproven easily, certainly not in time to address new public unease amid discussions about a regulatory proposal.
The Center for Ecology and Hydrology study involved seriously defective data gathering and analytical methodologies. More troubling, it appears to have been released in a time and manner calculated to influence a European Union decision on whether to continue or rescind a ban on neonicotinoid pesticides.
Sloppy or junk science is bad enough in and of itself. To use it deliberately, to pressure lawmakers or regulators to issue cures that may be worse than alleged diseases, is an intolerable travesty.
Via email
An update on Germany's "Energiewende"
Germany is still pursuing its goal of shutting down its nuclear plants but refuses to shut down its lignite plants. It is slashing renewable energy subsidies and replacing them with an auction/quota system. Public opposition is delaying the construction of the power lines that are needed to distribute Germany’s renewables generation efficiently. Renewables investment has fallen to levels insufficient to build enough new capacity to meet Germany’s 2020 emissions reduction target. There is no evidence that renewables are having a detectable impact on Germany’s emissions, which have not decreased since 2009 despite a doubling of renewables penetration in the electricity sector. It now seems certain that Germany will miss its 2020 emissions reduction target, quite possibly by a wide margin. In short, the Energiewende is starting to unravel.
This post discusses the Energiewende’s main problems under five subheadings, starting with arguably the most problematic:
Germany’s emissions are not decreasing:
Electricity sector emissions decreased between 1990, the baseline year, and 1999 but have remained essentially flat since then. Emissions from other sectors decreased between 1990 and 2009 but have also flattened out since then. As a result Germany’s emissions are about the same now as they were in 2009. The increase in renewables generation over this period has clearly not had the desired effect.
The electricity sector presently contributes only about 45% of Germany’s total emissions. 100% decarbonization of the electricity sector, which is already about 45% decarbonized if we add nuclear, would therefore in theory reduce total emissions by only another 25% or so. Yet Germany’s efforts to cut emissions continue to concentrate on the electricity sector.
The chances that Germany will meet its 2020 and 2030 emissions reduction targets do not look good.
Renewables have not reduced emissions
Since 1990 renewable energy generation has grown by a factor of over ten to the point where it now supplies 30% of Germany’s electricity. One would think that this would have had a visible impact on Germany’s electricity sector emissions, but as shown in Figure 3 it’s difficult to detect any impact at all. Despite the 20% absolute increase in renewables penetration between 1999 and 2014 electricity sector emissions have barely changed over this period, and had it not been for the 2008/9 recession they would probably have increased:
The reason renewables have had no detectable impact is that the added generation has gone towards filling increased demand and replacing nuclear generation rather than generation from gas, coal and lignite, which remains about the same as it was in 1990
Finally, Germany will discontinue direct renewable subsidies for new projects at the beginning of 2017. It will be interesting to see what happens to retail electricity rates as a result.
Summary:
Germany is a country of contradictions, at least as far as energy is concerned. Germans are in favor of more renewable energy yet oppose building the overhead power lines that are needed to distribute it. They are in favor of deep emissions cuts but also in favor of shutting down Germany’s nuclear plants, which will make the problem of meeting emissions targets far more difficult and costly. The government continues to pursue a nuclear shutdown but is unwilling to shut down Germany’s lignite plants. As a result of these conflicting and counterproductive viewpoints and policies the Energiewende has effectively gone nowhere. Despite the expenditure of many billions of dollars it has failed to achieve any visible reduction in Germany’s emissions or to make a meaningful difference to Germany’s energy mix (renewables still supply only 14% of Germany’s total energy). Its only demonstrable impact has been skyrocketing electricity bills.
And now Germany is discontinuing the direct renewables subsidies that have driven the Energiewende since its adoption in 2000. It might be premature to declare the Energiewende a failure, but things are certainly headed in that direction.
Much more HERE (See the original for links, graphics etc.)
The Troubling Science
Michael Hart is a Canadian academic with an impressive list of credentials. He has just put out a book – Hubris: The Troubling Science, Economics, and Politics of Climate Change.
This article covers many of the topics that have been raised here at Blackjay over the last couple of years. It is must-read for anyone with lingering doubts about the supposed urgent need for action on climate change.
For example: Alarm over a changing climate leading to malign results is in many ways the product of the hunger for stability and direction in a post-Christian world. Humans have a deep, innate need for a transcendent authority. Having rejected the precepts of Christianity, people in the advanced economies of the West are turning to other forms of authority. Putting aside those who cynically exploit the issue for their own gain – from scientists and politicians to UN leaders and green businesses – most activists are deeply committed to a secular, statist, anti-human, earth-centric set of beliefs which drives their claims of a planet in imminent danger from human activity.
To them, a planet with fewer people is the ultimate goal, achievable only through centralized direction and control. As philosopher of science Jeffrey Foss points out, “Environmental science conceives and expresses humankind’s relationship to nature in a manner that is – as a matter of observable fact – religious.” It “prophesies an environmental apocalypse. It tells us that the reason we confront apocalypse is our own environmental sinfulness. Our sin is one of impurity. We have fouled a pure, ‘pristine’ nature with our dirty household and industrial wastes. The apocalypse will take the form of an environmental backlash, a payback for our sins. … environmental scientists tell people what they must do to be blameless before nature.”
The interview concludes: it will take a determined effort by people of faith and conscience to convince our political leaders that they have been gulled by a political movement exploiting fear of climate change to push a utopian, humanist agenda that most people would find abhorrent. As it now stands, politicians are throwing money that they do not have at a problem that does not exist in order to finance solutions that make no difference. The time has come to call a halt to this nonsense and focus on real issues that pose real dangers. In a world beset by war, terrorism, and continuing third-world poverty, there are far more important things on which political leaders need to focus.
It may be nitpicking but the one thing I disagree with is his use of the term “humanist” in the final paragraph. Humanism is a philosophical and ethical stance that emphasizes the value and agency of human beings, individually and collectively, and generally prefers critical thinking and evidence over acceptance of dogma or superstition. The utopian agenda is certainly not humanist. Any philosophy in which wilderness has greater value than community, in which humans are seen as a “scourge on the planet” a la Attenborough and which supports the dogma and pseudo-science of climate change is certainly not humanist.
But I agree with him about the rest of it.
SOURCE
The Global Effects of Global Warming on Human Mortality
Paper Reviewed: Guo, Y., Gasparrini, A., Armstrong, B., Li, S., Tawatsupa, B., Tobias, A., Lavigne, E., Coelho, M. de S.Z.S.C., Leone, M., Pan, X., Tong, S., Tian, L., Kim, H., Hashizume, M., Honda, Y., Guo, Y.-L.L., Wu, C.-F., Punnasiri, K., Yi, S.-M., Michelozzi, P., Saldiva, P.H.N. and Williams, G. 2014. Global variation in the effects of ambient temperature on mortality. Epidemiology 25: 781-789.
In a study they designed to determine the effects of daily high and low temperatures on human mortality, Guo et al. (2014) obtained daily temperature and mortality data from 306 communities located in 12 different countries (Australia, Brazil, Thailand, China, Taiwan, Korea, Japan, Italy, Spain, the United Kingdom, the United States and Canada) that fell somewhere within the time span of 1972-2011. And what did they learn from this monumental endeavor?
The 22 researchers, hailing from numerous places throughout the world, report that "to obtain an easily interpretable estimate of the effects of cold and hot temperatures on mortality," they "calculated the overall cumulative relative risks of death associated with cold temperatures (1st percentile) and with hot temperatures (99th percentile), both relative to the minimum-mortality temperature [75th percentile]" (see figure below). And despite the "widely ranging climates" they encountered, they report that "the minimum-mortality temperatures were close to the 75th percentile of temperature in all 12 countries, suggesting that people have adapted to some extent to their local climates."
Once again, therefore, it is as clear as it can possibly be made, that essentially everywhere in the world, typical cold temperatures are far more likely to lead to premature human deaths than are typical warm temperatures. And because of this fact, we must be thankful for the post-Little Ice Age warming of the world, which has been predominantly experienced almost everywhere at the cold -- and deadly -- end of the planet's daily temperature spectrum.
More HERE (See the original for links, graphics etc.)
Bats Save Billions In Pest Control
And wind turbines kill them by the millions
A secret war is waged above farmland every night.
Just after dusk, high-stakes aerial combat is fought in the darkness atop the crop canopy. Nature’s air force arrives in waves over crop fields, sometimes flying in from 30 miles away. Bat colonies blanket the air with echo location clicks and dive toward insect prey at up to 60 mph. In games of hide-and-seek between bats and crop pests, the bats always win, and the victories are worth billions of dollars to U.S. agriculture.
Bats are a precious, but unheralded friend of farmers, providing consistent crop protection. Take away the colonies of pest killers and insect control costs would explode across farmland. And just how much do bats save agriculture in pesticide use? Globally, the tally may reach a numbing $53 billion per year, according to estimates from the University of Pretoria, U.S. Geological Survey (USGS), University of Tennessee, and Boston University.
A 2006 study proposed bats saved cotton growers $74 per acre in pesticide treatments across eight Texas counties. In 2013-2014, graduate student Josiah Maines and his advisor at Southern Illinois University Carbondale, Justin Boyles, went beyond penciled estimates and ran a concrete field trial to show the relation between bats and corn protection. Funded by Bat Conservation International, Maines’ unique test targeted density of corn earworm in southern Illinois bottomland in Alexander County.
Maines built a canopy system to prevent bats from accessing particular sections of corn at night. The controlled enclosure (65’ by 65’, and 23’ high) was braced by steel structural poles interconnected with steel cables draped by netting. Maines operated the netting like a gigantic shower curtain every day of crop season: Open in daylight and close at night. He kept the vigil over two years, sliding the big curtain at the given dusk hour from May to late September to cut off bat access to earworm moths. The results? Maines was astonished.
He found a 50% reduction in earworm presence in control areas and a similar reduction in damage to corn ears. Not only did bats suppress earworm larvae and direct damage to corn, they also hindered the presence of fungal species and toxic compounds. “Globally, we estimate bats save corn farmers over $1 billion annually in earworm control,” Maines says. “It’s an incredible amount when we’re only considering one pest and one crop. Bats are truly a vital economic species.”
Would producers see greater crop protection with more bat habitat? In general, researchers don’t know how many bats fly over a single acre of farmland at night. Bats are extremely difficult to count during the day. They hide incredibly well in trees, caves, holes in the ground, and buildings. “Future research should look at the tradeoff of forested bat habitat and crop protection. Safe to say, more bats could mean even greater consumption of crop pests,” Maines says.
Paul Cryan, a USGS research biologist at the Fort Collins Science Center, says of up to 45 bat species in the U.S., 41 to 42 eat nothing but insects. “Our U.S. bats are small -- 10 to 20 grams. They have voracious appetites and eat half or all their body weight each night. Pest control value to agriculture is certainly in the billions of dollars per year.”
However, pressing issues surround the future of U.S. bat populations. White nose syndrome (WNS) is a major threat to U.S. bat numbers. The fungal disease affects hibernating bats and has spread halfway across the U.S. since first appearing in New York in 2006. “WNS has killed up to 6 million bats and continues moving,” Cryan says. “I believe farmers would see an immediate impact in insect suppression if overall bat populations were seriously reduced.”
Cryan coauthored a seminal 2011 paper, Economic Importance of Bats in Agriculture, suggesting the loss of bats would cost U.S. agriculture at least $3.7 billion per year. “We’re typically scared of the dark, but bats shouldn’t be a part of that association. They’re such a beneficial and important part of the environment and farmland protection.”
Hat tip to the misunderstood bats of agriculture: phenomenal creatures patrolling farmland skies every night in the greatest show never seen.
SOURCE
***************************************
For more postings from me, see DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are here or here or here. Email me (John Ray) here.
Preserving the graphics: Most graphics on this site are hotlinked from elsewhere. But hotlinked graphics sometimes have only a short life -- as little as a week in some cases. After that they no longer come up. From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site. See here or here
*****************************************
Sunday, August 28, 2016
Solar activity has a direct impact on Earth's cloud cover
This new paper confirms that solar activity variation can account for a 2% variation in global cloud cover, sufficient to explain the warming of the 20th century and without any consideration of CO2 "radiative forcing."
A team of scientists from the National Space Institute at the Technical University of Denmark (DTU Space) and the Racah Institute of Physics at the Hebrew University of Jerusalem has linked large solar eruptions to changes in Earth's cloud cover in a study based on over 25 years of satellite observations.
The solar eruptions are known to shield Earth's atmosphere from cosmic rays. However the new study, published in Journal of Geophysical Research: Space Physics, shows that the global cloud cover is simultaneously reduced, supporting the idea that cosmic rays are important for cloud formation. The eruptions cause a reduction in cloud fraction of about 2 percent corresponding to roughly a billion tonnes of liquid water disappearing from the atmosphere.
Since clouds are known to affect global temperatures on longer timescales, the present investigation represents an important step in the understanding of clouds and climate variability.
"Earth is under constant bombardment by particles from space called galactic cosmic rays. Violent eruptions at the Sun's surface can blow these cosmic rays away from Earth for about a week. Our study has shown that when the cosmic rays are reduced in this way there is a corresponding reduction in Earth's cloud cover. Since clouds are an important factor in controlling the temperature on Earth our results may have implications for climate change," explains lead author on the study Jacob Svensmark of DTU.
Very energetic particles
These particles generate electrically charged molecules -- ions -- in Earth's atmosphere. Ions have been shown in the laboratory to enhance the formation of aerosols, which can serve as seeds for the formation of the cloud drops that make up a cloud. Whether this actually happens in the atmosphere, or only in the laboratory is a topic that has been investigated and debated for years.
When the large solar eruptions blow away the galactic cosmic rays before they reach Earth they cause a reduction in atmospheric ions of up to about 20 to -30 percent over the course of a week. So if ions affect cloud formation it should be possible to observe a decrease in cloud cover during events when the Sun blows away cosmic rays, and this is precisely what is done in this study.
The so-called 'Forbush decreases' of the cosmic rays have previously been linked to week-long changes in Earth's cloud cover but the effect has been debated at length in the scientific literature. The new study concludes that "there is a real impact of Forbush decreases on cloud microphysics" and that the results support the suggestion that "ions play a significant role in the life-cycle of clouds."
Arriving at that conclusion was, however, a hard endeavor; Very few strong Forbush decreases occur and their effect on cloud formation is expected to be close to the limit of detection using global atmospheric observations measured by satellites and land based stations. Therefore it was of the greatest importance to select the strongest events for study since they had to have the most easily detected effect. Determining this strength required combining data from about 130 stations in combination with atmospheric modeling.
This new method resulted in a list of 26 events in the period of 1987-2007 ranked according to ionization. This ranked list was important for the detection of a signal, and may also shed some light on why previous studies have arrived at varied conclusions, since they have relied on events that were not necessarily ranked high on the list.
Possible long term effect
The effect from Forbush decreases on clouds is too brief to have any impact on long-term temperature changes.
However since clouds are affected by short term changes in galactic cosmic radiation, they may well also be affected by the slower change in Solar activity that happens on scales from tens to hundreds of years, and thus play a role in the radiation budget that determines the global temperature.
The Suns contribution to past and future climate change may thus be larger than merely the direct changes in radiation, concludes the scientists behind the new study.
SOURCE
Uncovered: Incoherent, Conflicting IPCC ‘Beliefs’ on Climate Sensitivity
This is a long and complex article but it needs to be that so it can set out fully what the detailed scientific claims of Warmists are. It shows that to get their alleged "catastrophic" levels of warming they rely heavily on an assumption about water vapour in the air having a large magnifying effect on the warming due to CO2 alone. So how do they work out exactly what the size of that magnifying effect will be? They don't. They just guess it. And the actual evidence for the size of such an effect is that it has no effect -- JR
For going on 3 decades now, Intergovernmental Panel on Climate Change (IPCC) reports have estimated that the climate’s sensitivity to the doubling of preindustrial levels of CO2 (from 280 ppm to 560 ppm) may range between 1.5°C to 4.5°C due significantly to the assumed “dangerous” warming amplification from positive water vapor feedback. Despite years of analysis, the factor-of-three difference between the lower and higher surface temperature range thresholds has changed little. There apparently have been no breakthroughs in understanding the “basic physics” of water vapor amplification to narrow this range further.
The theoretical conceptualization for the surface temperature change resulting from CO2 doubling alone — without the “dangerous” amplification from water vapor feedback — has also been in use, and unchanged, for decades. Since the 1960s it has been hypothesized that if preindustrial CO2 levels were to be doubled to 560 ppm, the surface temperature change would amount to a warming of a non-alarming 1.2°C in the absence of other feedbacks.
Below are brief summaries from scientific papers (and the Skeptical Science blog) confirming that the IPCC and models claim doubling CO2 only results in 1.2°C of warming.
IPCC (2001) :
“[T]he radiative forcing corresponding to a doubling of the CO2 concentration would be 4 Wm-2. To counteract this imbalance, the temperature of the surface-troposphere system would have to increase by 1.2°C (with an accuracy of ±10%), in the absence of other changes”
Skeptical Science :
“We know that if the amount of carbon dioxide (CO2) in the Earth’s atmosphere doubles from the pre-industrial level of 280 parts per million by volume (ppmv) to 560 ppmv, this will cause an energy imbalance by trapping more outgoing thermal radiation in the atmosphere, enough to directly warm the surface approximately 1.2°C.”
Gebhart, 1967 :
“The temperature change at the earth’s surface is ΔT=+1.2°C when the present [CO2] concentration is doubled.”
Hansen et al., 1981 :
“The increase of equilibrium surface temperature for doubled atmospheric CO2 is ∼1.2°C. This case is of special interest because it is the purely radiative-convective result, with no feedback effects.”
Lorius et al., 1990 :
“The radiative forcing resulting from doubled atmospheric CO2 would increase the surface and tropospheric temperature by 1.2°C if there were no feedbacks in the climate system.”
Torn and Harte, 2006 :
“An increase in atmospheric CO2 concentration from 275 to 550 ppm is expected to increase radiative forcing by about 4 W m2, which would lead to a direct warming of 1.2°C in the absence of feedbacks or other responses of the climate system”
IPCC: Dangerous future warming levels (3°C and up) are caused mostly by water vapor, not CO2
As mentioned, the IPCC authors have claimed that it is primarily due to the conceptualization of positive feedback with water vapor that the surface temperature response is projected to reach the dangerous warming levels of 3.0°C and up as CO2 doubles to 560 ppm.
IPCC (2001) :
“The so-called water vapour feedback, caused by an increase in atmospheric water vapour due to a temperature increase, is the most important feedback responsible for the amplification of the temperature increase [from CO2 alone].”
In their 4th report, the IPCC acknowledged that humans have little influence in determining water vapor levels:
IPCC (2007) :
“Water vapour is the most abundant and important greenhouse gas in the atmosphere. However, human activities have only a small direct influence on the amount of atmospheric water vapour.”
The main reason why IPCC authors have asserted that water vapor will do most of the “dangerous” projected warming, while CO2 will contribute a much smaller fraction, is apparently because the greenhouse warming effect from water vapor forcing is “two to three times greater” than that of carbon dioxide:
IPCC (2013) :
“Water vapour is the primary greenhouse gas in the Earth’s atmosphere. The contribution of water vapour to the natural greenhouse effect relative to that of carbon dioxide (CO2) depends on the accounting method, but can be considered to be approximately two to three times greater.”
Even NASA agrees that water vapor and clouds together account for 75% of the greenhouse effect, while CO2 only accounts for 20%.
NASA :
“Carbon dioxide causes about 20 percent of Earth’s greenhouse effect; water vapor accounts for about 50 percent; and clouds account for 25 percent. The rest is caused by small particles (aerosols) and minor greenhouse gases like methane.”
IPCC: Positive water vapor feedbacks are believed to cause dangerous warming
It is curious to note that the insufficiently understood positive water vapor feedback conceptualization is rooted in . . . belief. Literally. In the third report (TAR), the IPCC authors actually used the word “believed” to denote how they reached the conclusion that 1.2°C will somehow morph into 1.5°C to 4.5°C of warming due to amplification from feedbacks.
IPCC (2001) :
“If the amount of carbon dioxide were doubled instantaneously, with everything else remaining the same, the outgoing infrared radiation would be reduced by about 4 Wm-2. In other words, the radiative forcing corresponding to a doubling of the CO2 concentration would be 4 Wm-2. To counteract this imbalance, the temperature of the surface-troposphere system would have to increase by 1.2°C (with an accuracy of ±10%), in the absence of other changes. In reality, due to feedbacks, the response of the climate system is much more complex. It is believed that the overall effect of the feedbacks amplifies the temperature increase to 1.5 to 4.5°C. A significant part of this uncertainty range arises from our limited knowledge of clouds and their interactions with radiation.”
IPCC climate sensitivity estimates have been based on hypotheticals, or the belief that water vapor positive feedback will cause another 1.8°C to 3.3°C of “extra” or “dangerous” warming (to reach upwards of 3.0°C to 4.5°C). CO2 alone only causes 1.2°C of warming as it is doubled from 280 ppm to 560 ppm. Since when are modeled beliefs about what may possibly happen to global temperatures at some point in the next 100 years . . . science?
IPCC: Water vapor increased substantially since 1970 — but didn’t cause warming
If water vapor is the primary determinant of the “extra” and “dangerous” warming we are expected to get along with the modest 1.2°C temperature increase as the CO2 concentration reaches 560 ppm, then it is natural to ask: How much of the warming since 1950 has been caused by the additional CO2, and how much has been caused by the water vapor feedback that is believed to cause the extra, “dangerous” warming?
This last question arises because, according to the IPCC, there has been a substantial increase in the potent water vapor greenhouse gas concentration in the last few decades. Specifically, in their 4th report, the IPCC authors claim there has been “an overall increase in water vapour of order 5% over the 20th century and about 4% since 1970“(IPCC [2007]).
Considering its abundance in the atmosphere (~40,000 ppm in the tropics), if water vapor increased by 4% since 1970, that means that water vapor concentrations could potentially have increased by more than 1,500 ppm in the last few decades. The overall magnitude of this water vapor concentration increase is therefore more than 20 times greater than the increase in atmospheric CO2 concentration (~70 ppm) since 1970.
But even though the IPCC claims that (a) water vapor will cause most of the “dangerous” warming in the future, (b) water vapor climate forcing is “two to three” times greater than CO2 forcing within the greenhouse effect, and (c) water vapor concentrations have increased substantially since 1970, the IPCC simultaneously claims that (d) CO2 has caused most — if not all — of the warming since the mid-20th century anyway. In the 5th report, the IPCC’s “consensus” statement reads like this:
IPCC (2013, 2014) :
“It is extremely likely that more than half of the observed increase in global average surface temperature from 1951 to 2010 was caused by the anthropogenic increase in greenhouse gas concentrations and other anthropogenic forcings together.”
For advocates of dangerous anthropogenic global warming (DAGW) projections, the “more than half” CO2 attribution apparently isn’t quantitatively strong enough. After all, “more than half” could be interpreted as only slightly more than 50%. To rectify this, Gavin Schmidt — a primary overseer of NASA temperature adjustments — has calculated that the anthropogenic impact on climate has not just been “more than half,” but more than 100%. In a recent RealClimate blog entry, Schmidt claims that humans have caused 110% of the global warming since 1950 — and that IPCC analysis (found in Fig. 10.5 in IPCC AR5) also supports an anthropogenic CO2 attribution of “near 100%”.
Real Climate :
“The best estimate of the warming due to anthropogenic forcings (ANT) is the orange bar [in Fig. 10.5] (noting the 1𝛔 uncertainties). Reading off the graph, it is 0.7±0.2ºC (5-95%) with the observed warming 0.65±0.06 (5-95%). The attribution then follows as having a mean of ~110%, with a 5-95% range of 80–130%. This easily justifies the IPCC claims of having a mean near 100%, and a very low likelihood of the attribution being less than 50% (p < 0.0001!).”
Conflicting IPCC climate sensitivity feedback suppositions
The IPCC believes that the climate’s overall surface temperature sensitivity to the doubling of preindustrial CO2 ranges between 1.5°C to 4.5°C, with the projected higher warming levels due primarily to amplifying water vapor feedback. This conceptualization appears to be in conflict with other IPCC suppositions.
On one hand, the IPCC reports have claimed that (a) water vapor is much more potent than CO2 within the greenhouse effect, that (b) the bulk of the 3.0°C and up “dangerous” warming that is believed to occur in the future will be forced by positive water vapor feedback, and that (c) water vapor levels have significantly increased in recent decades (by 4% since 1970).
On the other hand, (d) water vapor is claimed to have caused right around 0% of the warming in the last several decades.
Summarily, these conflicting explanations or suppositions about what can happen, what will happen, and what has already happened to the climate due to water vapor feedback beg the questions:
Why hasn’t the “dangerous” water vapor warming found in models “kicked in” during the last several decades, when water vapor levels have increased (according to the IPCC)?
Since it reportedly hasn’t yet, at what point in the future will the “dangerous” water vapor warming projections found in modeling finally show up in the temperature record?
Considering how fundamental climate sensitivity estimates are to climate science, and ultimately to the direction of political policies and energy production and consumption, these questions deserve to be answered . . . with something more substantive than what the IPCC authors have long believed to be true.
SOURCE
What Obama Is Doing to Seal His Environmental Record
Before his last day in office, President Barack Obama wants to impose new fuel-efficiency standards and establish a green energy plan for North America to top off an environmental legacy including major international agreements and a massive expansion of regulations and subsidies.
“He will be leaving office with a very strongly negative legacy,” predicted Nick Loris, research fellow on energy and environment with The Heritage Foundation, in a phone interview. “After he failed to get a ‘cap and trade’ bill through Congress, he has used unelected bureaucrats to implement and pioneer regulatory onslaught.”
Early in his presidency, Obama and liberals in Congress unsuccessfully proposed financial incentives for companies to reduce carbon emissions, saying such a “cap and trade” approach would help curb global warming
During his weekend address Aug. 13, Obama spoke about “ambitious investments” that led to tripling the use of wind power, increasing the use of solar energy “thirtyfold,” and more energy-efficient vehicles.
“We’re not done yet. In the weeks and months ahead, we’ll release a second round of fuel-efficiency standards for heavy-duty vehicles,” Obama said of his Jan. 20 departure after eight years, adding:
We’ll take steps to meet the goal we set with Canada and Mexico to achieve 50 percent clean power across North America by 2025. And we’ll continue to protect our lands and waters so that our kids and grandkids can enjoy our most beautiful spaces for generations.
‘Little to Mitigate Global Warming’
Three days after that address, the Environmental Protection Agency and the National Highway Traffic Safety Administration formally announced they are adopting new fuel-efficiency standards for heavy-duty vehicles such as tractor-trailers and buses.
This will mark the second time the Obama administration has put new fuel-efficiency standards in place. The White House, in a press release, asserts that 20 percent of carbon pollution comes from heavy-duty vehicles.
Separately, the Energy Department created a new program to spend $140 million on research and development for “fuel-efficient truck technologies.”
This will almost certainly mean higher costs with minimum impact on global warming, Loris said.
“Trucks, buses, and garbage trucks, these are all industries that measure their fuel to a tenth of a mile because energy efficiency is key to their bottom line,” Loris said. “There is little this would do to mitigate global warming. You could shut down the entire economy and the temperature would only move a few degrees Celsius.”
Obama’s other ambitious goal before leaving office was reached during the North American Leaders’ Summit in late June, where Obama met with Mexican President Enrique Peña Nieto and Canadian Prime Minister Justin Trudeau in Ottawa. The plan is to have the three countries operating on 50 percent clean energy by 2025.
Such a goal will be nearly impossible to reach in nine years, said Patrick Michaels, director of the Center for the Study of Science at the libertarian Cato Institute.
“Of course it’s not doable,” Michaels told The Daily Signal in a phone interview. “Even if you substitute nuclear power for fossil fuels, that wouldn’t be enough time to build enough nuclear plants.”
‘Legacy of Unconscionable Costs’
Sticking to the deal will be a challenge, agreed David Kreutzer, a senior research fellow for energy economics and climate change at The Heritage Foundation.
“Whatever the cost, it won’t be incurred by the Obama administration,” Kreutzer said in a phone interview. “He can take on the role of an energy reformer and his successor will have to deal with the lost jobs and high energy prices. The current government of Canada might seem inclined to sign on, but Mexico needs investment and might not want to tie itself into poverty.”
The regulatory costs of environmental regulations artificially raise energy prices, which are typically shouldered by lower-income Americans, according to an analysis by The Heritage Foundation.
A 2011 poll by the National Energy Assistance Directors Association found that 37 percent of low-income families sacrificed medical and dental coverage to pay for higher energy bills. The poll found almost one in five identified a family member who became sick because their home was too cold.
“It’s a legacy of unconscionable costs imposed with no climate impact,” Kreutzer said.
“It’s a legacy of unconscionable costs imposed with no climate impact,” @dwkreutzer says.
The president could have made a larger investment in cutting-edge technologies such as those associated with nuclear power, including fusion research, contends Tony Sadar, a certified consulting meteorologist and author of “In Global Warming We Trust: Too Big to Fail.”
“Progressives are looking at sunbeams and windmills,” Sadar told The Daily Signal in a phone interview. “You’re not really progressive if you’re looking at ancient technologies. Early on, the president supported research into nuclear power generation. But we’ve seen a return to the alternative energy that leaves much to be desired economically and even environmentally.”
‘He Is Doubling Down’
Michaels, of the Cato Institute, said much of the Obama legacy will be the “boondoggles” of solar and wind power along the countryside.
“His long-term legacy will be that he committed this country to sources of power that will never supply much dependable electric power,” Michaels said, adding:
The fact that solar and wind have been subsidized for years shows they are not successful. He makes no attempt to hide the fact that he believes Europe is doing so many wonderful things that we should. If he was consistent on that, he would observe that most of Europe is disengaging from these energy sources, while he is doubling down.
The Daily Signal sought comment from the Sierra Club and Greenpeace, both of which support much of Obama’s environmental agenda, but neither responded by publication time.
Courts have delivered a setback to some of Obama’s environmental agenda.
In February, the Supreme Court blocked EPA rules limiting carbon emissions from power plants. The high court ordered a stay, until more than two dozen lawsuits challenging the regulations can be sorted.
Lower federal courts halted the Interior Department from imposing stricter regulations on hydraulic fracturing, and separately stopped an EPA rule on small waterways and wetlands. The lawsuits and court rulings were based in part on executive overreach.
The United States entered an international climate agreement with 171 other countries negotiated in Paris that is intended to curb carbon emissions that government leaders say contribute to global warming. The governments hammered out the deal last year, and U.S. Secretary of State John Kerry signed the agreement in April.
Though treaties require Senate ratification, negotiators from the Obama administration and other countries worded much of the agreement to allow the measures to be handled by the executive branch.
Scaling Back Taxpayer Subsidies
The Obama administration has scaled back some taxpayer subsidies after spending hundreds of millions on loan guarantees for green energy companies that failed, Loris noted.
Solyndra, the politically connected solar panel company that went bankrupt despite a $500 million Energy Department loan, was the most publicized debacle. But dozens of other companies got taxpayer subsidies.
In congressional testimony, Loris noted the underlying themes of subsidies to green energy companies showed taxpayer money going to failed companies that couldn’t survive even with such help; projects backed by larger companies that should be able to operate without taxpayer help; and numerous other companies that benefit from taxpayer subsidies.
The government surprisingly seems to have learned something from the bad investments, Loris said.
“I haven’t seen new major loan guarantees, though there have been extended tax credits,” Loris told The Daily Signal. “There could be a recognition that government isn’t good at picking winners and losers. Folks will recognize that politicians shouldn’t invest in energy.”
SOURCE
UK: Health warning over plan to use hospital generators to avoid blackouts
In which universe might a plan to use hospital generators to avoid blackouts seem sane?
National Grid’s drive for hospitals to help keep the UK's lights on by using their back-up diesel generators is "highly questionable" because it will cause air pollution right in the vicinity of patients, a think-tank has warned.
The energy utility is encouraging NHS sites to sign up for schemes where they will be paid to use their back-up generators for electricity routinely, not just in the event of an emergency power cut.
National Grid argues that making greater use of these existing generators represents a cost-effective way of helping to meet peak UK power demand as the country builds more intermittent wind and solar, instead of building new power plants that would sit dormant much of the time.
But Policy Exchange has urged the Government to restrict the use of such diesel generators beyond genuine emergency back-up because of concerns about air quality, especially in urban areas that are already polluted.
Diesel generators emit significant amounts of nitrogen oxides and particulate matter which can be "extremely damaging to health", it warns.
"National Grid has been actively recruiting hospitals and other organisations to make back-up generators available at peak times and avoid blackouts.
"Whilst this is desirable from a security of supply point of view, it is highly questionable from an air quality point of view – particularly since hospitals are typically located in urban locations close to some of the most sensitive receptors," Richard Howard, Policy Exchange’s head of energy and environment wrote.
Mr Howard said he had even heard of "generator flues venting directly into car parks and communal areas in hospitals used by patients".
Ministers are currently considering how to curb the growth in diesel generators, dozens of which are being built around the country after becoming the unintended beneficiaries of the Government’s "capacity market" subsidy scheme, which procures power plant capacity.
The environment department is considering new emissions regulations to target diesel, which are also likely to affect existing generators.
"The regulations need to be designed so as to avoid placing undue restrictions on genuine back-up generators, but at the same time limit the extent to which these same generators can run purely for commercial reasons," Mr Howard said.
However, any such restrictions could be a setback for National Grid’s efforts to keep the lights on cost-effectively.
The company is trying to promote "demand side response" schemes where industrial or commercial users reduce their demand on the grid at times when national supplies are scarce.
To date, about 95pc of the capacity procured has come from users switching to alternative sources of power such as diesel engines, rather than actually reducing the total amount of electricity they are using.
A separate review by regulator Ofgem is currently considering removing some of the financial benefits that diesel generators currently enjoy as a result of connecting directly into local distribution networks.
A spokesman for National Grid said: "Demand side measures are good for bill payers as they provide flexibility at a lower cost and help the country shift to a more low-carbon energy system.
"National Grid is obliged to be agnostic about technology and to procure the most cost-effective solutions to help us balance supply and demand. However, the Government is currently examining the regulations surrounding diesel generation. "
SOURCE
Green Fiasco: Biofuels ‘Worse Than Petrol’ For The Environment, New Study Finds
“Green” biofuels such as ethanol and biodiesel are in fact worse for the environment that petrol, a landmark new study has found.
The alternative energy source has long been praised for being carbon-neutral because the plants it is made from absorb carbon dioxide, which causes global warming, from the atmosphere while they are growing.
But new research in the US has found that the crops used for biofuel absorb only 37 per cent of the C02 that is later released into the atmosphere when the plants are burnt, meaning the process actually increases the amount of greenhouse gas in the air.
The scientists behind the study have called on governments to rethink their carbon policies in light of the findings.
The use of biofuels is controversial because it means crops and farm space that could otherwise be devoted to food production are in fact used for energy.
They currently make up just under 3 per cent of global energy consumption, and use in the US grew from 4.2 billion gallons a year in 2005 to 14.6 billion gallons a year in 2013.
In the UK the Renewable Transport Fuel Obligation now means that 4.75 per cent of any suppliers’ fuel comes from a renewable source, which is usually ethanol derived from crops.
Professor John DeCicco, from the University of Michigan, said his research was the first to carefully examine the carbon on farmland where biofuels are grown.
“When you look at what’s actually happening on the land, you find that not enough carbon is being removed from the atmosphere to balance what’s coming out of the tailpipe,” he said.
“When it comes to the emissions that cause global warming, it turns out that biofuels are worse than gasoline.”
Professor DeCicco said the study, which is published in the journal Climatic Change, reset the assumptions, that biofuels, as renewable alternatives to fossil fuels, are inherently carbon neutral simply because the C02 released when burned was originally absorbed from the atmosphere through photosynthesis.
SOURCE
How Britain will keep the lights on
Millions being spent to do what existing infrastructure could do if it was all brought back online
Eight new battery storage projects are to be built around the UK after winning contracts worth £66m to help National Grid keep power supplies stable as more wind and solar farm are built.
EDF Energy, E.On and Vattenfall were among the successful companies chosen to build new lithium ion batteries with a combined capacity of 200 megawatts (MW), under a new scheme to help Grid balance supply and demand within seconds.
Power generation and usage on the UK grid have to be matched as closely as possible in real-time to keep electricity supplies at a safe frequency so that household electrical appliances function properly.
National Grid says that maintaining the correct frequency is becoming more challenging as more renewable generation is built, because this makes the electricity system less stable and leads to more volatile fluctuations in frequency.
As a result, it has launched a new scheme to support technologies such as batteries that can respond within less than a second to either deliver or absorb power to or from the grid, bringing the system back into balance.
Projects with a capacity of more than 1.2 gigawatts entered a competition for contracts to provide this service to the grid.
EDF Energy’s was the biggest individual project to secure a contract, winning a £12m deal to build 49 megawatts (MW) of battery storage by its coal and gas plants at West Burton in Nottinghamshire.
Vattenfall won a contract to build 22MW of batteries next to its Pen y Cymoedd wind farm in Wales, while E.On is to build a 10MW battery by its biomass plant at Blackburn Meadows near Sheffield.
Low Carbon secured £15m of deals to build two projects, one in Kent and one in Cumbria, with a combined capacity of 50MW. The other winners were Element Power, RES and Belectric.
SOURCE
Feds Fund Scientists Who Protect The ‘Global Warming Paradigm,’ Says Report
The Obama administration has been pumping billions of taxpayer dollars into science that’s “heavily biased in favor of the paradigm of human-induced climate change,” according to a researchers.
Policy experts wanted to know if the lure of federal dollars was biasing climate science research.What they found is the group responsible for a significant portion of government climate science funding seems more concerned with promoting the “anthropogenic global warming” (AGW) paradigm, than studying natural variability in weather patterns.
“In short there appears to be virtually no discussion of the natural variability attribution idea. In contrast there appears to be extensive coverage of AGW issues,” David Wojick, a freelance reporter and policy analyst, wrote in a blog post, referring to research he did with climate scientist Patrick Michaels of the libertarian Cato Institute.
“This bias in favor of AGW has significant implications for US climate change policy,” Wojick wrote for the blog Climate Etc., which is run by climate scientist Judith Curry.
Wojick and Michaels published a working paper in 2015, asking the question: Does federal funding bias climate science?
They conducted a “semantic” analysis of three years of budget requests for the U.S. Global Change Research Program (USGCRP), which usually gets around $2.5 billion. They found USGCRP overwhelmingly used language supporting the AGW paradigm.
“The ratio of occurrences is roughly 80 to one,” Wojick wrote. “This extreme lack of balance between considerations of the two competing paradigms certainly suggests that paradigm protection is occurring.”
Politicians have become more concerned with global warming in recent years, and have been willing to shell out more money for potential solutions to the problem. The Obama administration, for example, reported spending $22.2 billion on global warming efforts in 2013, including $2.5 billion to the USGCRP.
That’s a lot of money, and illustrates why Wojick and Michaels are so concerned about federal money’s influence on science.
“Present policy is based on the AGW paradigm, but if a significant fraction of global warming is natural then this policy may be wrong,” Wojick wrote. “Federal climate research should be trying to solve the attribution problem, not protecting the AGW paradigm.”
Wojick and Michaels have already weighed in on the bias in climate science towards using models, which they say “is a bad thing.”
“Climate science appears to be obsessively focused on modeling,” they wrote in May. “Modeling can be a useful tool, a way of playing with hypotheses to explore their implications or test them against observations. That is how modeling is used in most sciences.”
“But in climate change science modeling appears to have become an end in itself. In fact it seems to have become virtually the sole point of the research,” they wrote. “The modelers’ oft stated goal is to do climate forecasting, along the lines of weather forecasting, at local and regional scales.”
SOURCE
***************************************
For more postings from me, see DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are here or here or here. Email me (John Ray) here.
Preserving the graphics: Most graphics on this site are hotlinked from elsewhere. But hotlinked graphics sometimes have only a short life -- as little as a week in some cases. After that they no longer come up. From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site. See here or here
*****************************************
Subscribe to:
Posts (Atom)