Sunday, August 28, 2016



Solar activity has a direct impact on Earth's cloud cover

This new paper confirms that solar activity variation can account for a 2% variation in global cloud cover, sufficient to explain the warming of the 20th century and without any consideration of CO2 "radiative forcing."

 A team of scientists from the National Space Institute at the Technical University of Denmark (DTU Space) and the Racah Institute of Physics at the Hebrew University of Jerusalem has linked large solar eruptions to changes in Earth's cloud cover in a study based on over 25 years of satellite observations.

The solar eruptions are known to shield Earth's atmosphere from cosmic rays. However the new study, published in Journal of Geophysical Research: Space Physics, shows that the global cloud cover is simultaneously reduced, supporting the idea that cosmic rays are important for cloud formation. The eruptions cause a reduction in cloud fraction of about 2 percent corresponding to roughly a billion tonnes of liquid water disappearing from the atmosphere.

Since clouds are known to affect global temperatures on longer timescales, the present investigation represents an important step in the understanding of clouds and climate variability.

"Earth is under constant bombardment by particles from space called galactic cosmic rays. Violent eruptions at the Sun's surface can blow these cosmic rays away from Earth for about a week. Our study has shown that when the cosmic rays are reduced in this way there is a corresponding reduction in Earth's cloud cover. Since clouds are an important factor in controlling the temperature on Earth our results may have implications for climate change," explains lead author on the study Jacob Svensmark of DTU.

Very energetic particles

These particles generate electrically charged molecules -- ions -- in Earth's atmosphere. Ions have been shown in the laboratory to enhance the formation of aerosols, which can serve as seeds for the formation of the cloud drops that make up a cloud. Whether this actually happens in the atmosphere, or only in the laboratory is a topic that has been investigated and debated for years.

When the large solar eruptions blow away the galactic cosmic rays before they reach Earth they cause a reduction in atmospheric ions of up to about 20 to -30 percent over the course of a week. So if ions affect cloud formation it should be possible to observe a decrease in cloud cover during events when the Sun blows away cosmic rays, and this is precisely what is done in this study.

The so-called 'Forbush decreases' of the cosmic rays have previously been linked to week-long changes in Earth's cloud cover but the effect has been debated at length in the scientific literature. The new study concludes that "there is a real impact of Forbush decreases on cloud microphysics" and that the results support the suggestion that "ions play a significant role in the life-cycle of clouds."

Arriving at that conclusion was, however, a hard endeavor; Very few strong Forbush decreases occur and their effect on cloud formation is expected to be close to the limit of detection using global atmospheric observations measured by satellites and land based stations. Therefore it was of the greatest importance to select the strongest events for study since they had to have the most easily detected effect. Determining this strength required combining data from about 130 stations in combination with atmospheric modeling.

This new method resulted in a list of 26 events in the period of 1987-2007 ranked according to ionization. This ranked list was important for the detection of a signal, and may also shed some light on why previous studies have arrived at varied conclusions, since they have relied on events that were not necessarily ranked high on the list.

Possible long term effect

The effect from Forbush decreases on clouds is too brief to have any impact on long-term temperature changes.

However since clouds are affected by short term changes in galactic cosmic radiation, they may well also be affected by the slower change in Solar activity that happens on scales from tens to hundreds of years, and thus play a role in the radiation budget that determines the global temperature.

The Suns contribution to past and future climate change may thus be larger than merely the direct changes in radiation, concludes the scientists behind the new study.

SOURCE






Uncovered: Incoherent, Conflicting IPCC ‘Beliefs’ on Climate Sensitivity

This is a long and complex article but it needs to be that so it can set out fully what the detailed scientific claims of Warmists are.  It shows that to get their alleged "catastrophic" levels of warming they rely heavily on an assumption about water vapour in the air having a large magnifying effect on the warming due to CO2 alone.  So how do they work out exactly what the size of that magnifying effect will be?  They don't.  They just guess it.  And the actual  evidence for the size of such an effect is that it has no effect -- JR

For going on 3 decades now, Intergovernmental Panel on Climate Change (IPCC) reports have estimated that the climate’s sensitivity to the doubling of preindustrial levels of CO2 (from 280 ppm to 560 ppm) may range between 1.5°C to 4.5°C due significantly to the assumed “dangerous” warming amplification from positive water vapor feedback.  Despite years of analysis, the factor-of-three difference between the lower and higher surface temperature range thresholds has changed little.  There apparently have been no breakthroughs in understanding the “basic physics” of water vapor amplification to narrow this range further.

The theoretical conceptualization for the surface temperature change resulting from CO2 doubling alone — without the “dangerous” amplification from  water vapor feedback — has also been in use, and unchanged, for decades.  Since the 1960s it has been hypothesized that if preindustrial CO2 levels were to be doubled to 560 ppm, the surface temperature change would amount to a warming of a non-alarming 1.2°C in the absence of other feedbacks.

Below are brief summaries from scientific papers (and the Skeptical Science blog) confirming that the IPCC and models claim doubling CO2 only results in 1.2°C of warming.

IPCC (2001) :

“[T]he radiative forcing corresponding to a doubling of the CO2 concentration would be 4 Wm-2. To counteract this imbalance, the temperature of the surface-troposphere system would have to increase by 1.2°C (with an accuracy of ±10%), in the absence of other changes”

Skeptical Science :

“We know that if the amount of carbon dioxide (CO2) in the Earth’s atmosphere doubles from the pre-industrial level of 280 parts per million  by volume (ppmv) to 560 ppmv, this will cause an energy imbalance by trapping more outgoing thermal radiation in the atmosphere, enough to directly warm the surface approximately 1.2°C.”

Gebhart, 1967 :

“The temperature change at the earth’s surface is ΔT=+1.2°C when the present [CO2] concentration is doubled.”

Hansen et al., 1981 :

“The increase of equilibrium surface temperature for doubled atmospheric CO2 is ∼1.2°C.  This case is of special interest because it is the purely radiative-convective result, with no feedback effects.”

Lorius et al., 1990 :

“The radiative forcing resulting from doubled atmospheric CO2 would increase the surface and tropospheric temperature by  1.2°C if there were no feedbacks in the climate system.”

Torn and Harte, 2006 :

“An increase in atmospheric CO2 concentration from 275 to 550 ppm is expected to increase radiative forcing by about 4 W m2, which would lead to a direct warming of 1.2°C in the absence of feedbacks or other responses of the climate system”

IPCC: Dangerous future warming levels (3°C and up) are caused mostly by water vapor, not CO2

As mentioned, the IPCC authors have claimed that it is primarily due to the conceptualization of positive feedback with water vapor that the surface temperature response is projected  to reach the dangerous warming levels of 3.0°C and up as CO2 doubles to 560 ppm.

IPCC (2001) :

“The so-called water vapour feedback, caused by an increase in atmospheric water vapour due to a temperature increase, is the most important feedback responsible for the amplification of the temperature increase [from CO2 alone].”

In their 4th report, the IPCC acknowledged that humans have little influence in determining water vapor levels:

IPCC (2007) :

“Water vapour is the most abundant and important greenhouse gas in the atmosphere. However, human activities have only a small direct influence on the amount of atmospheric water vapour.”

The main reason why IPCC authors have asserted that water vapor will do most of the “dangerous” projected warming, while CO2 will contribute a much smaller fraction, is apparently because the greenhouse warming effect from water vapor forcing is “two to three times greater” than that of carbon dioxide:

IPCC (2013) :

“Water vapour is the primary greenhouse gas in the Earth’s atmosphere. The contribution of water vapour to the natural greenhouse effect relative to that of carbon dioxide (CO2) depends on the accounting method, but can be considered to be approximately two to three times greater.”

Even NASA agrees that water vapor and clouds together account for 75% of the greenhouse effect, while CO2 only accounts for 20%.

NASA  :

“Carbon dioxide causes about 20 percent of Earth’s greenhouse effect; water vapor accounts for about 50 percent; and clouds account for 25 percent. The rest is caused by small particles (aerosols) and minor greenhouse gases like methane.”

IPCC: Positive water vapor feedbacks are believed to cause dangerous warming

It is curious to note that the insufficiently understood positive water vapor feedback conceptualization is rooted in . . . belief.  Literally.   In the third report (TAR), the IPCC authors actually used the word “believed” to denote how they reached the conclusion that 1.2°C will somehow morph into 1.5°C to 4.5°C of warming due to amplification from feedbacks.

IPCC (2001) :

“If the amount of carbon dioxide were doubled instantaneously, with everything else remaining the same, the outgoing infrared radiation would be reduced by about 4 Wm-2. In other words, the radiative forcing corresponding to a doubling of the CO2 concentration would be 4 Wm-2. To counteract this imbalance, the temperature of the surface-troposphere system would have to increase by 1.2°C (with an accuracy of ±10%), in the absence of other changes. In reality, due to feedbacks, the response of the climate system is much more complex. It is believed that the overall effect of the feedbacks amplifies the temperature increase to 1.5 to 4.5°C. A significant part of this uncertainty range arises from our limited knowledge of clouds and their interactions with radiation.”

IPCC climate sensitivity estimates have been based on hypotheticals, or the belief that water vapor positive feedback will cause another 1.8°C to 3.3°C of “extra” or “dangerous” warming (to reach upwards of 3.0°C to 4.5°C).  CO2 alone only causes 1.2°C of warming as it is doubled from 280 ppm to 560 ppm.  Since when are modeled beliefs about what may possibly happen to global temperatures at some point in the next 100 years . . . science?

IPCC: Water vapor increased substantially since 1970 — but didn’t cause warming

If water vapor is the primary determinant of the “extra” and “dangerous” warming we are expected to get along with the modest 1.2°C temperature increase as the CO2 concentration reaches 560 ppm, then it is natural to ask: How much of the warming since 1950 has been caused by the additional CO2, and how much has been caused by the water vapor feedback that is believed to cause the extra, “dangerous” warming?

This last question arises because, according to the IPCC, there has been a substantial increase in the potent water vapor greenhouse gas concentration in the last few decades.  Specifically, in their 4th report, the IPCC authors claim there has been “an overall increase in water vapour of order 5% over the 20th century and about 4% since 1970“(IPCC [2007]).

Considering its abundance in the atmosphere (~40,000 ppm in the tropics), if water vapor increased by 4% since 1970, that means that water vapor concentrations could potentially have increased by more than 1,500 ppm in the last few decades.  The overall magnitude of this water vapor concentration increase is therefore more than 20 times greater than the increase in atmospheric CO2 concentration (~70 ppm) since 1970.

But even though the IPCC claims that (a) water vapor will cause most of the “dangerous” warming in the future, (b) water vapor climate forcing is “two to three” times greater than CO2 forcing within the greenhouse effect, and (c) water vapor concentrations have increased substantially since 1970, the IPCC simultaneously claims that (d) CO2 has caused most — if not all — of the warming since the mid-20th century anyway.   In the 5th report, the IPCC’s “consensus” statement reads like this:

IPCC (2013, 2014) :

“It is extremely likely that more than half of the observed increase in global average surface temperature from 1951 to 2010 was caused by the anthropogenic increase in greenhouse gas concentrations and other anthropogenic forcings together.”

For advocates of dangerous anthropogenic global warming (DAGW) projections, the “more than half” CO2 attribution apparently isn’t quantitatively strong enough.  After all, “more than half” could be interpreted as only slightly more than 50%.   To rectify this, Gavin Schmidt  — a primary overseer of NASA temperature adjustments — has calculated that the anthropogenic impact on climate has not  just been “more than half,” but more than 100%.   In a recent RealClimate blog entry, Schmidt  claims that humans have caused 110% of the global warming since 1950 — and that IPCC analysis (found in Fig. 10.5 in IPCC AR5) also supports an anthropogenic CO2 attribution of  “near 100%”.

Real Climate :

“The best estimate of the warming due to anthropogenic forcings (ANT) is the orange bar [in Fig. 10.5] (noting the 1𝛔 uncertainties). Reading off the graph, it is 0.7±0.2ºC (5-95%) with the observed warming 0.65±0.06 (5-95%). The attribution then follows as having a mean of ~110%, with a 5-95% range of 80–130%. This easily justifies the IPCC claims of having a mean near 100%, and a very low likelihood of the attribution being less than 50% (p < 0.0001!).”

Conflicting IPCC climate sensitivity feedback suppositions

The IPCC believes that the climate’s overall surface temperature sensitivity to the doubling of preindustrial CO2 ranges between 1.5°C to 4.5°C, with the projected higher warming levels due primarily to amplifying water vapor feedback.  This conceptualization appears to be in conflict with other IPCC suppositions.

On one hand, the IPCC reports have claimed that (a) water vapor is much more potent than CO2 within the greenhouse effect, that (b) the bulk of the 3.0°C and up “dangerous” warming that is believed to occur in the future will be forced by positive water vapor feedback, and that (c) water vapor  levels have significantly increased in recent decades (by 4% since 1970).

On the other hand, (d) water vapor is claimed to have caused right around 0% of the warming in the last several decades.

Summarily, these conflicting explanations or suppositions about what can happen, what will happen, and what has already happened to the climate due to water vapor feedback beg the questions:

Why hasn’t the “dangerous” water vapor warming found in models “kicked in” during the last several decades, when water vapor levels have increased (according to the IPCC)?

Since it reportedly hasn’t yet, at what point in the future will the “dangerous” water vapor warming projections found in modeling finally show up in the temperature record?

Considering how fundamental climate sensitivity estimates are to climate science, and ultimately to the direction of political policies and energy production and consumption, these questions deserve to be answered . . . with something more substantive than what the IPCC authors have long believed to be true.

SOURCE  





What Obama Is Doing to Seal His Environmental Record

Before his last day in office, President Barack Obama wants to impose new fuel-efficiency standards and establish a green energy plan for North America to top off an environmental legacy including major international agreements and a massive expansion of regulations and subsidies.

“He will be leaving office with a very strongly negative legacy,” predicted Nick Loris, research fellow on energy and environment with The Heritage Foundation, in a phone interview. “After he failed to get a ‘cap and trade’ bill through Congress, he has used unelected bureaucrats to implement and pioneer regulatory onslaught.”

Early in his presidency, Obama and liberals in Congress unsuccessfully proposed financial incentives for companies to reduce carbon emissions, saying such a “cap and trade” approach would help curb global warming

During his weekend address Aug. 13, Obama spoke about “ambitious investments” that led to tripling the use of wind power, increasing the use of solar energy “thirtyfold,” and more energy-efficient vehicles.

“We’re not done yet. In the weeks and months ahead, we’ll release a second round of fuel-efficiency standards for heavy-duty vehicles,” Obama said of his Jan. 20 departure after eight years, adding:

We’ll take steps to meet the goal we set with Canada and Mexico to achieve 50 percent clean power across North America by 2025. And we’ll continue to protect our lands and waters so that our kids and grandkids can enjoy our most beautiful spaces for generations.

‘Little to Mitigate Global Warming’

Three days after that address, the Environmental Protection Agency and the National Highway Traffic Safety Administration formally announced they are adopting new fuel-efficiency standards for heavy-duty vehicles such as tractor-trailers and buses.

This will mark the second time the Obama administration has put new fuel-efficiency standards in place. The White House, in a press release, asserts that 20 percent of carbon pollution comes from heavy-duty vehicles.

Separately, the Energy Department created a new program to spend $140 million on research and development for “fuel-efficient truck technologies.”

This will almost certainly mean higher costs with minimum impact on global warming, Loris said.

“Trucks, buses, and garbage trucks, these are all industries that measure their fuel to a tenth of a mile because energy efficiency is key to their bottom line,” Loris said. “There is little this would do to mitigate global warming. You could shut down the entire economy and the temperature would only move a few degrees Celsius.”

Obama’s other ambitious goal before leaving office was reached during the North American Leaders’ Summit in late June, where Obama met with Mexican President Enrique Peña Nieto and Canadian Prime Minister Justin Trudeau in Ottawa. The plan is to have the three countries operating on 50 percent clean energy by 2025.

Such a goal will be nearly impossible to reach in nine years, said Patrick Michaels, director of the Center for the Study of Science at the libertarian Cato Institute.

“Of course it’s not doable,” Michaels told The Daily Signal in a phone interview. “Even if you substitute nuclear power for fossil fuels, that wouldn’t be enough time to build enough nuclear plants.”

‘Legacy of Unconscionable Costs’

Sticking to the deal will be a challenge, agreed David Kreutzer, a senior research fellow for energy economics and climate change at The Heritage Foundation.

“Whatever the cost, it won’t be incurred by the Obama administration,” Kreutzer said in a phone interview. “He can take on the role of an energy reformer and his successor will have to deal with the lost jobs and high energy prices. The current government of Canada might seem inclined to sign on, but Mexico needs investment and might not want to tie itself into poverty.”

The regulatory costs of environmental regulations artificially raise energy prices, which are typically shouldered by lower-income Americans, according to an analysis by The Heritage Foundation.

A 2011 poll by the National Energy Assistance Directors Association found that 37 percent of low-income families sacrificed medical and dental coverage to pay for higher energy bills. The poll found almost one in five identified a family member who became sick because their home was too cold.

“It’s a legacy of unconscionable costs imposed with no climate impact,” Kreutzer said.

“It’s a legacy of unconscionable costs imposed with no climate impact,” @dwkreutzer says.

The president could have made a larger investment in cutting-edge technologies such as those associated with nuclear power, including fusion research, contends Tony Sadar, a certified consulting meteorologist and author of “In Global Warming We Trust: Too Big to Fail.”

“Progressives are looking at sunbeams and windmills,” Sadar told The Daily Signal in a phone interview. “You’re not really progressive if you’re looking at ancient technologies. Early on, the president supported research into nuclear power generation. But we’ve seen a return to the alternative energy that leaves much to be desired economically and even environmentally.”

‘He Is Doubling Down’

Michaels, of the Cato Institute, said much of the Obama legacy will be the “boondoggles” of solar and wind power along the countryside.

“His long-term legacy will be that he committed this country to sources of power that will never supply much dependable electric power,” Michaels said, adding:

The fact that solar and wind have been subsidized for years shows they are not successful. He makes no attempt to hide the fact that he believes Europe is doing so many wonderful things that we should. If he was consistent on that, he would observe that most of Europe is disengaging from these energy sources, while he is doubling down.

The Daily Signal sought comment from the Sierra Club and Greenpeace, both of which support much of Obama’s environmental agenda, but neither responded by publication time.

Courts have delivered a setback to some of Obama’s environmental agenda.

In February, the Supreme Court blocked EPA rules limiting carbon emissions from power plants. The high court ordered a stay, until more than two dozen lawsuits challenging the regulations can be sorted.

Lower federal courts halted the Interior Department from imposing stricter regulations on hydraulic fracturing, and separately stopped an EPA rule on small waterways and wetlands. The lawsuits and court rulings were based in part on executive overreach.

The United States entered an international climate agreement with 171 other countries negotiated in Paris that is intended to curb carbon emissions that government leaders say contribute to global warming. The governments hammered out the deal last year, and U.S. Secretary of State John Kerry signed the agreement in April.

Though treaties require Senate ratification, negotiators from the Obama administration and other countries worded much of the agreement to allow the measures to be handled by the executive branch.

Scaling Back Taxpayer Subsidies

The Obama administration has scaled back some taxpayer subsidies after spending hundreds of millions on loan guarantees for green energy companies that failed, Loris noted.

Solyndra, the politically connected solar panel company that went bankrupt despite a $500 million Energy Department loan, was the most publicized debacle. But dozens of other companies got taxpayer subsidies.

In congressional testimony, Loris noted the underlying themes of subsidies to green energy companies showed taxpayer money going to failed companies that couldn’t survive even with such help; projects backed by larger companies that should be able to operate without taxpayer help; and numerous other companies that benefit from taxpayer subsidies.

The government surprisingly seems to have learned something from the bad investments, Loris said.

“I haven’t seen new major loan guarantees, though there have been extended tax credits,” Loris told The Daily Signal. “There could be a recognition that government isn’t good at picking winners and losers. Folks will recognize that politicians shouldn’t invest in energy.”

SOURCE




UK: Health warning over plan to use hospital generators to avoid blackouts

In which universe might a plan to use hospital generators to avoid blackouts seem sane?

National Grid’s drive for hospitals to help keep the UK's lights on by using their back-up diesel generators is "highly questionable" because it will cause air pollution right in the vicinity of patients, a think-tank has warned.

The energy utility is encouraging NHS sites to sign up for schemes where they will be paid to use their back-up generators for electricity routinely, not just in the event of an emergency power cut.

National Grid argues that making greater use of these existing generators represents a cost-effective way of helping to meet peak UK power demand as the country builds more intermittent wind and solar, instead of building new power plants that would sit dormant much of the time.

But Policy Exchange has urged the Government to restrict the use of such diesel generators beyond genuine emergency back-up because of concerns about air quality, especially in urban areas that are already polluted.

Diesel generators emit significant amounts of nitrogen oxides and particulate matter which can be "extremely damaging to health", it warns.

"National Grid has been actively recruiting hospitals and other organisations to make back-up generators available at peak times and avoid blackouts.

"Whilst this is desirable from a security of supply point of view, it is highly questionable from an air quality point of view – particularly since hospitals are typically located in urban locations close to some of the most sensitive receptors," Richard Howard, Policy Exchange’s head of energy and environment wrote.

Mr Howard said he had even heard of "generator flues venting directly into car parks and communal areas in hospitals used by patients".

Ministers are currently considering how to curb the growth in diesel generators, dozens of which are being built around the country after becoming the unintended beneficiaries of the Government’s "capacity market" subsidy scheme, which procures power plant capacity.

The environment department is considering new emissions regulations to target diesel, which are also likely to affect existing generators.

"The regulations need to be designed so as to avoid placing undue restrictions on genuine back-up generators, but at the same time limit the extent to which these same generators can run purely for commercial reasons," Mr Howard said.

However, any such restrictions could be a setback for National Grid’s efforts to keep the lights on cost-effectively.

The company is trying to promote "demand side response" schemes where industrial or commercial users reduce their demand on the grid at times when national supplies are scarce.

To date, about 95pc of the capacity procured has come from users switching to alternative sources of power such as diesel engines, rather than actually reducing the total amount of electricity they are using.

A separate review by regulator Ofgem is currently considering removing some of the financial benefits that diesel generators currently enjoy as a result of connecting directly into local distribution networks.

A spokesman for National Grid said: "Demand side measures are good for bill payers as they provide flexibility at a lower cost and help the country shift to a more low-carbon energy system.

"National Grid is obliged to be agnostic about technology and to procure the most cost-effective solutions to help us balance supply and demand. However, the Government is currently examining the regulations surrounding diesel generation. "

SOURCE




Green Fiasco: Biofuels ‘Worse Than Petrol’ For The Environment, New Study Finds

“Green” biofuels such as ethanol and biodiesel are in fact worse for the environment that petrol, a landmark new study has found.
The alternative energy source has long been praised for being carbon-neutral because the plants it is made from absorb carbon dioxide, which causes global warming, from the atmosphere while they are growing.

But new research in the US has found that the crops used for biofuel absorb only 37 per cent of the C02 that is later released into the atmosphere when the plants are burnt, meaning the process actually increases the amount of greenhouse gas in the air.

The scientists behind the study have called on governments to rethink their carbon policies in light of the findings.

The use of biofuels is controversial because it means crops and farm space that could otherwise be devoted to food production are in fact used for energy.

They currently make up just under 3 per cent of global energy consumption, and use in the US grew from 4.2 billion gallons a year in 2005 to 14.6 billion gallons a year in 2013.

In the UK the Renewable Transport Fuel Obligation now means that 4.75 per cent of any suppliers’ fuel comes from a renewable source, which is usually ethanol derived from crops.

Professor John DeCicco, from the University of Michigan, said his research was the first to carefully examine the carbon on farmland where biofuels are grown.

“When you look at what’s actually happening on the land, you find that not enough carbon is being removed from the atmosphere to balance what’s coming out of the tailpipe,” he said.

“When it comes to the emissions that cause global warming, it turns out that biofuels are worse than gasoline.”

Professor DeCicco said the study, which is published in the journal Climatic Change, reset the assumptions, that biofuels, as renewable alternatives to fossil fuels, are inherently carbon neutral simply because the C02 released when burned was originally absorbed from the atmosphere through photosynthesis.

SOURCE




How Britain will keep the lights on

Millions being spent to do what existing infrastructure could do if it was all brought back online

Eight new battery storage projects are to be built around the UK after winning contracts worth £66m to help National Grid keep power supplies stable as more wind and solar farm are built.

EDF Energy, E.On and Vattenfall were among the successful companies chosen to build new lithium ion batteries with a combined capacity of 200 megawatts (MW), under a new scheme to help Grid balance supply and demand within seconds.

Power generation and usage on the UK grid have to be matched as closely as possible in real-time to keep electricity supplies at a safe frequency so that household electrical appliances function properly.

National Grid says that maintaining the correct frequency is becoming more challenging as more renewable generation is built, because this makes the electricity system less stable and leads to more volatile fluctuations in frequency.

As a result, it has launched a new scheme to support technologies such as batteries that can respond within less than a second to either deliver or absorb power to or from the grid, bringing the system back into balance.

Projects with a capacity of more than 1.2 gigawatts entered a competition for contracts to provide this service to the grid.

EDF Energy’s was the biggest individual project to secure a contract, winning a £12m deal to build 49 megawatts (MW) of battery storage by its coal and gas plants at West Burton in Nottinghamshire.

Vattenfall won a contract to build 22MW of batteries next to its Pen y Cymoedd wind farm in Wales, while E.On is to build a 10MW battery by its biomass plant at Blackburn Meadows near Sheffield.

Low Carbon secured £15m of deals to build two projects, one in Kent and one in Cumbria, with a combined capacity of 50MW. The other winners were Element Power, RES and Belectric.

SOURCE  






Feds Fund Scientists Who Protect The ‘Global Warming Paradigm,’ Says Report

The Obama administration has been pumping billions of taxpayer dollars into science that’s “heavily biased in favor of the paradigm of human-induced climate change,” according to a researchers.

Policy experts wanted to know if the lure of federal dollars was biasing climate science research.What they found is the group responsible for a significant portion of government climate science funding seems more concerned with promoting the “anthropogenic global warming” (AGW) paradigm, than studying natural variability in weather patterns.

“In short there appears to be virtually no discussion of the natural variability attribution idea. In contrast there appears to be extensive coverage of AGW issues,” David Wojick, a freelance reporter and policy analyst, wrote in a blog post, referring to research he did with climate scientist Patrick Michaels of the libertarian Cato Institute.

“This bias in favor of AGW has significant implications for US climate change policy,” Wojick wrote for the blog Climate Etc., which is run by climate scientist Judith Curry.

Wojick and Michaels published a working paper in 2015, asking the question: Does federal funding bias climate science?

They conducted a “semantic” analysis of three years of budget requests for the U.S. Global Change Research Program (USGCRP), which usually gets around $2.5 billion. They found USGCRP overwhelmingly used language supporting the AGW paradigm.

“The ratio of occurrences is roughly 80 to one,” Wojick wrote. “This extreme lack of balance between considerations of the two competing paradigms certainly suggests that paradigm protection is occurring.”

Politicians have become more concerned with global warming in recent years, and have been willing to shell out more money for potential solutions to the problem. The Obama administration, for example, reported spending $22.2 billion on global warming efforts in 2013, including $2.5 billion to the USGCRP.

That’s a lot of money, and illustrates why Wojick and Michaels are so concerned about federal money’s influence on science.

“Present policy is based on the AGW paradigm, but if a significant fraction of global warming is natural then this policy may be wrong,” Wojick wrote. “Federal climate research should be trying to solve the attribution problem, not protecting the AGW paradigm.”

Wojick and Michaels have already weighed in on the bias in climate science towards using models, which they say “is a bad thing.”

“Climate science appears to be obsessively focused on modeling,” they wrote in May. “Modeling can be a useful tool, a way of playing with hypotheses to explore their implications or test them against observations. That is how modeling is used in most sciences.”

“But in climate change science modeling appears to have become an end in itself. In fact it seems to have become virtually the sole point of the research,” they wrote. “The modelers’ oft stated goal is to do climate forecasting, along the lines of weather forecasting, at local and regional scales.”

SOURCE  

***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here

*****************************************


No comments: