Sunday, June 26, 2016

Global warming really COULD leave you hot-headed: Scientists say scorching year long temperatures make people violent

This is an old, old theory founded on the fact that there IS more interpersonal violence in the tropics.  But the people who live in the tropics are not the same people as those who live in temperate climes, so there could be other factors at work.  IQs, for instance are notably lower in the warm climate areas of the globe and low IQ is reliably associated with crime and violence.  The average IQ of almost any prison population is well below average.  So the case is moot.  I once thought I had some evidence in support of the theory in my own research but the difference turned out to be  unreliable

Near the equator, sweltering temperatures persevere day after day, with little chance that the upcoming season will break the routine. And according to a new theory, it just might make you snap.

Researchers say the combination of high temperatures and lack of seasonal variation causes people to lead 'faster' lifestyles, contributing to more aggression and violence, and say it could get worse as global warming causes temperature to rocket.

In the 'CLASH' model – CLimate, Aggression, and Self-control in Humans – researchers say hot temperatures and little seasonal variation contribute to more aggression and violence.

This is because people in these regions lead a 'faster' lifestyle, and spend less time planning for the future.

They also say people in these climate areas are likely to behave with less self-control.

This may be because they don't plan ahead for drastic seasonal changes, they say, and are faced with more immediate risks.

Researchers from Vrije Universiteit Amsterdam developed the 'CLASH' model – CLimate, Aggression, and Self-control in Humans – to understand why violent crime is so high in hot climates.

'Climate shapes how people live, it affects the culture in ways that we don't think about in our daily lives,' said Paul Van Lange, lead author of the article and professor of psychology at Vrije Universiteit Amsterdam.

'We believe our model can help explain the impact of climate on rates of violence in different parts of the world.'

Previous studies have linked violence and aggression simply to hot climates, but the two leading explanations of why that is so aren't satisfactory said Brad Bushman, a professor of psychology at The Ohio State University and VU Amsterdam.

The General Aggression Model, which Bushman helped to develop, attributes the aggression in hot climates to discomfort and irritation.  'But that doesn't explain more extreme acts, such as murder,' he said.

A second theory, the Routine Activity Theory, says that warm weather leads people to be outdoors more often, thus creating more opportunities for conflict.

But, this doesn't explain why violence increases as the temperatures grow hotter, with 95 degrees seeing more violence than 75.

In the new model, the researchers consider lack of seasonal variation a factor as well. 'Less variation in temperature, combined with heat, brings some measure of consistency to daily life,' said Maria I. Rinderu of VU.

As a result of this, the researchers say people have less of a need to plan ahead for weather differences, causing them to be less concerned about the future, and have less need for self-control.

'Strong seasonal variation in temperature affects culture in powerful ways,' said Van Lange.

'If there is less variation you're freer to do what you want now, because you're not preparing foods or chopping firewood or making winter clothes to get you through the winter. You also may be more concerned with the immediate stress that comes along with parasites and other risks of hot climates, such as venomous animals.'

Instead, the researchers say people who live in hot, consistent regions are more likely to act according to the present.

'We see evidence of a faster life strategy in hotter climates with less temperature variation – they are less strict about time, they have less use of birth control, they have children earlier and more often,' Bushman said.

While a person's behaviours may not entirely be the result of the climate they live in, this does help to shape the culture, the researchers explain.

'How people approach life is a part of culture and culture is strongly affected by climate,' Van Lange said.

'Climate doesn't make a person, but it is one part of what influences each of us. We believe it shapes the culture in important ways.'


Brexiteers are also climate skeptics

The article below is from a few months back but current observations say the same thing

Here are a number of parallels between the climate wars and the current Brexit skirmishes that I have noticed and found interesting. Make of these what you will:

1. There’s the stereotyping. Those in the “Out” camp are often viewed in the media as right-wing Little Englanders – except they’re not. George Galloway, anyone? Likewise, those in the climate sceptic camp here in the UK are often viewed in the media as right-wing Little Englanders – except they’re not. Piers Corbyn, right wing?

2.  Somewhat illogically, there’s also a perception in the media that the Brexit gang are a diverse and divisive rag-tag alliance (Nigel Farage and George Galloway on the same platform). The same could also generally be said about climate sceptics. I think this is actually not far from the truth, and it might indeed be a strength rather than a weakness, as not everyone can then be tarred with the same stereotypical brush.

3. There’s a bit of overlap between EU and CAGW scepticism – if these were circles in a Venn diagram, we would find UK politicians Owen Paterson and Graham Stringer (Conservative and Labour, respectively) in the area where they intersected (and they would probably be joined by lots of non-politicians, too).

4. There are also the big battalions lined up against both the Brexiteers and the climate sceptics. Against the “Out” camp are ranged a giant army of big business concerns, environment agencies, world leaders, the EU itself, Emma Thompson and President Obama. Against the CAGW sceptics are ranged a giant army of big business concerns, environment agencies, world leaders, the EU itself, Emma Thompson and President Obama. And the Pope. The power of authority! (Or the power of deeply vested interests, looked at in another way.)

5. And, of course, there’s Project Fear. Both Britain leaving the EU and “inaction on climate” will lead to Bad Things happening. Very Bad Things. I don’t need to spell these out, really. On climate change, Project Fear has actually been going for decades, although when they periodically realise people aren’t all that scared, something akin to Cameron’s “Project Fact” then gets proposed (just as long as the purported facts are frightening facts, mind). That doesn’t work, either, and so they go back to the Fear.

Anyway, why are there apparent close similarities between these two conflicts? I don’t have the definitive answer to this but suspect that something they have in common, very broadly speaking, is the age-old antagonism between Freedom and Authority.


Obama-Appointed Judge Strikes Down Fracking Regulation

Well, this is embarrassing for Barack Obama. Judge Scott Skavdahl — a judge Obama appointed to the Federal District Court in Wyoming — ruled that the Interior Department’s regulations on fracking were unlawful because Congress didn’t give it the power to hand down such rules. While the vast majority of fracking occurs on state and private land, the rules would have required oil companies operating on federal land to follow stricter safety guidelines.

“Hydraulic fracturing is one of the keys that has unlocked our nation’s energy resurgence in oil and natural gas, making the United States the largest energy producer in the world, creating tens of thousands of good-paying jobs, and lowering energy prices for consumers,” said House Speaker Paul Ryan in a statement. “Yet the Obama administration has sought to regulate it out of existence. This is not only harmful for the economy and consumers, it’s unlawful — as the court has just ruled.”

Congress, in a 2005 law, explicitly stated that the executive branch did not have the power to regulate fracking, the Wall Street Journal points out. That leaves room for states to decide the level of red tape they want to impose on the industry. But Obama, the erstwhile lecturer of constitutional law, didn’t need a 2005 law to tell him that; the spirit of that same statute is found in the Tenth Amendment. The courts have been striking down executive action after executive action of Obama’s because he doesn’t follow the Constitution. It’s especially significant that a judge Obama nominated has called a halt to this instance of unlawful executive overreach.


Bostonians are enjoying being scared by global warming

It relieves the boredom.  The sentence I like best below:  “We have a lot to fear from Antarctica.”  Since Antactica is actually GAINING mass, that reveals that the whole report is entertainment

The consequences of climate change on Boston are expected to be far more calamitous than previous studies have suggested, a new report commissioned by the city says.

In the worst-case scenario, sea levels could rise more than 10 feet by the end of the century — nearly twice what was previously predicted — plunging about 30 percent of Boston under water. Temperatures in 2070 could exceed 90 degrees for 90 days a year, compared with an average of 11 days now.

And changes in precipitation could mean a 50 percent decline in annual snowfall, punctuated by more frequent heavy storms such as nor’easters.

The report, by scientists from the University of Massachusetts and other local universities, has raised concerns in City Hall just two weeks after Mayor Martin J. Walsh attended a climate summit in Beijing.

“The updated climate projections confirm that we must work together to take bold approaches to prepare Boston for the impacts of climate change,” Walsh said in a statement.

The report, he said, is part of the city’s effort to assess its vulnerability and to seek solutions. Next year, Boston will host the same climate conference that Walsh attended, with leaders from some 60 US and Chinese cities.

“We take climate change seriously, because we take the health and resilience of our city seriously,” Walsh said. “We will continue to focus on using the best data to inform decisions and understand future investments.”

The updated projections for Boston take into account new research that suggests the accelerating melt of the ice sheets covering Antarctica will have a disproportionate impact on cities along the East Coast.

As ice melts on the South Pole, the resulting gravitational pull on the ocean, as well as the gradual sinking of land in the Northeast, means that Boston and other nearby communities are likely to experience about 25 percent higher increase of sea levels than other parts of the planet, according to the new research.

“Boston is a bull’s-eye for more sea level damage,” said Rob DeConto, a climate scientist at UMass Amherst who helped develop the new Antarctica research and who co-wrote the new Boston report. “We have a lot to fear from Antarctica.”

If high levels of greenhouse gases continue to be released into the atmosphere, the seas around Boston could rise as much as 10.5 feet by 2100 and 37 feet by 2200, according to the report.

Even under optimistic forecasts that factor in significant cuts to carbon emissions, sea levels are projected to rise as much as 6 feet by 2100 and nearly 12 feet by 2200.

Such a dramatic rise would be devastating to Boston. Faneuil Hall, for example, now floods at 5 feet and Copley Square at 7.5 feet above today’s high tides, city officials say.

“If seas rise that much, the New England coastline would look very different from space,” said DeConto, referring to the worst-case scenarios. “There would be huge impacts on our ecosystems, and we would be talking about a managed retreat from the coastline rather than engineering a way to harden our coastline.”

The most comprehensive previous projection of the impact of climate change on Boston was released two years ago in a report by the federal government called the National Climate Assessment.

That report found that the Northeast was already bearing the brunt of climate change, with prolonged heat waves, torrential rains, and increased flooding, which it attributed to the burning of fossil fuels and other human activity.

It noted that over the past century average temperatures in Northeastern states have risen by 2 degrees Fahrenheit. It also found that the region’s precipitation has risen by more than 10 percent, while the worst storms have brought significantly more precipitation.

But the federal report forecast that seas would rise, under the worst case, between 3 and 6 feet by 2100 and projected that the southern states in the Northeast, by midcentury, would experience about 60 additional days per year of temperatures above 90 degrees.

The new report, submitted to city officials this month, raises the stakes for policymakers to curb emissions, said Julie Wormser, vice president for policy and planning at Boston Harbor Now, a local advocacy group.

“In a word, this is awful,” she said of the new projections. “It’s so stark it’s hard to wrap one’s head around.”

She noted that the increased storm surge and high tides could bring significant damage and flooding to the city far sooner than the end of the century, just as Tropical Storm Sandy devastated parts of coastal New Jersey and New York in 2012.

“We will need to come together to prevent Boston’s people and places from flooding where we can, and learn to live with more water where we can’t,” she said.

On the bright side, Carl Spector, commissioner of the city’s Environment Department, said the worst scenarios remain unlikely and a historic agreement reached last year in Paris offered hope that nations around the world could work together to reduce emissions.

But he said the new data about Boston underscore why the city has to consider taking action in the coming years to build barriers and other defenses against the rising seas, revise its building codes, and find other ways to adapt to the changing climate.

“We know even relatively small amounts of sea level rise affect us,” he said. “All the models we’re seeing are concerning.”


Who wants wind turbines?

Last month’s wind-turbine fire near Palm Springs, CA, that dropped burning debris on the barren ground below, serves as a reminder of just one of the many reasons why people don’t want to live near the towering steel structures. In this case, no one was hurt as the motor fire was in a remote, unincorporated area of Palm Springs. But imagine if it was located just hundreds of feet from your back door—as they are in many locations—and the burning debris was raining down into your yard where your children were playing or onto your roof while you are sleeping.

Other reasons no one wants them nearby include the health impacts. Last month, Dave Langrud, of Alden, MN, sent a six-page, detailed complaint to the Minnesota Public Regulatory Commission. In it, he states: “Wisconsin Power and Light constructed the Bent Tree Wind Farm surrounding my home. There are 19 turbines within one mile and 5 within ½ mile. Both my wife and I have had difficulty sleeping in our home since the turbines started operating. If we leave the area, we don’t have this problem. The turbines have also caused severe headaches for my wife. She didn’t have this problem before the turbines, and this isn’t a problem for her when we spend time away from our home and away from the turbines. When we are home, the problems return.”

In response to another recent ongoing complaints at multiple Minnesota wind projects about the proximity of the turbines to residences, commissioners from the Minnesota Department of Health, Department of Commerce, and Pollution Control Agency acknowledged that regarding permitting and setbacks, “the noise standard was not promulgated with wind turbine-like noise in mind. It addresses audible noise, not infrasound. As such, it is not a perfect measure to use in determining noise-related set-backs between wind turbines and residences.” Yet, it is the “measure” that is used. The Commissioners also acknowledged: “At present there is no available funding to conduct such studies.”

Langrud’s letter addresses property values. He asks: “How do we get a fair price if we sell in order to save our health?” But recent studies prove that it isn’t just those forced to live in the shadows of the turbines whose property values are diminished. Waterfront properties that have offshore wind turbines in their viewshed would have a “big impact on coastal tourism,” according to a study from North Carolina State University. The April 2016 report in Science Daily states: “if turbines are built close to shore, most people said they would choose a different vacation location where they wouldn’t have to see turbines.” The economic impact to the coastal communities is estimated to be “$31 million dollars over 20 years.”

A similar study done in Henderson, NY, found a proposed wind project could have “a total loss in property value of up to about $40 million because of the view of turbines.” An interesting feature of the NY study, not addressed in the NC one is how the loss in property taxes, due to reduced values, will be made up. The Watertown Daily Times points out that most of the homes whose values “would fall sharply due to the view of turbines” are “assessed above $1 million.” It states: “homes in the $200,000 range without a view of turbines would probably see an increase in property taxes to make up for the overall drop in property values.” Robert E. Ashodian, a local resident is quoted as saying: “If property values go down and the town isn’t going to spend less money, the tax rate is going to go significantly up for all of the homeowners who aren’t impacted.” Henderson Supervisor John J. Calkin expressed concern over the “devastating impact” the wind project would have on the town and school district.

Offshore wind turbines were supposed to offer a visual benefit, but they, obviously, bring their own set of problems.

The Financial Times reports: “Building wind farms out at sea, rather than on land where critics say they are an eyesore, has made these power stations a less contentious form of clean energy … But it also makes them dearer than most other power stations and many EU governments face pressure to cut green subsidies that opponents say raise electricity prices and make some industries uncompetitive.” The higher cost argument is what has caused Denmark—known as the international poster child for green energy and the first to venture into offshore wind power—to abandon the policies that subsidized the turbines. Cancelling the coastal wind turbines is said to “save the country around 7 billion Krones ($1 billion).” According to Bloomberg: “The center-right government of [Prime Minister] Lars Loekke Rasmussen wants to scrap an electricity tax that has helped subsidize wind turbines since 1998.”

The Danish People’s Party, the largest group in the ruling bloc, is part of the “policy about-face.” Party leader Kristian Thulesen Dahl says: “You have to remember this is a billion-figure cost that we’re passing on to the Danes.” She added: “We also have a responsibility to discuss the costs we impose on Danes over the next 10 years.”

Germany is facing similar problems with its green energy policies. Energy Digital magazine points out that Germany’s rapid expansion of green energy has “driven up electricity costs and placed a strain on the grid.” As a result, Germany has capped wind power expansion. In fact, subsidies—which drove the growth in renewable energy—are being cut throughout Europe. Bloomberg states: “Europe is falling out of love with renewables.”

Then, there are the U.S. utility companies who are forced to buy the more expensive wind-generated electricity due to an abused—but little known in the public—1978 law that was intended to help the U.S. renewable energy industry get on its feet. The Public Utility Regulatory Policies Act (PURPA) was designed to give smaller power players an entry into the market. If wind-turbine projects meet the guidelines, utilities must buy the electricity generated at “often above-market” costs. Instead, in many cases, big projects, owned by one company, get divided up into different parcels with unique project names, but are still owned by the major developer.

Energy Biz magazine reports: “PacifiCorp, for one, estimates that such abuses will cost its customers up to $1.1 billion in the coming decade by locking the company into unneeded electricity contracts at rates up to 43-percent higher than market price.” It quotes John Rainbolt, federal affairs chief for Wisconsin-based Alliant Energy: “Our customers essentially pay for PURPA power at 20-percent higher-than-market-based wind prices.” Led by Senator Lisa Murkowski (R-AK), Rep. Fred Upton (R-MI) and Rep. Ed Whitfield (R-KY) a move is underway in Congress to review the nearly 40-year old legislation.

So, residents who live near wind turbines don’t want wind turbines. Nor do residents and renters who have them in the viewshed, governments looking to cut costs, utility companies, or ratepayers. And we haven’t even mentioned those who want to protect birds and bats. Scientific American just addressed the concern that “Bat killings by wind energy turbines continue.” It claims: “wind turbines are, by far, the largest cause of bat mortality around the world” and this includes three species of bats listed—or being considered for listing—under the Endangered Species Act. Bats are important because they eat insects and, therefore, save farmers billions of dollars in pest control each year. Scientific American reports that in addition to dead hawks and eagles found under the wind turbines are thousands of bats.

Who does want wind turbines?

Wind turbine manufacturers, the American Wind Energy Association, and the crony capitalists who benefit from the tax breaks and subsidies—which Robert Bryce, author of Power Hungry and Smaller Faster Lighter Denser Cheaper, reports total more than $176 billion “given to the biggest players in U.S. wind industry.” He states that the growth in wind energy capacity has “not been fueled by consumer demand, but by billions of dollars’ worth of taxpayer money.”

To address those who defend rent-seeking wind turbines and squawk about the favorable tax treatment the oil and gas sector gets, Bryce points out: “on an energy equivalent basis, wind energy’s subsidy is nearly three times the current market prices of natural gas.” Even billionaire Warren Buffett acknowledged that the only reason his companies are in the wind business is: “We get a tax credit if we build a lot of wind farms.”

(Note: Each of these stories is from just the past several weeks. There are far more concerns that could be addressed, but that would require a length beyond the attention span of everyone but policy wonks.)

If no one but the rent-seeking crony capitalists want wind turbines, why must people like Minnesota’s Langrud have to endure them? Because the wind energy lobby is powerful and “green energy” sounded good decades ago when the pro green-energy policies like PURPA were enacted.

However, as the Bloomberg story on Demark points out: wind power is “a mature industry that no longer needs state aid.” Unfortunately, in December 2015, Congress extended the wind energy tax credits through 2021. But tweaks, such as reforming PURPA, can take place and a new president could totally change the energy emphasis—which would be good, because, it seems, no one really wants wind turbines.


Fukushima -- fact and fiction

Damaging myths about radiation

On March 11, 2011, Japan was struck by an earthquake and tsunami, which triggered a nuclear accident. Four years later and 9,000 kilometers away, it was February 2015, I was a master's student at the University of Edinburgh, and a guest lecture was about to begin by Japanese researchers on their work in Fukushima.

I knew there had been a nuclear accident in Fukushima. I assumed this had led to dangerous radiation levels and increases in cancers. I had never entertained the thought of visiting.

What happens next could be described as a clash between what I thought I knew and reality.

The researchers gave a series of presentations. They showed us what they had found in Fukushima; there were overwhelmingly low levels of internal and external radiation in residents,1,2 and a mass screening of babies and children revealed that none had detectable levels of internal radiation contamination.3 Yet, other health problems were emerging; in contrast to low levels of radiation, an increased burden of diseases unrelated to radiation, such as diabetes, cardiovascular disease, hypertension and more, was being found.4,5 Particular health risks associated with evacuation were highlighted,5 including evidence that immediate evacuation of the elderly from nursing homes was associated with three times higher mortality risk that non-evacuation.6 It was presented to us that radiation may not be the biggest problem for Fukushima.

I was surprised. This appeared to be, in fact, the exact opposite of what one may think about Fukushima. This surely was not the Fukushima I had heard of or visualized, and my curiosity was piqued. I talked to the researchers and proposed an idea for further research. They, in turn, invited me to come to Fukushima to write my Master's dissertation. I agreed.

In May 2015, I first arrived in Fukushima, and began research at Minamisoma Municipal General Hospital. I wrote my master's dissertation, graduated, and then was offered a full-time job at the hospital, which is where I am today.

There are a lot of things I could write about, that I have learned from Fukushima. Yet one of the most unexpected parts of this experience has been the confrontation between what I thought I knew, and the reality which I found. There were few things in front of me in Fukushima that matched my original expectations, and I was struck by the feeling that I had been unaware of so much. Yet I also realized that the inaccurate ideas I previously held were surprisingly common. This has led me to think more than ever about what it means to 'know' something, in terms of both myself and others.

Because really, how do we know things? There's not one answer.

Talking about knowledge is difficult. Our own feelings and opinions can become what we know. Observations become what we know. The media can be said to be a source for knowledge. Science is a method of knowing.

But what happens when our knowledge does not reflect the reality of a situation? This brings me to the second biggest thing I have learned since coming to Fukushima: the damage of misinformation. Or in other words, how the ideas that I previously held and continue to see in others can be dangerous.

I never saw the actual results of misinformation until I moved to Fukushima. Now, I see them everywhere.

There is not one all-encompassing example, but we can start by talking about rumors and stigma. A particular problem here has been misinformation about radiation levels and the health implications of such levels; I have heard from many residents about the ways their lives have been affected because of the incorrect information held by others. When trying to evacuate, some were turned away from the homes of their families because radiation was misunderstood as contagious. I am told about the parents of young men, opposing their choice to marry a woman from Fukushima because it is assumed that she will not be able to bear healthy children. Some children themselves believe they will never be able to have healthy offspring in the future, because of what they have heard. There are unending examples.

This is not a beautiful subject to talk about, in fact, this is a terrible subject to talk about. And it is made worse when considering that these beliefs directly contradict what is being found scientifically. Recently, the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) formally predicted that there will be no effects of radiation exposure on the health of the general public in Fukushima.7 It was additionally highlighted that there are no expected hereditary or genetic effects that will be seen in new generations.7 The misinformation that has led to stigma and subsequent disruption of lives here therefore appears to be at conflict with the reality of the situation; an example of the tragic impacts mistaken knowledge can have on the lives of disaster-affected populations.

Lots of people say they want to help those in Fukushima. Many specifically mention the children of Fukushima. For this purpose, one of the most common programs I have seen are summer camps specifically for children from Fukushima. Yet, a trend is that these camps often take place outside of Fukushima prefecture. Some programs do not explicitly explain the reason for this, while others market it as an opportunity for respite from the radiation, a chance to run around and play outdoors in nature. And I wonder, are these organizers, these people who say they want to help the children in Fukushima, are they aware of the actual radiation levels here? Are they aware of the beautiful nature in this prefecture, and that it is safe for children to play outside in most places? Of course summer programs for children are great, and I would want any child to have the experience of a fun summer.

But I wonder, do these programs come with the cost of marking these children as victims of their prefecture? I wonder, are the foundations of these camps based on scientific information, or opinion? I wonder, would these camps be more beneficial and allow for more participants if they were held inside Fukushima prefecture? If we really want to make a difference and help people, we should base our actions on reality to be most effective, shouldn't we? But the camps are just the tip of the iceberg. Some people suggest that all the children should be taken out of Fukushima.

Has anyone thought of the negative effects this may have on the lives and livelihoods of these individuals?

I actually had not, until I came here.

A nuclear disaster is a terrible event. It's understandable that people may react emotionally to an unexpected situation that carries risks. Perhaps it's easy to assume the worst, and to spread rumors. Yet, it is of paramount importance to be aware that misinformation carries consequences. Unfounded ideas have led to suffering, and misinformation is one of the biggest things to overcome for the future of Fukushima. I urge everyone to look deeper at the foundations of their knowledge, and to be aware of the reasons something may be viewed in a particular way. Ask yourself what you think about Fukushima, for example, and then why. The second step is to be grounded in information. Read things you agree with, and just as importantly, read things you disagree with. Read and consider everything; I have come to think that this is the only way to get as close to reality as possible without being present at the scene of an event.

Simultaneous realization of the limits of my own knowledge and the impacts that misinformation can have on the lives of people has been one of the most striking aspects of encountering Fukushima. I write this article in hopes that it may prompt others to assess the way they "know," Fukushima and beyond. If we want to pragmatically help people or improve a situation, we must understand the reality of that situation first.

I moved to Fukushima because I realize that I didn't know enough, and I wanted to know more. I still want to know more, and I hope that others will want to know more too.



For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here


Friday, June 24, 2016

New crop varieties 'can't keep up with global warming'

How ridiculous can you get?  Do the BBC have no pride to publish this excreta?  For a start, global warming has been so slight in C21 that there is debate over whether it exists at all.  It is certainly not racing ahead in the way the article below implies.

Secondly, we don't need new crop varieties.  We just use ones we already have.  There are heaps of areas on the earth that are both very hot and which grow crops.  A warming world would simply see them more widely used.  Just as a minor example of heat-adaptation, the tropical Australian city of Townsville produces grapes, normally a cool temperature crop,  And what is the effect of growing grapes there?  They are bigger and juicier and reach the table up to a month before most other table grapes. We ALREADY have heat adapted crops if we need them.

A large muscadine grape native to sub-tropical Florida

Some very tasty Chambourcin grapes from Townsville

Warmer temperatures tend to suit crops in fact,  which is why the greatest biodiversity is in the tropics.  And maize is just such a plant.  It is  it is "cold-intolerant".  It likes warmth. It is already grown in temperatures up to 35C in India.  The most usual limitation on maize crops is drought.  But warming oceans should give off more water vapor -- which comes down as rain -- so maize should get more water and yield very well in a warming world.

And I suppose I should mention the obvious: According to Warmist theory, there will be lots more CO2 in the atmosphere of a warming world. And plants LOVE CO2. They suck it up. It's the raw material that they use to build themselves. So again, a warmer world would be a CO2-rich world in which plants would flourish as never before

So a bit of global warming would IMPROVE maize crops. The picture below of the sad lady holding maize ears is just another example of Warmists lying with pictures

Crop yields around the world could fall within a decade unless action is taken to speed up the introduction of new varieties. A study says temperatures are rising faster than the development of crop varieties that can cope with a warmer world.

In Africa, researchers found that it can take 10-30 years before farmers can grow a new breed of maize. By the time these new crops are planted, they face a warmer environment than they were developed in.

The scientists behind the study, published in the journal Nature Climate Change, looked closely at the impact of temperature rises on crop duration - that's the length of time between planting and harvesting.

They found that in a warmer world durations will be shorter meaning these varieties will have less time to accumulate biomass and yields could be affected.
Out of date

In their paper, the researchers write that crop duration will become significantly shorter as early as 2018 in some regions but by 2031, the majority of maize-growing areas of Africa will be affected.

"The actual changes in yield may be different but this effect is there, the impact of this change in duration will occur unless breeding changes," said lead author Prof Andy Challinor from the University of Leeds.

"The durations will be shorter than what they were bred for - by the time they are in the field they are, in terms of temperature, out of date."

The scientists say the lag is down to a combination of factors including the limited number of crops you can grow in a season, the need for government approved testing and there are also a number of problems of access to markets that can increase the time it takes before the farmers have the new seeds to plant.

"We can use the climate models to tell us what the temperatures are going to be," he told BBC News, "We can then put those temperature elevations into the greenhouses and then we can breed the crops at those temperatures. People are beginning to do this, but this paper provides the hard evidence of the necessity of it."

Researchers are also working on the impact of heat stress on crops at sites in Zimbabwe, Kenya and Ethiopia. Data from these trials is being used to identify species that could cope with warmer conditions.

But would the use of genetic modification (GM) help speed up this type of work? "GM does some things faster, so you would get a new variety of crop faster," said Prof Challinor.

"But it doesn't get you out of the testing requirement in fact the testing may in fact be greater and it doesn't help it all with farmers accessing seeds and markets - the problem will remain even for a magic GM crop."

Better techniques and more money for research are the keys according to others in this field, familiar with the study.

"Investment in agricultural research to develop and disseminate new seed technologies is one of the best investments we can make for climate adaptation," said Dr Andy Jarvis, from the International Centre for Tropical Agriculture,

"Climate funds could be used to help the world's farmers stay several steps ahead of climate change, with major benefits for global food security."

The researchers believe that the study also has implications beyond Africa, especially in the maize growing regions of the tropics.


How ironic that the modern green movement got started with the book Silent Spring and a concern for bird deaths

Hard to believe that this is a green thing to do. (Via J. Munshi)

The Nanny State Advances Statement on Passage of Anti-Soda Tax in Philadelphia

In the first success of its nature for “nanny state” advocates after many years of trying, Philadelphia Thursday became the first major city to attempt to control the non-alcoholic drink choices of its residents by enacting a 1.5-cent-per-ounce tax on soda, tea, sports and energy drinks. This is expected to embolden nanny state tax advocates across the United States.

The tax, like others on food and food-related items, will fall disproportionately on lower income individuals.

The National Center for Public Policy Research’s director of Risk Analysis, Jeff Stier, is available to speak with reporters and has a statement:

The only good thing about Philadelphia’s newly-imposed soda tax is that proponents were somewhat honest about it, admitting it wasn’t about improving public health. Instead, they admitted it was a money grab, albeit a highly regressive one.

Perhaps it was a wise tactical move, because soda-tax campaigners have failed to persuade scientists or the public that the tax reduces caloric consumption, obesity, or diabetes.

Adding to the absurdity of this tax, Philly’s treats diet soda and full sugar alike, failing to even distinguish between sugary drinks, which, like all caloric food and beverages, can contribute to obesity, and zero or low calorie beverages. Similarly, advocates across the country are pushing to equalize cigarette and e-cigarette sin taxes, the latter of which is primarily used by adult smokers trying to lower their risk. If soda was the new tobacco, now diet soda is the new e-cigarette.

In March, Stier told the Daily Caller that “Soda tax proponents are asking us to suspend normal assumptions about human behavior and simply assume that people who reduce soda consumption to avoid the tax, won’t just make their own sugary drinks and won’t replace the calories with other high-calorie foods or drinks.”

In an op-ed in the Houston Chronicle in 2014, Stier explained the real rationale for soda taxes: “Rather simply, it is Sutton’s Law. The ‘law’ is named after the infamous American bank robber Willie Sutton, who was incorrectly credited with answering a reporter who asked him why he robs banks by saying, ‘That’s where the money is.'”


It’s more like global LUKEwarming

Turning to the investigation of climate change: What do we know about climate?  Climate has always changed, is changing, and will always change. There were times when the earth was much colder or warmer than it is now, and during both those circumstances CO2 levels were at times higher or lower than now. Solar cycles, volcanic activity, greenhouse gasses, ocean currents, and macro weather patterns such as El Nino/La Nina all have an effect on climate. Our understanding of climate most evidently suggests there is much we don’t understand about climate.  It would therefore stand to reason that any investigation of the human influence on climate should begin with a broadly exploratory study of climate and the factors influencing climate.

However, that has not been the case. The International Panel of Climate Change (IPCC) temperature modeling is based in the following deductive reasoning:  CO2 is a relatively abundant greenhouse gas.  The noncontroversial physics of atmospheric CO2 predicts that a doubling of atmospheric CO2 concentrations should result in a temperature increase of 1.1 – 1.2 degrees C.  The IPCC computer modeling further incorporates a 2-3X or more amplification of the predicted CO2 temperature increase, postulating that CO2 increased temperature will warm the oceans creating more water vapor – a greenhouse gas – and thereby amplify the CO2 greenhouse gas temperature effect.

So how well has this deductive reasoning predicted the observed reality.

John Christy, a climate expert from the University of Alabama, gave the following report on climate change to a joint meeting of Senate and House committees on Dec. 8, 2015.

He first compared the observed temperature data to the IPCC computer modeled temperature for the middle troposphere.   The troposphere is the earth’s active weather zone, and extends from the surface to around 40,000 feet.  The observed temperature record was a product of two different temperature measurements – balloon data and satellite data.

The balloon data is the compilation of four separate data sets from weather balloons launched twice a day simultaneously across the world so to get a snapshot of the physical properties of that day’s atmosphere.  These balloon launches have occurred twice daily since 1979.  The satellite temperature recordings go back 35 years and are derived from measuring the vibration of diatomic oxygen in the lower atmosphere which turns out to produce a much more accurate temperature measurement than standard mercury-in-glass instruments.

The data demonstrates that for the 36-year period from 1979 to 2015, the observed tropospheric temperature was less than that predicted by the mean of the 102 computer models, and at times significantly so.  Over that time period, the observed warming has been roughly one-third of that predicted by the models.  This data also shows the observed tropospheric temperature increase over the last 10 years has been less than 0.05 degrees C.

Dr. Christy also compared the most recent revision of each of the five observed global temperature records to that of the average of the 108 IPCC climate models predicted temperatures.  His analysis demonstrates for all periods from 10 years (2006-2015) to 65 (1951-2015) years in length, the observed temperature trend was in the lower half of the climate model temperature predictions, and for several periods, the observed trend lies very close (or even below) the 2.5th percentile of those predictions.

This empirical data also demonstrates a “pause” or “slowdown” in the rate of global warming has taken place over the past 15 years – a period during which more than 100 billion tons of carbon dioxide has been released into the atmosphere.

This pause only recently has been acknowledged in the climate change scientific journals. One such article, whose authors included Michael Mann, the Penn State climatologists accused of fudging data to create the famed hockey stick shaped global warming prediction, states, “It has been claimed that the early-2000s global warming slowdown or hiatus, characterized by a reduced rate of global surface warming, has been overstated, lacks sound scientific basis, or is unsupported by observations. The evidence presented here contradicts these claims.”

Climate scientists have proposed over 40 explanations for the warming hiatus including particulate matter from small volcanoes and pollution, ocean movements, data gathering problems, natural variability, and several more. The 40-plus explanations can’t all be right, but all potentially provide insight into better understanding climate change. The pause tells us that there is significant underlying natural climate variability. The pause tells us that our knowledge of climate change is limited and incomplete.  The pause tells us that the science is not settled.

Given that the observed rate of warming in the satellite-sensed and balloon data is barely a third of that predicted by global climate models, it is both reasonable and prudent to cut the modeled temperature forecasts for the rest of this century by 50 percent.

Most experts believe that warming of less than 2 degrees Celsius from pre-industrial levels will result in no net economic or ecological damage.  In fact, for up to two degrees of total warming, the benefits will generally outweigh the harmful effects. Warming of up to 1.2 degrees Celsius over the next 70 years (0.8 degrees have already occurred), most of which is predicted to happen in cold areas in winter and at night, would extend the range of farming further north, improve crop yields, slightly increase rainfall (especially in arid areas), have a continued greening effect on the earth, and lower winter-related deaths.

What conclusions should be drawn from the observed – as opposed to computer predicted – temperature data?

Our knowledge of climate and climate change remains limited and incomplete. The science is not settled!  Secondly, models are not evidence.  Finally, given the huge political and economic implications of climate policy, climate change study merits a vigorous, broad and open-ended investigation – not research to confirm a pre-ordained conclusion.


Report: World Not Building Enough Nuclear Power To Fix Global Warming

A report published Tuesday by the World Nuclear Association found reactors are not being built quickly enough to meet the world’s global warming goals.

The report found 1,000 gigawatts of new nuclear capacity need to be added by 2050 to come close to limiting global warming. A single gigawatt of power provides enough energy for roughly 700,000 homes.

That means roughly 100 new nuclear power plants need to be built worldwide by 2050, but only three were constructed last year. The report blames the slow rate of construction on a lack of public support in Europe and tough economic conditions in America. It also points out that Japan’s permanent shutdown of six reactors since the Fukushima accident in 2011 has substantially slowed the industry’s growth.

“The situation facing the nuclear industry globally is challenging.” Agneta Rising, director general of the World Nuclear Association, stated in the report preface. “Substantial progress has also been made towards the commercialization of small and advanced reactor designs. The rate of new build is, however, insufficient if the world is to meet the targets for reducing the impacts of global warming.”

America currently operates 99 nuclear reactors across 61 commercially-operated nuclear power plants, according to the Energy Information Administration. Of the 66 new nuclear reactors under construction worldwide, only four of them are being built in the U.S. — just enough to compensate for shutting down older reactors. Instead of building more modern reactors, the government is planning to simply extend the operating licenses against the advice of its own technical staff. It takes an average of 73 months to construct a new nuclear reactor, according to the report.

The average American nuclear reactor is 35 years old, nearly obsolete by modern design standards and near the end of its operating license. Within the past two years, six states have shut down nuclear plants and many other reactors are risking premature retirement. America could get less than 10 percent of its electricity from nuclear by 2050, according to the International Atomic Energy Agency.

Sixteen American nuclear reactors are more than 42 years old, according to government data compiled and mapped in April by The Daily Caller News Foundation.

Other countries haven’t shown the same reluctance as the U.S. to embrace nuclear power. India has a rapidly growing nuclear power program and the country plans to get 25 percent of its electricity from nuclear reactors by 2050. China is also planning to build new nuclear plants and has plans to build 20 floating nuclear reactors in the South China Sea, strengthening its claim to the valuable and disputed region. The country plans to have 150 gigawatts of nuclear power by 2030, according to the World Nuclear Association.

The average nuclear plant employs between 400 and 700 highly skilled workers, has a payroll of about $40 million and contributes $470 million to the local economy, according to the Nuclear Energy Institute.


Solar and wind power simply don’t work — not here, not anywhere

By Keith DeLacy, a former Labor treasurer of Queensland, Australia.

One policy which seems to have escaped scrutiny during this election campaign is Labor’s commitment to increase the Renewable Energy Target to 50 per cent by 2030. I am surprised because it is a proposal that has enormous ramifications for economic growth and living standards, and disproportionate impacts on traditional Labor constituencies.

The problem we have in Australia is when we talk renewable energy we are talking wind and solar only — low value, expensive, unreliable, high capital cost, land hungry, intermittent energy.

According to the Department of Industry and Science wind currently generates 4.1 per cent and solar 2 per cent of Australia’s electricity. But even this is highly misleading because it is such low value power. You could close it down tomorrow (which it regularly does by itself) and it would make no difference to supply.

If we talk about total energy, as opposed to just electricity, wind and solar represent 1 per cent of Australia’s energy consumption. This despite billions of dollars of investment, subsidies, creative tariffs, mandates, and so on.

Solar and wind simply don’t work, not here, not anywhere.

The energy supply is not dense enough. The capital cost of consolidating it makes it cost prohibitive. But they are not only much more expensive because of this terminal disadvantage, they are low value intermittent power sources — every kilowatt has to be backed up by conventional power, dreaded fossil fuels. So we have two capital spends for the same output — one for the renewable and one for the conventional back-up. Are you surprised it is so much more expensive, and inefficient, and always will be? So wind and solar, from a large scale electricity point of view, are duds. Now I know that will send the urgers into paroxysms of outrage. But have you ever seen an industry that so believed its own propaganda. Note, when they eulogise the future of renewables they point to targets, or to costly investments, never to the real contribution to supply.

Let’s look overseas where many countries have been destroying their budgets and their economies on this illusion for longer and more comprehensively than we in Australia. The Germans are ruing the day they decided to save the world by converting to solar and wind. Germany has spent $US100bn on solar technology and it represents less than 1 per cent of their electricity supply.

Energy policy has been a disaster. Subsidies are colossal, the energy market is now chaotic, industry is decamping to other jurisdictions, and more than a million homes have had their power cut off.

It is reported electricity prices in Germany, Spain and the UK increased by 78 per cent, 111 per cent and 133 per cent between 2005 and 2014 as they forced additional renewable capacity into their electricity markets. Sunny Spain used to be the poster boy for renewables in Europe — photovoltaic cells and wind turbines stretching on forever. Now they are broke, winding back subsidies, even the feed-in tariffs which were guaranteed for 20 years. But wait, what about the green energy jobs that everybody gushes about? Spain has an unemployment rate of 21 per cent with a youth rate of 45.5 per cent.

Britain is little better. Subsidies are being wound back, and a Department of Energy report points out that in 2013, the number of households in fuel poverty in England was estimated at 2.35 million representing around 10.4 per cent of all households.

It is no better in the US either. States with renewable energy mandates are backtracking faster than Sally Pearson can clear hurdles. Ohio has halved its mandate level (it was 25 per cent by 2025) because of high costs. West Virginia has repealed its mandate because of high costs, and New Mexico has frozen its mandates. Kansas was repealing its mandate which reportedly would save ratepayers $171m, representing $4367 for each household, and so the dismal story goes on. The US Department of Energy has found electricity prices have risen in states with mandates twice as fast as those with no mandate. As of 2013 California was the only state to adopt a feed-in tariff for solar power. It was immediately dubbed a failure by the renewable energy community because it offered only 31 cents per kWh, only five times the rate for conventional base load power.

Ah, but Asian countries are jumping on the bandwagon. Maybe. China built one new coalfired power plant every week in 2014, and India’s coal-powered investment in that same year equalled the total electricity capacity of NSW and Queensland. To summarise — with all of the trillions spent worldwide on wind and solar, wind currently represents 1.2 per cent of global consumption of energy, and solar 0.2 per cent.

The good news, it is possible to reduce fossil fuel use in electricity generation — through hydro-electricity and nuclear fuel. Plenty of countries have done it — Canada 60 per cent hydro and 15 per cent nuclear; Sweden 45 per cent hydro and 48 per cent nuclear; Switzerland 54 per cent hydro and 41 per cent nuclear; France 11 per cent hydro and 79 per cent nuclear.

But Australia has zero tolerance of these two workable alternatives to fossil fuels. At least we are consistently inconsistent.

So where does that leave us? On the basis of evidence everywhere we could easily double the price of electricity and get nowhere near the 50 per cent target. What would that mean?

First, it means rapidly disappearing blue collar jobs in high energy industries like manufact­uring, car and ship building, smelting and refining, steel making and food processing. There may be still some construction jobs, but they will largely be assembly only, as all of the components will come from those countries more interested in growing the economy and eliminating poverty than stoking the warm inner glow. Make no bones about it, a clean green economy has no place for high-vis shirts.

Second, rapidly rising electricity prices and the subsequent increase in the cost of living, disproportionately affects those at the bottom of the income scale.

Policies like this are OK for the Greens. They can keep their virtue intact because they never have to deliver. As Gough Whitlam once said, only the impotent are pure.

Mainstream parties don’t have that luxury. They need to look at the true costs, and benefits, of all policy proposals.



For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here


Thursday, June 23, 2016

Which animals will cope with climate change droughts?

One has to assume that the Warmist prophecies below will be as good as all the other Warmist prophecies:  Totally useless and wrong.  Prophecy is a mug's game and those who engage in it reveal themselves as mugs.  And why is prophecy needed anyway?  Australia is always having a drought somewhere so if response to drought is of interest, go out and observe it directly instead of theorizing about it from your armchair

And the basic assumption below is a crock-- that warming would cause drought.  It would not.  Warmer water gives off more water vapour which eventually comes down as rain.  Flooding might be a problem, not drought.

So the whole story below is just an arid intellectual exercise. Tasmin Rymer should stick to rhymes

James Cook University, Australia

Summary: Scientists believe the current rate of climate change is unprecedented in Earth's history and will lead to more and worse droughts in many areas. Now a research team may have found a way to predict which mammals will best cope with drought -- and which won't do so well.

JCU's Dr Tasmin Rymer led a study that produced a template measuring several crucial factors, including an animal's physiology and environment, to determine how it would handle a severe drought.

Dr Rymer said scientists believe the current rate of climate change is unprecedented in Earth's history and will lead to more and worse droughts in many areas.

"So we developed a theoretical framework that allows researchers to estimate the likelihood that a species will be able to cope," she said.

Dr Rymer said the "Adaptive Triquetra" model considers the primary driving stressors of droughts: temperature, limited water, and reduced food availability. Then it looks at how well an animal's specific body system could mount a response, and the extent to which its traits are adaptable.

"We have provided a comprehensive suite of traits to consider when making predictions about species' resilience to drought. It's designed to help scientists assess the potential for a species or population to cope with increasing aridity," she said.

Dr Rymer said the process is more complex than it sounds, with much work still needed to fully determine the characteristics of many species before the model can be applied to them.

She said the Adaptive Triquetra is still a conceptual framework in need of empirical testing, but held great promise for fine-tuning wildlife management.

"If you found a species was particularly vulnerable to water stress, such as in a drought, you might design a management plan that provides access to artificial water points. If you found a species was vulnerable to increased temperatures, you might provide subterranean shelters."

Dr Rymer said in one example of where the model would have been useful, managers of a reserve in South Africa assumed their animals were suffering from lack of water during a drought, but in fact they had denuded the vegetation around their few artificially-built water holes and the animals were starving.

"If they had dropped fences and spaced water sources widely apart, this would have promoted movement and foraging over a wider area. Our model may have suggested this course of action if it had been in use," said Dr Rymer.

"Knowing which species are at risk and what stressors have the greatest impact allows for more effective management strategies to be put into place."


Did ‘Stonewall’ Jackson Sleep Here? Farmer Sues Green Group Over Claim

Martha Boneta is still duking it out with the Greenies

No historical evidence locates Confederate Gen. Thomas “Stonewall” Jackson on what is now Liberty Farm in Fauquier County, Virginia, on the evening of July 18, 1861, three days before the First Battle of Bull Run.

So why would a prestigious state preservation group represent that as a fact?

The current owner of the farm, Martha Boneta, has sued the Piedmont Environmental Council, a nonprofit land trust, accusing the organization of knowingly making a false historical claim when selling her the property.

The environmental council, Boneta claims in the lawsuit, told her the celebrated Civil War general bivouacked on the open fields surrounding the farm.

Liberty Farm, also known as Paris Farm, is located about an hour’s drive outside Washington, at the foot of the Blue Ridge Mountains in the rural village of Paris. Boneta purchased the property in 2006, 145 years after Jackson’s supposed overnight stay.

While negotiating the real estate transaction with Boneta, according to her suit in Fauquier County Circuit Court, the environmental council presented her with a document describing Jackson’s movements and coordinates in July 1861.

After a “strenuous march” from Winchester, Virginia, Jackson and his men spent the night nearby, according to the deed detailing terms of the conservation easement that was part of the environmental council’s sale of the property to Boneta.

The next day, July 19, Jackson resumed his march to what was then Piedmont Station and is now Delaplane, the environmental council’s document explains. From there, it says, Jackson and his troops boarded a train on their way to what would be called the First Battle of Bull Run.

Also known as the Battle of First Manassas, Bull Run was where Jackson earned the nickname “Stonewall.”

The “Oak Grove” situated on the high point of the property “is recognized as the heart of the bivouac of General Thomas ‘Stonewall’ Jackson’s men on the evening of July 18, 1861 … ,” the document says.

But here’s the problem: Even a general as nimble and agile as Jackson could not be in two places at the same time.

Historians believe he camped in the vicinity, but don’t agree on where. One historian, in an email to the Virginia Outdoors Foundation, insists Jackson did not overnight on the land that is now Liberty Farm.

Jackson, who served under Gen. Robert E. Lee, was a decisive factor in significant Civil War battles until he was fatally wounded by friendly fire at age 39 during the Battle of Chancellorsville in May 1863, as notes.

Boneta’s lawsuit against the Piedmont Environmental Council, filed in May, argues that the organization’s linking of Jackson to her property was not just a mistake but a deliberate act of fraud.

“Stonewall Jackson did not bivouac on Paris Farm on the eve of the First Battle of Bull Run,” Boneta’s suit says, adding:

PEC knew, in negotiating with Ms. Boneta for the sale of the Paris Farm, that its representation regarding Stonewall Jackson bivouacking on the Paris Farm prior to the First Battle of Bull Run was false. PEC’s knowledge of the falsity of its claim regarding Stonewall Jackson bivouacking on Paris Farm is demonstrated by the fact that PEC claimed that another one of its properties was the scene of Stonewall Jackson’s famous night watch.

The false historical designation, the suit claims, greatly inflated the purchase price of the property beyond its actual value and restricted Boneta from accessing roughly 18 acres for agricultural operations.

The dispute over Stonewall Jackson’s whereabouts that day in 1861 is the latest wrinkle in a long, complicated dispute between Boneta and the Piedmont Environmental Council that reaches back to 2009. That’s when Boneta first filed suit against the land trust, accusing it of violating the terms and conditions of the conservation easement.

Boneta purchased the 64-acre property for $425,000 on July 31, 2006. Without the historical designation involving the Civil War general,  the property actually was worth $100,000—the value the environmental council listed in 2005 tax filings, according to the suit.

“PEC’s knowing, false representations to Ms. Boneta were made in order to obtain more money from Ms. Boneta for Paris Farm than she would otherwise have paid for it and more money from Ms. Boneta than Paris Farm was worth at the time,” the suit alleges.

The environmental council’s willingness to invoke what turns out to be dubious history as a way to restrict Boneta’s farming operations points to the disproportionate influence green groups have acquired across the country, an energy policy analyst who has followed the story closely told The Daily Signal.

“Under the guise of practicing conservation, land trusts—operating with little, if any, oversight—are becoming states within a state,” said Bonner Cohen, a senior fellow at the National Center for Public Policy Research, adding:

Lording over millions of acres throughout the U.S., they have learned that they can harass property owners and force them into prohibitively expensive litigation with impunity.  By claiming—in the absence of any evidence—that Martha Boneta’s farm was of particular historical significance, the PEC could both limit her ability to use her land and line their pockets by jacking up the sales price of the property they sold her.  We’re living in an age of green robber barons.

In November 2014, the Virginia Outdoors Foundation, which holds the easement with the Piedmont Environmental Council, “indicated that there was no accurate historical evidence in support of the PEC’s claim,” according to the suit.

In response to requests from The Daily Signal under the Freedom of Information Act, the Virginia Outdoors Foundation released email records detailing the role it played in obtaining historical information that raised questions about Jackson’s precise location on July 18, 1861.

Reached by telephone, a spokeswoman for the Piedmont Environmental Council told The Daily Signal that the land trust “disputes the claims” in Boneta’s suit but “cannot comment further” on ongoing litigation.

Jason McGarvey, spokesman for the Virginia Outdoors Foundation, told The Daily Signal that his agency had done its own research before hearing from the local historian who disputed the claim that  Jackson camped on what is now Boneta’s property.

“We had already determined that the documentation supporting the restrictions in the deed did not meet our standards for stewardship,” McGarvey said in a June 17 email.

‘Well-Heeled’ Activists

Matthew Vadum, senior vice president of the Capital Research Center, which studies politically oriented nonprofits, describes the Piedmont Environmental Council as a “well-heeled activist group” that is well positioned to wage legal battles. Indeed, from 2005 through 2014, the land trust pulled in $55.4 million in donations and other revenue, according to publicly available tax  filings.

Boneta’s suit says she fenced about 18 acres of her property in response to the environmental council’s historical claim. Erecting, maintaining, and removing the fencing cost about $18,000, the suit says.

Boneta also spent thousands of dollars to trademark the name Liberty Farm with the U.S. Patent and Trademark Office, so her property could be marked for guided historical tours.

Her suit calls for the environmental council to be charged with fraud and breach of contract. She asks for no less than $325,000 in compensatory damages to cover what she considers an inflated sale price and her loss of the use of her property,

Cohen, the energy policy analyst, said he sees a financial motivation behind the environmental council’s tactics.

“By boasting about all the land it has ‘saved,’ including the Stonewall Jackson fabrication,” he told The Daily Signal,  “the PEC can receive millions of dollars in government grants and it can hustle donors to write even bigger checks to the green money machine.”


Going Out With A Bang: Could Algal Sex Save The Great Basrrier Reef?

It’s no secret that the domestic situation between corals and the algae that live inside has become a little heated in recent months, but scientists may have found a way to get that steamy relationship get back on track.

First, a bit of background: The mass coral bleaching that has savaged the Great Barrier Reef over recent months occurred because of unusually warm ocean temperatures, driven by climate change and an El Nino weather system.

The bleaching starts when corals expel a type of algae that normally lives inside them, and gives them their colour. When the water becomes too warm, the algae gets all hot under the collar, and starts producing toxins that damage the corals.

That’s why the algae get turfed out. But the algae are the coral’s main source of food, so they starve, get bleached white, and are eventually overrun by a different kind of algae.

Clearly, it’s a marriage in crisis – which is why scientists have mounted an intervention.

New research published in the Journal of Molecular Biology and Evolution has revealed that the water of the Coral Sea isn’t the only thing that has been getting hot of late.

The algae appear to have responded to the conditions by starting to reproduce sexually, instead of asexually, and it turns out this promiscuity could help save the corals’ relationship with their special algae friends too.

The difference is that when the algae produce asexually they produce a more-or-less identical copy of themselves. If they produce sexually, different algae’s genetic codes get spliced together, which produces new variants of algae.

The algae that can stand the heat are less likely to get all toxic, and therefore less likely to be sent to the dog-house by the corals, which are in turn less likely to bleach. It’s a raunchy sort of survival of the fittest.

Professor Madeleine van Oppen, from the Australian Institute of Marine Science, was one of the scientists involved in the study. She said the findings are “critical in terms of developing more climate-resilient algae and corals”.

The algae’s sexual reproduction was only a small part of the study. The main finding was that some algae use a mechanism to switch on genes which produce special proteins in order to protect themselves from heat exposure and mop up some of the toxic chemicals that poison their symbiotic relationship with the coral.

The sexual reproduction is important, though, because it speeds up evolution and might allow the algae to adapt quickly enough to tolerate the rise in sea temperatures.

It’s a bit of good news in a sea of bad, for those of us rooting for the Great Barrier Reef.


The Watermelons Are Here

“Recently I was foolish enough to try to reason with an environmentalist,” wrote Stanford economist Thomas Sowell. “But it became obvious that he had his mind made up and didn’t want to hear any evidence to the contrary. The pope is more likely to have read Karl Marx than an environmentalist is to have read even a single book that criticized environmentalism.”

One might say a lot about the Pope and Marx, but I want to focus on Sowell’s juxtaposition of the ideologies of socialism and environmentalism. Socialism is an economic and political ideology, but surely environmentalism is just a concern for the environment?

Sowell conflated these ideas because socialism and environmentalism have become opposite sides of the same coin. Socialists want to ban private ownership and favor government ownership and control over the means of production. Socialists believe that removing individual freedom of economic and political action results in a reduction of inequity and thereby brings about a just society in which everyone is equal.

But that seems a million light years away from the idea of cleaning up a roadside, protecting rare birds, or concern about polluted water. In such context the word ‘ideology’ seems inappropriate to apply to concern for a healthy environment. Most people, like myself, believe that it is proper and good to seek a fruitful and beautiful environment. If that is environmentalism then count me in.

Patrick Moore, a founder and past president of Greenpeace who has since left the group, prefers to call himself a ‘sensible environmentalist’ because he appreciates that the environmentalist movement has changed. It is, he says, no longer science based but “a political activist movement.” It has taken on the form of a total ideology erasing boundaries between radical activism and sensible environmentalism.

Moore identifies the point where the ideology of socialism co-opted ‘sensible’ environmentalism. In an interview with the Vancouver Sun he said, “The collapse of world communism and the fall of the Berlin Wall during the 1980s added to the trend toward extremism. The Cold War was over and the peace movement was largely disbanded. The peace movement had been mainly Western-based and anti-American in its leanings. Many of its members moved into the environmental movement, bringing with them their neo-Marxist, far-left agendas. To a considerable extent the environmental movement was hijacked by political and social activists who learned to use green language to cloak agendas that had more to do with anti-capitalism and anti-globalization than with science or ecology.”

Dany Cohn-Bendit, co-president of the group European Greens–European Free Alliance, exemplifies the all-too-common Marxist-Green connection. When he transformed himself from Dany the Red into Dany the Green he surfed the fashionable green political wave onto a deeper Red tide.

Cohn-Bendit said, “We have a project for Europe, an idea—the ecological transformation of our way of production and our way of life.” Says Dany the Green: “It’s for the survival of mankind.”

Self-described socialist activist Tom Athanasiou, director of U.S.-based EcoEquity, wrote “[E]nvironmentalism is only now reaching its political maturity.” He explains that there is a wonderful convergence of Red political concerns that Green concerns enable.

President Obama’s short-lived Green Jobs Czar, Van Jones, who self-identifies as a “communist,” explained why he was not on the streets burning down the system but instead working within it. “I’m willing to forgo the cheap satisfaction of the radical pose for the deep satisfaction of radical ends,” said he. He had discovered in environmentalism a means to satisfy his need for both the radical pose and Marxist ends because environmentalism serves policies he already believes in.

The ecosocialist current within the Green movement has become a red tide engulfing the planet. That is presumably why there is often a profusion of hammer and sickle communist party flags proudly flown by Green activists outside climate conferences, while inside leaders like the late Hugo Chavez, the former president of Venezuela, insist that socialism is the path to saving the planet. The last conference he attended Chavez added, “Capitalism is the road to hell, to the destruction of the Earth.”

Edward Said once described environmentalism as “the indulgence of spoiled tree-huggers who lack a proper cause.”

That may be true to a certain extent. However, as I hope you see, for many in the green movement the environment is no longer the cause, but the vehicle. The environment, and climate change in particular, is the big sail at the backs of activists who have hijacked the green movement. They are watermelons—green on the outside, red on the inside.


Wind-Energy Sector Gets $176 Billion Worth of Crony Capitalism

Last month, during its annual conference, the American Wind Energy Association issued a press release trumpeting the growth of wind-energy capacity. It quoted the association’s CEO, Tom Kiernan, who declared that the wind business is “an American success story.”

There’s no doubt that wind-energy capacity has grown substantially in recent years. But that growth has been fueled not by consumer demand, but by billions of dollars’ worth of taxpayer money. According to data from Subsidy Tracker — a database maintained by Good Jobs First, a Washington, D.C.–based organization that promotes “corporate and government accountability in economic development and smart growth for working families” — the total value of the subsidies given to the biggest players in the U.S. wind industry is now $176 billion.

That sum includes all local, state, and federal subsidies as well as federal loans and loan guarantees received by companies on the American Wind Energy Association’s board of directors since 2000. (Most of the federal grants have been awarded since 2007.) Of the $176 billion provided to the wind-energy sector, $2.9 billion came from local and state governments; $9.4 billion came from federal grants and tax credits; and $163.9 billion was provided in the form of federal loans or loan guarantees.

General Electric — the biggest wind-turbine maker in North America — has a seat on AWEA’s board. It has received $1.6 billion in local, state, and federal subsidies and $159 billion in federal loans and loan guarantees. (It’s worth noting that General Electric got into the wind business in 2002 after it bought Enron Wind, a company that helped pioneer the art of renewable-energy rent-seeking.)

NextEra Energy, the largest wind-energy producer in the U.S., has received about 50 grants and tax credits from local, state, and federal entities as well as federal loans and loan guarantees worth $5.5 billion. That’s more than what the veteran crony capitalist Elon Musk has garnered. Last year the Los Angeles Times’s Jerry Hirsch reported that Musk’s companies — Tesla Motors, Solar City, and Space Exploration Technologies — have collected subsidies worth $4.9 billion. NextEra’s haul is also more than what was collected by such energy giants as BP ($315 million) and Chevron ($2.2 billion).

About $6.8 billion in subsidies, loans, and loan guarantees went to foreign corporations, including Iberdrola, Siemens, and E.On. Those three companies, and five other foreign companies, have seats on AWEA’s board of directors.

Many of the companies on the AWEA board will be collecting even more federal subsidies over the next few years. In December, the Congressional Joint Committee on Taxation estimated that the latest renewal of the production tax credit will cost U.S. taxpayers about $3.1 billion per year from now until 2019. That subsidy pays wind-energy companies $23 for each megawatt-hour of electricity they produce.

That’s an astounding level of subsidy. In 2014 and 2015, according to the Energy Information Administration, during times of peak demand, the average wholesale price of electricity was about $50 per megawatt-hour. Last winter in Texas, peak wholesale electricity prices averaged $21 per megawatt hour. Thus, on the national level, wind-energy subsidies are worth nearly half the cost of wholesale power, and in the Texas market, those subsidies can actually exceed the wholesale price of electricity.

Of course, wind-energy boosters like to claim that the oil-and-gas sector gets favorable tax treatment, too. That may be so, but those tax advantages are tiny when compared with the federal gravy being ladled on wind companies. Recall that the production tax credit is $23 per megawatt-hour. A megawatt-hour of electricity contains 3.4 million Btu. That means wind-energy producers are getting a subsidy of $6.76 per million Btu. The current spot price of natural gas is about $2.40 per million Btu. Thus, on an energy-equivalent basis, wind energy’s subsidy is nearly three times the current market price of natural gas.

MidAmerican Energy Company, a subsidiary of Berkshire Hathaway, has a seat on AWEA’s board. Berkshire’s subsidy total: $1.5 billion — and it’s primed to collect lots more. In April, the company announced plans to spend $3.6 billion on wind projects in Iowa. Two years ago, Berkshire’s CEO, Warren Buffett, explained why his companies are in the wind business. “We get a tax credit if we build a lot of wind farms. That’s the only reason to build them,” he said. “They don’t make sense without the tax credit.”

Keep in mind that the $176 billion figure in wind-energy subsidies is a minimum number. It counts only subsidies given to companies on AWEA’s board. Not counted are subsidies handed out to companies like Google, which got part of a $490 million federal cash grant for investing in an Oregon wind project. Nor does it include the $1.5 billion in subsidies given to SunEdison, the now-bankrupt company that used to have a seat on AWEA’s board. (To download the full list of subsidies garnered by AWEA’s board members, click here.)

Nor does that figure include federal money given to J. P. Morgan and Bank of America, both of which have a seat on AWEA’s board. The two banks received federal loans or loan guarantees worth $1.29 trillion and $3.49 trillion, respectively. In an e-mail, Phil Mattera, the research director for Good Jobs First, told me that the loan and loan-guarantee figures for the banks include the federal bailout package known as the Troubled Asset Relief Program as well as “programs instituted by the Federal Reserve in the wake of the financial meltdown.”

When all of the subsidies, loans, and loan guarantees given to the companies on AWEA’s board are counted, the grand total comes to a staggering $5.1 trillion.

According to Wikipedia, crony capitalism “may be exhibited by favoritism in the distribution of legal permits, government grants, special tax breaks, or other forms of state interventionism.” Wind-energy companies are getting favoritism on every count. The U.S. Fish and Wildlife Service wants to give those companies permits allowing them to legally kill bald and golden eagles with their turbines for up to 30 years. The industry is getting grants, tax breaks, and loans worth billions. And thanks to federal mandates like the Clean Power Plan and state renewable-energy requirements — nearly all of which are predicated on the specious claim that paving vast swaths of the countryside with wind turbines is going to save us from catastrophic climate change — the industry is surfing a wave of state interventionism.

AWEA’s Kiernan likely has it right. In a country where having a profitable business increasingly requires getting favors from government, the U.S. wind industry is definitely a “success.”


As Readiness Declines, U.S. Military Fiddles with 'Greening'

America's military faces a readiness crisis. The Marine Corps is looting aviation museums for spare parts to repair combat aircraft. The Navy is three dozen ships short of what the chief of naval operations says is necessary to meet operational requirements. Reduced training hours have led to an increase in fatal training incidents for the Army. And the Air Force is flying increasingly old and worn-out planes.

With these issues piling up for our service members, one would think our commander in chief would dedicate his final year in office to rebuilding the military. Yet throughout his presidency, Mr. Obama preferred to steer taxpayer dollars to wasteful environmental campaigns.

Global warming, he claims, is one of the greatest threats to American security — on par with North Korean nukes or terrorism. He thus "justifies" the allocation of scarce resources within the Defense Department to feel-good power projects driven by arbitrary energy consumption and production targets rather than military utility. Dubious "military" projects, such as building more solar power facilities to generate electricity on bases, provide no additional security, cost much more than conventional power sources and put the stability and security of bases at risk.

Solar power is famously unreliable. It provides consistent power only where it's consistently sunny, and of course it can't harness any power at night. Because security demands reliable power, many bases shifting to solar find they still must rely largely on conventional energy sources for power.

And building solar fields isn't the only major cost incurred by the military. Because of the way solar panels function, most military bases pursuing commercial-scale solar projects must also upgrade their power grids just to make it safe for them to handle solar. All of these expenses are being incurred at a time when conventional fuel sources are far more affordable.

This is not to say that alternative and renewable power is always a waste. In fact, the military has engaged in such practices to much success in the past. One example is the geothermal power generated by two plants at Naval Air Weapons Station China Lake in California.

There, the Defense Department leased land to a private company that recognized the potential for producing consistent, reliable, renewable power from the Earth's heat. These plants generate 270 megawatt hours — enough power for 180,000 homes.

But what drove the renewable energy project in China Lake were free market principles, well-thought-out investments and recognition of a legitimate demand for power — not feel-good environmental crusades or political posturing.

Ironically, the Navy is forced to use some of its income from this successful renewable energy project to fund wasteful energy initiatives. Environmental regulations require the Defense Department to "reinvest" a portion of the plants' payments into solar and other initiatives aimed at meeting arbitrary targets for renewable energy production and consumption.

The Obama administration also forces the Defense Department to spend taxpayer dollars on "renewable energy certificates." These certificates enable the department's agencies to "meet" renewable standards by essentially buying credits, without actually engaging directly in the production or consumption of renewable energy. This acts as a cap-and-trade structure internal to the military, through which taxpayer dollars are spent on symbolic pieces of paper that contribute nothing to military capability.

Certainly there is a role for renewable energy projects in the Pentagon. Some may save taxpayer dollars. Others may enhance war fighter capability. As an example of the latter, troops stationed in sunny environments have been able to use portable solar panels to recharge batteries — a practice that reduces the weight carried by combat units. Either type of initiative should be lauded and pursued by the government.

Instead, many Defense Department energy projects exist only because of mandates imposed upon our armed forces. The next commander in chief and Congress owe it to the services to give them the resources needed to reverse the growing readiness crisis. Continuing to divert defense dollars to pet environmental projects is unwise and unsafe.



For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here


Wednesday, June 22, 2016

The great ozone embarrassment

Do you ever wonder why we don't hear much about the ozone hole these days?  There's a reason.  I made some mocking comments about the messed-up talk from Greenies about stratospheric ozone yesterday.  I now want to tell more of the story.

When I searched the net for the numbers about CO2 levels and global temperature, I very rapidly found the numbers nicely set out for both.  So I initially expected that I would have no trouble finding the numbers for atmospheric ozone levels.  I found quite a lot of sites that gave information about that but none of them gave the underlying numbers.  The information was always presented in pretty multi-colored pictures.

That is very strange.  Numbers are food and drink to scientists.  Pictures just cannot give you precision.  So what is going on? Is there a reason for the imprecision?

I think I have eventually found out. The numbers are pretty  embarrassing.   Ozone levels are at least not rising and may be FALLING.  Yet, according to the Ozone-hole enthusiasts, the levels  should be rising.  When the very expensive Montreal protocol of 1989 was imposed on us, we were told that CFC's were destroying ozone at a dangerous rate (ALL change is dangerous according to Greenies) so if we stopped producing CFCs, the ozone would bounce back and the "hole" in Antarctica would shrink away.  So ozone levels should have been RISING for quite a while now.

But the opposite may have happened.  I eventually found  an official  New Zealand statistics site which informed me that: "From 1978 to 2013, median monthly ozone concentrations decreased slightly, about 4 percent",  And I found another source which put the loss to the year 2000 at 7%.

And the cooling trend in the stratosphere can only reasonably be explained by falling ozone levels.  It's absorption of UV by ozone that keeps the stratosphere warm.  I showed yesterday that the cooling trend cannot be explained by CO2 levels.

Greenies are always cautious about when they expect the ozone hole to close, generally putting it quite a few years in the future.  They say, reasonably, that these things oscillate so the  process of ozone recovery must be a gradual one and you need a long series to see a trend.  But  for the level to be DECLINING  looks very much like proof of failure.  

But I needed those elusive numbers to be certain of what was going on. And I did eventually find them at Mauna Loa. They give almost daily readings up to this year. I looked at the readings for three years, 1996, 2010 and this year.  I noted  that the readings in all three years  varied between around 230 to 270 Dobson units, according to the time of the year.  I saw no point in calculating exact averages as it was clear that, at this late stage when the effects of the CFC ban should long ago have cut in, essentially nothing was happening.  The ozone level may not have fallen in recent years but it is not dropping either. The predicted rise was not there.  The levels just bob up and down in the same old way within the same old range year after year

So it looks like the Montreal protocol did nothing.  The whole thing seems to have been wholly misconceived. The "science" behind it was apparently wrong.

Yet it was the "success" of the Montreal protocol that inspired the Greenie assault on CO2.  We have paid a big price for that hasty bit of scientific speculation.

Al Gore Might Want to Oppose the Prosecution of Exxon

The Left is heading into dangerous legal waters. In recent months, leftist attorneys general from blue states like New York and Massachusetts have been trying to build a case against Exxon Mobile Corp. Massachusetts AG Maura Healey demanded Exxon hand over 40 years of documents related to the company’s climate change research in an effort to build a case that the company committed fraud because it’s a “climate change denier.” Exxon, of course, is fighting the subpoena, saying handing over mountains of banker boxes infringes on its First Amendment rights.

In effect, the leftist AGs are pushing for the criminalization of dissent. But that cuts both ways. Climate change may or may not be occurring, and the cause — whether it’s human industry or the climate’s natural cycle — is up for debate. Responding to the prosecution of Exxon, 13 AGs from red states penned a letter that pointed out if the Left wants to prosecute anyone who doesn’t believe socialism is the response to warmer weather, global warming activists could be prosecuted for overstating the threat.

“We all understand the need for a healthy environment, but we represent a wide range of viewpoints regarding the extent to which man contributes to climate change and the costs and benefits of any proposed fix,” read the letter headed by Alabama AG Luther Strange and Texas AG Ken Paxton. “Nevertheless, we agree on at least one thing — this is not a question for the courts. Using law enforcement authority to resolve a public policy debate undermines the trust invested in our offices and threatens free speech.”

While the conservative AGs said in the letter they would not mount such a prosecution, the same legal logic could lead to the prosecution of climate change activists who advocate for the redistribution of taxpayer money to green energy companies — like failed solar energy company Solyndra. For example, Al Gore made statements that were demonstrably false in “An Inconvenient Truth” and he’s continued to double-down on the Chicken Little rhetoric. Is it just a coincidence that he’s a senior partner in a venture-capital firm that invests in clean energy technology?


Climate Change Prediction Fail? What did ‘climate hero’ James Hansen actually predict back in 1986?

The Senate Environment and Public Works Committee held a hearing on June 10 and 11, 1986, to consider the problems of ozone depletion, the greenhouse effect, and climate change. The event featured testimony from numerous researchers who would go on to become major figures in the climate change debate. Among them was James Hansen, who was then a leading climate modeler with NASA's Goddard Institute of Space Studies and who has subsequently been hailed by the Worldwatch Institute as a "climate hero." When the Washington Post ran an article this week marking the 30th anniversary of those hearings, it found the old testimony "eerily familiar" to what climate scientists are saying today. As such, it behooves us to consider how well those 30-year-old predictions turned out.

At the time, the Associated Press reported that Hansen "predicted that global temperatures should be nearly 2 degrees higher in 20 years" and "said the average U.S. temperature has risen from 1 to 2 degrees since 1958 and is predicted to increase an additional 3 or 4 degrees sometime between 2010 and 2020." These increases would occur due to "an expected doubling of atmospheric carbon dioxide by 2040." UPI reported that Hansen had said "temperatures in the United States in the next decade will range from 0.5 degrees Celsius to 2 degrees higher than they were in 1958." Citing the AP report, one skeptical analyst reckoned that Hansen's predictions were off by a factor of 10. Interpreting a different baseline from the news reports, I concluded that Hansen's predictions had in fact barely passed his low-end threshold. Comments from unconvinced readers about my analysis provoked me to find and re-read Hansen's 1986 testimony.

Combing through Hansen's actual testimony finds him pointing to a map showing "global warming in the 1990's as compared to 1958. The scale of warming is shown on the left-hand side. You can see that the warming in most of the United States is about 1/2 C degree to 1 C degree, the patched green color." Later in his testimony, Hansen noted that his institute's climate models projected that "in the region of the United States, the warming 30 years from now is about 1 1/2 degrees C, which is about 3 F." It is not clear from his testimony if the baseline year for the projected increase in temperature is 1958 or 1986, so we'll calculate both.

In Hansen's written testimony, submitted at the hearing, he outlined two scenarios. Scenario A featured rapid increases in both atmospheric greenhouse gases and warming; Scenario B involved declining emissions of greenhouse gas and slower warming. "The warming in Scenario A at most mid-latitude Northern Hemisphere land areas such as the United States is typically 0.5 to 1.0 degree C (1-3 F degrees) for the decade 1990-2000 and 1-2 degree C (2-4 F degrees) for the decade 2010-2020," he wrote.

The National Oceanic and Atmospheric Administration (NOAA) offers a handy Climate at a Glance calculator that allows us to figure out what various temperatures trends have been for the U.S. since 1901 and the globe since 1881. So first, what did happen to U.S. temperatures between 1958 and 1986? Inputting January 1958 to January 1986 using a 12-month time scale, the NOAA calculator reports that there was a trend of exactly 0.0 F degrees per decade for that period. Curiously, one finds a significant divergence in the temperature trends depending on at which half of the year one examines. The temperature trend over last half of each of the 28 years considered here is -0.13 F degree per decade. In contrast, the trend for the first half of each year yields an upward trend of +0.29 F degrees.

What happens when considering "global warming in the 1990's as compared to 1958"? Again, the first and second half-year trends are disparate. But using the 12-month time scale, the overall trend is +0.25 F degrees per decade, which would imply an increase of about 1 F degree during that period, or just over ½ C degree.

So what about warming 30 years after 1986—that is, warming up until now? If one interprets Hansen's testimony as implying a 1958 baseline, the trend has been +0.37 F degree per decade, yielding an increase of about 1.85 F degrees, or just over 1 C degree. This is near the low end of his projections. If the baseline is 1986, the increase per decade is +0.34 F degrees, yielding an overall increase of just over 1 F degree, or under 0.6 C degree. With four years left to go, this is way below his projection of a 1 to 2 C degrees warming for this decade.

Hansen pretty clearly believed that Scenario A was more likely than Scenario B. And in Scenario A, he predicted that "most mid-latitude Northern Hemisphere land areas such as the United States is typically 0.5 to 1.0 degree C (1-3 F degrees)." According to the NOAA calculator, average temperature in the contiguous U.S. increased between 1990 and 2000 by 1.05 F degree, or about 0.6 C degree.

Hansen's predictions go definitively off the rails when tracking the temperature trend for the contiguous U.S. between 2000 and 2016. Since 2000, according to the NOAA calculator, the average temperature trend has been downward at -0.06 F degree per decade. In other words, no matter what baseline year Hansen meant to use, his projections for temperatures in the U.S. for the second decade of this century are 1 to 3 F degrees too high (so far).

What did Hansen project for global temperatures? He did note that "the natural variability of the temperature in both real world and the model are sufficiently large that we can neither confirm nor refute the modeled greenhouse effect on the basis of current temperature trends." It therefore was impossible to discern a man-made global warming signal in the temperature data from 1958 to 1986. But he added that "by the 1990's the expected warming rises above the noise level. In fact, the model shows that in 20 years, the global warming should reach about 1 degree C, which would be the warmest the Earth has been in the last 100,000 years."

Did it? No. Between 1986 and 2006, according to the NOAA calculator, average global temperature increased at a rate of +0.19 C degree per decade, implying an overall increase of 0.38 C degrees. This is less half of Hansen's 1 C degree projection for that period. Taking the analysis all the way from 1986 to today, the NOAA calculator reports a global trend of +0.17 C degree per decade, yielding an overall increase of 0.51 C degree.

Hansen did offer some caveats with his projections. Among them: The 4.2 C degree climate sensitivity in his model could be off by a factor of 2; less solar irradiance and more volcanic activity could affect the trends; crucial climate mechanisms might be omitted or poorly simulated in the model. Climate sensitivity is conventionally defined as the amount of warming that would occur as the result of doubling atmospheric carbon dioxide. Three decades later, most researchers agree that Hansen set climate sensitivity way too high and thus predicted increases that were way too much. The extent to which his other caveats apply is still widely debated. For example, do climate models accurately reflect changes in the amount of cloudiness that have occurred over the past century?

The U.N.'s Intergovernmental Panel of Climate Change's 1990 Assessment Report included a chapter on the "Detection of the Greenhouse Gas Effect in the Observations." It proposed that total warming of 1 C degree since the late 19th century might serve as a benchmark for when a firm signal of enhanced global warming had emerged. It also suggested that a further 0.5 C degree warming might be chosen as the threshold for detecting the enhanced greenhouse. According to the NOAA calculator, warming since 1880 has been increasing at a rate of +0.07 C degree per decade, implying an overall increase of just under 1 C degree as of this year. As noted above, global temperatures have increased by 0.51 C degree since 1986, so perhaps the man-made global warming signal has finally emerged. In fact, Hansen and colleagues suggest just that in a 2016 study.

The upshot: Both the United States and the Earth have warmed at considerably slower pace than Hansen predicted 30 years ago. If the three-decades-old predictions sound eerily familiar, it's because they've been updated. Here's hoping the new predictions will prove as accurate as the old ones.


Bat Killings by Wind Energy Turbines Continue

Industry plan to reduce deadly effects of blades may not be enough, some scientists say

On a warm summer evening along the ridgetops of West Virginia’s Allegheny Mountains, thousands of bats are on the move. They flutter among the treetops, searching for insects to eat and roosts on which to rest. But some of the trees here are really metal towers, with 30-meter-long blades rotating at more than 80 kilometers per hour even in this light breeze. They are electricity-generating wind turbines—a great hope for renewable energy, but dangerous for bats. The flying animals run into spinning blades, or the rapid decrease in air pressure around the turbines can cause bleeding in their lungs. By morning, dozens will lay dead on the ground. Countless more will die at wind turbines elsewhere in the U.S. and Canada in the forests and fields of the Midwest and the windy prairies of the Great Plains.

Much of this slaughter—the greatest threat to animals that are a vital link in our ecosystem—was supposed to end last year. In 2015, with great fanfare, the American Wind Energy Association (AWEA), a trade group, announced voluntary guidelines to halt turbines at low wind speeds, when bats are most active, which would save lives. Conservationists praised the move.

But some scientists say this promise falls short. The industry plan claims to reduce bat deaths by 30 percent, but holding the blades in check just a little longer could reduce deaths by up to 90 percent or more, a decade of research indicates, and would do so with little additional energy loss. A research review published in January of this year found that wind turbines are, by far, the largest cause of mass bat mortality around the world. White-nose syndrome, the deadly fungal disease that has decimated bat populations throughout the northeastern U.S., came in second. Biologist Cris Hein of the nonprofit group Bat Conservation International says that if the current industry practices continue and wind turbine installation grows, bat populations already weakened by the fungus will crash. Industry has balked at holding the blades still at higher wind speeds, however, saying the energy loss will be larger than scientists claim.

Bats eat insects, saving farmers billions of dollars in pest control each year, but they generally do not get much attention. No one was even looking for bats under turbines until 2003, according to wildlife biologist Ed Arnett, currently a senior scientist at the Theodore Roosevelt Conservation Partnership. But on routine checks for dead hawks and eagles under West Virginia turbines that summer, surveyors found an estimated 2,000 dead bats. The discovery prompted creation of the Bat and Wind Energy Cooperative - a consortium of federal agencies, the wind energy association and Bat Conservation International. The consortium hired Arnett in 2004 to conduct the first major studies of why turbines kill bats and to find solutions.

In what is now considered a classic study at the Casselman Wind Project in Somerset County, Pa., in 2008 and 2009 Arnett “feathered” the blades in the evening hours of bats’ critical fall migration period. Feathering involves turning the blades parallel to the wind so the turbines do not rotate. Arnett feathered blades at wind speeds of five to 6.5 meters per second, slightly above the cut-in speed – the speed at which the turbines connect with the power grid—now typical in the industry, which is 3.5 to four meters per second. Delaying the cut-in speed reduced bat deaths by 44 to 93 percent, depending on the night studied and conditions. And delaying turbine starts until slightly higher wind speeds during this two-month migration period, Arnett estimated, would only reduce annual wind energy production, by less than 1 percent. A flurry of research by other scientists followed, showing feathered blades and higher cut-in speed saved more bat lives than other proposed solutions.

Paul Cryan, a bat biologist with the U.S. Geological Survey and a co-author of the January bat mortality review, praised the industry’s voluntary guidelines as an important first step. But like Cris Hein, he worries about the ongoing impact of turbines on bat populations. “Bats are long-lived and very slow reproducers,” he says. “Their populations rely on very high adult survival rates. That means their populations recover from big losses very slowly.” He questions whether bats can handle such damage year after year.

Defending the wind turbine policy, John Anderson, AWEA’s senior director of Permitting Policy and Environmental Affairs, says the guidelines were just a first move, not necessarily the last. “The initial step was to find that sweet spot between reducing our impact while maintaining energy production levels that make a project economic,” he says.

To date, however, the industry has resisted feathering at speeds higher than what the guidelines recommend. “For every megawatt hour that wind is not operating, that’s a megawatt hour that has to be replaced by a far more impactful form of energy from fossil fuel,” Anderson notes. He maintains that the low energy cost estimated at Casselman does not hold for other locations. “I wish it was 1 percent everywhere,” he says. “But the reality is that you have different wind profiles in different locations, and different costs of energy. So 1 percent in one location may be very inexpensive and in another [it could be] extremely expensive and make or break the difference in a very competitive market.”

Now the U.S. Fish and Wildlife Service (FWS), part of the bat consortium, is weighing in on the debate, and it appears to be following the conservation research. In a draft Habitat Conservation Plan covering eight Midwestern states the FWS proposes raising turbine cut-in speeds to five or 6.5 meters per second to protect three bat species listed (or being considered for listing) under the Endangered Species Act. One such species is the Indiana bat. To date, few other bat species are officially listed as endangered, including those most frequently killed by turbines. And the FWS can only require action by a wind facility if it has proof that the facility killed an endangered Indiana bat, a difficult task without close monitoring.

Right now, “many, many, many facilities within the range of the Indiana bat” do not participate in any plan, says Rick Amidon, a biologist in the FWS’s Midwest office. The service hopes that a region-wide Habitat Conservation Plan will make it easier for facilities to opt into good conservation practices in advance, before the bodies of endangered species appear under their blades and the FWS takes action. The public comment period for the proposed plan closes July 14.

The situation right now puts Hein and other conservationists in a difficult position. “We see the impact of climate change on bats, and so we’re in favor of renewable energy,” Hein says. “It’s unfortunate that one of those—wind energy—has this negative impact.” He is frustrated that industry has not acted more quickly on existing studies but acknowledges “it’s hard to get an industry to move on anything very rapidly.” In the meantime he and the consortium will keep searching for the ultimate environmental sweet spot.


Poland severely restricts wind farms

Position of the [Polish] National Institute of Public Health – National Institute of Hygiene on wind farms:

The National Institute of Public Health – National Institute of Hygiene is of the opinion that wind farms situated too close to buildings intended for permanent human occupation may have a negative impact on the well-being and health of the people living in their proximity.

The human health risk factors that the Institute has taken into consideration in its position are as follows:

the emitted noise level and its dependence on the technical specifications of turbines, wind speed as well as the topography and land use around the wind farm,

aerodynamic noise level including infrasound emissions and low-frequency noise components,

the nature of the noise emitted, taking into account its modulation/impulsive/tonal characteristics and the possibility of interference of waves emitted from multiple turbines,
the risk of ice being flung from rotors,

the risk of turbine failure with a rotor blade or its part falling,

the shadow flicker effect,

the electromagnetic radiation level (in the immediate vicinity of turbines),

the probability of sleep disruptions and noise propagation at night,

the level of nuisance and probability of stress and depression symptoms occurring (in consequence of long exposure), related both to noise emissions and to non-acceptance of the noise source.

In the Institute’s opinion, the laws and regulations currently in force in Poland (regarding risk factors which, in practice, include only the noise level) are not only inadequate to facilities such noise source as wind turbines, but they also fail to guarantee a sufficient degree of public health protection. The methodology currently used for environmental impact assessment of wind farms (including human health) is not applicable to wind speeds exceeding 5 m/s. In addition, it does not take into account the full frequency range (in particular, low frequency) and the nuisance level.

In the Institute’s view, owing to the current lack of a comprehensive regulatory framework governing the assessment of health risks related to the operation of wind farms in Poland, an urgent need arises to develop and implement a comprehensive methodology according to which the sufficient distance of wind turbines from human habitation would be determined. The methodology should take into account all the above-mentioned potential risk factors, and its result should reflect the least favourable situation. In addition to landform (natural topography) and land use characteristics, the methodology should also take into consideration the category, type, height and number of turbines at a specific farm, and the location of other wind farms in the vicinity. Similar legislative arrangements aimed to provide for multi-criteria assessment, based on complex numerical algorithms, are currently used in the world.

The Institute is aware of the fact that owing to the diversity of factors and the complicated nature of such an algorithm, its development within a short time period may prove very difficult. Therefore, what seems to be an effective and simpler solution is the prescription of a minimum distance of wind turbines from buildings intended for permanent human occupation. The setback criteria are also a common standard-setting arrangement.

Having regard to the above, until a comprehensive methodology is developed for the assessment of the impact of industrial wind farms on human health, the Institute recommends 2 km as the minimum distance of wind farms from buildings. The recommended value results from a critical assessment of research results published in reviewed scientific periodicals with regard to all potential risk factors for average distance usually specified within the following limits:

0.5-0.7 km, often obtained as a result of calculations, where the noise level (dBA) meets the currently acceptable values (without taking into account adjustments for the impulse/tonal/modulation features of the nose emitted),

1.5-3.0 km, resulting from the noise level, taking into account modulation, low frequencies and infrasound levels,

0.5-1.4 km, related to the risk of turbine failure with a broken rotor blade or its part falling (depending on the size of the piece and its flight profile, rotor speed and turbine type),

0.5-0.8 km, where there is a risk of ice being flung from rotors (depending on the shape and mass of ice, rotor speed and turbine type),

1.0-1.6 km, taking into account the noise nuisance level (between 4% and 35% of the population at 30-45 dBA) for people living in the vicinity of wind farms,

the distance of 1.4-2.5 km, related to the probability of sleep disruptions (on average, between 4% and 5% of the population at 30-45 dBA),

2,0 km, related to the occurrence of potential psychological effects resulting from substantial landscape changes (based on the case where the wind turbine is a dominant landscape feature and the rotor movement is clearly visible and noticeable to people from any location),

1.2-2.1 km, for the shadow flicker effect (for the average wind turbine height in Poland, including the rotor from 120 to 210 m).
In its opinions. the Institute has also considered the recommended distances of wind farms from buildings, as specified by experts, scientists, as well as central and local government bodies around the world (in most cases recommended from 1.0 to 5.0 km).


Despite huge investments, renewable energy isn’t winning

For hydrocarbon doomsayers, there’s good news and bad news. In 2015, there were record investments in renewable energy, and record capacity was added, much of it in emerging economies. Yet despite the huge investment, the global share of fossil fuels is not shrinking very fast. Renewables such as wind, solar and geothermal still account for a tiny share of energy production, and there are factors that may inhibit their growth in the next few years.

REN21, the international renewable energy association backed by the United Nations Environment Program, has summarized impressive developments in the sector in 2015. Total investment in renewable power and fuels reached $285.9 billion, an all-time record, and renewable power capacity, including hydropower, increased by 148 gigawatts — another record — to 1.8 terawatts. For the sixth consecutive year, investment in new renewable capacity was higher than in hydrocarbon-burning power plants.

Renewables such as wind, solar and geothermal still account for a tiny share of energy production, and there are factors that may inhibit their growth in the next few years.
Much of the increase came from the developing world. China was in first place; the U.S. came in second, and added more solar and wind capacity than any other country. Turkey added the most geothermal generation. The narrative about the environmentally conscious rich nations and the laggard poor ones is obsolete; Mauritania invested the biggest share of economic output in sustainable energy in 2015, followed by Honduras, Uruguay and Morocco. Bangladesh is the biggest market for home-based solar systems.

One might think the energy revolution is fast displacing fossil fuels. Not really. Although investment in renewables and in the oil industry are of comparable magnitude — $522 billion was invested in oil last year — sustainable energy is growing from a very low base.

Mauritania invested the biggest share of economic output in sustainable energy in 2015, followed by Honduras, Uruguay and Morocco. Bangladesh is the biggest market for home-based solar systems.

We read about the big successes — Costa Rica with 99 percent of energy generated from renewable sources, Uruguay with 92.8 percent, three German states with most of their energy coming from wind — but weaning the world off fossil fuels is an uphill battle.

One reason is regulators’ understandable fixation on generation. Wind and solar installations are relatively easy to promote: The technology is already there, all governments need to do is subsidize its use by levying additional taxes or “feed-in tariffs.” It’s much harder to set up an equally effective mechanism in transportation, which uses the lion’s share of oil products. Although solar and wind generation is already price-competitive with fossil fuels in many countries, modern electric vehicles are pricey, clunky (yes, even the Teslas) and far behind gas-powered competitors in terms of driving range. It would be an expensive proposition for governments to subsidize them to a degree that would make them popular.

Now, because oil is relatively cheap, the global market is moving toward cars that use more gas, especially SUVs. No wonder global oil consumption grew at the fastest rate in five years in 2015.

This year, the growth is set to continue. And increases in renewables capacity may hit some obstacles soon.

Most of last year’s expansion came from additional wind and solar capacity. Countries such as Germany and Poland added a lot of wind power because their governments are about to end direct subsidies and move to tendering programs, which allow only the lowest bidders to build new power plants. This is fair: European governments nursed sustainable energy producers when it was hard for them to compete with traditional generation on price, and now it’s time for a more market-based approach. The policy shift, however, will probably cause an investment slowdown starting in 2017.

Solar photovoltaic generation has another problem in markets where it has a large, established share, especially in Europe. “The more that solar PV penetrates the electricity system, the harder it is to recoup project costs,” the REN21 report says. “So an important shift is under way: from the race to be cost-competitive with fossil fuels to being able to adequately remunerate solar PV in the market.”

Other markets, too, will eventually reach a point where government support has to be scaled back because it’s harder to justify, and the huge investments of today will become harder to recoup. The current investment and growth rates in renewables are not quite natural, and they are not likely to last. Only major technological breakthroughs in energy storage, both for grids and for vehicles, could ensure another leap in sustainable energy use.

Without such breakthroughs, which will make traditional generation and powertrains vastly inferior to modern ones, demand for fossil fuels will remain strong for decades. The International Energy Agency’s projection for 2040, based on the current growth rate in renewables, has the share of natural gas used in power generation roughly at the same level as today. It doesn’t predict any drops in oil demand.

Those who have predicted the end of the petrostates and permanently low oil prices are in for a long wait. Fortunes will still be made in fossil fuels, and oil dictatorships will probably keep squabbling and menacing their neighbors at least for most of our remaining lifetimes.


1,500 scientists lift the lid on reproducibility

Survey sheds light on the ‘crisis’ rocking research

More than 70% of researchers have tried and failed to reproduce another scientist's experiments, and more than half have failed to reproduce their own experiments. Those are some of the telling figures that emerged from Nature's survey of 1,576 researchers who took a brief online questionnaire on reproducibility in research.

The data reveal sometimes-contradictory attitudes towards reproducibility. Although 52% of those surveyed agree that there is a significant 'crisis' of reproducibility, less than 31% think that failure to reproduce published results means that the result is probably wrong, and most say that they still trust the published literature.

Data on how much of the scientific literature is reproducible are rare and generally bleak. The best-known analyses, from psychology1 and cancer biology2, found rates of around 40% and 10%, respectively. Our survey respondents were more optimistic: 73% said that they think that at least half of the papers in their field can be trusted, with physicists and chemists generally showing the most confidence.

The results capture a confusing snapshot of attitudes around these issues, says Arturo Casadevall, a microbiologist at the Johns Hopkins Bloomberg School of Public Health in Baltimore, Maryland. “At the current time there is no consensus on what reproducibility is or should be.” But just recognizing that is a step forward, he says. “The next step may be identifying what is the problem and to get a consensus.”

Failing to reproduce results is a rite of passage, says Marcus Munafo, a biological psychologist at the University of Bristol, UK, who has a long-standing interest in scientific reproducibility. When he was a student, he says, “I tried to replicate what looked simple from the literature, and wasn't able to. Then I had a crisis of confidence, and then I learned that my experience wasn't uncommon.”

The challenge is not to eliminate problems with reproducibility in published work. Being at the cutting edge of science means that sometimes results will not be robust, says Munafo. “We want to be discovering new things but not generating too many false leads.”

But sorting discoveries from false leads can be discomfiting. Although the vast majority of researchers in our survey had failed to reproduce an experiment, less than 20% of respondents said that they had ever been contacted by another researcher unable to reproduce their work. Our results are strikingly similar to another online survey of nearly 900 members of the American Society for Cell Biology (see That may be because such conversations are difficult. If experimenters reach out to the original researchers for help, they risk appearing incompetent or accusatory, or revealing too much about their own projects.

A minority of respondents reported ever having tried to publish a replication study. When work does not reproduce, researchers often assume there is a perfectly valid (and probably boring) reason. What's more, incentives to publish positive replications are low and journals can be reluctant to publish negative findings. In fact, several respondents who had published a failed replication said that editors and reviewers demanded that they play down comparisons with the original study.

Nevertheless, 24% said that they had been able to publish a successful replication and 13% had published a failed replication. Acceptance was more common than persistent rejection: only 12% reported being unable to publish successful attempts to reproduce others' work; 10% reported being unable to publish unsuccessful attempts.

Survey respondent Abraham Al-Ahmad at the Texas Tech University Health Sciences Center in Amarillo expected a “cold and dry rejection” when he submitted a manuscript explaining why a stem-cell technique had stopped working in his hands. He was pleasantly surprised when the paper was accepted. The reason, he thinks, is because it offered a workaround for the problem.

Others place the ability to publish replication attempts down to a combination of luck, persistence and editors' inclinations. Survey respondent Michael Adams, a drug-development consultant, says that work showing severe flaws in an animal model of diabetes has been rejected six times, in part because it does not reveal a new drug target. By contrast, he says, work refuting the efficacy of a compound to treat Chagas disease was quickly accepted4.

One-third of respondents said that their labs had taken concrete steps to improve reproducibility within the past five years. Rates ranged from a high of 41% in medicine to a low of 24% in physics and engineering. Free-text responses suggested that redoing the work or asking someone else within a lab to repeat the work is the most common practice. Also common are efforts to beef up the documentation and standardization of experimental methods.

Any of these can be a major undertaking. A biochemistry graduate student in the United Kingdom, who asked not to be named, says that efforts to reproduce work for her lab's projects doubles the time and materials used — in addition to the time taken to troubleshoot when some things invariably don't work. Although replication does boost confidence in results, she says, the costs mean that she performs checks only for innovative projects or unexpected results.



For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here