Thursday, March 31, 2016



Arctic news

Sea ice was down a bit for most of March but it has now popped back up to a level similar to that in other recent years.  The graph tells all:


2016 is the black line

How awful for the Warmists.  The Arctic is all they've got.  I imagine they will console themselves by saying today is the 15th lowest or some such.  You have to be creative with the truth to be a Warmist -- which mostly means taking refuge in trivialities

SOURCE for the graph

UPDATE:

The following is amusing.  It looks like the upswing began on 25th.

"Scientists at the National Snow and Ice Data Center (NSIDC)  said that the sea ice cover attained an average maximum extent of 14.52m sq km (5.607m sq miles) on 24 March, the lowest winter maximum since records began in 1979. The low beats a record set only last year of 14.54m sq km (5.612m sq miles), reached on 25 February 2015.

SOURCE




No matter how warm or cool a year is, it always proves global warming

As we all know, Warmists have seized on the slight warming in 2015 as "proving" Warmism to be right.  El Nino is ignored. So the year 2011 must have been hard for them.  I downloaded the 2011 chart from CRU in 2012.  It is below.  That was only the 12th Warmest year on record.  So did such a dismal figure shake their faith in Warmism at all?  No way!  They went on proclaiming their twisted gospel as before







Why renewable energy is a worse option than nuclear

Comment from South Africa

THERE is a strange belief in the new green religion that "renewable" always means "good". It doesn’t. Slave labour is a form of renewable energy but is far from good. Wood is renewable but the burning of trees for firewood is causing environmental calamity in Africa. Solar and wind energy are both excellent for many applications such as solar water-heating, windmills on Karoo farms and the provision of small amounts of electricity in remote households, clinics and schools. But they are bad for generating grid electricity — bad for the environment and bad for the economy.

The Western Cape provides a good demonstration of energy realities. About 30km north of Cape Town is Koeberg Nuclear Power Station; a further 30km north is the Darling Wind Farm. A comparison of the two is instructive.

Koeberg consists of two units of 900MW capacity each. It was built in nine years, which included a long delay for sabotage, and completed in 1985. Its average electricity production is about 12,600 gigawatt hours (GWh) a year.

The Darling Wind Farm consists of four wind turbines of 1.3MW capacity each. It was built in eight months and completed in 2008. According to its website, it is estimated to produce 8.6GWh a year. Wind farms typically produce less electricity than predicted, but let us accept this figure.

The "load factor" or "capacity factor" of a power plant tells what the plant actually generates compared with its capacity. If it has a capacity to generate 100MW but over a period of time actually produces an average of 70MW, its load factor is 70%.

On these figures, Koeberg has a load factor of 80%. This is not bad but it is by no means the best for nuclear stations. In the US, the load factor for nuclear power is 90%. The Darling Wind Farm has a load factor of 18.9%. This is pretty good for wind. In Germany, Europe’s biggest generator of wind power, the load factor is 17%.

It would require 5,860 Darling wind turbines to generate the same amount of electricity as Koeberg. Imagine 5,860 of these huge machines, each 81m high, compared with Koeberg’s two reactor buildings, each 57m high. Imagine the thousands of kilometres of transmission lines. Imagine the colossal, wasteful, inefficient use of the earth’s resources (wind requires 10 times more concrete and steel than nuclear per kilowatt hour, or kWh).

Wind turbines elsewhere are even bigger than Darling’s, looming over local landscapes like Goliaths. "Gigantic is beautiful!" could be the slogan of wind power.

If these 5,860 wind turbines were built at the same rate as the Darling Wind Farm, it would take 970 years. If they were built in the same time as Koeberg, it would mean building more than 12 wind turbines every week for nine years. It is a fallacy that wind turbines can be built more quickly than nuclear power plants.

But this does not tell half of wind’s problems. With nuclear (or coal or gas), the electricity is generated when you want it for as long as you want it. It is reliable and predictable. With wind, the electricity is produced only if the wind happens to be blowing at the right strength, which is seldom and unpredictable. Because of this, one kWh of wind electricity has far less value than one kWh of nuclear electricity, if indeed it has any value at all. (In 2008, our gold mines shut down because Eskom could not guarantee electricity supply. Unreliable electricity was worthless to them.)

Wind for grid electricity depends completely on governments. Because it is so expensive and unreliable, nobody will put a single cent into it unless the government forces taxpayers or consumers to pay huge operating subsidies for it. Governments compel utilities to buy wind electricity at very high prices, whether they want it or not, whenever the wind happens to be blowing. With nuclear, coal and gas, the generator serves the customer. With wind, the customer serves the generator.

The UK has more than 3,000 wind turbines with a capacity of more than 5,000MW. Because of its latitude, the UK has relatively good wind conditions. But a study by the John Muir Trust (which looked only at the records of electricity production) showed that on 124 occasions from November 2008 to December 2010, the total generation of wind power was less than 20MW. The load factor over these periods was less than 0.4%.

This exposes another fallacy of wind power, that "the wind is always blowing somewhere". In recent cold winters in northern Europe, when electricity was desperately demanded, the wind turbines from Ireland to Germany were producing next to nothing.

If you look at any graph of a nation’s electricity demand, you will see a fairly predictable curve that peaks at breakfast and supper time on weekdays and dips on weekends and at night. The difference between minimum and maximum demand is about two to one. Now look at a graph of wind electricity production. It shows violent, unpredictable fluctuations. The difference between minimum and maximum production is hundreds to one or more.

Nuclear power has by far the best safety record of any energy technology, much better than wind.

The Fukushima nuclear accident last year provided a spectacular demonstration of nuclear safety. A monstrous earthquake and tsunami, which killed 25,000 people, hit old-fashioned Japanese nuclear plants run by a negligent and corrupt utility; four were severely damaged and thousands of people were evacuated, yet the radiation from the accident has killed nobody and is unlikely to do so.

Meanwhile, in recent years, thousands of people have been killed in accidents in coal, gas, hydro, oil and wind.

Because of the vast amounts of uranium and thorium on earth, nuclear power is sustainable for the remaining life of the planet. Nuclear waste, tiny in volume, solid and stable, is easy to store so that it presents no danger to people or the environment.

The waste from wind includes the toxic, long-lived wastes from the mining of neodymium, used in wind generators, which are causing death and disease in Chinese mining communities. (It is literally true that every single energy technology, including wind and solar, produces "deadly waste that lasts for thousands of millions of years" but with proper care we know how to deal with it from generation to generation. Nuclear waste presents nothing new, including plutonium and fission products.)

Solar energy, especially in sunny South Africa, seems better than wind but, for grid electricity, it is even more expensive and with even lower load factors.

Nuclear versus renewable energy boils down to this simple question: do you want to work with nature or against it?

Nature has made nuclear energy highly concentrated and reliable, allowing us to generate large amounts of electricity from small amounts of materials, very economically and with the least disruption to the environment. Nature has made wind and solar power diluted and unreliable.

It would be stupid to build a nuclear reactor plant in your attic to heat your water; solar power is far better. Similarly it is stupid to use solar or wind for grid electricity; nuclear is far better.

SOURCE  





New Study Debunks Polar Bear Scare

If past predictions are any indication, enough ice should have melted by now as a result of anthropogenic global warming to threaten the existence of polar bears. That’s not to say Arctic sea ice is doing exceedingly well or even that it’s near average. For the record, ice extent appears to have registered a new record low this winter, though any alarm is dampened by the fact El Niño and an overall warm Pacific ocean contributed to more heat across the globe, and likely significantly so. On the flip side, it’s true also that the Arctic has not experienced the death spiral that was predicted by so many. That goes for both ice and polar bears. A new study conducted by scientists at Lakehead University in Canada should help alleviate any concerns we might have that polar bears are nearing extinction.

The authors write, “[W]e suggest that the current status of Canadian polar bear subpopulations in 2013 was 12 stable/increasing and one declining (Kane Basin). We do not find support for the perspective that polar bears within or shared with Canada are currently in any sort of climate crisis.” They continue: “We show that much of the scientific evidence indicating that some polar bear subpopulations are declining due to climate change-mediated sea ice reductions is likely flawed by poor mark–recapture (M-R) sampling and that the complex analysis models employed to overcome these capture issues apparently fail to provide accurate estimates of the demographic parameters used to determine subpopulation status.”

These findings are more or less in line with other studies, including one by Dr. Susan Crockford of the Global Warming Policy Foundation. Last year she wrote, “On almost every measure, things are looking good for polar bears. Scientists are finding that they are well distributed throughout their range and adapting well to changes in sea ice. Health indicators are good and they are benefiting from abundant prey.” Moreover, other estimates show that the polar bear population has increased significantly over the years and now sits in the tens of thousands, perhaps as high as 30,000.

Paradoxically, alarmists may be looking at the situation totally backwards — and again we turn to Crockford for explanation. As The Daily Caller’s Michael Bastasch writes, “Shrinking Arctic sea ice may not be the real threat to polar bears. Veteran zoologist Susan Crockford argues that thick spring ice is a bigger problem for polar bears than sparse summer ice.” Crockford says, “Thick spring ice near shore drives seals to give birth elsewhere because they cannot maintain their breathing holes in the ice. This leaves mothers emerging from onshore dens with newborn cubs with nothing to eat at a time when they desperately need food: cubs die quickly, mothers more slowly.”

We conclude by noting the Lakehead University researchers aren’t what we would label climate “deniers.” In the report, they write, “We see reason for concern, but find no reliable evidence to support the contention that polar bears are currently experiencing a climate crisis. We suggest that the qualitative projections for dramatic reductions in population numbers and range are overly pessimistic given the response of polar bears, climate, and sea ice to the present.” In other words, they’re demonstrating an objective approach to the scientific evidence. And that’s something the entire climate community should emulate.

SOURCE  





The EPA Is Using Private Emails to Talk to Lobbyists

A recent report from the Daily Caller highlights how the Environmental Protection Agency frequently uses private email accounts to communicate with environmental lobbyists, ducking the transparency and record-keeping requirements that are supposed to bind the agency.

One characteristic email from a lobbyist for green advocacy groups, obtained under the Freedom of Information Act (FOIA), explicitly requested that EPA Senior Counsel Joe Goffman forward an email to EPA Administrator Gina McCarthy's private account.

“Joe," the email begins, "Would you please send this email to Gina for me? I would have sent it to her directly with a cc to you but I don’t have a private email address for her and would prefer to not use an office email address,” Following that introduction is a message outlining specific concerns about a pending regulation, and how it would impact the author's clients.

Upon seeing the report, Executive Director of FreedomWorks Foundation Curt Levey, who heads the organization’s regulatory reform project, commented: "Under the best circumstances, the growth of the regulatory state is a threat to the constitutional limits on the power of the federal government. The cronyism and contempt for accountability at these executive branch agencies only makes the problem worse. Not only are the regulatory agencies run by unelected bureaucrats, with no incentive to do right by the American people, but they continue to act in ways that indicate that they think they are above the law."

While the private email server used by Hillary Clinton when she served as Secretary of State may be the most outrageous example, it appears that this type of behavior is far from an anomaly in Washington. Transparency guidelines exist for a good reason; government is uniquely positioned to impose burdens on businesses and individuals, and enforce them with any legal means necessary. Such power is dangerous if unchecked, and so the American people have a right to know what regulators at the EPA are up to. By using private email accounts, the agency robs the public of that ability.

Private communications with lobbyists indicate a desire to cut deals or trade favors far away from the watchful eyes of the citizenry, a motivation that can't be good for freedom of any kind. The EPA is doing this in more than a few cases, and who knows what other federal agencies are doing the same or worse. So long as government bureaucrats sufficiently cover their tracks, even FOIA requests are unlikely to uncover the truth.

Al this underscores the need for restoring the separation of powers originally intended by America's Founding Fathers and enshrined in the Constitution – that is, Congress makes the laws and the executive branch executes them. Federal bureaucrats, who have little accountability to voters in the best case and even less when they evade transparency requirements, must be prevented from writing de facto laws under the guise of interpreting legislation.

SOURCE  





Health Officer Gets Migraines When Visiting Wind Project

DENMARK, WI - Brown County appears to be digging a deeper and deeper hole for itself as more facts come to light surrounding Duke Energy’s Shirley Windpower. After an unusually long almost 3 month delay in satisfying a resident’s open records request, the records ultimately provided expose that former Brown County Health Officer Chua Xiong feels ill when visiting the Shirley Wind facility. In an email to her intern Carolyn Harvey she states:

“Carolyn the times I have been out there by the Wind Turbines, l get such migraine headaches. I think I should take some preventative Tylenol before I head out there.”

Despite this admission, approximately one month later Ms. Xiong went on to make her declaration that “Currently there is insufficient scientific evidence-based research to support the relationship between wind turbines and health concerns.” She then went further in saying that this was her “final decision” and that she would only monitor the situation “on an annual basis”. In this decision she completely ignored the real world health impact of Duke Energy’s wind turbines on Brown County families as evidenced through their sworn affidavits and their documentation of past and continued suffering, not to mention her own repeated migraines when in proximity to Duke’s turbines.

So what has happened between Ms. Xiong’s declaration and the March 18th release of the open records showing that Brown County’s Health Officer Chua Xiong suffers migraines when she is by the Shirley Wind turbines? On March 4th, Ms. Xiong submitted her resignation to County Executive Troy Streckenbach. He did not share this with County department heads until just two days prior to March 18th, Ms. Xiong’s last day. This date also coincides with Executive Streckenbach’s announcement of Brown County Corporation Counsel Juliana Ruenzel’s resignation.

Ms. Ruenzel served as head legal counsel who participated in all closed sessions meetings regarding Shirley Wind, and was in charge of reviewing open records requests. The facts of her resignation have not been disclosed. According to sources, Ruenzel opted to not state why the sudden departure.

It is high time that Brown County and its Health Director follow the lead of its own Board of Health who unanimously declared Duke’s wind turbines in Glenmore a “Human Health Hazard”. They need to recognize that residents are sick, homes have been abandoned, that outsiders (even the County’s own Health Director) feel ill while in the project area, and FINALLY do whatever is necessary to protect the health and safety of southern Brown County residents. Brown County does not need Shirley Wind to become its Flint, Michigan. Until the County does the right thing and takes action, families will continue to suffer, the County’s inaction will escalate their legal liability, and this issue will not go away.

SOURCE  





Australia: Great Barrier Reef coral bleaching at 95 per cent in northern section -- attributed to global warming

What bulldust!  For a start, coral bleaching is NOT coral death.  It is a stress response that leads to the expulsion of symbiotic algae.  There are about half a dozen things that can cause it.  And the ONE thing that can be excluded as a cause is anthropogenic global warming.  Why?  Because there has been none of that for nearly 19 years.  Things that don't exist don't cause anything.  

The ocean waters MAY have warmed but that will be due to natural factors such as El Nino.  The 2015 and early 2016 temperature upticks were DEMONSTRABLY due to El Nino and other natural factors, as CO2 levels were plateaued at the relevant time.

And it is not at all certain that a small temperature rise causes bleaching.  An ancient coral reef specimen now on display at the Natural History Museum in London is instructive.  It goes back to  160 million years ago.  The exhibit is proof that ancestors of modern corals somehow thrived during the Late Jurassic period when temperatures were warmer and atmospheric levels of carbon dioxide higher than they are today.

And if that's ancient history, how come corals survive in the Persian Gulf today at temperatures up to 8 degrees hotter that what we see in the tropical Pacific?

Bleaching may even be a positive thing. In recent years, scientists have discovered that some corals resist bleaching by hosting types of algae that can handle the heat, while others swap out the heat-stressed algae for tougher, heat-resistant strains.

And a recent study by the Australian Institute of Marine Science showed that warming in Australian waters actually INCREASED coral growth over the 20th century.

I could go on but I think I have said enough

All the points I have made above could have been made by any competent marine biologist  -- and I can provide references for  them all.  But I am not a marine biologist.  I am a psychologist.  What a harrowed world we live in where a psychologist has to give the basic information that marine biologists dare not give.

An aerial survey of the northern Great Barrier Reef has shown that 95 per cent of the reefs are now severely bleached — far worse than previously thought.

Professor Terry Hughes, a coral reef expert based at James Cook University in Townsville who led the survey team, said the situation is now critical.

"This will change the Great Barrier Reef forever," Professor Hughes told 7.30.

"We're seeing huge levels of bleaching in the northern thousand-kilometre stretch of the Great Barrier Reef."

Of the 520 reefs he surveyed, only four showed no evidence of bleaching.  From Cairns to the Torres Strait, the once colourful ribbons of reef are a ghostly white.

"It's too early to tell precisely how many of the bleached coral will die, but judging from the extreme level even the most robust corals are snow white, I'd expect to see about half of those corals die in the coming month or so," Professor Hughes said.

This is the third global coral bleaching since 1998, and scientists have found no evidence of these disasters before the late 20th century.

"We have coral cores that provide 400 years of annual growth," explains Dr Neal Cantin from the Australian Institute of Marine Science.

"We don't see the signatures of bleaching in reduced growth following a bleaching event until the recent 1998/2000 events."

Environment Minister Greg Hunt flew over the reef just eight days ago, before Professor Hughes' aerial survey, and announced some additional resources for monitoring the reef.

"There's good and bad news — the bottom three quarters of the reef is in strong condition," he said at the time.

"[But] as we head north of Lizard Island it becomes increasingly prone to bleaching."

The northern part of the Great Barrier Reef is the most pristine part of the marine park — and that is one possible glimmer of hope.

"On the bright side, it's more likely that these pristine reefs in the northern section will be better able to bounce back afterwards," Professor Hughes said.

"Nonetheless we're looking at 10-year recovery period, so this is a very severe blow."

Professor Justin Marshall, a reef scientist from the University of Queensland, said the reason for these bleaching events was clear.

"What we're seeing now is unequivocally to do with climate change," he told 7.30.

"The world has agreed, this is climate change, we're seeing climate change play out across our reefs."

Professor Hughes said he is frustrated about the whole climate change debate.

"The government has not been listening to us for the past 20 years," he said.

"It has been inevitable that this bleaching event would happen, and now it has.

"We need to join the global community in reducing greenhouse gas emissions.

SOURCE

***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here

*****************************************


Wednesday, March 30, 2016



Arctic sea ice reaches a record low: Scientists say 'disturbing' data points to a long-term trend in global warming

Arrant nonsense.  Arctic temperatures increased FAR more than global temperatures.  So this is a local effect, not a global one.  It is Arctic-specific with no demonstrable relevance to CO2 emissions or the alleged effects of CO2 emissions. Since CO2 emissions were in fact flat overall in 2015 and into 2016, it is DEMONSTRABLE that they did not cause the Arctic warming. Non-change doesn't cause change.  The warming could have been caused by oscillations in ocean currents, oscillations in air currents or subsurface vulcanism.  Nobody knows

The growth of Arctic sea ice this winter peaked reached another milestone.

It recorded the lowest maximum level of ice on record, thanks to extraordinarily warm temperatures.

The National Snow and Ice Data Center says ice covered a maximum of 5.607 million square miles of the Arctic Ocean in 2016.

That's 5,000 square miles less than the old record set in 2015 — a difference slightly smaller than the state of Connecticut.

It's also some 431,000 miles less than the 30-year average. That difference is the size of Texas and California combined.

Records go back to 1979 when satellites started measuring sea ice, which forms when Arctic Ocean water freezes.

This year's ice didn't break the record by much, but it's 'an exclamation point' on a longer-term trend, said Nasa scientist Walt Meier, who helped calculate the data.

The sub-par showing doesn't necessarily mean that the minimum extent this summer will also break a record, scientists said.

The summer minimum is more important for affecting Earth's climate and weather.

Data center scientist Julienne Stroeve says winter temperatures over the North Pole were 16 degrees warmer than normal, while other parts of the Arctic ran 4 to 11 degrees warmer than normal.

Data center chief Mark Serreze said: 'I have never seen such a warm, crazy winter in the Arctic.' 'It was so warm that the Barents Sea was 'pretty much close to ice -free for almost the whole winter, which is very unusual,' Meier said. Stroeve said early indications show that the sea ice is thinner than last year.

A leading but still controversial theory says loss of sea ice in the Arctic may change the jet stream and bring more extreme weather to the US, Stroeve said.

The new report reveals 'just the latest disturbing data point in a disturbing trend wherein climate changes are happening even faster than we had forecast,' Pennsylvania State University climate scientist Michael Mann said.

However, Nasa adds that the cap of sea ice over the Arctic Ocean is always changing.

Each winter it grows dramatically, usually reaching its maximum in March.It melts just as dramatically each summer, reaching its minimum in September.

In 2015-16, that winter growth got off to a leisurely start due in part to a month of unusually warm weather in the region.

They link this to a phenomenon known as the Arctic Oscillation. This involves differences in air pressure over the Arctic and lower latitudes.

Scientists say a shift in the Arctic Oscillation likely weakened the atmospheric barrier between the polar latitudes and the mid-latitudes

SOURCE





More than global warming afflicts endangered Shishmaref

The village of Shishmaref is as much a global symbol as an Alaska town. Located on Sarichef Island off the northern coast of the Seward Peninsula, the tiny community is remote even by Alaska standards. The population, which hovers around 560 residents, is almost entirely Inupiat and the economy primarily built around subsistence rather than cash.

What brought Shishmaref international renown is its precarious state. Sarichef is eroding in no small part due to the pernicious effects of climate change. There’s no question that the village will be abandoned at some point in the relatively near future. This impending fate has brought researchers and reporters from all over the world into town, and their consensus is that the residents of Shishmaref will soon be among the world’s first people to become climate refugees in the truest sense of the term.

The reality is somewhat more complicated.

“Fierce Climate Sacred Ground” is a brief book by Oregon State University-Cascades anthropology professor Elizabeth Marino. Based on her ethnographic studies of the people of Shishmaref, it places their plight in a broader context than just global warming. Drawn from her doctoral dissertation, written while she was a graduate student at the University of Alaska, Fairbanks, it’s a somewhat academically oriented work, but one that remains accessible. It is essential reading for the heightened perspective it offers on a situation too often simplified for the purposes of those with differing political objectives.

The popular version of the story, Marino explains, is that Shishmaref is a small village of people living a deeply traditional lifestyle who are in danger of losing everything to forces far away and beyond their control. This understanding is true to an extent, but it overlooks the history of the community.

Government role
What has caused Shishmaref to stand out for many is the somewhat mistaken belief that unlike climate refugees elsewhere on the planet, residents of this village are losing their homes entirely to climate change. This differs from climate crises in developing nations, where a combination of political ineptitude, economic failures and environmental challenges created vulnerable populations and where rising temperatures are simply the final blow to fragile communities.

As Marino explains, in America we like to think that our own government didn’t play a role in creating problems like the one in Shishmaref. She doesn’t explicitly say it, but this view is useful to both environmentalists and economic conservatives. For environmentalists it means the entirety of the situation can be blamed on human carbon emissions, while conservatives can insist that since the government didn’t place the village on Sarichef, it shouldn’t be responsible for relocating it.

In truth, the government did play an important role in locating the town. Shishmaref has on one level existed hundreds of years, but historically it was a seasonally occupied settlement used by people who migrated across the landscape seeking the best places to gather food.

Early in the 20th century the U.S. government pursued a deliberate policy of ending all nomadic lifestyles among Native Americans. The people of Shishmaref weren’t forcibly collectivized in the way that Natives were elsewhere in the country in the 19th century, but the government’s opening of a school in Shishmaref, coupled with the onset of compulsory education, had the same effect.

For the traditionally mobile Inupiat who settled there, Shishmaref made a certain amount of sense. It’s ideally located for winter hunting on sea ice and close enough to the mainland to access traditional subsistence grounds in summer. It was, however, always tenuous ground to build on.

Desperate situation
Erosion has been impacting Sarichef for a very long time, and as early as the 1970s there was already talk of moving Shishmaref to the mainland.

The problem is that the village itself lacks the resources to do so, while neither the state of Alaska nor the federal government is eager to pick up the tab, and the bureaucratic hurdles are enormous.

Compounding residents’ woes, since it is considered temporary even by some of the people who live there, the village has not seen the sort of upgrades other rural communities in Alaska have received. Instead a series of mostly failed stopgap measures have been taken to try to ward off erosion while the decision of where and when to move the town keeps getting studied and discussed into oblivion. Meanwhile, the steadily lengthening ice-free season has left shorelines exposed to storms that themselves are aggravated by climate change, speeding the pace of erosion and consistently thwarting efforts at maintaining the ground beneath the town.

For residents of Shishmaref, it’s a desperate situation. As a people they have lived in the region for centuries, and they see remaining there as integral to their cultural identity. If, as many have suggested, they simply integrate into other towns, they lose their sense of who they are. For the people of Shishmaref, Marino explains, this would be cultural genocide. Their lands and subsistence lifestyle define them. Everything else about their culture has already been taken away. That they live in a town rather than nomadically was entirely due to decisions made in Washington, D.C., and Juneau. Their present dilemma springs from a history over which they were often deprived of a say. What they want most this time is a voice in their own fate.

Midway through her book, Marino asks, “Is the risk posed to Shishmaref the product of climate change or the product of a history of development that ignored local knowledge and removed local adaptation strategies?” While much of the reporting on Shishmaref has focused on the former cause, Marino’s important book shows us that, in her own words, “the simple equation that anthropogenic climate change = erosion = relocation is not an accurate analysis of this complex sociological system.”

History, she demonstrates, shows us why this is.

SOURCE





The Catch-22 of Energy Storage

Pick up a research paper on battery technology, fuel cells, energy storage technologies or any of the advanced materials science used in these fields, and you will likely find somewhere in the introductory paragraphs a throwaway line about its application to the storage of renewable energy.  Energy storage makes sense for enabling a transition away from fossil fuels to more intermittent sources like wind and solar, and the storage problem presents a meaningful challenge for chemists and materials scientists… Or does it?

Guest Post by John Morgan. John is Chief Scientist at a Sydney startup developing smart grid and grid scale energy storage technologies.  He is Adjunct Professor in the School of Electrical and Computer Engineering at RMIT, holds a PhD in Physical Chemistry, and is an experienced industrial R&D leader.  You can follow John on twitter at @JohnDPMorgan. First published in Chemistry in Australia.

Several recent analyses of the inputs to our energy systems indicate that, against expectations, energy storage cannot solve the problem of intermittency of wind or solar power.  Not for reasons of technical performance, cost, or storage capacity, but for something more intractable: there is not enough surplus energy left over after construction of the generators and the storage system to power our present civilization.

The problem is analysed in an important paper by Weißbach et al.1 in terms of energy returned on energy invested, or EROEI – the ratio of the energy produced over the life of a power plant to the energy that was required to build it.  It takes energy to make a power plant – to manufacture its components, mine the fuel, and so on.  The power plant needs to make at least this much energy to break even.  A break-even powerplant has an EROEI of 1.  But such a plant would pointless, as there is no energy surplus to do the useful things we use energy for.

There is a minimum EROEI, greater than 1, that is required for an energy source to be able to run society.  An energy system must produce a surplus large enough to sustain things like food production, hospitals, and universities to train the engineers to build the plant, transport, construction, and all the elements of the civilization in which it is embedded.

For countries like the US and Germany, Weißbach et al. estimate this minimum viable EROEI to be about 7.  An energy source with lower EROEI cannot sustain a society at those levels of complexity, structured along similar lines.  If we are to transform our energy system, in particular to one without climate impacts, we need to pay close attention to the EROEI of the end result.

The EROEI values for various electrical power plants are summarized in the figure.  The fossil fuel power sources we’re most accustomed to have a high EROEI of about 30, well above the minimum requirement.  Wind power at 16, and concentrating solar power (CSP, or solar thermal power) at 19, are lower, but the energy surplus is still sufficient, in principle, to sustain a developed industrial society.  Biomass, and solar photovoltaic (at least in Germany), however, cannot.  With an EROEI of only 3.9 and 3.5 respectively, these power sources cannot support with their energy alone both their own fabrication and the societal services we use energy for in a first world country.

Energy Returned on Invested, from Weißbach et al.,1 with and without energy storage (buffering).  CCGT is closed-cycle gas turbine.  PWR is a Pressurized Water (conventional nuclear) Reactor.  Energy sources must exceed the “economic threshold”, of about 7, to yield the surplus energy required to support an OECD level society.
Energy Returned on Invested, from Weißbach et al.,1 with and without energy storage (buffering).  CCGT is closed-cycle gas turbine.  PWR is a Pressurized Water (conventional nuclear) Reactor.  Energy sources must exceed the “economic threshold”, of about 7, to yield the surplus energy required to support an OECD level society.

These EROEI values are for energy directly delivered (the “unbuffered” values in the figure).  But things change if we need to store energy.  If we were to store energy in, say, batteries, we must invest energy in mining the materials and manufacturing those batteries.  So a larger energy investment is required, and the EROEI consequently drops.

Weißbach et al. calculated the EROEIs assuming pumped hydroelectric energy storage.  This is the least energy intensive storage technology.  The energy input is mostly earthmoving and construction.  It’s a conservative basis for the calculation; chemical storage systems requiring large quantities of refined specialty materials would be much more energy intensive.  Carbajales-Dale et al.2 cite data asserting batteries are about ten times more energy intensive than pumped hydro storage.

Adding storage greatly reduces the EROEI (the “buffered” values in the figure).  Wind “firmed” with storage, with an EROEI of 3.9, joins solar PV and biomass as an unviable energy source.  CSP becomes marginal (EROEI ~9) with pumped storage, so is probably not viable with molten salt thermal storage.  The EROEI of solar PV with pumped hydro storage drops to 1.6, barely above breakeven, and with battery storage is likely in energy deficit.

This is a rather unsettling conclusion if we are looking to renewable energy for a transition to a low carbon energy system: we cannot use energy storage to overcome the variability of solar and wind power.

In particular, we can’t use batteries or chemical energy storage systems, as they would lead to much worse figures than those presented by Weißbach et al.  Hydroelectricity is the only renewable power source that is unambiguously viable.  However, hydroelectric capacity is not readily scaled up as it is restricted by suitable geography, a constraint that also applies to pumped hydro storage.

This particular study does not stand alone.  Closer to home, Springer have just published a monograph, Energy in Australia,3 which contains an extended discussion of energy systems with a particular focus on EROEI analysis, and draws similar conclusions to Weißbach.  Another study by a group at Stanford2 is more optimistic, ruling out storage for most forms of solar, but suggesting it is viable for wind.  However, this viability is judged only on achieving an energy surplus (EROEI>1), not sustaining society (EROEI~7), and excludes the round trip energy losses in storage, finite cycle life, and the energetic cost of replacement of storage.  Were these included, wind would certainly fall below the sustainability threshold.

It’s important to understand the nature of this EROEI limit.  This is not a question of inadequate storage capacity – we can’t just buy or make more storage to make it work.  It’s not a question of energy losses during charge and discharge, or the number of cycles a battery can deliver.  We can’t look to new materials or technological advances, because the limits at the leading edge are those of earthmoving and civil engineering.  The problem can’t be addressed through market support mechanisms, carbon pricing, or cost reductions.  This is a fundamental energetic limit that will likely only shift if we find less materially intensive methods for dam construction.

This is not to say wind and solar have no role to play.  They can expand within a fossil fuel system, reducing overall emissions.  But without storage the amount we can integrate in the grid is greatly limited by the stochastically variable output.  We could, perhaps, build out a generation of solar and wind and storage at high penetration.  But we would be doing so on an endowment of fossil fuel net energy, which is not sustainable.  Without storage, we could smooth out variability by building redundant generator capacity over large distances.  But the additional infrastructure also forces the EROEI down to unviable levels.  The best way to think about wind and solar is that they can reduce the emissions of fossil fuels, but they cannot eliminate them.  They offer mitigation, but not replacement.

Nor is this to say there is no value in energy storage.  Battery systems in electric vehicles clearly offer potential to reduce dependency on, and emissions from, oil (provided the energy is sourced from clean power).  Rooftop solar power combined with four hours of battery storage can usefully timeshift peak electricity demand,3 reducing the need for peaking power plants and grid expansion.  And battery technology advances make possible many of our recently indispensable consumer electronics.  But what storage can’t do is enable significant replacement of fossil fuels by renewable energy.

If we want to cut emissions and replace fossil fuels, it can be done, and the solution is to be found in the upper right of the figure.  France and Ontario, two modern, advanced societies, have all but eliminated fossil fuels from their electricity grids, which they have built from the high EROEI sources of hydroelectricity and nuclear power.  Ontario in particular recently burnt its last tonne of coal, and each jurisdiction uses just a few percent of gas fired power.  This is a proven path to a decarbonized electricity grid.

But the idea that advances in energy storage will enable renewable energy is a chimera – the Catch-22 is that in overcoming intermittency by adding storage, the net energy is reduced below the level required to sustain our present civilization.

BNC Postscript

When this article was published in CiA some readers had difficulty with the idea of a minimum societal EROI.  Why can’t we make do with any positive energy surplus, if we just build more plant?  Hall4 breaks it down with the example of oil:

Think of a society dependent upon one resource: its domestic oil. If the EROI for this oil was 1.1:1 then one could pump the oil out of the ground and look at it. If it were 1.2:1 you could also refine it and look at it, 1.3:1 also distribute it to where you want to use it but all you could do is look at it. Hall et al. 2008 examined the EROI required to actually run a truck and found that if the energy included was enough to build and maintain the truck and the roads and bridges required to use it, one would need at least a 3:1 EROI at the wellhead.

Now if you wanted to put something in the truck, say some grain, and deliver it, that would require an EROI of, say, 5:1 to grow the grain. If you wanted to include depreciation on the oil field worker, the refinery worker, the truck driver and the farmer you would need an EROI of say 7 or 8:1 to support their families. If the children were to be educated you would need perhaps 9 or 10:1, have health care 12:1, have arts in their life maybe 14:1, and so on. Obviously to have a modern civilization one needs not simply surplus energy but lots of it, and that requires either a high EROI or a massive source of moderate EROI fuels.

SOURCE






A New Study Shows How Climate Science Could Be All Wrong

What could the theory of “ego depletion” possibly have to do with global warming?

Ego depletion is the idea in psychology that humans have a limited amount of willpower that can be depleted. It’s been largely accepted as true for almost two decades, after two psychologists devised an experiment in self-control that involved fresh-baked cookies and radishes.

One group of test subjects were told they could only eat the radishes, another could eat the cookies. Then they were given an unsolvable puzzle to solve. The researchers found that radish eaters gave up on the puzzle more quickly than the cookie eaters. The conclusion was that the radish eaters had used up their willpower trying not to eat the cookies.

Daniel Engber, writing in Slate, notes that the study has been cited more than 3,000 times, and that in the years after it appeared, its findings “have been borne out again and again in empirical studies. The effect has been recreated in hundreds of different ways, and the underlying concept has been verified via meta-analysis. It’s not some crazy new idea, wobbling on a pile of flimsy data; it’s a sturdy edifice of knowledge, built over many years from solid bricks.”

But, he says, it “could be completely bogus.”

A “massive effort” to recreate “the main effect underlying this work” using 2,000 subjects in two-dozen different labs on several continents found … nothing.

The study, due to be published next month in Perspectives on Psychological Science, “means an entire field of study — and significant portions of certain scientists’ careers — could be resting on a false premise.”

Engber laments that “If something this well-established could fall apart, then what’s next? That’s not just worrying. It’s terrifying.”

Actually, it’s science.

As Thomas Kuhn explained in his 1962 book “The Structure of Scientific Revolutions,” this kind of event is typical in the course of scientific progress.

A “paradigm” takes hold in the scientific community based on early research, which subsequent studies appear to confirm, but which can later collapse as findings that don’t fit the paradigm start to accumulate. Kuhn found several such “paradigm shifts” in history.

The ego depletion findings also come as scientists are starting to realize that much, if not most, of what gets published is essentially bogus because it can’t be reproduced by subsequent studies.

“By some estimates,” notes an article in Quartz, “at least 51% — and as much as 89% — of published papers are based on studies and experiments showing results that cannot be reproduced.”

The Quartz article says one reason is a bias in scientific journals to produce “exciting studies that show strong results.”

“Studies that show strong, positive results get published, while similar studies that come up with no significant effects sit at the bottom of researchers’ drawers.”

So what does any of this have to do with global warming?

Democrats routinely accuse Republicans of being “anti-science” because they tend to be skeptical about claims made by climate scientists — whether it’s about how much man has contributed to global warming, how much warming has actually taken place, or scary predictions of future environmental catastrophes.

There’s a scientific consensus, we’re told, and anyone who doesn’t toe the line is “denier.”

Yet even as deniers get chastised, evidence continues to emerge that pokes holes in some of the basic tenets of climate change.

Evidence such as the fact that actual temperature trends don’t match what climate change computer models say should have happened since the industrial age. Or that satellite measurements haven’t shown warming for two decades. Or that past predictions of more extreme weather have failed to come true.

It is certainly possible then, that today’s climate change paradigm — and all the fear and loathing about CO2 emissions — could one day end up looking as quaint as Ptolemy’s theory of the solar system or Galen’s theory of anatomy.

It’s possible. And anyone who believes in science has to admit that.

SOURCE





Calls for Fracking Bans Ignore Sound Science

Some politicians and environmental activists have been quick to call for blanket bans on hydraulic fracturing under claims that the process is poisoning America’s drinking water. Scientific evidence, from both government agencies and independent analyses, proves otherwise.

For instance, The Environmental Protection Agency’s last study, released in June of 2015 and the most comprehensive government study on fracking’s impact so far, clearly states that “we did not find evidence … [of] widespread, systemic impacts on drinking water resources in the United States.”

The EPA’s analysis is hardly the first study to refute the oft-repeated myth that fracking poses a serious threat to American drinking water. In 2009, the Department of Energy conducted a report that declared fracking “safe and effective.” In 2014, the Department of Energy released another study of the Marcellus Shale that found no evidence of fracking contaminating water supplies.

Again in 2014, the National Academy of Sciences released a a report finding that the contamination of water resources in Pennsylvania and Texas were attributable to well leaks, not hydraulic fracturing.

Groundwater aquifers sit thousands of feet above the level at which fracking takes place, and energy companies construct wells with steel-surface casings and cement barriers to prevent gas migration If any leaks or contamination does occur, companies should pay for the economic and environmental damages they cause from such well leaks. But these leaks are not a systemic problem of the industry, much less something that causes widespread polluted water.

Such statements by progressives and environmental activists, manifestly in conflict with actual experience and the science of the issue, pose a serious threat to the vast economic benefits of fracking. Scholars of all stripes agree that fracking is excellent for the economy, providing Americans with jobs, communities near fracking wells with economic booms, and U.S. households with significant energy savings. According a recent Energy Information Administration report:

Wholesale electricity prices at major trading hubs on a monthly average basis for on-peak hours were down 27 to 37 percent across the nation in 2015 compared with 2014, driven largely by lower natural gas prices.

Prices at the pump are down significantly, too, allowing American families to keep more of their money to use for other purposes. The current average price of regular gasoline is less than $2 per gallon. Many factors contribute to the price of gas, but domestic supply is a key component.

We save money not only through lower energy bills and cheaper gasoline, but through cheaper goods and services, because energy is a necessary component for just about all we do. Lower gas prices also reduce input and transportation costs for businesses around the country, savings that are also passed on to consumers through reduced prices in other sectors of the economy.

Moreover, hydraulic fracturing benefits low-income families most of all, which is why the Wall Street Journal termed fracking “America’s best antipoverty program.” Such an energy revolution should be embraced, not rejected out of hand.

Anti-fracking rhetoric not only conflicts with experience and science, but ignores the effective state-based regulatory system in place. The process has been regulated successfully at the state level for decades.

States have the most to gain when they permits fracking to take place, but also the most to lose if the process is done irresponsibly. The states’ effective regulation underscores the need for members of Congress to prevent duplicative federal intervention that would unnecessarily stall the oil and gas boom and drive up costs for producers and therefore consumers.

Fracking has safely provided a much needed boon to the American economy. Attacking it with unfounded rhetoric is an assault not just on the industry itself, but on American businesses and families who benefit from the influx of domestic natural gas and oil fracking companies supply. Congress should resist the demands of the environmental lobby and put more authority in the hands of the states, not less.

SOURCE





Some fun with a Leftist genius

I don't get a lot of emails or blog comments from Leftists but those I do get are invariably abusive.  Most conservative bloggers have that experience, I gather.  But sometimes the abuse is unintentionally amusing.

I recently got an email from an apparently Australian person named Leigh Williams (willeye1978@gmail.com) who gave his mobile phone no. as 0405205252.  He started his first email with something I certainly believe: "I don't know anything about the specifics of climate change science".  But there was no rational argument or presentation of facts after that.  It was just abuse. So it was solid "ad hominem" abuse.

But here's the funny bit:  What did he accuse me of?  He accused me of "ad hominem abuse"!  That good old Leftist projection cut in good and hard!

He appeared to be upset that I had spoken ill of someone but did not say whom.  Since he mentioned climate however, I imagine he might be referring to my comments on writings by Warmism acolyte  Sarah Perkins-Kirkpatrick.  I put up her university picture and called her "gorgeous" in a subtitle to it.  That is abuse? Calling someone "gorgeous" is abuse?  Leigh Williams is certainly in a mental fog.  But most of the Green/Left seem to be in a permanent mental fog.

In any case, there is no reason why Leftists should have any monopoly of criticizing others.  If you offer facts and arguments in criticism of somebody else's claims that is a reasonable and routine thing to do.  When you offer no facts and arguments but proceed straight to abuse that is what is called "ad hominem abuse". And I did offer facts and figures in support of my disagreement with Ms. Perkins-Kirkpatrick. Strictly speaking, an "ad hominem" argument is one where you accept or reject a claim SOLELY because of who made it.  But that was all too deep for the foggy one.

***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here

*****************************************



Tuesday, March 29, 2016



Medical researchers embrace open access to research data

Will Warmists follow suit?  Not likely

Warmist "scientists" regularly disgrace themselves by refusing to make publicly available their raw research data.  What was once a routine courtesy has been destroyed by Warmist crookedness.  But the recent uproar in the social and biological sciences about unreplicable findings makes the issue more critical than ever.  It is clear that researchers regularly sift through their data and report only the bits that they like for whatever reason.  And from a statistician's viewpoint the regular practice in the medical literature of reporting only extreme quintiles is simply laughable.  Who knows what relaationships are obscured by throwing away three fifths of your data?  So there is clearly much to be gained by having the analysis of a dataset open to all comers.

And the medical literature is coming onboard with that.  Below are two scans from the latest issue of JAMA.  The point of the second scan is that even those evil old drug companies are making their raw research data generally available.  So Warmists are less ethical than drug companies.  Drug companies are a favorite hate-object of the Green/Left so  that contrast would embarrass them if they were real scientists.






A vegan who loves nukes

There is a HUGE rant by Geoff Russell on "New Matilda" about  global warming being caused by farm animals.  It bemuses me to see how many words the Green/Left usually take to make their points and this is an example of that. The article seems to go on forever.  The Green/Left must be boiling with rage to pour out so many bile-filled words.  

And despite all those words absolutely nothing is said about how humans have evolved to be omnivores and that any attempt to take meat off our dinner tables would be so widely and strongly resisted as to make the attempt futile. He seems to think it is only a "conspiracy" that keeps us eating meat.  What a wacko!

He also dosn't question global warming orthodoxy but that is unsurprising. It gives him a hook to hang his vegan crusade on.

That he is actually capable of critical thought is revealed by the second oddity about him.  He likes nuclear power.  That's perfectly rational if you believe in the evils of CO2 and CH4 but is rare on the Green/left.

And speaking of CH4, the usual swipe that Warmists take at farm animals is at their farts, which do have a lot of CH4 in them.  But CH4 intercepts warming in certain wavelengths only and water vapour also absorbs those wavelengths so the theoretical effect of CH4 on global warming translates in practice to a nil effect. So that part of Mr Russell's argument is a washout.

It's amusing, though, that Mr Russell aims primarily at fellow Greenies.  He thinks they are conveniently overlooking  a major source of global warming. Just a few excerpts:


The makers of the US eco-ethical-documentary “Cowspiracy” are attempting to explain why the world’s largest environmental organisations have ignored the role of meat in both climate change and more generally in trashing the planet.

They use the well-worn tactic of simply asking them… or trying to. When it comes to slandering people for buggering the planet, Greenpeace apparently thinks it’s more noble to give than to receive, so they aren’t keen on being asked inconvenient questions.

This doco has lots of Michael Moore moments. People knocking on doors, asking pointed questions and getting sheepish looks. All the big US players get a mention: The Sierra Club, Greenpeace, NRDC, Rainforest Action Network, Amazon Watch, and more.

These groups all love asserting the high moral ground and aren’t used to being questioned about their submersion in a deep trench of cattle excrement.

The inconvenient truth is that none of these environmental icons care enough about their beloved planet to order the vegan option, let alone make the whole menu vegan.

In the case of Greenpeace, their PR people did the old “turn that camera off” shuffle and refused to be interviewed; … priceless!

But after all the fun and games… does Cowspiracy actually explain the inaction of at least the US environmental movement on the meat and dairy industries? Is it really a conspiracy? Is it organised and funded?

US Professor of Nutrition, Marion Nestle blew the whistle years ago with “Food Politics” on how the meat industry stacked and bullied US Government nutritional advice committees.

Cowspiracy lacks Nestle’s academic rigor, but still delivers a few hits.

When asked if the meat and dairy industries donate to environmental organisations, the Animal Agriculture Alliance spokesperson looked like a kid caught with both hands and feet in the cookie jar, and said she couldn’t comment. She refused to answer a direct question about funding Greenpeace.

In Australia, the funding link is clear and a matter of public record. As is the lack of any major campaign against meat by the big green groups (ACF, FOE, AYCC, Greens to name but a few) getting this funding. Tim Flannery is also a recipient of pastoral largess from the bovine broverhood.

Let’s be clear here: different meats have different impacts. It gets tiresome to differentiate constantly, so I’ll do it once now.

Ruminants are the primary climate culprits by way of methane and deforestation, while pigs and chickens primarily pollute air, water and other foods while diverting deforested land from food to feed, while also killing people directly via new diseases (e.g. Swine Flu) while adding to our risk of losing antibiotics.

The cattle barons supporting our big green groups obviously don’t care that their funding is common knowledge. Why? Probably because our mainstream media don’t give a damn. Aussie BBQ culture is at least as strong here as in the US; and don’t forget meat industry advertising.....

Environmental tribalism has our environmental groups automatically anti-GM and anti-nuclear as a matter of ideology. This illustrates a profoundly anti-science bias. They simply don’t get it.

You can’t credibly accept climate science but reject any other science which contradicts your policies. All the science of the last 30 years on the causes of cancer and the mechanism of DNA repair contradict the radiophobia behind green anti-nuclear policy.

When science conflicts with your policy, you may wait a little to make sure the science is solid and well supported, but if it is, then you change your policies. Any high school student can understand this, except perhaps those in AYCC.

When your science is shallow and you don’t really understand the process, you tend to pick and choose what you like. But science isn’t like that.

The human population, even the 9 billion of us expected by 2050, could actually live without doing too much environmental damage if we ate at the bottom of the food chain (vegan) and used nuclear power for all our energy needs.

Energy doesn’t have to have a large adverse footprint on the planet, unless we go with sources having a low power density, like wind, solar and biofuels. It is ironic that our environmental movement has opted for the sources of energy that will have the most impact on wildlife habitat, and therefore biodiversity.

SOURCE





Global warming will dump rain on dry areas – but not in a helpful way (?)

Some unwarranted journalistic enthusiasm below. The Donat study simply showed that there will be more big storms.  It showed nothing about how helpful or unhelpful or how useful or unuseful they would be.  Donat speculated about that but his study had no way of showing it

New research challenges the view that drier areas will get drier with global warming. Climate scientists suggest that as the world warms, dry regions will get more rain. Drought-stricken farmers, rejoice!

Hold the champers. While there will be more rain overall on populated areas, it's unlikely to be useful, and may make life harder for those unused to regular drenchings.

In a study published in Nature Climate Change, climate scientists from Sydney's University of New South Wales and the Massachusetts Institute of Technology looked at 60 years of climate observations and modelled future rainfall.

They found the tropics will receive more rain with climate change – as will arid areas such as western and central Australia, California, central Asia and southwestern Africa.

Most of the rainfall will be tied up in massive storms that could lead to flash floods. Areas used to little rainfall may not be able to handle such deluges.

"The concern with an increased frequency and in particular intensity of extreme precipitation events in areas that are normally dry is that there may not be infrastructure in place to cope with extreme flooding events," lead author Markus Donat says.

And when it's not pouring, the added heat in the atmosphere will lead to more evaporation.

While climate modelling studies have suggested that wet parts of the world will become wetter while dry parts of the world will become drier, this, the authors argue, holds for large-scale simulations over oceans, not land.

To determine how wet and dry parts of the world will fare under climate change, the researchers decided to steer clear of comparing wet places to dry and try to work out the complexities between the two. Instead, they compared like with like.

When the modelling was extended into the late 21st century, he saw that rate continue for arid regions.

They took the most extreme rainfall (as in, the most that fell in a day in a year) from similarly dry land in Australia, Asia, Africa and elsewhere from 1951 to 2010. They repeated it with similarly wet regions across the world.

They averaged, separately, the extreme rainfall events across the wetter and drier areas. Over the 60-year period, they saw the fraction of annual rain that falls on the wettest day of the year matched what's known as the Clausius-Clapeyron rate.

Named after German Rudolf Clausius and Frenchman Benoît Clapeyron, both physicists known for their work in thermodynamics, the Clausius-Clapeyron rate predicts those days of extreme rainfall should increase by 6 to 7% per 1°C of warming.

Donat's simulations, using a general climate model, of that period matched the observations. When the modelling was extended into the late 21st century, he saw that rate continue for arid regions.

And while the tropics will receive more rain, Donat admits exactly how much is as yet unclear. This could be because there's simply less historical data from those areas.

William Ingram, a climate scientist at Oxford University, writes in a News and Views article that while the work won't help local meteorologists forecast days of extreme rainfall, it tells us "how risks will change – which is precisely the information needed by emergency planners".

SOURCE





Bees: What happened on Oahu didn’t stay on Oahu

Scientific detective work stopped cholera – now it needs to separate myths, mites and neonics

Paul Driessen

If modern activist groups held sway in the mid-nineteenth century, countless multitudes would have died from typhoid fever and cholera. The “miasma” paradigm held that the diseases were caused by foul air arising from putrid matter – and only dogged scientific work by William Budd, John Snow and others finally convinced medical and health authorities that the agent was lethal organisms in drinking water.

Ultimately, the investigators’ persistence led to discoveries of Vibrio and Salmonella bacteria, the use of chlorine-based disinfectants for drains, water purification and hand washing, programs that kept sewage away from drinking water supplies, and steady advances in germ and virus theories of medicine.

Parallels exist today, with activist politics driving the science, rather than solid science guiding informed public policy decisions. One such arena is neonicotinoid pesticides and large-scale bee deaths.

Europeans introduced domesticated honeybees to North America in the early 1600s. They helped foster phenomenal growth in important food crops like tomatoes and almonds. Indeed, over 60% of all U.S. beehives are needed each spring just to pollinate California’s extensive almond groves. By contrast, staples like wheat, rice, corn and most citrus fruits do not require animal pollination at all (by bees, hummingbirds, hover flies, butterflies and bats); these crops are self-pollinating or wind-pollinated.

Commercial beekeeping grew steadily, and today about 1% of all beekeepers manage nearly 80% of the 2.7 million U.S. honeybee colonies. The system generally functioned well until 1987, when a vicious new pest arrived. As the appropriately named Varroa destructor mite spread, beekeepers began reporting major to total losses of bees in Iowa, Michigan and Wisconsin hives in spring 2006, and later in Florida, the Dakotas, southern states, both U.S. coasts, Europe and elsewhere.

Dubbed “colony collapse disorder” (CCD), the problem led to scarifying news stories about a “bee-pocalypse” and the imminent demise of modern agriculture. However, inexplicable bee colony losses had been reported in 1898, 1903, the 1960s and 1970s – even as far back as 940 AD in Ireland!

Explanations included an undefined “disappearing disease,” organophosphate pesticides, cell phone towers, GM crops that embed Bt insect killers in their genetic makeup, climate change (of course), and even a lack of “moral fiber” in bees, Paradigms and Demographics blogspot editor Rich Kozlovich notes. A psychic, he adds, claimed she was communicating with domesticated bees, who told her they were tired of being enslaved by humans and were leaving their hives to protest their crowded, inhumane conditions!

Mounting evidence suggests that today’s die-offs are primarily due to Varroa mites, along with parasitic phorid flies, Nosema fungal parasites, the tobacco ringspot virus – and even beekeepers misusing or over-using pesticides in hives to control disease outbreaks, by killing tiny bugs on little bees.

However, anti-pesticide activists and some news stories continue to blame colony deaths and other bee problems on neonicotinoid insecticides. This new class of chemicals protects crops primarily (97% of the time) by coating seeds, letting plants incorporate the pesticide into their leaves and stems, to target insects that feed on them, without harming beneficial bugs. The regular rotation of different neonic products is also the only means currently available to kill the Asian psyllids that spread “citrus greening disease” (HLB), which is decimating citrus groves in Florida and is now spreading to Texas and California groves.

This is where solid scientific detective work becomes vital. Without it, the wrong conclusions are drawn, the wrong “solutions” are applied, and the unintended consequences can be serious. For example, banning neonics will likely mean farmers are forced to use insecticides that truly are dangerous for bees.

Over the past 50 years, Varroa mites have killed off millions of honeybee colonies around the world, scientists note. Among the diseases the mites carry is deformed wing virus, which results in short, twisted or otherwise deformed and useless wings. Like many other viral infections, DWV had long been present in hives, but was generally considered harmless before Varroa became ubiquitous. Disease-carrying mites bite through the bees’ hard shell (exoskeleton) and inject viruses and infections directly into the bee blood (hemolymph). The mites’ saliva also carries an enzyme that compromises the bees’ immune systems, making the diseases far more toxic. Modern transportation methods disperse the problems far and wide.

Making the beekeepers’ challenge even more daunting, female Varroas often lay eggs in the same hexagonal beehive cells where the queen lays newly fertilized eggs, before worker bees “cap” the incubator cells. New honeybees then emerge with an infected mite already attached. And to top it off:

Trying to kill vicious bugs you can’t even see, in a box filled with some 40,000 buzzing bees that you don’t want to hurt, using chemicals that could easily become toxic – and that the Varroa mites quickly become resistant to – is a devilishly complicated business, beekeepers like Randy Oliver attest. In fact, they are already on their third generation of miticides, and Varroa have become resistant to all of them. So the battle rages on, as pesticide companies again try to gain the upper hand against the crafty pests.

Varroa was discovered on Oahu in August 2007. By spring 2008, 274 of 419 honeybee colonies on Oahu had collapsed, and wild bees had disappeared from its urban areas. Despite quarantine measures, by late 2010 the mite spread throughout the island of Hawaii. Now even effective Varroa control cannot eradicate DWV, since the disease is in their hemolymph and transmitted through feeding and sexual activity.

Studies in the United Kingdom and New Zealand found similar mite, DWV infection and CCD patterns.

Another nasty plague on honeybee houses involves parasitic phorid flies, which have now been found in California, Vermont and South Dakota hives. The flies stab bee abdomens and lay their eggs inside. When they hatch, fly larvae attack the bees’ bodies and brains, disorienting them and causing them to fly in circles and at night – giving rise to stories about zombie bees, or “zombees.” As the larvae mature into new flies, they exit the bees at their necks, decapitating them. Not surprisingly, phorid flies also carry DWV, Nosema parasites and other bee diseases.

Meanwhile, in the real world where bees interact with nature, agriculture and pesticides (rather than with artificial laboratory conditions and egregious over-exposure to those pesticides), multiple studies in Canadian and other countries’ canola and corn fields have concluded that neonicotinoids do not harm bees when used properly. And in equally good news, U.S. Department of Agriculture, StatsCanada, EU and UN data show that bee populations have been increasing over the past several years, with American and Canadian colony totals reaching their highest levels in a decade or more.

And yet, news stories still say neonics threaten domesticated and wild bees with zombee-ism and extinction. That’s partly because anti-pesticide groups are well funded, well organized, sophisticated in public relations, and aided by journalists who are lazy, gullible, believe the activist claims and support their cause, or simply live by the mantra “if it bleeds, it leads.” A phony bee-pocalypse sells papers.

The activists employ Saul Alinsky tactics to achieve political goals by manipulating science. They select and vilify a target. Devise a “scientific study” that predicts a public health disaster. Release it to the media, before honest scientists can analyze and criticize it. Generate “news” stories featuring emotional headlines and public consternation. Develop a Bigger Government “solution,” and intimidate legislators and regulators until they impose it. Pressure manufacturers to stop making and selling the product.

Too often, the campaigns are accompanied by callous attitudes about the unintended consequences. If banning neonics means older, more toxic pesticides kill millions of bees, so be it. If a DDT ban gives environmentalists more power and influence, millions of children and parents dying from malaria might be an acceptable price; at least they won’t be exposed to exaggerated or fabricated risks from DDT.

When activism and politics drives science, both science and society pay dearly. The stakes are too high, for wildlife and people, to let this continue. The perpetrators must be outed and defanged.

Via email






EPA Chief: Climate Regs Meant To Show ‘Leadership’, Not Fight Global Warming

Environmental Protection Agency Administrator Gina McCarthy admitted her agency’s signature regulation aimed at tackling global warming was meant to show “leadership” rather than actually curb projected warming.

McCarthy admitted as much after being questioned by West Virginia Republican Rep. David McKinley, who pressed the EPA chief on why the Obama administration was moving forward with economically-damaging regulations that do nothing for the environment.

“I don’t understand,” McKinley said in a Tuesday hearing. “If it doesn’t have an impact on climate change around the world, why are we subjecting our hard working taxpayers and men and women in the coal fields to something that has no benefit?”

“We see it as having had enormous benefit in showing sort of domestic leadership as well as garnering support around the country for the agreement we reached in Paris,” McCarthy responded.

McKinley was referring to EPA’s so-called Clean Power Plan, which forces states to cut carbon dioxide emissions from coal-fired power plants. The CPP is expected to double the amount of coal plant closings in the coming years, and even EPA admits it won’t have a measurable impact on projected global warming.

EPA has long argued the point of the CPP was to show the world America was serious about tackling global warming in order to galvanize support for United Nations delegates to sign a global agreement to cut emissions. Nearly 200 countries agreed to a U.N. deal last year.

“But even then no one is following us,” McKinley said. “Since that Paris accord China has already announced that they’re going to put up 360 [coal plants]. India has announced that they’re going to double their use of coal since the Paris accord.”
China has made promises to curb its coal use in order to tackle the country’s horrible air pollution problems, but China still plans on using more coal in the future. Likewise, India promised in December to double its coal production by 2020.


EPA, however, has bigger problems than global concern over warming. The Supreme Court forced the agency to stop implementing its rule in February, siding with a coalition of 29 states and state agencies suing to have the CPP thrown out.

SOURCE





Tasmania is on the brink of an entirely avoidable power crisis

Because of Green bribery for "renewable" power from the former Gillard government, Tasmania ran down its big hydro dams.  So the water is not now there when it is needed to cover a drought

Tasmania appears to be on the brink of a crisis, with the island state only weeks away from serious blackouts if there is no significant rainfall.

The seriousness of the issue at hand isn’t suggested by Techly as being down to mismanagement by Tasmanian officials, simply a sequence of unforeseen problems.

Multiple sources in Tasmania and the mainland describe the situation as dire.

Tasmania has just two months supply of water to feed its hydroelectric dams, unless there is significant rainfall. Energy storage, or the level of water available to generate hydro-power, is at historic lows. Rainfall into catchment areas in the past 12-months has been around one-third of projected rainfall, based on thirty-year modelling. Without hydropower, Tasmania’s energy demands at normal peaks far exceed current generation.

Dam levels were reduced during the carbon tax era, where hydroelectric or carbon neutral power generation was extremely valuable. Hydro Tasmania, the body who maintain and run a series of 55 major dams and 30 hydropower stations within, was very profitable during this time, as it drained water for great revenues.

Indeed, in the quirks of the carbon tax arrangements, the sale of renewable energy certificates or RECs accounted for more than 70 per cent of revenue inflows. (It is not suggested that reducing dam levels during this time was malfeasant.)

Basslink. Tasmania is supplied both power and data connections via the Basslink submarine cable. That cable is no small matter – it runs for 370 kilometres undersea, it is rated to 500MW and cost over a half a billion dollars to install between 2003-06, including testing and commissioning.

However, on 21 December 2015, it was announced the Basslink was disconnected due to a faulty interconnector. Given the cable is underwater, and the fault was located as around approximately 100 kilometres off the Tasmanian coast, the Basslink controlling body called Basslink first announced that it would be repaired and returned to service by 19 March 2016.

That date has since fallen into the abyss as more than 100 experts, including 16 or more from Italy, plus a specialist ship, try to fix the cable. Basslink advised on March 13th that the cable would be fixed by late May.

Normally, a Basslink outage isn’t a big deal. The mainland has to adjust how it distributes power across the Eastern Seaboard, and given the cable supplies an absolute peak of 500MW, it doesn’t shoulder the entire load, but provides greater flexibility for operators, and reduces the average cost of power. It also helps to balance peak and off-peak loads across the grid.

Additional power from non-renewables in Tasmania includes three significant gas turbine and thermal power stations which provide 535 MWh of power at full capacity.

But Tasmania has far more hydroelectric power – more than 2300MW of hydropower at full capacity.

Techly understands that if Basslink can’t be fixed for an economic cost, it may not be fixed at all, depending on the assessments currently underway.

SOURCE

***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here

*****************************************



Monday, March 28, 2016


The Next Great Global Warming 'Hiatus' Is Coming!

A strange argument from astrophysicist Ethan Siegel below.  He admits the recent influence of El Nino and says that recent temperaure upticks therefore prove nothing by themselves.  But he adds that the long term trend measured over a century or more is upwards and that THAT is what we should worry about.

He says "But the fact that the global average temperature is rising — and that it continues to rise — is a real long-term problem facing the entire world".

But how does he know that?  There has been NO steady rise in temperature in C20.  All we can factually say about the 20th century temperature record is that there were some periods of warming and other periods of cooling or stasis.  Whether we will have more periods of warming is completely unknown.  We could have more periods of cooling that will undo past warming.  Nobody knows.  He has a sort of paranoid certainty about him that is supported by no data

UPDATE: I must say that I found Siegel's reasoning fascinating -- fascinating in a psychiatric sort of way. I suspect that he may be manic. The file picture of himself that he puts up would be unusual in a well person.



But I will do him the courtesy of taking him seriously and will add a few more comments.

You can certainly put a rising trend line through the C20 temperature record and it appears that Siegel has taken that for reality. It isn't.  It is a statistical artifact only.  And it does not describe the data well.  Perhaps the most striking feature in the data is the long stasis between 1945 and 1975 -- 30 years, almost a third of the record.  That totally busts any tale of continually rising temperatures.  It makes the C21 "hiatus" of nearly 20 years look ephemeral in comparison.

Warmists of course dismiss the C21 stasis as in some way not meaningful.  But how long does it have to be for it to be meaningful?  Warmists don't say these days but when the C21 stasis was young, they  said it would have to be 15 to 17 years long to represent anything.  Now that it has exceeded that mark they no longer put a number on the matter, which reveals their argument as unscientific.  If they had any basis for accepting or rejecting meaningfulness, they would be able to put a number on it.

So in that context the 30 year stasis tells us nothing either.  But it does.  It tells us that the temperatures are unpredictable.  It tells us that we do NOT know what the future may bring.  But since we are at the end of a warm interglacial, my bet would be on future cooling


Global warming has been occurring at a steady rate for many decades now — possibly for over a century, depending on how you interpret the temperature records — with the past few years setting unprecedented temperature records around the globe. If you go back to 1948-49, the earliest time we’ve had global temperature maps for the entire world, you’ll find that over the vast majority of the Earth, there are more locations seeing the warmest temperatures right now than at any other time. But in terms of “cause for alarm,” what does this actually mean?

The first thing we have to realize is that there are two things at play here: long-term trends, which is the gradual warming we’re seeing over generational timescales, and short-term variations, which are due to things like the seasons, volcanic eruptions, and weather events like El Niño and La Niña. The record-breaking temperatures we’re seeing across the globe are due to a combination of all the short-term and long-term variations superimposed atop one another, and so although last month — February of 2016 — was the hottest month ever recorded, that isn’t necessarily a reason to freak out.

You see, we’re currently experiencing an El Niño event. If you take a look back through the temperature record, many of the largest upward “spikes” you see are due to El Niño years, such as the famous one in 1998. In fact, if you take a look at global average temperatures throughout Februaries, we haven’t had one warmer than the one in 1998 until now.

This peak in temperatures that we’re seeing now, the one that spans from 2015-2016, isn’t due to global warming. That is to say, most of the anomalously high temperatures we’re seeing are due to these short-term variations. But what should be far more concerning to anyone who wants to know the truth about climate change is this: the long-term rise in temperatures is continuing at a steady rate. The fact that temperatures appear to be rising at a rate of between  0.40-0.80 °C (0.72-1.44 °F) per century, unabated, is the real cause for concern. That’s what global warming really is, the slow, long-term rise in temperatures. That’s also the component that humans — through emissions reduction, energy efficiency, renewable power, policy changes and (possibly) geoengineering — can do something about.

But there’s an insidious argument that’s going to come up over the coming years (and possibly the next decade or two), once the current spike in temperature subsides: the idea that global warming will have stopped. Global warming doesn’t just stop. It won’t stop unless there’s a causative reason for it to stop, and — at present — there isn’t one. But because the long-term rise (i.e., the “global warming” component) is gradual, and the short-term variations (i.e., the fluctuations above an below the trend-line) are large, it’s going to appear, over 13-to-17 year timescales, that global warming has ceased.

This is because the long-term rise can be easily masked by short-term variations, and the Berkeley Earth Surface Temperature (BEST) study — the one conducted by global warming skeptics that reached the same conclusions as the rest of the climate science community — reached the following conclusion:

"Some people draw a line segment covering the period 1998 to 2010 and argue that we confirm no temperature change in that period. However, if you did that same exercise back in 1995, and drew a horizontal line through the data for 1980 to 1995, you might have falsely concluded that global warming had stopped back then. This exercise simply shows that the decadal fluctuations are too large to allow us to make decisive conclusions about long term trends based on close examination of periods as short as 13 to 15 years"

There are prominent climatologists who have made these arguments before (who will likely make these arguments again), and they will be quoted in a great many news outlets and by numerous science writers. If you see an article that cites one of them claiming global warming has stopped and it isn’t yet 2033, the 17 years from now that we’re required to wait to see if the rise continues, please refer them back to this article.

Temperature spikes, like the one we’re experiencing now, are temporary, and in all honestly are part of the normal variations we experience over the short term. But the fact that the global average temperature is rising — and that it continues to rise — is a real long-term problem facing the entire world. Don’t let dishonest arguments that gloss over the actual issue dissuade you from the scientific facts. We can fool ourselves into believing that there isn’t a problem until it’s too late to do anything about it, or we can own up to what the science tells us, and face this problem with the full force of human ingenuity. The choice is ours.

SOURCE  



Fracking, methane and Bill McKibben

Well-known Warmist preacher Bill McKibben has an article out under the heading: "Global Warming’s Terrifying New Chemistry".  He is easily terrified in the hope that we will be too. Like a lot of articles in Leftist publications, it is VERY long-winded.  I sometimes wonder about that.  If they had purely factual statements to make there would surely be no more than a few paragraphs needed.  Readers will note that my posts are very short.  If you know what you are talking about, it doesn't take long to say it.

Anyway, I will not attempt to reproduce any of the huge rant concerned.  The point of the article can indeed be presented with great brevity.  McKibben says that fracking releases methane into the atmosphere and that methane there will soon fry us with global warming.  So he wants to stop fracking!

Such a simple story and so wrong.  It's probably true that atmospheric methane levels have increased as a result of leaks from fracking but does that matter?

No.  It is true that methane can absorb some heat from the electromagnetic radiation that we get from the sun.  And molecule for molecule, it absorbs a lot more heat than does CO2.

Warmists normally stop the discussion there.  But the atmosphere is a complex thing and we have to look at methane in the context of what normally goes on in the whole atmosphere. And it so happens that water vapour absorbs the same wavelengths that methane does.  And there is a heck of a lot more water vapour in the atmosphere than methane.  So the water vapour will already have intercepted most or all of the wavelengths that methane might -- leaving no heating effect due to methane. The effects of CH4 are completely masked by H2O.  So methane is a POTENTIAL warming gas but not an ACTUAL one.  No foreseeable increase in methane would generate any increase in warming.

Isn't it strange that in his long article Bill McKibben found no space to discuss that matter?  Just another climate crook.





Coral Reefs Bounce Back Despite Warming Of Oceans

This study is one of many to find that corals are very resilient

Coral reefs have managed to bounce back, despite being under constant threat of extinction. However, marine scientists caution these fragile ecosystems are still being threatened by global warming, pollution and human activity.

The discovery of a large number of coral reefs in excellent health has been quite a joyous occasion for the researchers who routinely deal with ominous news like mass die-offs, worldwide bleaching events, oil spills, and such other calamities which have been pushing the coral reefs towards extinction, reported The Washington Post.

A decade-long study of remote islands in the Central Pacific has indicated that these coral reefs might survive despite threats posed by global warming brought on by climate change and warming of the oceans due to increasing amounts of carbon dioxide introduced by burning of fossil fuels.

In a large scale study covering 56 islands, researchers studied 450 locations that were once teeming with coral reefs. Researchers looked at regions spanning from Hawaii to American Samoa. They even investigated locations in the remote Line and Phoenix Islands as well as the Mariana Archipelago. To their surprise, they realized there are quite a few locations where coral reefs have defied the odds and bounced back to life. Smith’s report was published recently in the journal Proceedings of the Royal Society B.

The researchers wanted to investigate the impact of climate change as well as a 1998 El Nino event that led to widespread bleaching. Since 1998, coral reefs had been increasingly banishing the symbiotic algae that gave them their brilliant colors and welcoming seaweed, which encroaches on the real estate once occupied by the corals. Study leader Jennifer Smith, a professor at Scripps’ Center for Marine Biodiversity and Conservation said the following.

    “After a bleaching event, it really matters what happens to all those dead skeletons. Do they get colonized by big seaweeds, or do they get covered by coralline algae, which are providing settlements for baby corals and providing an environment that facilitates recovery.”

Majority of the reefs that have shown signs of regaining their structure are located near far-flung islands. They are significantly healthier as compared to the reefs near islands that are heavily populated and frequented by humans. In other words, human influence, coupled with coral reef bleaching event — fueled in part by El Niño-driven Ocean warming — has had its detrimental effect on the delicate undersea ecosystem. Such was the impact and scientists had painted a very gloomy picture stating up to 70 percent of coral reefs would vanish before 2050.

It now appears the fear that these reefs were on their way to extinction, has been largely alleviated. The coral reefs that have clearly bounced back strongly indicate that such features won’t fade from existence in the coming decades, as previously feared. Speaking about the discovery of such healthy coral reefs, Smith explained its significance for the researchers.

    “There are still coral reefs on this planet that are incredibly healthy and probably look the way they did 1,000 years ago. The scientists were practically in tears when we saw some of these reefs. We’ve never experienced anything like it in our lives. It was an almost religious experience.”

Smith seems justifiably euphoric because just like environmental science, coral-reef researchers have been dealing with dying and degraded ecosystems, which can be a traumatic and rather depressing experience. However, the sight that greeted the researchers is certainly a breath of fresh air, continued Smith.

    “It’s hard to fathom. I would jump into the water and there would be so much coral, so many different species of fish, so much complexity and color. I would find myself underwater, shaking my head, looking around in disbelief that these places still existed.”

Though coral reefs occupy less than 0.1 percent of the ocean floor, they shelter close to 25 percent of all marine species, reports Los Angeles Times. Besides helping oceanic life, coral reefs also offer food, tourism and flood protection to human settlements along the coastline.

SOURCE  






New Survey Casts More Doubt On The ‘97% Consensus’ On Global Warming

A recent survey conducted by George Mason University of more than 4,000 American Meteorological Society (AMS) members found about one-third of them don’t agree with the so-called global warming “consensus” that humans are the cause of most recent warming.

The GMU survey of AMS members found “14% think the change is caused more or less equally by human activity and natural events; and 7% think the change is caused mostly by natural events.”

“Conversely, 5% think the change is caused largely or entirely by natural events, 6% say they don’t know, and 1% think climate change isn’t happening,” according to the GMU poll.

“Fully 33% either believe climate change is not occurring, is mostly natural, or is at most half-natural and half-manmade (I tend toward that last category) … or simply think we ‘don’t know,’”

Dr. Roy Spencer a climate scientist who compiles satellite-derived temperature data at the University of Alabama in Huntsville, wrote in his blog. “For something that is supposed to be ‘settled science’, I find that rather remarkable,” wrote Spencer, who is a prominent skeptic of claims of catastrophic man-made global warming.

GMU found that 29 percent of AMS members thought global warming was “largely or entirely” caused by humans and another 38 percent believe warming is “mostly” due to humans. It should be noted, however, only 37 percent of AMS respondents considered themselves climate “experts.”

“But what I find interesting is that the supposed 97% consensus on climate change (which we know is bogus anyway) turns into only 67% when we consider the number of people who believe climate change is mostly or entirely caused by humans,” Spencer wrote.

Spencer is referring to claims from politicians and environmentalists that 97 percent of climate scientists think humans are causing global warming.

“Ninety-seven percent of scientists, including, by the way, some who originally disputed the data, have now put that to rest,” President Barack Obama said in 2013. “They’ve acknowledged the planet is warming and human activity is contributing to it.”

The 97 percent figure has largely been cited by activists looking to squash public debates about climate science. The figure is based on a now debunked study 2013 study by Australian researcher John Cook.

“Our analysis indicates that the number of papers rejecting the consensus on [anthropogenic global warming] is a vanishingly small proportion of the published research,’’ Cook and his fellow authors wrote in their study which was published in the journal Environmental Research Letters.

But the definition Cook used to get his consensus was over-simplified. Only 41 out of the 11,944 published climate studies examined by Cook explicitly stated that mankind caused most of the warming since 1950 — meaning the actual consensus is 0.3 percent.

“It is astonishing that any journal could have published a paper claiming a 97% climate consensus when on the authors’ own analysis the true consensus was well below 1%,” Dr. David Legates, a geology professor at the University of Delaware, said about a study he and four other prominent researchers authored debunking Cook’s consensus claim.

The new AMS survey, however, does show that most of the science group’s members believe global warming “is happening,” according to the GMU poll. The disagreement, however, is over what is the driving force behind global warming: is it mostly caused by humans or mostly due to natural variability?

SOURCE  





Scientists Say Obama’s Global Warming Plan Will Fail

The efforts of President Barack Obama and other world leaders to prevent global warming will almost certainly fail, according to a new study published recently by Texas A&M scientists.

“It would require rates of change in our energy infrastructure and energy mix that have never happened in world history and that are extremely unlikely to be achieved,” Glenn Jones, a professor of marine sciences at Texas A&M who co-authored the study, said in a Wednesday statement on Science Daily. “For a world that wants to fight climate change, the numbers just don’t add up to do it.”

The study modeled the projected population growth and per capita energy consumption, as well as the size of known reserves of oil, coal and natural gas, and greenhouse gas emissions. It determined that it would be essentially impossible to meet the global warming goal of 2 degree Celsius by 2100 set by the December Paris agreement.

“The latest study just adds to what everyone other than those with their heads in the clouds already knows: the combination of a growing demand for energy and a growing population will lead to continued growth in the most practical form of energy production—one reliant on fossil fuels,” Chip Knappenberger, a climate scientist at the libertarian Cato Institute, told The Daily Caller Caller News Foundation. “Unless a technological breakthrough in non-carbon emitting energy production occurs in the very near future, the global production of energy and the global emissions of carbon dioxide will stay pretty tightly coupled for the remainder of the century.

Significant reductions to carbon dioxide (CO2) emissions are extremely difficult to achieve due to the immense costs involved according to the scientists. They estimate that simply limiting global warming to the Paris agreement targets would require the annual installation of 485,000 wind turbines by 2028. Only 13,000 turbines were installed in 2015, despite the enormous tax breaks and subsidies offered to wind power.

“The costs of reducing emissions are enormous, while the reductions in atmospheric concentrations of greenhouse gases are non-existent,” Myron Ebell, director of the Center for Energy and Environment at the free market Competitive Enterprise Institute, told The Daily Caller News Foundation. “It is nice to see scientists in the alarmist community realizing what has been obvious for decades.”

The likely costs of the kind of wind and solar power program the scientists say would be necessary to actually slow global warming would be measured in the tens of trillions of dollars, and even then success would be far from assured. The scientists conclude that other methods of reducing CO2 emissions, such as significantly increasing the number of nuclear reactors, would run into political opposition from environmental groups.

“Current efforts, like US EPA regulations or the UN’s Paris Agreement may chip away at the tightness of the gross world product/global CO2 emissions relationship but, they probably won’t be successful in breaking it so long as they are relying on current technologies (with perhaps the exception of a rapid build-out of nuclear power plants—something that doesn’t seem to be in the cards),” Knappenberger concluded.

The study’s conclusions are mirrored by Environmental Protection Agency (EPA) Administrator Gina McCarthy comments during a Tuesday hearing that the Clean Power Plan (CPP), her agency’s signature regulation aimed at tackling global warming, was meant to show “leadership” rather than actually prevent projected warming.

EPA repeatedly has long argued the point of the Clean Power Plan was to show the world America was serious about tackling global warming in order to galvanize support for United Nations delegates to sign a global agreement to cut emissions. Nearly 200 countries agreed to a U.N. deal last year.

SOURCE  





Shades of "Smart Growth" in Australia: Busybodies want to limit other people's choices in apartment sizes

A small, low-cost inner-city "pied-à-terre" might be just what is needed for someone who works in the city during the week but who spends the weekend at a pleasant rural property.  Many men work away from their families during the week.  My father did

THEY’VE been labelled “crappy” and “dog boxes in the sky”, apartments so small and badly designed there’s barely enough room to swing a cat — let alone a pooch.

There’s no space for luxuries like, you know, a dining room table, while some rooms don’t even sport windows.

The tiniest units in Australian cities are so small they would be illegal in crowded Hong Kong and New York.

But far from being spurned, compact flats are being heralded by some as the solution to the growing demand for city living.

However, there are moves afoot to clamp down on so-called “micro apartments” with calls for a minimum size for flats to stop developers squeezing more people into ever smaller spaces.

Earlier this month, Melbourne Lord Mayor Robert Doyle criticised developers who were sacrificing design for density.

“I am pro-development but some of the developments that have been put before us are shameful”, he told the Urban Development Institute in Adelaide.

Talking to news.com.au he reeled off a list of developer requests he was outraged by, including windows separated from the rooms they were supposed to illuminate by a corridor so long it was “like something out of Alice in Wonderland”, glass walls whose role it was to filter light into windowless bedrooms but actually created “little caves”, and fridge doors that couldn’t open because of the cramped space.

There was even the builder who created a micro apartment without a kitchen with the reason that it would be ideal for someone who enjoyed eating out.

A critic of unchecked development, Mr Doyle said good design needed to be at the centre of new apartments to prevent “building the slums of tomorrow”.

Yet, for 24-year-old public relations consultant Elena Eckhardt, her tiny Sydney apartment, which she shares with her partner, is a bijou beauty.

“The apartment has a double bedroom, bathroom, laundry, joint kitchen and living room and balcony,” she told news.com.au.

“Despite it being so small I’ve decorated it so it feels very personal.”

At 48sq m her flat is skirting the regulations in NSW, known as SEPP 65, that set a minimum apartment size. One bedroom units can be no smaller than 50sq m but studio apartments can go down to a super snug 35sq m.

Ms Eckhardt’s bedroom is partially separate with openings in the wall letting some natural light “borrowed” from the living room which has large windows.

“It’s the smallest place I’ve lived,” she said of the unit in the city fringe suburb of Chippendale. “We wouldn’t be able to afford a big apartment in the CBD so I do definitely like being here at this stage in our lives.”

Ms Eckhardt said she could walk to work and any number of pubs and shops were in the local area. The couple are out most nights, so see the flat as less a place to linger and more somewhere to bed down in.

Nevertheless, they’ve had to make compromises. “We decided not to have a kitchen table because it’s too cluttered so we only have a table on the balcony and eat there or on the couch”.

“But having a separate bedroom was really important because there is two of us so it doesn’t feel like we’re sharing one room.”

Ms Eckhardt’s 48sq m are an indulgence of open space compared to an apartment advertised for rent in Melbourne CBD that was just 20sq m, or roughly the size of two car parking spots, the Age reported.

In Victoria, unlike NSW, there is no minimum apartment size. In the Victorian Government’s ‘Better Apartments’ consultation, Planning Minister Richard Wynne raised the prospect of a new apartment code which could see minimum sizes alongside a raft of other measures around natural light, noise and outdoor space.

The consultation found daylight and space were the top concerns for apartment dwellers with 76 per cent of respondents calling for a minimum apartment size.

SOURCE


***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here

*****************************************