Wednesday, July 29, 2015



Regulations, mostly Green, are destroying America's future

The Jetsons, a sci-fi fantasy family of the 1960s, lived in impossible luxury in 2062. Not only are we not nearing their living standards, we are going in the opposite direction. This column has wondered why, and a recent Supreme Court case has given us the answer. In one of its few good decisions of the recent term, the Court by a slim 5-4 vote struck down an EPA regulation whose costs were 1,000 times its benefits. It’s good to know imposing costs of 1,000 times the benefits is a no-no, but the myriads of regulations where costs are only 100 times, 10 times or even twice the benefits pervade the entire U.S. economy – and are leading us towards the Flintstones rather than the Jetsons.

The case, Michigan vs. EPA, concerned an EPA decision to regulate power plants directly, beyond the general requirements of the Clean Air Act of 1970, if they determined emissions (primarily mercury) from those plants posed a significant risk to public health. Thus the figure of $90 billion for benefits, bandied about in the media by friends of the EPA, included all the benefits from mercury regulation under the Clean Air Act – a dubious figure even under that definition, but of no relevance whatever to the further regulations proposed by the EPA. The new regulations, according to the Supreme Court ruling, imposed costs of $9-10 billion on the electric utility industry, while achieving benefits of $4-6 million – that’s million with an m. The cost/benefit ratio was thus in the region of 2,000 to 1.

Maddeningly, this Supreme Court ruling will in itself provide no benefit to the U.S. economy. Coming as it does three years after the regulations were imposed, it arrives only after most of the $9-10 billion of costs have been incurred, as utilities across the country have closed power plants in response to the EPA regulation.

Friends of regulation will no doubt claim that this was a rogue outlier, or (as many of the mainstream media have done) that the true benefit of the rogue regulation were a huge multiple of those claimed in the Supreme Court ruling. Both claims are implausible. The higher figure for benefits could be arrived at only by including the provisions of the Clean Air Act itself, and is any case highly likely to be spurious if examined closely. (A quick calculation: $90 billion claimed benefit divided by 11,000 claimed lives saved gives a value of $8.2 million per life, three or four times the value assumed in any reasonable actuarial calculation.)

The claim that the 1,000 to 1 cost to benefit ratio of this particular regulation is a rogue outlier is statistically highly implausible. Yes, it’s likely that the ratio was at the extreme of cost/benefit ratios produced by regulations as a whole, if only because 1,000 to 1 is a very rare cost/benefit ratio for anything. But it is vanishingly unlikely that the 1,000 to 1 regulation is one plucked from a population of regulations, the rest of which are close to 1 to 1 or even have a net benefit. Were that the case, the 1,000 to 1 cost/benefit ratio would be 25 or 50 standard deviations from the mean of all regulatory cost/benefit ratios, a deviation that only occurs one in the life of a million universes.

Statistically, it is much more plausible that the 1,000 to 1 cost/benefit ratio is only 3 or 4 standard deviations from the mean, and the population of regulations as a whole is full of 100 to 1, 200 to 1, 50 to 1, 10 to 1 and even 2 to 1 cost-benefit ratios. In other words, the entire population of regulations from the EPA (and we have no reason to believe the EPA to be especially egregious among government regulators) is likely to have costs a substantial multiple of its benefits.

When you look at the incredible density of regulations inflicted on the U.S. economy since around 1970, and more particularly since 2009, it’s clear that they should have a major economic effect. As this column has pointed out before, from the productivity statistics, the effect itself is clear, even if the causal link isn’t. The average annual rate of labor productivity growth in the United States from 1947 to 1972 was 2.88%. From 1973 to 2010 it declined by around a third, to 1.98%. Since 2011, the productivity growth rate has fallen still further, to 0.51% annually from the fourth quarter of 2010 to the first quarter of 2015. If average productivity growth had been maintained since 1973 at the rate obtaining before 1973, we would today be 54% richer. The United States would be richer than Singapore, rather than having fallen to a level one third below Singapore’s per capita wealth.

This is not especially an anti-environmentalist point. The personnel restrictions generated by the Occupational Health and Safety Act of 1970 (another of Richard Nixon’s less stellar moments) and the various anti-discrimination acts generate huge costs, partly for employers attempting desperately to avoid the flood of frivolous lawsuits the legislation has generated. The licensing requirements of the FDA add enormously to the cost of developing new drugs, making the United States’ the costliest pharmacopeia in the world.

The CAFÉ fuel economy restrictions on automobiles have come close to destroying the U.S. automobile industry, by far the world leader in 1970. The recent restrictions on financial services appear to be generating mostly gigantic fines for trivial offenses such as manipulating LIBOR by a basis point or so. They have effectively closed the financial sector to new entrants, while in the long run enormously raising the cost of financial transactions. Even trivial tech improvements such as Uber are banned from various cities by their local governments acting in concert with taxicab companies. Finally, there is the disaster that is U.S. healthcare, more expensive than anywhere else in the world, and always liable to zap ordinary citizens with outrageously padded medical bills, which they have no hope of paying. And so the list goes on.

Further clear evidence of the recent intensification in regulation, and its pernicious effects is the decline in U.S. entrepreneurship since 2008. In recent years, the exit rate of new firms has exceeded the entry rate, something never seen before in the postwar economy. Part of this can be blamed on the Fed, whose extreme ultra-low interest policies stifle saving and thereby prevent many smaller new businesses from getting started. But there can be no doubt that the plethora of modern regulation plays at least an equally important role.

The left invented Gross Domestic Product, so they could include all government activities, however wasteful and even damaging, in national output figures, as though they were truly productive. This statistical legerdemain flattered historical periods such as the middle 1930s and the 1960s and early 1970s, when the U.S. government was increasing rapidly in size. Now they want to move away from GDP towards a measure of output that includes such things as cleaner air and water, and other measures that are merely evidence of compliance with left-devised regulations rather than anything tangibly benefiting the populace as a whole.

The objective of this will be to move further towards the regulatory state, impoverishing ordinary citizens and causing immense economic misery, while being able to claim that their new “Gross National Happiness” index is increasing at a rapid rate and that all is for the best in the best of all possible worlds. Global Warming legislation, ideally on a global scale where democratic forces are impotent, is likely be a key element in the move to the ultimate regulatory state in which all economic activity is controlled by Platonic Guardians – and non-Party members lead a miserable existence. Curiously, this was very much the Soviet dream, and was set out powerfully in George Orwell’s 1984. The success of regulation in the U.S. since 1970 and its effect on the overall economy indicate clearly that the dream never dies – and for the rest of us the nightmare too lives on.

The Commissar wears many hats – and if he comes in the form of a kindly environmental regulator, concerned about the level of mercury in the drinking water, he should be resisted as fiercely as if he bore a hammer and sickle.

SOURCE





New Little Ice Age Started: Climate Change with a Difference

by Professor Cliff Ollier

In the past decades we have been overwhelmed by books on Global Warming and its successor Climate Change. We have also been exposed to a large (though much smaller) number of books that take a skeptical view of these issues. book new little ice age

Here is a book with something new in the Climate Change debate: 'A New Little Ice Age Has Started: How to survive and prosper during the next 50 difficult years.' [1]

 This book goes beyond global warming and the usual arguments against it. It does not deal with the details of carbon dioxide as a greenhouse gas, simply noting that its amount has gone up in the past 60 years from about 350 to 400 ppm, while temperatures have not risen for the past 18 years. Clearly there is no correlation. Instead the arguments are assembled to show that a new ice is upon us.

On the scientific side he gets into the role of alignment of planets affecting gravity, cosmic rays (the link between solar flares and climate), and the relationship between volcanoes and climate (big eruptions cause T 250 New Concepts in Global Tectonics Journal, V. 3, No. 2, June 2015. www.ncgt.org cooling).

But this book is for the layman, so he does not use masses of facts and statistics, but rather anecdotal evidence. Instead of using satellite measurements to show the growing Greenland ice cap he recounts that a plane lost in World War II was discovered in 1989 under 87m of ice.

He goes on to show the fallacious science that has been used to blind the public to the reality, with discussion of the role of Climategate where climate scientists exchanged cynical e-mails discussing their fraud and manipulation very openly.

Lawrence Pierce describes the work of the IPCC (Intergovernmental Panel on Climate Change) who publish their political Executive Summaries for politicians months before the actual Scientific Reports. They claim to use first class data but in fact use all kinds of nonrefereed reports from green agencies such as Greenpeace instead of scientific evidence.

Pierce has a few words to say on the disgraced ex-chairman of the IPCC, Dr. Pachauri, Al Gore’s misleading propaganda film, and Michael Mann’s infamous hockey-stick. Why does he do this? It turns out that the author is an ex lawyer who retired to grow grapes in British Colombia.

But the weather didn’t warm as he had been promised and the business failed. So he started his own investigation. Of course he found the pause in global warming. But more than this he found a completely different story. Carbon dioxide was barely a player, and the thing that has the best correlation with climate is the sunspot cycle. He describes the cycle using good diagrams and tables, and recounts the climatic history of the past few hundred years, with the Mediaeval Warm Period and the subsequent Little Ice Age.

As an aside Lawrence Pierce gives an account of Mann’s famous ‘hockey stick’ graph showing ever accelerating temperature increase (a onetime logo for the alarmists), the construction of which required the elimination of both the Medieval Warming and the Little Ice Age – which are incontrovertible facts. He describes the cold periods in the past starting roughly as follows: – the Oort 1000, Wolf 1250, Sporer 1400, Maunder 1645, Dalton 1780 - all related to sun spot minima.

And then comes the shocking discovery – we have already started our descent into the next Little Ice Age.

Solar Cycle 24 has started, and could be the Solar Cycle with the lowest recorded sunspot activity since accurate records began in 1750, so we are likely to have very cold weather for the next fifty to eighty years. Pierce points out that the minima are not times of permanent cold, but have great variation, with short hot spells and many storms.

In general life is good in the warm spells between little ice ages – the Roman, Medieval and Twentieth century warm periods, but harsh in the cold periods. He ties historical events to his narrative, such as Bonaparte’s attack on Moscow in one the very cold winters, the collapse of the Nordic settlement in Greenland, the Irish potato famine and many others.

We have come to accept the twentieth century warm as the norm, but the time of abundance is over. He sees the oncoming Ice Age as a real cause for alarm, and he asks why has it been kept from us? Why are our governments spending trillions to ‘avoid’ global warming when the real peril is just the reverse, and we have no plans to meet it. Lawrence Pierce feels cheated that the governments, scientists and journalists who he trusted have in fact completely misled him. Finally he writes about what to do about the coming cold.

Unfortunately this is a very parochial view and really tells people in Canada what to do. 35 out of 125 pages of the main text are devoted to this topic. But he pointed out that during the cold periods of previous little ice ages wiped out hundreds of thousands of people outside Canada by famine and associated war and disease. At present there are many countries, especially in the Middle East, who have booming population growth but are entirely dependent on buying food from elsewhere.

If the boundary of the wheat belt in the northern hemisphere moves 300 miles km to the south, they are in jeopardy. Guess what they will do. So if you believe his text you must make your own strategy to survive the hard times that are coming.

SOURCE




Using NOAA's cooked data, NASA says June tied as hottest month

 by Thomas Richard

NASA announced on Wednesday that by using NOAA's recently altered temperature data, June 2015 was tied as the warmest June on record.  goes 8 satelliteAs previously reported here, the US National Oceanic and Atmospheric Administration (NOAA) reworked its climate data in order to eliminate the 18-year-and-counting pause in global warming. In early June, NOAA released a study saying that long-existing instrument biases have been masking rising sea surface temperatures. Once they "readjusted" the data, the current warming hiatus disappeared. Put simply, by cooling the past, NOAA made the the last two decades look warmer.

With the release of global temperature data for June, the National Aeronautics and Space Administration (NASA) has essentially changed how it analyses measurements by using the same sea surface dataset that was readjusted by NOAA. In using NOAA's highly controversial dataset, NASA can now say that global average temperatures last month tied June 2015 with June 1998 as the warmest on record. The global surface temperature anomaly for June was +0.78 degrees Celsius, which they say was driven by temperature inconsistencies in the Northern Hemisphere.

The June 2015 data released by NASA uses the same readjustments of global sea surface temperature records created by NOAA, which increases the rate of overall global warming (both land and sea) in the last 15 years. NOAA's dataset, known as the Extended Reconstructed Sea Surface Temperature version 4 (ERSST v4), reflects these readjustments and have now been arrogated by NASA.

More troubling is the fact that NASA and NOAA have joined forces to hide the global warming pause, even though there are more robust, accurate datasets available that clearly show it. One item of contention is that both agencies have essentially overlooked the satellite record dataset, which shows a global warming pause since 1998. Starting in 1979, orbiting satellites have been measuring the atmosphere five miles up and are accurate to within .001 degrees Celsius.

Satellite data show that the upper atmosphere is warming much less than global surface temperatures, even though computer models predicted the opposite would happen. Worse still, the satellite-derived measurements clearly show a global warming pause. The dataset are analyzed by both the U.S. firm Remote Sensing Systems (RSS) and also the University of Alabama in Huntsville (UAH). Both the RSS and UAH datasets are unaffected by the issues that plague land-based measurements and ship- and buoy-based biases in sea surface temperatures. NOAA's re-adjustments to the climate's temperature record doesn't impact satellite measurements as they are not susceptible to such distortions.

Even the data from weather balloons agree with the satellite temperature measurements. They show much less warming then was predicted, and in the past 18.6 years have shown no statistical warming worldwide. The other major player in the global temperature measurement field is the UK Met Office surface temperature dataset, which also shows a global warming pause since 1998. Oddly enough, NASA announced on July 9 that the oceans slowed the global temperature rise by "trapping the heat," while simultaneously claiming temperatures haven't stopped rising.

NASA also said it has "eliminated GHCN's Amundsen-Scott temperature series" and will only be using the SCAR reports for the South Pole (Antarctica). The Goddard Institute for Space Studies (GISS) also announced it was using the readjusted NOAA ERSST v4 dataset. Unlike the UK Met Office and NCEI products, climate researcher Bob Tisdale writes that "GISS masks sea surface temperature data at the poles where seasonal sea ice exists, and they extend land surface temperature data out over the oceans in these locations."

Even the Intergovernmental Panel on Climate Change (IPCC) acknowledged two years ago that the "rise in Earth's mean surface temperatures had begun to slow since 1998, and since then everything from volcanic activity to solar output has been used to explain the pause." Currently there are more than 66 excuses to explain the global warming hiatus.

Critics argue all of this comes at a time when President Obama has shifted his focus to climate change ahead of the Paris Climate Talks, and that NOAA and NASA are using this new dataset of revised sea surface temperatures to push other countries into crippling regulations. Even EPA Administrator Gina McCarthy admitted to Congress last week that all the new rules and regulations it is rolling out would only avert warming by .01 degrees.

SOURCE




Science or Selective Ignorance?

In an editorial published in Science magazine on July 3, Marcia McNutt, Editor-in-Chief of the Science Journals, removed all doubt concerning the direction that this once prestigious journal is taking. censorship

In "The beyond-two-degree inferno", she wrote: "The time for debate has ended. Action is urgently needed."

Then, she strongly supports the contrived effort of the European Union to keep "global warming" below 2°C above the preindustrial level - a number for which we have no rigorous measurement or logic.

She advocates the political position of the Administration in forcing reductions in carbon dioxide emission (CO2) by stating "The United States has pledged reductions of 26 to 28% below 2005 levels by 2025..."

Of course, there is no such pledge by the American people and its representatives in Congress. The Administration's pledge is arbitrary and authoritarian. Ms. McNutt concludes with a description of the nine circles of Hell found in Dante's Inferno.

Ms. McNutt continues a trend established in the Science journals by Donald Kennedy (2000-2008), who declared while he is editor, Science would no longer accept articles contradicting the pronouncements of the Intergovernmental Panel on Climate Change (IPCC) on global warming, later termed climate change, regardless of the empirical data presented.

The IPCC reports featured glaring deficiencies such as the falsely named distinct human fingerprint, a hot-spot over the tropics, which no one can empirically find; Mr. Mann's hockey-stick, based on sparse data, from which contradicting data was deleted; and global climate models, which greatly overestimate warming, as current measurements demonstrate. The logic behind this editorial policy can be described as selective ignorance. Please see links under Defending the Orthodoxy, including an excellent critique by Judith Curry.

16th Century Thinking: European scientific thinking of the 16th century was dominated by the re-discovery of the works of the Greeks. Their works in geometry and astronomy were very good, particularly considering the lack of precise instruments. Estimates of size of the earth and the moon, and the distance between them were quite accurate. However, they generally underestimated the size of the sun and its distance from the earth.

The concept of a heliocentric solar system was suggested by Aristarchus (died about 232 B.C.) and was accepted by some astronomers but eventually rejected, particularly by Ptolemy, a Roman, (about 150 A.D), whose system became the one widely accepted in the 16th century. During the 16th century, learning and written documents were extremely limited, and authority and consensus were dominate.

Copernicus disagreed with the Ptolemy concept of the solar system, but the work was not published until the year of his death in 1543. It was up to Galileo to earn the full wrath of the Greek scholars (often called Aristotelian scientists) that dominated science in the period. Galileo confronted the scientific models and assumptions of the era with observations from nature and experiments.

The most dramatic of these confrontations was proposing a heliocentric solar system, with an earth that orbited the sun annually, rotated daily, and titled on its axis. [Kepler proposed elliptical, not circular, orbits doing away with epicycles, and non-uniform speeds.] Using a telescope, Galileo identified spots on the sun, refuting the notion that it was immutable (unchanging). There are various versions of what occurred in the 17th century (until Newton) and the importance of various groups.

However, for the purposes of this discussion, one must note that Galileo was the first, influential astronomer of the Renaissance to propose that observations take precedence over authority and consensus of opinion as the objective standard in science. He incurred the full wrath of the scientific establishment of that time.

The Sun? Royal Astronomical Society published a study of a "new model of the Sun's solar cycle is producing unprecedentedly accurate predictions of irregularities within the Sun's" 10 to 12 year solar cycle. "The model draws on dynamo effects in two layers of the Sun, one close to the surface and one deep within its convection zone. Predictions from the model suggest that solar activity will fall by 60 per cent during the 2030s to conditions last seen during the 'mini ice age' that began in 1645."

"It is 172 years since a scientist first spotted that the Sun's activity varies over a cycle lasting around 10 to 12 years. But every cycle is a little different and none of the models of causes to date have fully explained fluctuations. Many solar physicists have put the cause of the solar cycle down to a dynamo caused by convecting fluid deep within the Sun. Now, Zharkova and her colleagues have found that adding a second dynamo, close to the surface, completes the picture with surprising accuracy."

"We found magnetic wave components appearing in pairs, originating in two different layers in the Sun's interior. They both have a frequency of approximately 11 years, although this frequency is slightly different, and they are offset in time. Over the cycle, the waves fluctuate between the northern and southern hemispheres of the Sun. Combining both waves together and comparing to real data for the current solar cycle, we found that our predictions showed an accuracy of 97%," said Zharkova.

Zharkova and her colleagues derived their model using a technique called 'principal component analysis' of the magnetic field observations from the Wilcox Solar Observatory in California. They examined three solar cycles-worth of magnetic field activity, covering the period from 1976-2008. In addition, they compared their predictions to average sunspot numbers, another strong marker of solar activity. All the predictions and observations were closely matched.

Looking ahead to the next solar cycles, the model predicts that the pair of waves become increasingly offset during Cycle 25, which peaks in 2022. During Cycle 26, which covers the decade from 2030-2040, the two waves will become exactly out of synch and this will cause a significant reduction in solar activity.

"In cycle 26, the two waves exactly mirror each other - peaking at the same time but in opposite hemispheres of the Sun. Their interaction will be disruptive, or they will nearly cancel each other. We predict that this will lead to the properties of a 'Maunder minimum'," said Zharkova. "Effectively, when the waves are approximately in phase, they can show strong interaction, or resonance, and we have strong solar activity. When they are out of phase, we have solar minimums. When there is full phase separation, we have the conditions last seen during the Maunder minimum, 370 years ago."

Since the period covered in the testing is only three solar cycles, 1976 to 2008, it is far too brief to draw any long-term conclusions. However, the accuracy in the testing is significant. Further, the cooling corresponds with predictions from some other solar scientists.

The short period of study understood, The Summary for Policymakers of Fifth Assessment Report (AR-5), Synthesis Report, of the IPCC also covers a relatively short period. Table SPM.3 presents "Contributions to observed surface temperature change over the period 1951-2010." Yet, the IPCC expressed 95% certainty in its work.

The total of natural forcings presented by the IPCC in this table covers a temperature range of about minus 0.1 ºC to plus 0.1 ºC. If the new report of the Royal Astronomical Society bears out, and we experience a cooling greater than 0.1 ºC, the IPCC and the climate establishment has significant problems.

SOURCE







EPA head: We don't need to justify our regulations with data

EPA Administrator Gina McCarthy took a drumming yesterday when she refused to release the 'secret science' her agency used when drafting new regulations. mccarthy testimony Testifying before the House Science, Space and Technology committee, Rep. Lamar Smith (R) began the Q&A by asking McCarthy why she wouldn't release the studies and data in which her regulations are based. Rep. Smith told McCarthy that his 'secret science' reform act would make the data public without interfering in the EPA's primary job and maintaining the confidentiality of third parties.

Rep. Smith also quoted Obama's science adviser, John Holdren, saying "The data on which regulatory decisions are based should be made available to the committee and should be made public. Why don't you agree with the president's science adviser?" McCarthy replied that while she supports transparency in the regulatory process, the bill would make public the personal information of the people working on the science.

Smith reiterated that in his secret science reform act, personal information would be redacted but the underlying studies and data that are being used to justify costly regulations would be made public so that other scientists and the American people can review it. This is especially important as the EPA has a 60-day comment period after a new proposal is issued, but the science behind the new regulations is not included. Smith's new bill would rectify that issue.

McCarthy also said she "doesn't actually need the raw data in order to develop science. That's not how it's done."

Rep. Smith: "But why don't you give us the data you have and why can't you get that data you do have? Surely you have the data that you based the regulations on?"

McCarthy: "EPA actually has the authority and the need to actually get information that we have provided to you."

Rep. Smith: "You're saying you can't give us the information because it is personal and then you're saying you don't have the information. Which is it?"

McCarthy: "There is much information we don't have the authority to release."

Rep. Smith reiterated again that any personal information would be redacted and once again asked why she won't release this information after meeting all the criteria McCarthy used to justify not revealing the information. Rep. Smith reminded her that every other agency does this, so why can't the EPA simply redact this personal information and release the underlying science on which the EPA's regulations are based?

McCarthy stressed that the science is generated through the peer-reviewed process and not by the agency itself, prompting Rep. Smith to say that by not showing the American people and the Congress the studies and data they used to make new regulations, it looks like the EPA has something to hide. Rep. Smith said there was no good reason other scientists couldn't review the data, no good reason his committee couldn't review it, and most important, the American people can't review it.

Changing topics, Rep. Smith asked McCarthy about the Clean Power Plan, reminding her that after spending enormous amounts of money and implementing burdensome regulations, increasing the costs of electricity that would hurt the poorest Americans, it would only lower global temperatures 1/100 of a degree. "How do you justify such an expensive, burdensome, onerous rule that isn't going to do much good?…Isn't this all pain and no gain?"

McCarthy admitted the goal of the Clean Power Plan was to show strong domestic action which can trigger strong global action, e.g., getting other countries to follow our lead. McCarthy refused to say if Rep. Smith's analysis of the minuscule effect on global temperatures was correct, stating again it was more about leading on a global scale. She also refused to give Rep. Smith a timetable on when he could expect supporting documentation that he had been requesting for months.

Later in the hearing, Rep Dana Rohrabacher (R) was shocked that McCarthy did not have any idea what percentage of the atmosphere was made up of carbon dioxide (CO2). Stunned by this admission, Rohrabacher said, "You’re head of the EPA and you did not know? …Now you are basing policies that impact dramatically on the American people and you didn’t know what the content of CO2 in the atmosphere was… the justification for the very policies you’re talking about?"

McCarthy: "If you’re asking me how much CO2 is in the atmosphere, not a percentage but how much, we have just reached levels of 400 parts per million."

Rohrabacher: "I think I was very clear on what I was asking. I think it was very clear you didn’t know."

This is not the first time McCarthy has flunked knowing basic science. In a Senate hearing in March, McCarthy was unaware of climate data showing no increase in extreme weather. At that hearing, she was asking for additional money be dedicated to the president's controversial Clean Power Plan, an initiative to limit carbon dioxide (CO2) emissions that are blamed for any type of bad weather.

As previously reported here, carbon dioxide levels reached a global level of 400 parts per million (ppm) in March, even though global temperatures have not risen for nearly 19 years. You can find 400 carbon dioxide molecules per one million parts of dry air. By volume, "dry air contains 78.09% nitrogen, 20.95% oxygen, 0.93% argon, 0.039% carbon dioxide (.04% in March 2015), and small amounts of other gases." Carbon dioxide levels vary between 390 and 400 ppm and change seasonally as more plant life is around to absorb it in the spring and summer.

SOURCE





SCOTLAND’S WIND FARMS CAUSE WATER POLLUTION

Anti wind turbine campaigner Susan Crosthwaite is calling for an immediate and full independent investigation into the pollution of surface and groundwater of ALL Scottish windfarm developments sited on River Basin Districts. scottish windfarm  The construction of giant wind turbines has led to the industrialisation of water catchment areas damaging water quality and public health. She demands that relevant legislation be adhered to vigorously to ensure complete protection of Scotland’s reservoirs, lochs and private water supplies can be restored.

Commenting from her home in South Ayrshire Susan Crosthwaite said:

“Windfarm development in Scotland is clearly breaching The Environmental Liabilities Directive and the Water Frameworks Directive. Developers and government bodies have allowed these developments to proceed in the full knowledge that there are risks to surface and groundwater. Authorities such as SEPA, Scottish Water, Councils and the Scottish Government have failed in their legal duty to protect the water environment. Public authorities should ensure the proper implementation and enforcement of the scheme provided for by this Directive.

“People wonder how windfarms can possibly contaminate our water. Firstly, most are constructed on areas of unspoilt moss, heather and deep peat, often with associated forestry. Construction vehicles churn up the ground to make access roads and clear the forests (approximately 3 million trees were cleared at Whitelee). Trees are pulled up, and the churned up peat is washed into the river systems by heavy rain, releasing excessive carbon which the water treatment works are not able to deal with.

“The construction teams then blast quarries and ‘borrow-pits’ to provide rock foundations for access roads and turbine bases - six quarries with 85 articulated dump lorries ferried almost 6 million tons of excavated rock around the Whitelee site for roads and turbine foundations. These excavations allow access to the numerous faults (fractures) and dykes (intrusions) which crisscross Scotland and act as conduits for ground water. Chemical and  diesel spills, therefore, have an immediate channel to the aquifer. It is also a great irony that anti-fracking campaigners make spurious claims about potential water pollution and then support the construction on industrial wind turbines, which are demonstrably causing widespread pollution to our water supplies in Scotland.

groundwater impact

“The evidence of pollution discovered by radiologist Dr. Rachel Connor stems from her own experience of living close to Whitelee, the largest windfarm in Europe, and experiencing first-hand the results of drinking contaminated water. Evidence of pollution was discovered in monitoring reports which  were a requirement for the Whitelee windfarm construction 2006-2009 and were brought before a Public Inquiry re a 3rd extension to Whitelee, where Dr. Connor underwent a 5 hour cross examination. (This material has not yet been ruled on by the Scottish Government.) It included a failure to monitor and test for instances of specific contamination related to chemical spill or diffuse contamination from dangerous chemicals- some of which may have come from 160,000 m3 of concrete which were used in turbine foundations and other areas.

“There was also evidence of contamination of private water supplies where springs had failed completely, boreholes had silted up temporarily and water quality was rendered unfit to drink. There is no effective protective mechanism for private water supplies if the local authority responsible for protecting the water supply has no mechanism to insist that a developer find, chart and protect the water source, and is subsequently not responsible for the hydrological environment upon which that water supply depends.

“Windfarm developments have not been monitored or assessed according to the legal requirements which under a European Directive require Member States to ensure the establishment of programmes for the monitoring of water status in order to establish a coherent and comprehensive overview of water status within each river basin district. It is clear that incidents and concerns have been reported by a Planning Monitoring Officer to the regulatory authorities but have not been investigated. Indeed Planning Monitoring Officers are not routinely employed and in any case, information from such officials may be difficult and costly for the public to access. Consequently developments proceed unchallenged.

“Wind farm construction has coincided with an increase in raw water colour at Amlaird and other Scottish Water treatment works. Scottish Water test results indicated high levels of colour, iron, manganese, coliforms, E coli and turbidity, but these were not investigated and resolved by the appropriate authority. The disinfection procedures meant that drinking water failed to meet European and UK regulatory standards leading to increased levels of Trihalomethanes – recognised by the WHO as possible human carcinogens

“Now Scottish Water test results from 2005 to 2014 for colour, iron, manganese, coliforms and e coli in Loch Bradan, Afton Reservoir and Penwhapple Reservoir – also show a deterioration in water quality associated with windfarm construction and pre-construction forestry clearance. This means that many people in East and South Ayrshire are drinking water below the Drinking Water Regulatory Standards. Where water quality has fallen consistently below regulatory standards, statutory authorities have not informed the public of the potential risks to their health despite an EC Directive that insists  ‘Member States shall ensure the necessary protection for the bodies of water identified with the aim of avoiding deterioration in their quality in order to reduce the level of purification treatment required in the production of drinking water’.

“As Whitelee is Scottish Power Renewable’s flagship windfarm, the credibility of all their windfarm developments is based on the belief that their professed mitigation measures are successfully preventing any water pollution. How can the public be confident that this is the case if they do not constantly and consistently monitor all subsequent developments with results made easily available to the public?

“Arecleoch SPR windfarm consists of 60 turbines, operational since Autumn of 2011 This windfarm along with Hadyard Hill, Hadyard Hill Extension, Assel Valley, Millenderdale and Straid windfarms are all sited within the River Stinchar water catchment protected area. Tralorg plus the 5 ‘Straiton’ windfarms including Dersalloch are sited on the Girvan and Doon water protected catchment zones. None of these developments, according to the FOI, have been adequately monitored or assessed. Indeed failure to monitor the impact on surface and ground water before, during and after the construction of the 60 turbines at Arecleoch constitutes a direct breach of the water directives.”

SOURCE

***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here

*****************************************



Tuesday, July 28, 2015



Is the Pope a Fascist?

If we compare him with Fascists of the past, his ideas are clearly Fascist.  Fortunately, however, he has none of their power.  "But how can such a nice guy be a Fascist?" one might ask. In answer to that remember that "Pope" is a version of the Italian word for "father" and that both Mussolini and Hitler were seen as fatherly figures in their times.  Hitler had most Germans convinced that he loved them. And even in the mouth of a holy man bad ideas can be destructive when other people take them seriously.

And the church has always accomodated Fascism.   In 1929 Mussolini and Pope Pius 12th signed the Lateran treaty -- which is  the legal basis for the existence of the Vatican State to this day -- and Pius in fact at one stage called Mussolini "the man sent by Providence".  The treaty recognized Roman Catholicism as the Italian State religion as well as recognizing the Vatican as a sovereign state.  What Mussolini got in exchange was acceptance by the church -- something that was enormously important in the Italy of that time.

It should also be noted that Mussolini's economic system (his "corporate State") was a version of syndicalism  -- having workers, bosses and the party allegedly united in several big happy families --  and syndicalism is precisely what had been recommended in the then recent (1891) "radical" encyclical De rerum novarum of Pope Leo XIII.  So that helped enormously to reconcile Mussolini to the church.  Economically, Fascism was more Papal than capitalist (though in the Papal version of syndicalism the church naturally had a bigger role).

Syndicalism was of course a far-Leftist idea (with Sorel as a major prophet) long before it was a Papal one but the Holy Father presented a much more humanized and practical version of it and thus seems in the end to have been more influential than his Leftist rivals.  Mussolini was of course acutely aware of both streams of syndicalist thinking and it was a great convenience to him to be able to present himself as both a modern Leftist and as a supporter of the church.

So that is the Catholic intellectual inheritance, making Frank's ideas not at all outlandish in a Catholic context.  Catholic economic ideas in fact formed the basis of Italian Fascism.  And Frank has built on that foundation using more modern ideas.

In his recent encyclical, Frank has made it clear that he idealizes a simple and definitely non-capitalist rural past. Hitler did the same and the modern-day Green/Left do the same.  So exactly from where did Frank get those ideas?  As well as from Catholic economic thinking, he got them from liberation theology.  Liberation theology is a very Leftist doctrine that is widespread among South American priests and Frank is a South American priest.  So where did South American priests get their ideas?  From the prevailing South American culture.  And South American thinking is typically Fascist.  Latin America has had heaps of Fascist-type dictatorships in the recent history of its governance so that is hardly controversial.  Fascism explains Latin-American poverty.  Fascism is a form of Leftism and Leftism is always economically destructive.

So where did South American Fascism come from?  Initially from Simon Bolivar, the great liberator of South America.  Bolivar wanted to replace the king of Spain by a South American elite, not by mass democracy.  And to this day the Venezuelan regime describes itself as Bolivarian.  Bolivar and his ideas are far from forgotten.  Bolivar emphasized the importance of a strong ruler and the constitution he wrote aimed to establish a lifelong presidency and an hereditary senate. He explicitly rejected the liberal ideas of the U.S. founders. Fascist enough?  Memories of  a certain Tausend Jahr Reich come to mind. So the Latin American dictators have simply been good Bolivarians.

So that is the mental world that formed Pope Frank as he was growing up in Argentina.  And who is to this day the most influential political figure in Argentina?  Juan Peron, another Fascist and a friend of Mussolini in his day.  And it was of course Peron who gave refuge to many displaced Nazis after WWII.  And what was Peron's appeal?  He claimed to be standing up for the descamisados", the "shirtless ones".  In typical Leftist style he claimed to be an advocate for the poor.

Is Frank's thinking coming into focus yet?  He is actually a pretty good Peronist.  He has brought Argentinian Fascism to the Holy See.  He is certainly no original thinker. Paul Driessen  sets out below how his prescriptions would perpetuate poverty, disease, and premature death in the Third World -- just as they have done in Argentina


The Laudato Si encyclical on climate, sustainability and the environment prepared by and for Pope Francis is often eloquent, always passionate but often encumbered by platitudes, many of them erroneous.

“Man has slapped nature in the face,” and “nature never forgives,” the pontiff declares. “Never have we so hurt and mistreated our common home as in the last 200 years.” It isn’t possible to sustain the present level of consumption in developed countries and wealthier sectors of society. “Each year thousands of species are being lost,” and “if we destroy creation, it will destroy us.”

The pope believes climate change is largely manmade and driven by a capitalist economic system that exploits the poor. Therefore, he says, we must radically reform the global economy, promote sustainable development and wealth redistribution, and ensure “intergenerational solidarity” with the poor, who must be given their “sacred rights” to labor, lodging and land (the Three L’s).

All of this suggests that, for the most part, Pope Francis probably welcomes statements by his new friends in the United Nations and its climate and sustainability alliance.

One top Intergovernmental Panel on Climate Change official bluntly says climate policy is no longer about environmental protection; instead, the next climate summit will negotiate “the distribution of the world’s resources.” UN climate chief Christiana Figueres goes even further. UN bureaucrats, she says, are undertaking “probably the most difficult task we have ever given ourselves, which is to intentionally transform the global economic development model.” [emphasis added]

However, statements by other prominent prophets of planetary demise hopefully give the pope pause.

Obama science advisor John Holdren and Population Bomb author Paul Ehrlich, in their Human Ecology book: “We need to de-develop the United States” and other developed countries, “to bring our economic system into line with the realities of ecology and the global resource situation.” We will then address the “ecologically feasible development of the underdeveloped countries.” [emphasis added]

Ehrlich again: “Giving society cheap energy is like giving an idiot child a machine gun.” And most outrageous: The “instant death control” provided by DDT was “responsible for the drastic lowering of death rates” in poor countries; so they need to have a “death rate solution” imposed on them.

Radical environmentalism’s death campaigns do not stop with opposing DDT even as a powerful insect repellant to prevent malaria. They view humans (other than themselves) as consumers, polluters and “a plague upon the Earth” – never as creators, innovators or protectors. They oppose modern fertilizers and biotech foods that feed more people from less land, using less water. And of course they are viscerally against all forms and uses of hydrocarbon energy, which yields far more energy per acre than alternatives.

Reflect on all of this a moment. Unelected, unaccountable UN bureaucrats have given themselves the authority to upend the world economic order and redistribute its wealth and resources – with no evidence that any alternative they might have in mind will bring anything but worse poverty, inequality and death.

Moreover, beyond the dishonest, arrogant and callous attitudes reflected in these outrageous statements, there are countless basic realities that the encyclical and alarmist allies sweep under the rug.

We are trying today to feed, clothe, and provide electricity, jobs, homes, and better health and living standards to six billion more people than lived on our planet 200 years ago. Back then, reliance on human and animal muscle, wood and dung fires, windmills and water wheels, and primitive, backbreaking, dawn-to-dusk farming methods made life nasty, brutish and short for the vast majority of humans.

As a fascinating short video by Swedish physician and statistician Hans Rosling illustrates, human life expectancy and societal wealth has surged dramatically over these past 200 years. None of this would have been possible without the capitalism, scientific method and hydrocarbon energy that radical, shortsighted activists in the UN, EPA, Big Green, Inc. and Vatican now want to put in history’s dustbin.

Over the past three decades, fossil fuels – mostly coal – helped 1.3 billion people get electricity and escape debilitating, often lethal energy and economic poverty. However, 1.3 billion still do not have electricity. In India alone, more people than live in the USA still lack electricity; in Sub-Saharan Africa, 730 million (equal to Europe) still cook and heat with wood, charcoal and animal dung.

Hundreds of millions get horribly sick and 4-6 million die every year from lung and intestinal diseases, due to breathing smoke from open fires and not having clean water, refrigeration and unspoiled food.

Providing energy, food, homes and the Three L’s to middle class and impoverished families cannot happen without nuclear and hydrocarbon energy and numerous raw materials. Thankfully, we still have these resources in abundance, because “our ultimate resource” (our creative intellect) has enabled us to use “fracking” and other technologies to put Earth’s resources to productive use serving humanity.

Little solar panels on huts, subsistence and organic farming, and bird-and-bat-butchering wind turbines have serious cost, reliability and sustainability problems of their own. If Pope Francis truly wants to help the poor, he cannot rely on these “alternatives” or on UN and Big Green ruling elite wannabes. Who are they to decide what is “ecologically feasible,” what living standards people will be “permitted” to enjoy, or how the world should “more fairly” share greater scarcity, poverty and energy deprivation?

We are all obligated to help protect our planet and its people – from real problems, not imaginary ones. Outside the computer modelers’ windows, in The Real World, we are not running out of energy and raw materials. (We’re just not allowed to develop and use them.) The only species going extinct have been birds on islands where humans introduced new predators – and raptors that have been wiped out by giant wind turbines across habitats in California and other locations. Nor are we encountering climate chaos.

No category 3-5 hurricane has struck the USA in a record 9-3/4 years. (Is that blessing due to CO2 and capitalism?) There has been no warming in 19 years, because the sun has gone quiet again. We have not been battered by droughts more frequent or extreme than what humanity experienced many times over the millennia, including those that afflicted biblical Egypt, the Mayas and Anasazi, and Dust Bowl America.

The scientific method brought centuries of planetary and human progress. It requires that we propose and test hypotheses that explain how nature works. If experimental evidence supports a hypothesis, we have a new rule that can guide further health and scientific advances. If the evidence contradicts the hypothesis, we must devise a new premise – or give up on further progress.

But with climate change, a politicized method has gained supremacy. Based on ideology, it ignores real-world evidence and fiercely defends its assumptions and proclamations. Laudato Si places the Catholic Church at risk of surrendering its role as a champion of science and human progress, and returning to the ignominious persecution of Galileo.

Nor does resort to sustainable development provide guidance. Sustainability is largely interchangeable with “dangerous manmade climate change” as a rallying cry for anti-hydrocarbon, wealth redistribution and economic transformation policies. It means whatever particular interests want it to mean and has become yet one more intolerant ideology in college and government circles.

Climate change and sustainability are critical moral issues. Denying people access to abundant, reliable, affordable hydrocarbon energy is not just wrong. It is immoral – and lethal.

It is an unconscionable crime against humanity to implement policies that pretend to protect the world’s energy-deprived masses from hypothetical manmade climate and other dangers decades from now – by perpetuating poverty, malnutrition and disease that kill millions of them tomorrow.

Paul Driessen is senior policy analyst for the Committee For A Constructive Tomorrow, author of Eco-Imperialism: Green power - Black death, and coauthor of Cracking Big Green: Saving the world from the Save-the-Earth money machine





Hansen in the gun

Some excerpts below from comments by  Judith Curry on Jim Hansen's latest brainstorm.  She first notes that lots of people thought Hansen has gone well beyond the realm of the probable this time. She then gives the actual journal abstract and adds some comments of her own

Ice Melt, Sea Level Rise and Superstorms: Evidence from Paleoclimate Data, Climate Modeling, and Modern Observations that 2°C Global Warming is Highly Dangerous

J. Hansen, M. Sato, P. Hearty, R. Ruedy, M. Kelley, V. Masson-Delmotte, G. Russell, G. Tselioudis, J. Cao, E. Rignot, I. Velicogna, E. Kandiano, K. von Schuckmann, P. Kharecha, A. N. Legrande, M. Bauer, and K.-W. Lo

Abstract.

There is evidence of ice melt, sea level rise to +5–9 m, and extreme storms in the prior interglacial period that was less than 1 C warmer than today. Human-made climate forcing is stronger and more rapid than paleo forcings, but much can be learned by  combining insights from paleoclimate, climate modeling, and on-going observations. We argue that ice sheets in contact with the ocean are vulnerable to non-linear disintegration in response to ocean warming, and we posit that ice sheet mass loss can be approximated by a doubling time up to sea level rise of at least several meters. Doubling times of 10, 20 or 40 years yield sea level rise of several meters in 50, 100 or 10 200 years. Paleoclimate data reveal that subsurface ocean warming causes ice shelf melt and ice sheet discharge. Our climate model exposes amplifying feedbacks in the Southern Ocean that slow Antarctic bottom water formation and increase ocean temperature near ice shelf grounding lines, while cooling the surface ocean and increasing sea ice cover and water column stability. Ocean surface cooling, in the North Atlantic as well as the Southern Ocean, increases tropospheric horizontal temperature gradients, eddy kinetic energy and baroclinicity, which drive more powerful storms.We focus attention on the Southern Ocean’s role in aecting atmospheric CO2 amount, which in turn is a tight control knob on global climate. The millennial (500–2000 year) time scale of deep ocean ventilation aects the time scale for natural CO2 change, thus the time 20 scale for paleo global climate, ice sheet and sea level changes. This millennial carbon cycle time scale should not be misinterpreted as the ice sheet time scale for response to a rapid human-made climate forcing. Recent ice sheet melt rates have a doubling time near the lower end of the 10–40 year range.We conclude that 2 C global warming above the preindustrial level, which would spur more ice shelf melt, is highly dangerous. Earth’s energy imbalance, which must be eliminated to stabilize climate, provides a crucial metric.

The paper is in Atmospheric Chemistry and Physics Discussions, the discussion forum of the European Geosciences Union journal Atmospheric Chemistry and Physics

This is an intriguing and wide-sweeping paper that has put together a multi-disciplinary team to examine the possibility of near term catastrophic sea level rise.

For context,  Hansen et al. present a much more extreme scenario than the  last report from the Intergovernmental Panel on Climate Change and the most recent assessment in 2014  “Expert assessment of sea-level rise by AD 2100 and AD 2300.”

The biggest issue raised by Hansen is the potential (plausible? possible?) for a catastrophic >5 m sea level rise in the 21st century. Hansen et al. have proposed a  a new mechanism for faster sea level rise – can we falsify this?  The collapse of the West Antarctic ice sheet (WAIS)  is arguably the most alarming potential impact of global warming.  WAIS has collapsed before during previous interglacials, and will undoubtedly collapse again (with or without AGW), with a ~5 m sea level rise.  The issue is whether the WAIS can collapse on timescales of decades to a century.  Based on what we know (summarized by Tad Pfeffer above), this is a process that would take centuries.

I am not an expert on sea level rise or ice sheets, but here are a few things that frame my own understanding, including some recent research:

Sea level has been rising for millennia.  I am not convinced that there is a significant acceleration of sea level rise that can be attributed to human caused global warming (see this previous post).

Recent research from Scripps finds that the Greenland ice sheet did not melt as much as expected during the Eemian but that may mean Antarctic ice sheets melted more than expected

A new paper summarized by Cato that found that the size of the Greenland ice sheet—especially the best observed portions covering the west and southwestern parts of Greenland—during the mid-Holocene was smaller than it is today—but not by a whole lot.
Study finds surprisingly high geothermal heating beneath west antarctic ice sheet

So it looks like we should be more worried about WAIS than about Greenland, and it seems that natural processes (natural climate change and geothermal processes) have caused large sea level changes in the past during interglacial periods (albeit not rapid ones) and will continue to cause sea level to changes in the future.

Human contribution so far to sea level rise does not seem particularly significant, given the early 20th century rate of sea level rise is about the same as the current rate.  

Our ways of inferring future rates of sea level rise from ice sheet melting is crude – we can speculate but not with much confidence.  The danger posed by sea level rise is a function of the rate of change far more than the actual sea level itself.

Does Hansen et al. make any contribution to all this?  Well their proposed mechanism with feedbacks is of interest and should be explored further. But their conclusions regarding an alarming rate of sea level rise are at best possible (and not plausible).

SOURCE




House Action to Ban GMO Labeling Laws Merits Praise

If the U.S. House has its way, state laws passed (and those being considered) requiring that foods produced using genetic modification (i.e. genetically modified crops or biotech foods) be label would become moot. On July 24, 2015, the house voted 275 for to 150 against, passing a bill banning state laws that force food makers to place labels on products that contain genetically modified organisms (GMOs).

The agriculture industry complained individual state labeling standards would be costly and confusing and, more importantly, any standard, even a universal federal standard, would unfairly lend credence to environmentalists false assertions or suggestions that biotech foods are not as safe or healthy as conventional foods developed through traditional cross breeding techniques.

Vermont, Connecticut and Maine have already passed mandatory GMO labeling laws, though they have yet to take effect, while GMO labeling laws are being considered in few other states. The House bill would prevent them all.

Democrats and Republican’s alike supported the GMO labeling ban. The Minneapolis Star Tribune notes Democratic Rep. Tim Walz of Minnesota supported the House bill he said, “hundreds of scientific, peer reviewed studies have found [genetically engineered] foods are just as safe and nutritious as non-[genetically engineered] foods.” Another Minnesota Rep. who voted in favor of the bill, Republican Rep. Tom Emmer argued “Minnesota farmers already deal with heavy compliance regulations to ensure that genetically engineered crops are safe to eat.”

The vote came on the heels another in a long list of literature reviews and analyses demonstrating the safety of biotech foods was published in Salon Magazine on July 15, 2015. In it the author William Saletan notes, organizations lobbying against GMOs routinely lie and have been consistently anti-scientific in their claims about biotech foods, contributing to public misunderstanding and, in some cases, hysteria.

Much of the food industry was thrilled with the House vote and hopes the Senate will move quickly to pass the bill as well.

Ag-giants General Mills and Cargill, and the nation’s largest farmer-owned cooperative, CHS Inc., each lobbied for the bill. In statements post-passage, they praised the House vote. CHS Inc.’s statement said “CHS applauds the House of Representatives for passing the Safe and Accurate Food Labeling Act.”

Sometimes sound science wins despite environmental fear-mongering. Three cheers for the House of Representatives.

SOURCE




Hillary sways Democrats with her impressive ignorance on Global Warming

Hillary Clinton was in Iowa, talking about the subject foremost on everyone's mind. No, not the tremendous national debt, or illegal immigration, or Iran getting nuclear weapons, but global warming.

She praised Iowa for its success with wind energy, which she said was an example of good environmental and economic policy. She said she favored a wind-production tax credit

How is it good economic policy when the championed energy source requires taxpayer subsidies? How is it good energy policy when the championed energy source stops working when the wind dies down?

Clinton called for harnessing the power of the sun to generate enough renewable energy to run every home in the country within the next decade, as part of a climate-change initiative announced Sunday.

And what happens at night when there is no sun? Do we have to go back to the middle ages and live in darkness without power for 12 hours a day?

In a campaign video, Clinton says, “It’s hard to believe that people running for president refuse to believe the settled science of climate change.”

It's hard to believe that Hillary believes the science is settled. Thousands of scientists would disagree. I imagine, in the same vein, that she believes that the matter of her using a personal email server for State Department emails without compromising national security is also settled, as well as the matter of accepting foreign donations to her slush fund/foundation while she was secretary of state.

“This is not complicated, folks,” she said. “The people on the other side will answer any question about climate change by saying, ‘Well, I’m not a scientist.’ Well, I’m not a scientist either. I’m just a grandmother with two eyes and a brain.”

I agree she has two eyes, but what's behind them could simply be jello. I'll bet she can't answer the simplest questions, such as "What causes global warming? Did you know that CO2 is mostly produced naturally, and man-made sources are insignificant? Did you know that CO2 is a tiny percentage of the upper atmosphere? How does that trap heat?"

The climate-change initiative announced on Clinton’s Web site calls for having more than 500 million solar panels installed by the end of her first term and generating enough renewable energy to power every home in the country within 10 years of Clinton taking office.

How much land will that take up? How much will it cost? And again, what will we do at night and on cloudy days?

It's just like a liberal to master a superficial understanding of a subject and then render themselves an expert.

SOURCE




Warming predictions increasingly detumescent

At the European Institute for Climate and Energy (EIKE), retired climate scientist Hans-Joachim Lüdecke and two colleagues have responded to the Senate testimony given by Pat Michaels

Lüdecke and his colleagues agree with Dr. Michaels’ assertion that the projected increase in the earth’s temperature from CO2 is getting smaller and smaller.

F. Gervais, C.O. Weiss and H.J. Lüdecke write at EIKE:

“Anyone who has been tracking the scientific journals on climate science has observed over many years that the supposedly expected temperature increase from CO2 has steadily been decreasing over the years.”

This means that all the assumptions and claims made by the IPCC in the past were based on hype and totally inaccurate results.

Gervais, Weiss and Lüdecke conclude in their EIKE piece:

“We can now tell politcians that they can call off the warnings. There’s no chance of a global warming of more than 2°C .

The decrease in the projected temperature rise from CO2 will continue on its present trend. By 2025 the warming by CO2 will be close to zero. We can thus expect that the quality of the forecasts will increase to the point where they will actually reflect reality.”

SOURCE





How Safe is Your Drinking Water and Can the EPA Really be Trusted?

“What’s in my Water?” by David De John

David De John in his book, “What’s in my Water?”, sets forth in fourteen chapters the risks involved in some drinking water. Featured in De John’s book are chapters dealing with the following subjects:  “About EPA contaminant levels; About Water Supply Filtration Systems; Skin Absorption/Inhalation of Contaminants; Is Bottled Water the Answer?; Filtration Devices and Equipment; and EPA Violation Information and Reports.”

De John has been called upon as the expert to review Department of Health investigation reports on water quality by the media. As a keynote speaker at medical conferences, De John has likewise spoken at numerous public venues. Furthermore, his book has been used as training material by some of the largest water filtration companies in the country, along with being distributed to consumers as educational information by the companies.

De John owned multiple water filtration locations in the Midwest. During that period, he came upon a report — referenced as originating from the Center for Disease Control (CDC) — indicating that approximately one million people become ill every year and an estimated 100,000 people die due to infectious drinking water.

This CDC report spearheaded De John’s quest to find the truth about our nation’s drinking water quality, leading him to conduct extensive research through thousands of pages of information and data from the Environmental Protection Agency (EPA); the Center for Disease Control (CDC); the American Journal of Public Health; the Department of Public Health and many more sources.

As such, De John’s book is not based on opinions or theories, nor is it intended to impart health or medical advice.  Instead, “What’s in my Water?” is a compilation of the important elements of De John’s research obtained from government and professional organizations, which, in turn, led to his writing of “What’s in my Water?”, prompted, in a large part, by requests from DeJohn’s numerous business associates who at one time were competitors.

EPA and Contaminates

In speaking recently to De John by phone, he stressed how important it was for readers to be briefed on what the Environmental Protection Agency has to say about contaminants in our drinking water:  As printed here on Page 2 of“Drinking Water Quality Consumer Confidence Reports”:

“Some people may be more vulnerable to contaminants in drinking water than the general population. Children and infants, pregnant women and their fetuses, the frail elderly, people undergoing chemotherapy or living with HIV/AIDS, and transplant patients can be particularly at risk for infections… If you have special health care needs, consider taking additional precautions with your drinking water…”

As the standards set by the EPA are relied upon and trusted throughout this nation for drinking water quality, it was most shocking to learn from De John how the Environmental Protection Agency defines contaminant levels. Additionally, the list of drinking water contaminants and their health effects can be further reviewed here.

“Maximum Contaminant Level Goal (MCLG):  The level of a contaminant in drinking water below which there is no known or expected risk to health. MCLGs allow for a margin of safety and are non-enforceable public health goals.”
“Maximum Contaminant Level (MCL) – The highest level of a contaminant that is allowed in drinking water. MCLs are set as close to MCLGs as feasible using the best available treatment technology and taking cost into consideration. MCLs are enforceable standards.”
It becomes extremely important that the wording of the above EPA definitions be examined.

The first definition is about MCLG (Maximum Contaminant Level Goal):  Notice the wording: “below which there is NO KNOWN OR EXPECTED RISK TO HEALTH” and “THEY ARE UNENFORCEABLE.”

 The second one defines MCL (Maximum Contaminant Level): Again notice the wording: “using the BEST AVAILABLE TREATMENT TECHNOLOGY AND TAKING COST INTO CONSIDERATION” and “MCL’s are enforceable standards.”

Dichotomy Between EPA’s Standards for contaminants and what is allowed

How can it be that the EPA regulates 90 contaminants in our drinking water, allows 33 of those contaminants to exceed the lower level (MCLG), but then decides to regulate the 33 at the higher level (MCL)?  In other words, just because the MCLG states that it allows for a margin of safety, doesn’t mean that drinking water is safe with contaminants at the higher regulated level (MCL)?  Even the EPA is at odds with its own MCLG and MCL definitions as to what constitutes safe contaminate levels in the water we drink.  How is this so?

A cancer-causing contaminant called Radium 226 and 228 can be present in drinking water around the country. The EPA set the Maximum Contaminant Level Goal (MCLG) for Radium 226 and 228 at ZERO. That is the level at which there is no known or expected risk to health.  Nevertheless, it’s unbelievable that the EPA allows Water Supply Systems to provide drinking water with a Radium 226 & 228 level of 5 pCi/L.

Is it safe to drink water over a long period of time if containing a 5pCi/L Radium 226 & 228 level?  No!  Increased levels of Radium 226 & 228 are even linked by the EPA to an increased risk of cancer, including fatal cancer.

Following is what the Maryland Department of Environment, the US EPA and the Anne Arundel County Health Department states about long term ingestion of Radium 226 & 228, as found under “What are the health risks of radium indigestion?”

“For radium 226 and 228, the U.S. EPA estimates that the additional lifetime risks associated with drinking water containing 5pCi/l is about 1 in 10,000. This means that if 10,000 people were to consume two liters of this water per day for 50 years, one additional fatal cancer would be estimated among the 10,000 exposed individuals. According to the EPA model, as the level of radium increases, so does the risk. For example, increasing the concentration of radium from 5 to 10 pCi/l would increase the lifetime risk from approximately one to two additional deaths per 10,000 individuals.”

More about Radium 226 & 228

What they are talking about is death from drinking tap water from your kitchen sink with Radium 226 & 228 at the allowed Maximum Contaminant Level?

As David De John related to me, in his initial research for the book he found over 45 Water Supply Systems that were violating the “higher allowed level” (Maximum Contaminant Level) for Radium 226 & 228 here in Illinois alone. Some Water Supply Systems had levels of over 24 pCi/L. That is almost 500% higher than the allowed 5 pCi/L level.

Based on the Maryland Department of Health and the U.S. EPA’s calculations in the quote above, that would mean potentially five cancer deaths per year per 10,000 people. Although De John did assure me that some of the Water Supply Systems have taken steps to fix the contaminant level, there  still remain many with contaminant levels far exceeding the “higher allowed” Maximum Contaminant Level. This is not a new problem, for Radium 226 & 228 is formed in the rock beds down in the earth and seeps through the cracks in the rocks into the aquifers where Water Supply Systems might be pumping water from.

Radium 226 & 228 is but one of the 33 contaminants in your drinking water that the EPA allows to exceed the MCLG stated lower level, which the EPA then deems as acceptable at the “higher”MCL (Maximum Contaminate Level).  However, a dichotomy exists in what the EPA has to say about drinking water with contaminants that exceed the higher Maximum Contaminant Level and how violations are issued.

“A health-based violation means that either a system has exposed their users to what EPA has judged as an unreasonable risk of illness, or a system has failed to treat their water to the extent EPA has judged necessary to protect their users from an unreasonable risk of illness in the event that the regulated contaminant is present in source water.”

To be noted is that for 33 of the 90 contaminants being regulated by the EPA that were allowed to exceed the MCLG lower level (MCLG), no violations were issued by the EPA until they reached the higher MCL.  Noted below are violations reported by the EPA in 2010:

8,522 violations for health based standards reported by Water Supply System
17,519 Water Supply Systems were in violation for failure to monitor or submit a report on contaminants in their water.
Contaminants added to water with EPA approval

Contaminates in your drinking water should not be your only concern. Regulated and potentially dangerous contaminants approved by the EPA are actually added by the Water Supply Systems themselves, such as Chlorine.

Chlorine started being used back in the 1908 because of illnesses like Cholera and Typhoid. For reference purposes at that time it was estimated that Typhoid Fever killed about 25 people out of 100,000 people. That was serious enough for the government to take action to stop any more deaths. Those death rates work out to 2 ½ per 10,000 people. As a reference point, the anticipated death rate from cancer stated by the EPA from Radium 226 & 228 in drinking water is 1 per 10,000 people at the allowed Maximum contaminant level.

Consider what the EPA has to say about chlorine:

“Disinfectants, while effective in controlling many microorganisms, react with matter in water to form DBPs. Unchlorinated private well water is unlikely to contain any DBPs……..While health effects from exposure to disinfectants and DBPs vary by contaminant, some epidemiological studies have shown a link between bladder, rectal and colon cancers and DBP exposure.”

What about lead?  Can lead be in your drinking water?  Absolutely YES!

“Evidence also suggests that for children with BLLs 5–9 ugdl [indicates lead poisoning threshold], no single source of exposure predominates. For these children, the contribution of multiple sources, including drinking water, seems likely, particularly for children who do not have well-established risk factors such as living in old housing or having a parent who is exposed to lead at work (38). CDC and its Advisory Committee on Childhood Lead Poisoning Prevention concur that primary prevention of lead exposure is essential to reducing high BLLs in children and that reducing water lead levels is an important step in achieving this goal. …”

What about Fluoride?  The Maximum Contaminant Level set by the EPA for fluoride was 1.2 ppm until a few years ago, when it was increased from 1.2 ppm to 4.0 ppm.  Why did this happen?

“Political appointees at the Environmental Protection Agency (EPA) raised the acceptable level of fluoride in drinking water from 1.2 ppm to 4 ppm, over objections from their agency scientists… 7,000 *EPA union employees and the unions jumped into the debate.”

*The 7,000 union employees argued against increasing the level of fluoride in drinking water.

According to this report put out by the U.S. National Library of Medicine and the National Institutes of Health: “The PTD, 5.0 mg F/kg, is defined as the dose of ingested fluoride that should trigger immediate therapeutic intervention and hospitalization because of the likelihood of serious toxic consequences.”

Take a look at a tube of toothpaste with Fluoride as an ingredient and you will find a warning:

“…..If you accidentally swallow more than used for brushing seek professional assistance or contact a Poison Control Center immediately.”

Did you know that the recommended amount of toothpaste that should be used is a size of a pea!?

Highlighted information in David De John’s book

“What’s in my Water?” is filled with vital information to help you understand what is taking place with our nations water quality and what you can do about it. There are actions points in almost every chapter, there is a chapter on home water filtration systems, bottled water, the complete EPA contaminant list, levels and affects, along with a list of every states drinking water quality office contact information and more.

Also included in De John’s book is a complete list of the EPA Regulated Drinking Water Contaminants, their MCLG’s and MCLs, the EPA stated potential health effects from Long-Term Exposure above the MCL and a list of the 33 contaminants that are allowed to exceed the MCLG. There is also a sample of the Drinking Water Consumer Confidence Report (which is available to all consumers from their Water Supply System) with an explanation of how to understand it, and a list of every state’s Drinking Water Protection Program Offices with phone numbers and addresses.

The embedded links are special, in that additional information can be be seen that relates to the issue at hand.

I recommend that if you read only one book this year, it has to be “What’s in my Water?” recognizing that the information in the De John’s book is not intended as health or medical advice.  Any medical questions or concerns should be discussed with a qualified medical practitioner.

How does your drinking water measure up in your community or city?  It is up to you to request the latest analysis of the water you are being supplied by your local water treatment plant.  If not acceptable, demand that action be taken.

After all, it is your health and the health of those in your city or community who might be at risk.

SOURCE

***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here

*****************************************

Monday, July 27, 2015



"Salon" discusses using courts to punish climate deniers.

Bring it on! Skeptics should be looking forward to this.  A court case would be a great opportunity to expose the hollowness of the global warming scare.  You can see why "Salon" is very tentative about the idea.  For political reasons, the Dutch government could not mount a defense on the basis of the science but other individuals and bodies would not be under that constraint.  Al Gore's movie was declared inaccurate by a British court but it would have much more impact if the whole hoax was declared inconclusive by a court

Last month a court at The Hague ordered the Dutch government to cut its emissions by at least 25% compared to 1990 levels by 2020, the first ruling of its kind anywhere in the world. The victory was the result of a class action lawsuit brought by an NGO called the Urgenda Foundation (short for “Urgent Agenda”), which charged the Dutch government with “hazardous state negligence” in the face of climate change. Along with the rest of the EU, the Netherlands is taking a promise to cut 40% against 1990 by 2030 to the Paris climate talks in December, but they are off track, looking to achieve only a 17% cut by 2020. The court extracted a confession from the government’s lawyers that more could be done, and therefore ruled that not doing more was negligent.

The case has excited activists around the world. This week Marjan Minnesma, Urgenda’s co-founder and director, was in Australia, advising groups looking to emulate her success. “It’s the kind of action we’d love to run and we’re investigating”, environmental lawyer Sean Ryan told the Guardian. Australia is of course headed by the government of Tony Abbott, an aggressive climate change denier. In the face of such apathy, courts may be the best option. The speculative Australian attempt is one of five cases found by RTCC.org that might benefit from the Urgenda example, including one almost identical in its goal and reasoning brought (and recently won) by eight teenagers in Washington State.

Historically, courts seem to have backed away from climate change, preferring to leave it up to legislators and diplomats. In 2008, for example, the tiny Alaskan village of Kivalina sued several major oil corporations, including ExxonMobil, BP and Shell, for putting it under threat of rising sea levels and erosion. It’s handful of citizens wanted compensation to move the entire community to a different location. All courts up to the U.S. Supreme Court dismissed the case as an issue for the executive and legislative branches. This refusal to play a role may be about to change. Ceri Warnock analyzing the Urgenda decision for the Journal of Environmental Law, believes that the case and a handful like it may indicate that courts are moving, in the face of an extreme danger such as climate change, to close a constitutional gap between the duty of governments to protect their citizens and their means of doing so. Climate change makes the unthinkable — that courts might be called upon to “re-balance” the constitution — thinkable. Such an internal conflict was on display as far back as 2007′s landmark Massachusetts vs Environmental Protection Agency, where the justices of the Supreme Court clashed over the threat posed by climate change and the causal link with emissions. Declaring that emissions caused climate change and climate change was a threat to the plaintiffs, the majority ordered the EPA to reconsider its refusal to treat carbon dioxide and other greenhouse emissions as pollutants.

With the nations of the world lining up to promise vague or inadequate emissions cuts, and with no mechanism yet in place to enforce them, is it time to call in the lawyers? Do courts have a duty to push governments to act on the threat of climate change, or is this better (and perhaps more legitimately) left to governments?

SOURCE





Have three climate change scientists been ASSASSINATED? The astonishing claim made by a Cambridge professor

Because they don't actually understand what is going on, Leftists are very prone to conspiracy theories.  Climate skeptics can even control lightning, would you believe?

A Cambridge professor has claimed that three scientists investigating climate change in the Arctic may have been assassinated.

Professor Peter Wadhams insists Seymour Laxon, Katharine Giles and Tim Boyd could have been murdered by someone possibly working for the oil industry or within government forces.

The trio had been studying the polar ice caps - with a focus on sea ice - when they died within a few months of each other in 2013.

Professor Laxon, 49, a director of the Centre for Polar Observation at University College London, was at a New Year's Eve party in Essex when he fell down a flight of stairs and died.

Meanwhile oceanographer Dr Boyd, 54, was out walking his dogs near his home in Port Appin, Argyll, western Scotland, in January 2013 when he was struck by lightning and killed instantly.

Just months later in April, Dr Giles, 35, was cycling to work at UCL where she lectured when she was hit by a tipper truck in Victoria, central London, and died.

Professor Wadhams, Cambridge University's head of Polar Ocean Physics Group, claims that in the weeks after Professor Laxon's death, he was targeted by a lorry trying to force him off the road.

He reported the incident to the police but did not express his concerns about the scientists over fears he would be labelled a 'looney', he told The Telegraph.

'It's just very odd coincidence that something like that should happen in such a brief period of time,' he said.

'They [the deaths] were accidents as far as anybody was able to tell but the fact they were clustered like that looked so weird.'

He added: 'I thought if it was somebody assassinating them could it be one of our people doing it and that would be even more frightening. I thought it would be better not to touch this with a barge pole.'

But his comments have left Professor Laxon's partner, Fiona Strawbridge - also a close friend of Dr Giles - furious and she has labelled it 'outrageous and very distressing'.

SOURCE





Petroleum power: an eco-revolution

Laura Ingalls Wilder’s "The Long Winter" is generally regarded as the most historically accurate book of her semi-autobiographical "Little House on the Prairie" series. The Long Winter tells the story of how the inhabitants of De Smet (present-day South Dakota) narrowly avoided starvation during the severe winter of 1880-81, when a series of blizzards dumped nearly three and a half metres of snow on the northern plains – immobilising trains and cutting off the settlers from the rest of the world. Faced with an imminent food shortage, Laura and her neighbours learned that a sizeable amount of wheat was available within 20 miles of their snow-covered houses. Her future husband, Almanzo Wilder, and a friend of his risked their lives in order to bring back enough food to sustain the townspeople through the rest of the winter. With the spring thaw, the railroad service was re-established and the Ingalls family enjoyed a long-delayed Christmas celebration in May.

The Long Winter is a valuable reminder of how lethal crop failures and geographical isolation could be before the advent of modern farming and transportation technologies. Not too long ago, subsistence farmers across the West had to cope with the ‘lean season’ – the period of greatest scarcity before the first availability of new crops. As some readers may know, in England the late spring (and especially the month of May) was once referred to as the ‘hungry gap’ and the ‘starving time’. One problem was the cost and difficulty of moving heavy things over often muddy and impracticable dirt roads; three centuries ago, moving a ton of goods over 50 kilometres on land between, say, Liverpool and Manchester was as expensive as shipping them across the north Atlantic.

The development of coal-powered railroads and steamships revolutionised the lives of our ancestors. Among other positive developments, landlocked farmers could now specialise in what they did best and rely on other farmers and producers for their remaining needs. The result was not only more abundant food at ever-cheaper prices, but the end of widespread famine and starvation, as the surplus from regions with good harvests could now be shipped to those that had experienced mediocre ones. (Of course, a region that experienced a bumper crop one year might have a mediocre one the next.)

In time, petroleum-derived products such as diesel, gasoline, kerosene (jet fuel) and bunker fuels (used in container ships) displaced coal because of their higher energy density, cleaner combustion and greater ease of extraction, handling, transport and storage. Nearly two thirds of the world’s refined petroleum products are now used in land, water and air transportation, accounting for nearly 95 per cent of all energy consumed in this sector. Despite much wishful thinking, there are simply no better alternatives to petroleum-powered transport at the moment. For instance, despite very generous governmental subsidies, battery electric, hybrid electric and plug-in hybrid vehicles have repeatedly failed to gain any meaningful market shares against gasoline-powered cars. This is because of their limited range and power, long charging time, bad performance in cold weather, security concerns (especially in collisions), and inadequate electricity production and delivery infrastructure.

While the convenience of cars is obvious, few people grasp their historical significance in terms of public health and environmental benefits. The best historical anecdote on the topic goes something like this. In 1898, delegates from across the globe gathered in New York City for the world’s first international urban-planning conference. The topic that dominated discussions was not infrastructure, housing or even land use, but horse manure. The problem was that just as a large number of people had moved to cities from the countryside, so had powerful workhorses, each one of them producing between 15 and 30 pounds of manure and one quart of urine every day. For New York, this meant well over four million pounds of manure each day, prompting claims that by 1930 it would rise to Manhattan’s third storey. At about the same time, a contributor to The Times in London estimated that by 1950 every street in London would be buried nine-feet deep in horse manure. Unable to think of any solution, the New York delegates called it quits after three days, as they concluded that urban living was inherently unsustainable.

Paradoxically, much of the urban-manure problem had been created by the advent of the railroad, and other technologies such as canning and refrigeration. On the one hand, it had cut into the profitability of manure-consuming farms, located near cities, by delivering cheaper perishable goods (fruits, vegetables, meat and dairy products) from locations that benefited from better soil and climate. On the other, because rail transport was not flexible enough to handle final deliveries, railroad companies often owned the largest fleets of urban horses.

Apart from their stench, urban stables and the manure piles that filled practically every vacant lot were prime breeding grounds for house flies, perhaps three billion of which hatched each day in American cities at the turn of the twentieth century. With flies came outbreaks of deadly infectious diseases, such as typhoid and yellow fever, cholera and diphtheria. Workhorses’ skittishness in heavy traffic also meant that they stampeded, kicked, bit and trampled a number of bystanders. According to one estimate, the fatality rate per capita in urban traffic was roughly 75 per cent higher in the horse era than today. The clatter of horseshoes and wagon wheels on cobblestone pavement was also incredibly noisy. They also created significant traffic congestion, because a horse and wagon occupied more street space than a modern truck, while a badly injured horse would typically be shot on the spot or abandoned to die on the road, creating a major obstruction that was difficult to remove in an age without tow trucks. (Indeed, street cleaners often waited for the corpses to putrefy so they could be sawed into pieces and carted off with greater ease.)

The impact of urban workhorses was also felt in the countryside. First, workhorses ate a lot of oats and hay. One contemporary British farmer calculated that one workhorse would consume the produce of five acres of land, which could have fed six to eight human beings. In the words of transportation historian Eric Morris, ‘directly or indirectly, feeding the horse meant placing new land under cultivation, clearing it of its natural animal life and vegetation, and sometimes diverting water to irrigate it, with considerable negative effects on the natural ecosystem’.

So, while early twentieth-century cars were noisy and polluting by today’s standards, they were a significant improvement on the alternatives. In later decades, advances such as the removal of lead from gasoline and the development of catalytic converters would essentially eliminate their more problematic features. Although not completely green, today’s petroleum-powered cars remain one of humanity’s most underappreciated environmental successes.

Railroads, ships and trucks also delivered significant environmental benefits. One longstanding problem, as the Marxist theorist Karl Kautsky observed in his 1899 classic The Agrarian Question, was that as ‘long as any rural economy is self-sufficient it has to produce everything which it needs, irrespective of whether the soil is suitable or not. Grain has to be cultivated on infertile, stony and steeply sloping ground as well as on rich soils.’ (1) In many locations without much prime agricultural land, primitive technologies ensured not only that at least 40 acres and a mule were required to sustain a household, but also that much environmental damage, primarily in the form of soil erosion, was done in the process. Fortunately, Kautsky observed, modern transportation had made possible the development of regions like the Canadian prairies and brought much relief to poorer soils in Europe, where more suitable forms of food production, such as cultivating orchards, rearing beef cattle and dairy farming, could now be practiced sustainably.

Over time, the concentration of food production in the world’s best locations allowed a lot of marginal agricultural land to revert to a wild state. For instance, France saw its forest area expand by one third between 1830 and 1960, and by a further quarter since 1960. This so-called ‘forest transition’ occurred in the context of a doubling of the French population and a dramatic increase in standards of living. Reforestation – or an improvement in the quality of the forest cover in countries such as Japan where it has no room to grow – has similarly occurred in all major temperate and boreal forests. Every country with a per-capita GDP now exceeding $4,600 – roughly equal to that of Chile – has experienced this, as well as some developing economies ranging from China and India to Bangladesh and Vietnam. (Of course, the replacement of firewood and charcoal with coal, kerosene, heavy oil and natural gas was also significant.)

The modern-logistics industry further allowed the production and export of food from locations where water is abundant to consumers living in regions where it isn’t, thus preventing the depletion of surface waters and aquifers in drier parts of the world. It also made possible a drastic increase in the size of our cities. In the words of economist Ed Glaeser: ‘Residing in a forest might seem to be a good way of showing one’s love of nature, but living in a concrete jungle is actually far more ecologically friendly… If you love nature, stay away from it.’ (2) To put things in perspective, cities now occupy between two and three per cent of the Earth’s surface, an area that is expected to double in the next half century. And in roughly half of the world today, far more agricultural land has been reverting to wilderness than has been converted to suburbia. (3)

Unfortunately, activists are often blind to the environmental benefits of petroleum-powered transportation. Countless local-food activists have embraced the notion of ‘food miles’ – the distance food items travel from farms to consumers – as the be all and end all of sustainable development. However, as has been repeatedly and rigorously documented in numerous studies, the distance travelled by food is unimportant. For one thing, producing food typically requires much more energy than moving it around, especially when significant amounts of heating and/or cold-protection technologies, irrigation water, fertilisers, pesticides and other inputs are required to grow things in one region, but not in another. Reducing food miles in such circumstances actually means a greater environmental impact. The distance travelled by food also matters less than the mode of transportation used. For instance, shipping food halfway around the world on a container ship often has a smaller footprint per item carried than a short trip by car to a grocery store to buy a small quantity of these items. (4)

To most of us, the notion that we can have our cake and eat it too is mind-boggling. Yet, in many respects, this is what petroleum products in general and modern transportation technologies in particular have actually delivered. Until something truly better comes along, they remain essential for the creation of a wealthier, cleaner and more sustainable world.

SOURCE





Green energy policies are costing us the future

When it comes to energy prices, UK homeowners are being seriously ripped off. But it’s not the much demonised energy providers who are to blame. The real culprit, according to a report from think-tank Policy Exchange, is the spiralling cost of green subsidies, which has led to a £60 increase for the average energy bill over the past five years. The report suggests that energy suppliers are only responsible for approximately 19 per cent of the total cost of household energy; meanwhile, the government has direct control over more than a third of your energy bill, meaning that the cost of the UK’s drive towards renewables is footed by energy customers.

This green revolution – demanded by the quinoa-munching class and paid for by everyone else – has cost the UK more than enough already, with far too little to show for it. People are quick to forget that coal – much maligned by green-energy fanatics – was the fuel on which modern Britain was built. As the fossil fuels burned, families were lifted out of poverty, and life expectancy rose. Now, as developing nations emulate us, burning their own abundant fuel reserves in the process, the developed West has the nerve to condemn them for it.

In doing this, we are cruelly pulling up the drawbridge to cheap industrialisation, from the warmth and comfort of our own developed countries. Are we so blinded by green politics that we ignore how much we owe to our own, environmentally unfriendly, Industrial Revolution? The millions of people in China lifted out of poverty over the past decade were helped on their way by vast quantities of coal. When have wind turbines or solar panels ever lifted anyone out of poverty? As the global energy mix is forced towards a greater reliance on renewables, the opportunities for development in the poorest parts of the world are stifled.

In Britain, it’s time to rethink our own energy mix, follow America’s lead and turn to shale gas for our energy needs. Fracking would increase our available supply substantially, and bolster our energy security with it. Nuclear power presents another opportunity. Uranium is as clean as it is plentiful, and is a tried-and-tested winner in the countries that have embraced it.

If we don’t abandon unreliable wind and expensive solar, we’ll end up paying even more for our energy – with a smaller output to show for it. Britain’s misguided and expensive green adventure has served only to run up an enormous subsidy bill. As the taxpayer forks out, UK politicians pat themselves on the back for their efforts in mitigating the supposed threat of global warming, when, in reality, politicians’ impact on global emissions is negligible.

It’s time we put the wellbeing of fellow humans before green dogma. If more lives can be improved by burning fossil fuels, fracking or pursuing nuclear power, then surely it’s time to tear down the turbines and fire up the power plants.

SOURCE





Mr. President: The 1970s Called, They Want Their Crude Oil Export Ban Back

Not long ago, during a presidential campaign debate with Mitt Romney, President Barack Obama suggested Romney had, at one point, stated Russia was the number one geopolitical threat to the United States. The president then quipped—with his usual glibness—“The 1980s are now calling to ask for their foreign policy back.”

Well, Mr. President, you have a phone call, too. It’s the 1970s calling, and they want their crude oil export ban back.

The crude oil export ban was signed into law in 1975 in the wake of the Arab oil embargo that brought long lines for gasoline and high oil prices. Today, by contrast, hydraulic fracturing, also known as fracking, has made the United States the world’s largest producer of crude oil. The outdated export ban puts U.S. oil producers at a competitive disadvantage with other countries, and may actually serve to increase gas prices at the pump.

Imagine what would happen if we didn’t allow our farmers to export their crops. A farmer has just harvested a bumper crop but he doesn’t have enough room to store it all, so he decides to sell it. But there is a problem: All of his neighbors have bumper crops too, and that has driven domestic prices so low the farmer will lose money if he sells his crop because the export ban prevents him from selling it to other countries for a higher price to make a profit.

In the short term, this might sound like a great deal for people in this country who want to buy the farmers’ crops, because they will get lower prices. But that effect is only temporary, because the low prices cause some farmers to go out of business. Other farmers are forced to plant fewer crops the next year because they can’t afford to buy the seeds or fertilizer to grow more. As a result, we produce less food in this country, and we are forced to import food from other countries, making us more reliant on other countries to meet our most basic needs, often at a higher price than before. This is exactly what our crude oil export ban does to American energy producers and consumers.

West Texas Intermediate (WTI) is the price paid for American oil, and last month the WTI price was about eight dollars lower than the price for Brent oil, the price the rest of the world pays for oil, putting U.S. energy companies at a distinct disadvantage vis-à-vis countries such as Russia, Saudi Arabia, and Venezuela. The price of oil produced in North Dakota is even lower than WTI, because oil refineries in the United States are not set up to process the light, sweet, crude oil produced in this area.

Oil refineries in the United States are geared to run on heavy, sour crude oil, not the light, sweet crude that comes from North Dakota. A report by IHS, an energy consulting firm, states the “United States is nearing a “gridlock” with the mismatch between the rapid growth of light sweet oil and the inability of the U.S. refining system to economically process these growing volumes.”

Additionally, the report suggested the assumption that allowing crude oil exports would result in higher gasoline prices is not accurate because oil refineries are already allowed to export gasoline, meaning the price of gas at the pump already reflect global prices. The report also estimates lifting the ban could lower gas prices by an annual average of 8 cents per gallon, adding to the $675 dollars the average American household is already saving on lower gas prices.

The U.S. Environmental Protection Agency recently released its long-awaited report on hydraulic fracturing and found it has “not led to widespread, systemic pollution of drinking water.” This is great news for our energy future. Now, Mr. President, let’s lift that export ban. It’s time to get frackin’.

SOURCE




The 'Hour of Power': Hybrid Motors come to ships

Ships to run off batteries for one hour -- enough to get them out of port and out from under local regulations

In 2015 two significant developments are going to make many operators, owners and builders of professional vessels consider hybrid marine power. Firstly the new emissions laws in ports and secondly there is now an incentive for high technology manufacturers to invest in developing highly efficient batteries.

Hybrid is ‘here and now’ technology that is being used by many industries globally. The marine industry is now recognizing the potential of utilizing hybrid power and innovative propulsion systems for vessels in the sub IMO / sub 24 meter professional sector.

 ‘The Hour Of Power’ has been well received by the marine industry worldwide. This simple concept enables vessels to run in and out of port for an hour on electric with battery power - then carry out their open sea work on diesel power. The aim of this innovative hybrid solution is to enhance conventional propulsion systems. Vessels can reduce emissions and improve fuel consumption whilst extending engine maintenance periods and engine life.

This is not just green energy for the sake of it - ‘The Hour Of Power’ focuses on hybrid solutions linked to viable business cases. For commercial and professional organisations the concept of running vessels with zero emissions at up to 10 knots for one hour will shape decisions that lead to improvements of in-service systems and procurement of next generation vessels. The overall objective is fuel saving and improved efficiency by all means.

For the marine industry to move forward it needs to use expertise from aviation and other sectors to drive this innovation and support relevant safety standards. Automotive manufacturers in Europe, the Far East and the U.S. have recognised that hybrid technologies such as PHEV (Plug-in Hybrid Electric Vehicle) using lithium ion batteries will be dominant for the next decade. Reducing emissions from busses and trucks in the world’s major cities has been a major driver for lithium ion battery power storage. The need for self sufficient land based grid applications has further extended the capabilities of next generation battery and hybrid technology.

There are two main types of hybrid system. A serial hybrid is where the engine only powers a generator, and is not mechanically connected to the propeller shaft. A parallel hybrid is where the engine is mechanically connected along with an electric ‘machine’ that can operate as both propulsion motor and generator.

Certain sectors are potentially well suited to hybrid diesel / electric systems. These include wind farm service vessels and pilot boats that have relatively consistent duty cycles. We are entering a period of rapid change and commercial opportunity in the hybrid marine market. End-user organizations, boat builders, engine manufacturers and naval architects are now investigating systems for survey vessels, superyacht tenders, patrol vessels and unmanned craft.

SOURCE

***************************************

For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here

*****************************************