Friday, May 31, 2019

Japan finds a huge cache of scarce rare-earth minerals

Essential to much "Green" technology

Japan looks to replace China as the primary source of critical metals.  Enough rare earth minerals have been found off Japan to last centuries.Rare earths are important materials for green technology, as well as medicine and manufacturing. Where would we be without all of our rare-earth magnets?

Rare earth elements are a set of 17 metals that are integral to our modern lifestyle and efforts to produce ever-greener technologies. The "rare" designation is a bit of a misnomer: It's not that they're not plentiful, but rather that they're found in small concentrations, and are especially difficult to successfully extract since they blend in with and resemble other minerals in the ground.

China currently produces over 90% of the world's supply of rare metals, with seven other countries mining the rest. So though they're not precisely "rare," they are scarce. In 2010, the U.S. Department of energy issued a report that warned of a critical shortage of five of the elements. Now, however, Japan has found a massive deposit of rare earths sufficient to supply the world's needs for hundred of years.

The rare earth metals can be mostly found in the second row from the bottom in the Table of Elements. According to the Rare Earth Technology Alliance, due to the "unique magnetic, luminescent, and electrochemical properties, these elements help make many technologies perform with reduced weight, reduced emissions, and energy consumption; or give them greater efficiency, performance, miniaturization, speed, durability, and thermal stability."

Japan located the rare earths about 1,850 kilometers off the shore of Minamitori Island. Engineers located the minerals in 10-meter-deep cores taken from sea floor sediment. Mapping the cores revealed and area of approximately 2,500 square kilometers containing rare earths.

Japan's engineers estimate there's 16 million tons of rare earths down there. That's five times the amount of the rare earth elements ever mined since 1900. According to Business Insider, there's "enough yttrium to meet the global demand for 780 years, dysprosium for 730 years, europium for 620 years, and terbium for 420 years."

The bad news, of course, is that Japan has to figure out how to extract the minerals from 6-12 feet under the seabed four miles beneath the ocean surface — that's the next step for the country's engineers. The good news is that the location sits squarely within Japan's Exclusive Economic Zone, so their rights to the lucrative discovery will be undisputed.


Trump to Replace Alarmist Climate Models With Sound Science

The president taps a Princeton scientist to head his new climate-review panel. 

Given the fact that extremist climate-change-prognostication models have been wildly inaccurate, it would be wise to avoid basing any serious environmental policy on them. In fact, anyone interested in following the sound and time-tested scientific method would demand nothing less, and yet the Leftmedia is up in arms over President Donald Trump’s recent decision to do just that. The New York Times blows its climate alarmist’s dog whistle with the headline, “Trump Administration Hardens Its Attack on Climate Science.” Hardly.

The Times fallaciously asserts that “the attack on science is underway,” supposedly evidenced by Trump’s appointment of geologist and former astronaut James Reilly as the director of the United States Geological Survey. And how is Reilly “attacking science”? By insisting upon the practice of sound science. The Times huffs, “Reilly … has ordered that scientific assessments produced by that office use only computer-generated climate models that project impact of climate change through 2040, rather than through the end of the century, as had been done previously.”

In reality, Trump is pushing for the government to return to adhering to sound scientific practice for informing policy decisions, rather than agenda-driven hysterics. James Hewitt, a spokesman for the Environmental Protection Agency, explained, “The previous use of inaccurate modeling that focuses on worst-case emissions scenarios [and] that does not reflect real-world conditions needs to be thoroughly reexamined and tested if such information is going to serve as the scientific foundation of nationwide decision-making now and in the future.”

The Trump administration is also creating a new climate-review panel to be headed by respected Princeton University scientist William Happer. Long an outspoken critic of the alarmism surrounding rising CO2 levels, Happer has argued, “The public in general doesn’t realize that from the point of view of geological history, we are in a CO2 famine. … There is no problem from CO2. The world has lots and lots of problems, but increasing CO2 is not one of the problems. So [the Paris accord] dignifies it by getting all these yahoos who don’t know a damn thing about climate saying, ‘This is a problem, and we’re going to solve it.’ All this virtue signaling.”

No more Chicken Little climate alarmism dictating policy. It’s time to return to sound, verifiable scientific practices that don’t elevate worst-case predictions as a means of pushing for ever-more government regulation.


Gasoline tax propsals versus a carbon tax

There are an increasing number of reports about the business community’s growing support for a carbon tax.  Americans for Carbon Dividends, CERES, and the CEO Climate Dialogue along with several environmental groups are urging Congress to pass legislation that would put a price on carbon—CO2 emissions.

According to Axios, 69% of republicans are concerned that “their party’s position on climate will hurt them with younger voters and 43 % …said that their concern about climate change has increased in the past year.”  David Doniger at the Natural Resources Defense Council told Axios…“They see a rising public demand for action and they’re smart enough to know this extreme denial of the Trump era will not last and may be coming to a halt in 2020.”

Maybe they are right and a turning point has been reached but don’t bet on it.  The recent election in Australia is one piece of evidence.  Voters there had a clear choice between pro-growth policies and policies of higher taxes and income distribution.  Growth won.  One way to test the public’s receptiveness for a carbon tax is for Congress to propose and try to pass something like a 25 or 50 cent increase in the gasoline tax since that would apply directly to CO2 emissions.

The reason for skepticism is the reality of how voters are reacting to proposals to raise state gasoline taxes.  Proposals in Ohio, Missouri, and Minnesota for example have run into to fierce resistance from voters.

A study by two European economists captured the reality well.  Given concerns about climate change, “the option of raising gasoline taxes has received greater consideration in the American public policy debate. …Gasoline tax increases remain nevertheless highly unpopular. Public resistance to them is at least partly explained by their adverse distributional effects. In developed economies …gasoline is generally a necessity good in household consumption. Therefore, gasoline price increases tend to affect the poor more than the wealthy in relative terms. That is, they tend to be regressive.”

While the rhetoric surrounding a carbon tax might initially produce public support, especially given promises returning some proceeds to individuals in the form of cash dividends, the realities of carbon taxes can only be hidden for so long.  The public would soon realize that the taxing mechanism would give Congress another way to feed its spending appetite, would not be as simple and straight forward as promised, and would be gamed by crony capitalists.

So, if Congress can get the public to buy into a gasoline tax increase that would set predicate for moving onto a carbon tax in spite of the fact that it is a scam.  The impact on global CO2 emissions would be trivial since they are increasing as a result of coal fired power units being built by China, India, and other countries while US emissions peaked in 2005.


Why predictions go wrong

Credentialed authorities are comically bad at predicting the future. But reliable forecasting is possible -- from non-experts

The idea for the most important study ever conducted of expert predictions was sparked in 1984, at a meeting of a National Research Council committee on American-Soviet relations. The psychologist and political scientist Philip E. Tetlock was 30 years old, by far the most junior committee member. He listened intently as other members discussed Soviet intentions and American policies. Renowned experts delivered authoritative predictions, and Tetlock was struck by how many perfectly contradicted one another and were impervious to counterarguments.

Tetlock decided to put expert political and economic predictions to the test. With the Cold War in full swing, he collected forecasts from 284 highly educated experts who averaged more than 12 years of experience in their specialties. To ensure that the predictions were concrete, experts had to give specific probabilities of future events. Tetlock had to collect enough predictions that he could separate lucky and unlucky streaks from true skill. The project lasted 20 years, and comprised 82,361 probability estimates about the future.

The result: The experts were, by and large, horrific forecasters. Their areas of specialty, years of experience, and (for some) access to classified information made no difference. They were bad at short-term forecasting and bad at long-term forecasting. They were bad at forecasting in every domain. When experts declared that future events were impossible or nearly impossible, 15 percent of them occurred nonetheless. When they declared events to be a sure thing, more than one-quarter of them failed to transpire. As the Danish proverb warns, “It is difficult to make predictions, especially about the future.”

Even faced with their results, many experts never admitted systematic flaws in their judgment. When they missed wildly, it was a near miss; if just one little thing had gone differently, they would have nailed it. “There is often a curiously inverse relationship,” Tetlock concluded, “between how well forecasters thought they were doing and how well they did.”

Early predictions in Tetlock’s research pertained to the future of the Soviet Union. Some experts (usually liberals) saw Mikhail Gorbachev as an earnest reformer who would be able to change the Soviet Union and keep it intact for a while, and other experts (usually conservatives) felt that the Soviet Union was immune to reform and losing legitimacy. Both sides were partly right and partly wrong. Gorbachev did bring real reform, opening the Soviet Union to the world and empowering citizens. But those reforms unleashed pent-up forces in the republics outside Russia, where the system had lost legitimacy. The forces blew the Soviet Union apart. Both camps of experts were blindsided by the swift demise of the U.S.S.R.

One subgroup of scholars, however, did manage to see more of what was coming. Unlike Ehrlich and Simon, they were not vested in a single discipline. They took from each argument and integrated apparently contradictory worldviews. They agreed that Gorbachev was a real reformer and that the Soviet Union had lost legitimacy outside Russia. A few of those integrators saw that the end of the Soviet Union was close at hand and that real reforms would be the catalyst.

The integrators outperformed their colleagues in pretty much every way, but especially trounced them on long-term predictions. Eventually, Tetlock bestowed nicknames (borrowed from the philosopher Isaiah Berlin) on the experts he’d observed: The highly specialized hedgehogs knew “one big thing,” while the integrator foxes knew “many little things.”

Hedgehogs are deeply and tightly focused. Some have spent their career studying one problem. Like Ehrlich and Simon, they fashion tidy theories of how the world works based on observations through the single lens of their specialty. Foxes, meanwhile, “draw from an eclectic array of traditions, and accept ambiguity and contradiction,” Tetlock wrote. Where hedgehogs represent narrowness, foxes embody breadth.

Incredibly, the hedgehogs performed especially poorly on long-term predictions within their specialty. They got worse as they accumulated experience and credentials in their field. The more information they had to work with, the more easily they could fit any story into their worldview.

Unfortunately, the world’s most prominent specialists are rarely held accountable for their predictions, so we continue to rely on them even when their track records make clear that we should not. One study compiled a decade of annual dollar-to-euro exchange-rate predictions made by 22 international banks: Barclays, Citigroup, JPMorgan Chase, and others. Each year, every bank predicted the end-of-year exchange rate. The banks missed every single change of direction in the exchange rate. In six of the 10 years, the true exchange rate fell outside the entire range of all 22 bank forecasts.

In 2005, tetlock published his results, and they caught the attention of the Intelligence Advanced Research Projects Activity, or IARPA, a government organization that supports research on the U.S. intelligence community’s most difficult challenges. In 2011, IARPA launched a four-year prediction tournament in which five researcher-led teams competed. Each team could recruit, train, and experiment however it saw fit. Predictions were due at 9 a.m. every day. The questions were hard: Will a European Union member withdraw by a target date? Will the Nikkei close above 9,500?

Tetlock, along with his wife and collaborator, the psychologist Barbara Mellers, ran a team named the Good Judgment Project. Rather than recruit decorated experts, they issued an open call for volunteers. After a simple screening, they invited 3,200 people to start forecasting. Among those, they identified a small group of the foxiest forecasters—bright people with extremely wide-ranging interests and unusually expansive reading habits, but no particular relevant background—and weighted team forecasts toward their predictions. They destroyed the competition.

Tetlock and Mellers found that not only were the best forecasters foxy as individuals, but they tended to have qualities that made them particularly effective collaborators. They were “curious about, well, really everything,” as one of the top forecasters told me. They crossed disciplines, and viewed their teammates as sources for learning, rather than peers to be convinced. When those foxes were later grouped into much smaller teams—12 members each—they became even more accurate. They outperformed—by a lot—a group of experienced intelligence analysts with access to classified data.

In Tetlock’s 20-year study, both the broad foxes and the narrow hedgehogs were quick to let a successful prediction reinforce their beliefs. But when an outcome took them by surprise, foxes were much more likely to adjust their ideas. Hedgehogs barely budged. Some made authoritative predictions that turned out to be wildly wrong—then updated their theories in the wrong direction. They became even more convinced of the original beliefs that had led them astray. The best forecasters, by contrast, view their own ideas as hypotheses in need of testing. If they make a bet and lose, they embrace the logic of a loss just as they would the reinforcement of a win. This is called, in a word, learning.


Australia: Adani coalmine in path of more than one endangered species in Queensland

Dilemma:  Approving the mine would cost the State Labor government its deputy leader at the next election and give the seat to the Greens. Premier Annastacia with deputy Jackie Trad (in pink) above

Well fancy that. After months of dithering on Adani’s proposed Carmichael mine in the Galilee Basin and refusing to intervene in an approval process that saw the company frustrated at every turn, Queensland Premier Annastacia Palaszczuk has suddenly found her voice again. And here we were thinking Annastacia had a case of aphasia.

Saying she was “fed up” with the delays, Palaszczuk announced deadlines of May 31 and June 13 for the mine’s bird protection and groundwater management plans respectively. You might say when it comes to the demands of anti-Adani activists, Palaszczuk is desperately trying to give the appearance of not giving a flying finch.

“Well I think everyone’s had a gutful, so that’s why we have moved — why I have moved quickly — to resolve this issue,” she stated.

Notice the nuance? Actually, it was more an extended middle finger to Deputy Premier and Treasurer Jackie Trad, the leader of the dominant left faction which controls Cabinet and has constantly hindered Adani. At a press conference last weekend at the Gold Coast’s Sea World, the pair attempted to portray party unity.

With an election just over a year away, it is not just the government’s future at stake. In 2017, Trad’s formerly safe Labor inner-city seat of South Brisbane suffered an 11.7 per cent swing to the Greens. She holds the seat by only a 3.6 per cent margin thanks to the Liberal National Party’s preferencing Labor over the Greens in 2017 — a decision the LNP has announced it will not be repeating at the next election. Worst still for Trad, the national election showed this Greens incursion has increased, with some federal booths within her seat registering a swing as high as 15 per cent.

In what can only be described as a case of chronic denialism, both Palaszczuk and Trad have denied the delays in the Carmichael mine approval process had anything to do with Labor’s federal election rout in Queensland, “I think the Carmichael mine … was part of that message, but it wasn’t the entire message,” Trad told ABC radio last week.

To reiterate: Adani had planned to begin construction of the mine prior to Christmas last year, but this was delayed when the government ordered an independent review into the company’s environmental management plans for the black-throated finch. “We are now seeing more processes and actions coming in at the eleventh hour when we have been working on this for the best part of 18 months,” said an exasperated Adani mining chief executive, Lucas Dow, in December.

As if intentionally exacerbating this situation, the government rejected Adani’s management plan at the beginning of this month, the Queensland Department of Environment and Science claiming it “did not meet requirements”. This is the same department that last July appointed anti-Adani activist Dr Tim Seelig as an adviser, along with Greens candidates Kirsten Lovejoy and Gary Kane.

But all this has nothing to do with federal Labor’s primary vote in Queensland dropping to 27.3 per cent, right? Wrong. Palaszczuk and Trad have dug a hole for themselves so big it would have inspired Jules Verne, had he still been alive, to write a sequel to Journey to the Centre of the Earth. This bureaucratic and political farce is about protecting an endangered species alright, but it is the squawking Member for South Brisbane the government is concerned about, not the black-throated finch.

The party charade of trying to appease anti-Adani voters in the inner-city while attempting to convince those in the regions it is pro-mining intensified on the eve of the 2017 election. At that time, polling revealed that the Greens led Labor 51 per cent to 49 per cent on a two-party preferred basis in Trad’s seat.

Around the same time, Palaszczuk, to the disbelief of many, announced she had exercised a “veto” not to support an application by Adani for a $1 billion Northern Australia Infrastructure Facility loan. The ostensible basis for this was that she wanted to remove any perception of a conflict of interest, as her then partner, Shaun Drabsch, worked on the application to the NAIF with his employer, PricewaterhouseCoopers (PwC), which acted for Adani.

In attempting to defend her arbitrary decision, Palaszczuk cited that she had relied on advice from the Queensland Integrity Commissioner, Dr Nikola Stepanov. But as Jamie Walker of The Australian revealed, the commissioner’s advice was merely that Palaszczuk should exclude herself from Cabinet deliberations concerning the NAIF loan. In fact, ministers had, during a crisis Cabinet meeting five months before the Premier’s announcement, resolved not to support the NAIF loan bid.

Undoubtedly, Palaszczuk had succumbed to pressure from Trad, who later had the chutzpah to criticise the federal government’s NAIF program, saying it “has not yet seen a single dollar go to a Queensland project”. Earlier that year Trad had intervened to scotch Palaszczuk and then Treasurer Curtis Pitt’s agreement with Adani which would have seen royalties limited to $2m annually for the first seven years of the mine’s operation.

“I have never been anti-coal,” Trad told the Australian Financial Review in 2015. “I actually think it’s ridiculous to think we don’t use our natural resources — it’s one of our strengths”. Yet in February this year she told parliament “markets are moving away from thermal coal, communities are moving away from thermal coal, nation states are moving away from thermal coal”.

Translation: inner-city seats are moving away from Labor to the Greens.

“What we need to do as a coal exporter is understand that, and equip our communities with the best possible chance of re-skilling, and that’s why we’re focused on other materials,” she said. Contrast this West End insouciance with the urgency of a group of Labor regional MPs that, as The Courier-Mail reported this week, is threatening to form a sub-caucus.

One wonders how long, were it not for the federal election forcing its hand, the Queensland government was prepared to prolong this debacle. Put simply, it cannot have had a viable exit strategy, for Adani has committed far too much to abandon the project. In the event the government refuses approval, it has, through its intransigence and decisions based on ulterior motives, left itself open to a compensation claim amounting to hundreds of millions, perhaps even billions.

It would be a legal battle that would take years to finalise. Of course, that will affect neither Palaszczuk nor Trad. By then they will be enjoying retirement and a generous taxpayer-funded pension.

Palaszczuk appears destined for Opposition unless she can clear the way for Adani quickly. Of course that would mean Trad would lose her seat, but the Deputy Premier need not despair. After all, there is always re-skilling.



For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here


Thursday, May 30, 2019

Newest drought scare just another climate scam

Federal bureaucrats are propagating another climate scare this week, claiming their new study shows global warming is causing drought and will soon result in “unprecedented drying.” The underlying data, however, show the bureaucrats are misrepresenting the results of their study.

Headlined by workers at NASA’s Goddard Institute and the Lawrence Livermore National Laboratory, the authors acknowledge, “The most recent Intergovernmental Panel on Climate Change (IPCC) report indicates only low confidence in attributing changes in drought” to global warming. The authors then set about to try to change that assessment.

The authors utilized soil moisture measurements and computer models in an attempt to discern connection and causation between global warming and drought. The authors reported drought increased during the early 20th century, some 100 years of global warming ago.

Noting, however, that “a negative trend indicates that the data and [warming] fingerprint are increasingly dissimilar,” the authors acknowledged that “In the middle of the twentieth century, these trends become negative.”

From 1981 through the present, the authors reported, “the signal of greenhouse gas forcing is present but not yet detectable at high confidence.” The signals were so small, the authors acknowledged, that they “are not detectable at the likely level over background noise.”

In summary, the authors found that between 1950 and 1980, the signal was the opposite of what one would expect if global warming causes drought. From 1981 through the present, there was a signal so small that it was indistinguishable from background noise. The only detectable signal connecting warming temperatures and drought was during a period 100 years ago, when temperatures were cooler than today. The warming since then has not caused any detectable drought.

The findings strike a hammer blow against the notion that global warming causes alarming levels of drought, or even any detectable drought at all. People who work at climate change departments for the federal government, however, must keep the notion of a climate crisis alive to preserve their apparent importance and their jobs. So here are a few snippets of how the authors spun the story, according to USA Today:

“’The big thing we learned is that climate change started affecting global patterns of drought in the early 20th century,’” said study co-author Benjamin Cook of the NASA Goddard Institute for Space Studies and Columbia University’s Lamont-Doherty Earth Observatory. ‘We expect this pattern to keep emerging as climate change continues.’”

“Lead author Kate Marvel, a climate modeler at Goddard and Columbia University, said, ‘It’s mind boggling. There is a really clear signal of the effects of human greenhouse gases on the hydroclimate.’”

“‘All the models are projecting that you should see unprecedented drying soon, in a lot of places,’” Marvel said.

Actually, we are only seeing unprecedented alarmism, everywhere taxpayer dollars are to be had.


White House hardens attack on climate hoax

President Trump has rolled back environmental regulations, pulled the United States out of the Paris climate accord, brushed aside dire predictions about the effects of climate change, and turned the term "global warming" into a punch line rather than a prognosis.

Now, after two years spent unraveling the policies of his predecessors, Trump and his political appointees are launching a new assault.

In the next few months, the White House will complete the rollback of the most significant federal effort to curb greenhouse-gas emissions, initiated during the Obama administration. It will expand its efforts to impose Trump's hard-line views on other nations, building on his retreat from the Paris accord and his recent refusal to sign a communiqu‚ to protect the rapidly melting Arctic region unless it was stripped of any references to climate change.

And, in what could be Trump's most consequential action yet, his administration will seek to undermine the  science on which climate change policy rests.

As a result, parts of the federal government will no longer fulfill what scientists say is one of the most urgent jobs of climate science studies: reporting on the future effects of a rapidly warming planet and presenting a picture of what the Earth could look like by the end of the century if the global economy continues to emit heat-trapping carbon dioxide pollution from burning fossil fuels.

The administration's prime target has been the National Climate Assessment, produced by an interagency task force roughly every four years since 2000. Government scientists used computer-generated models in their most recent report to project that if fossil fuel emissions continue unchecked, the Earth's atmosphere could warm by as much as 8 degrees Fahrenheit by the end of the century. That would lead to drastically higher sea levels, more devastating storms and droughts, crop failures, food losses, and severe health consequences.

Work on the next report, which is expected to be released in 2021 or 2022, has already begun. But from now on, officials said, such worst-case scenario projections will not automatically be included in the National Climate Assessment or in some other scientific reports produced by the government.

"What we have here is a pretty blatant attempt to politicize the science - to push the science in a direction that's consistent with their politics," said Philip Duffy, president of the Woods Hole Research Center, who served on a National Academy of Sciences panel that reviewed the government's most recent National Climate Assessment. "It reminds me of the Soviet Union."

In an e-mail, James Hewitt, spokesman for the Environmental Protection Agency, defended the proposed changes.

"The previous use of inaccurate modeling that focuses on worst-case emissions scenarios, that does not reflect real-world conditions, needs to be thoroughly reexamined and tested if such information is going to serve as the scientific foundation of nationwide decision-making now and in the future," Hewitt said.

However, the goal of political appointees in the Trump administration is not just to change the climate assessment's methodology, which has broad scientific consensus, but also to question its conclusions by creating a new climate review panel. That effort is led by William Happer, a 79-year-old physicist who had a respected career at Princeton but has become better known in recent years for attacking the science of man-made climate change and for defending the virtues of carbon dioxide.

“The demonization of carbon dioxide is just like the demonization of the poor Jews under Hitler,” the physicist, William Happer, who serves on the National Security Council as the president’s deputy assistant for emerging technologies, said in 2014 in an interview with CNBC.

Mr. Happer’s proposed panel is backed by John R. Bolton, the president’s national security adviser, who brought Mr. Happer into the N.S.C. after an earlier effort to recruit him during the transition.

Mr. Happer and Mr. Bolton are both beneficiaries of Robert and Rebekah Mercer, the far-right billionaire and his daughter who have funded efforts to debunk climate science. The Mercers gave money to a super PAC affiliated with Mr. Bolton before he entered government and to an advocacy group headed by Mr. Happer.

Climate scientists are dismissive of Mr. Happer; his former colleagues at Princeton are chagrined. And several White House officials — including Larry Kudlow, the president’s chief economic adviser — have urged Mr. Trump not to adopt Mr. Happer’s proposal, on the grounds that it would be perceived as a White House attack on science.

Even Stephen K. Bannon, the former White House strategist who views Mr. Happer as “the climate hustler’s worst nightmare — a world-class physicist from the nation’s leading institution of advanced learning, who does not suffer fools gladly,” is apprehensive about what Mr. Happer is trying to do.

“The very idea will start a holy war on cable before 2020,” he said. “Better to win now and introduce the study in the second inaugural address.”

But at a White House meeting on May 1, at which the skeptical advisers made their case, Mr. Trump appeared unpersuaded, people familiar with the meeting said. Mr. Happer, they said, is optimistic that the panel will go forward.

More HERE 

NY’s green new dud

New York produced less electricity from renewable sources in 2018 than it did the year before despite significant intervention by state government.

Governor Andrew Cuomo’s Public Service Commission (PSC) in August 2016 ordered utilities and large electricity customers to subsidize new renewable projects by purchasing credits, with an eye toward having half the state’s electricity come from renewables by 2030. Cuomo has recently touted the policy as part of his version of a “Green New Deal,” and hiked the 2030 renewable target to 70 percent.

The PSC doesn’t calculate the total cost of its renewable mandate and utilities are prohibited from showing ratepayers the impact on their bills. But based on the number of RECs utilities must buy and the price at which they were being sold by the state Energy Research & Development Authority (NYSERDA), the mandate will cost about $50 million this year and rise to more than $100 million by 2021.

The New York Independent System Operator, which oversees the wholesale electricity market,  detailed 2018 generation in its annual Power Trends report this month, and it calls into question what New Yorkers are getting for their generosity.

The data show renewable energy generators, including hydroelectric, wind, solar and others, together produced 35,808 gigawatt hours of electricity last year. That amounted to a 2.5 percent drop from 2017. All told, 26.4 percent of generation last year came from renewables, down from 28.0 percent the year before. The numbers don’t reflect the state’s efforts promoting behind-the-meter renewable, such as most solar panel deployments, which reduce demand for power from the grid and aren’t easily quantified. That said, New York customers used more electricity from the grid in 2018 than in 2016 or 2017.

As shown below, the state does not appear to be on the path to reaching Cuomo’s initial 50 percent goal, let alone his more recent 70 percent target. In fact, the state has yet to hit the 30-percent target state regulators set in 2010—hoping to hit it by 2015.

Most of the renewable energy came from the state Power Authority’s Niagara Falls and Massena dams, where annual outputs fluctuate based on weather factors and operational decisions. But when all hydroelectric power is excluded, it reveals less renewable energy (6,763 GWh) was sold on the grid in 2018 than in 2015 (7,064 GWh), before the Clean Energy Standard was adopted.

How could that be? For one thing, the electric grid hasn’t been able to deliver it. NYISO warned state regulators in 2010 that transmission would be an issue if the state wanted to promote wind development, and roughly 70 GWh of wind energy last year had to be “curtailed” because the grid couldn’t move it from its upstate source to where demand was higher.

Still other projects have been stuck on the drawing board because proposals to build solar panel and wind turbine depots in rural upstate regions have met local opposition. The state has taken the unconventional step of using state resources to help renewable-friendly local officials change municipal codes and smooth the way for private developers.

The state’s ham-handed approach to subsidies failed to initially account for the many smaller renewable generators getting state support under a Pataki-era program. One biomass generator went out of business at the end of 2017 because state officials couldn’t, in the interceding 16 months, fold them into the new subsidy regime.

Even when issues with transmission and existing facilities are worked out, the state faces additional headwinds.

New York’s program has relied on a pair of lucrative federal incentives that are now being phased out. Wind projects that break ground after December 31 will no longer be eligible for the Production Tax Credit (PTC), which pays owners a subsidy for each kilowatt-hour generated. The Investment Tax Credit (ITC) that benefits larger solar projects will pay out 30 percent of capital costs on projects built this year before unwinding down to 10 percent in 2021.

Meanwhile, close to two-thirds of the current wind energy capacity (1,162 MW of 1,739 MW) came online prior to 2009. Assuming a 20-year project life expectancy, some facilities will likely need to be replaced before 2030.

All told, New York’s renewable energy policies have never been poised for success.

First, the Cuomo administration has been more interested in tactics–such as building more renewables—than the actual goal of lowering carbon dioxide emissions.

And even that tactic was over-constrained: the PSC explicitly disqualified most hydroelectric power from state subsidies in no small part because it might make greater economic sense to finance a new dam in Quebec than to deploy solar panels in places like snowy Oswego County.

To cite one hypothetical alternative, if the state imposed a broad carbon tax, people and businesses (and perhaps even governments) would have found efficiencies and reduce emissions at the lowest cost. But Cuomo’s “Green New Deal” has never been about any sort of environmental outcomes. That became clear when the Cuomo administration last year took steps to steer construction work on renewable energy projects to building trade unions.

The anemic growth of land-based wind and other renewables will increase pressure on the administration to go big and further subsidize construction of larger wind turbines off Long Island. Unfortunately for ratepayers, offshore wind is the single most expensive type of renewable energy, and the Cuomo administration has already signaled most of the necessary billions will come from electricity customers north of New York City.


Rejecting Wind and Solar: Deep Green Resistance

Solar panels and wind turbines aren’t made out of nothing. They are made out of metals, plastics, and chemicals. These products have been mined out of the ground, transported, processed, manufactured. Each stage leaves behind a trail of devastation: habitat destruction, water contamination, colonization, toxic waste, slave labor, greenhouse gas emissions, wars, and corporate profits.

Yesterday’s post shared with readers the scary premises and means of the Deep Green Resistance, now the Progressive/Left option to the Green New Deal.

Today’s post shares the DGR’s views on renewables, which this group correctly sees as invasive to the natural world. One wishes that mainstream, Washington, DC-centric environmentalists would wake up to the fact that wind power and solar panels are very invasive to the natural world relative to dense, mineral energies.

Here is the DGR’s views verbatim.

Will Green Technology Save the Planet?

No. Wind turbines, solar PV panels, and the grid itself are all manufactured using cheap energy from fossil fuels. When fossil fuel costs begin to rise such highly manufactured items will simply cease to be feasible.

Solar panels and wind turbines aren’t made out of nothing. They are made out of metals, plastics, and chemicals. These products have been mined out of the ground, transported, processed, manufactured. Each stage leaves behind a trail of devastation: habitat destruction, water contamination, colonization, toxic waste, slave labor, greenhouse gas emissions, wars, and corporate profits.

The basic ingredients for renewables are the same materials that are ubiquitous in industrial products, like cement and aluminum. No one is going to make cement in any quantity without using the energy of fossil fuels. And aluminum? The mining itself is a destructive and toxic nightmare from which riparian communities will not awaken in anything but geologic time.

From beginning to end, so called “renewable energy” and other “green technologies” lead to the destruction of the planet. These technologies are rooted in the same industrial extraction and production processes that have rampaged across the world for the last 150 years.

We are not concerned with slightly reducing the harm caused by industrial civilization; we are interested in stopping that harm completely. Doing so will require dismantling the global industrial economy, which will render impossible the creation of these technologies.

Aren’t renewable energies like solar, wind, and geothermal good for the environment?

No. The majority of electricity that is generated by renewables is used in manufacturing, mining, and other industries that are destroying the planet. Even if the generation of electricity were harmless, the consumption certainly isn’t. Every electrical device, in the process of production, leaves behind the same trail of devastation. Living communities — forests, rivers, oceans — become dead commodities.

The emissions reductions that renewables intend to achieve could be easily accomplished by improving the efficiency of existing coal plants, businesses, and homes, at a much lower cost. Within the context of industrial civilization, this approach makes more sense both economically and environmentally.

That this approach is not being taken shows that the whole renewables industry is nothing but profiteering. It benefits no one other than the investors.

OK, renewable technologies have some impacts, but they’re still better than fossil fuels, right?

Renewable energy technologies are better than fossil fuels in the same sense that a single bullet wound is “better” than two bullet wounds. Both are grievous injuries.

Do you want to shoot the planet once or twice?

The only way out of a double bind is to smash it: to refuse both choices and craft a completely different path. We support neither fossil fuels nor renewable tech.

Even this bullet analogy isn’t completely accurate, since renewable technologies, in some cases, have a worse environmental impact than fossil fuels.

More renewables doesn’t mean less fossil fuel power, or less carbon emissions. The amount of energy generated by renewables has been increasing, but so has the amount generated by fossil fuels. No coal or gas plants have been taken offline as a result of renewables.

Only about 25% of global energy use is in the form of electricity that flows through wires or batteries.  The rest is oil, gas, and other fossil fuel derivatives. Even if all the world’s electricity could be produced without carbon emissions, it would only reduce total emissions by about 25%. And even that would have little meaning, as the amount of energy being used is increasing rapidly.

It’s debatable whether some “renewables” even produce net energy.  The amount of energy used in the mining, manufacturing, research and development, transport, installation, maintenance, grid connection, and disposal of wind turbines and solar panels may be more than they ever produce; claims to the contrary often do not take all the energy inputs into account.  Renewables have been described as a laundering scheme: dirty energy goes in, clean energy comes out.


Is the Long Renewables Honeymoon Over?

The European renewables industry press, which is usually unequivocally upbeat in its assessments, is currently reporting a broad spectrum of substantial problems in the sector, ranging from bankruptcies and technical problems to tepid policy support and increasing public resistance.

In a fundamentally viable energy generation sector such stories could be regarded as minor perturbations, but in one that has been for decades all but completely insulated from risk by subsidy and other non-market support, it suggests deep-seated structuro-physical weakness.

The German wind turbine manufacturer Senvion S.A., formerly trading under the name of RePower, is currently in financial difficulties. This Hamburg-based firm, which has installed over 1,000 wind turbines in the UK alone, applied to commence self-administered insolvency proceedings in mid-April this year, and is at present sustained by a EUR 100m loan agreement with its lenders and main bond holders. Senvion has delayed both its AGM, which was due to take place on the 23 May, and also the publication of its recent financial results. At the time of writing the company had not yet announced a new timetable.

For nearly eight years, from 2007 to 2015, Senvion was owned by the Indian wind turbine manufacturer, Suzlon, and is now the property of the private equity firm, Centerbridge Partners. It is currently rumoured in the industry press that Centerbridge may now be compelled to cut its losses by making a distressed sale to Asian, probably Chinese, companies seeking a cheap way of acquiring a wind power market toehold in Europe. Western companies are thought to be unlikely to have the appetite for such a purchase, and their reluctance is entirely understandable: as Ed Hoskyns shows in a recent note for GWPF using EurObservER data, the annual installation rates for wind and solar have halved in the EU28 since 2010. Senvion may be the first major company to feel the effects of this downturn, and is certainly large enough for its difficulties to have wide ramifications, with two of its suppliers, FrancEole, which makes towers, and the US company TPI Composites, which makes blades, both being hurt by reduced revenues. Indeed, FrancEole was already in a poor way, and is now reported as being on the verge of liquidation.

Projects that were being supplied by Senvion are also affected, with the building of one, Borkum West 2.2, a 200 MW offshore wind farm, being suspended mid-construction since components due from Senvion have not been delivered on schedule. This delay, which has been front-page news in some circles, must be causing considerable headaches for Borkum West’s developer, Trianel GmbH, which is apparently now seeking to establish direct links with Senvion’s suppliers so that they can complete the project.

Elsewhere in the offshore wind universe, two large and relatively new projects are in the midst of what must be costly repairs involving significant downtime. Having received regulatory approval, the Danish mega-developer Orsted is about to start removing and renovating all 324 blades on the 108-turbine, 389 MW, Duddon Sands wind farm in the UK part of the Irish Sea, a year after problems first became apparent. The machines used, the Siemens 3.6–120, have suffered leading edge erosion, a problem that affects perhaps some 500 turbines in Europe (See “Type Failure or Wear and Tear in European Offshore Wind?”), and requiring the application of a remedial covering to each blade.

Less can be read in the public domain about the repairs about to restart at the gigantic, EU-funded Bard Offshore 1, which is owned by Ocean Breeze Energy GmbH & Co. KG. The project, which commissioned in 2013, has eighty 5 MW turbines, with a total capacity of 400 MW. Bard had already suffered a well-known series of cable failures, and it now transpires that both nacelles and rotors have been undergoing replacement for about two years, though Ocean Breeze is, according to industry press reports, apparently declining to confirm how many turbines are affected. The company’s website gives no information in either German or English that I could find.

There would, then, appear to be a great deal of work in servicing offshore wind installations, but this has not been enough to prevent Offshore Marine Management Ltd (OMM), a UK-based offshore wind contractor, entering into voluntary liquidation after several years of losses. Interestingly, OMM, a relatively small company though prominent in the UK, cited the increasingly “competitive nature” of the sector as a factor underlying its failure, and it seems likely that it was unable to survive the efforts of developers determined to reduce both capital and operational and maintenance costs to the bone (and judging from the failures reported, perhaps into the bone itself). With margins pared thin, costly local suppliers may quite simply be forced out of the market, and regardless of their other merits. Related evidence of this phenomenon, which is clearly global, can be found in the fact that the Danish mega-developer Orsted is now grumbling that the Taiwanese government’s insistence of a high level of local content for its projected 900 MW Changua 1 & 2a offshore wind farms will double the capital cost from approximately £1.6m/MW to about £3m/MW.

One wonders whether this underlying reality was discussed at the recent and apparently robust meeting between the Scottish Government and the offshore wind industry, convened because the Scottish metal manufacturing firm BiFab had not been commissioned to make equipment for the 950 MW Moray East wind farm, a wind farm that has one of the much over-hyped Contracts for Difference at £57.50/MWh. The supply deals had instead been awarded to Lamprell, which is based in the UAE. The Scottish Energy Minister, Paul Wheelhouse, MSP, used the meeting to express “significant frustration” that local firms had been involved to such a small degree hitherto, in spite of repeated promises.

Did Benji Sykes of the Offshore Wind Industry Council, present at the meeting, cite the Taiwanese case and explain to Mr Wheelhouse that something very similar would apply in Scotland, and that if local content was insisted upon, then construction costs would increase substantially and subsidies would also have to be increased to pay for it? Did he explain that there is genuine doubt whether Moray East can be viable at £57.50/MWh, even with low-cost international suppliers, and that local content would certainly not improve that situation? It would seem not. However, he did promise to “work closely” with the Scottish government to “ensure that communities up and down the country reap the economic benefits offshore wind offers”. Mr Wheelhouse has probably heard that before. How much longer will he go on believing it?

So much for the action in the foreground. The backdrop is also sombre. The Crown Estate, which in effect controls offshore wind development in UK territorial waters, has delayed pre-qualification for Round 4 projects until after the summer of 2019, and the German maritime agency, the BSH, has disappointed developers by not assigning new development zones as had been requested. In delay is danger, and the offshore wind industry in general will be deeply concerned at the loss of momentum that may result from these decisions.

Onshore wind is doing no better. The most recent auction for wind contracts in Germany took place in February and was radically undersubscribed, with only 476 MW of a possible 700 MW being awarded, the underlying causes being, it is reported, less favourable planning consent regulations and less generous price support. Senvion itself is described in some reports as being one of the supply chain casualties, alongside the German tower and foundation maker, Ambau GmbH, which has already filed for bankruptcy.

One wonders why these companies were not better prepared. Reductions in subsidy in Germany were inevitable, and the tightening of planning regulations is long overdue and unsurprising. Indeed, it is remarkable that the German public has tolerated for so long such intense development in close proximity to domestic housing. However, some German states are now considering an exclusion zone of 1 km from the nearest turbine, which is still extremely close for structures in excess of 100m, and now heading, believe it or not, to over 200m in overall height. The German people have been patient, but the mood is clearly changing; indeed, the premier manufacturer and developer Enercon has recently been compelled by court order to suspend construction of its 30 MW Wulfershausen wind farm because it had, apparently, breached the local authorities’ requirement that no dwelling should be within a distance ten times tip height.

This less favourable atmosphere is contributing to a general sense that existing onshore wind farms in Germany will not be repowered in great numbers at the end of their lives. About 15 GW of Germany’s onshore wind is now over fifteen years old and the end of the economic lifetime is in sight. But industry sources quoted in the subscription only press suggest that less than a third of this will actually be repowered, much less than had been expected only a few years back. The reasons given for this sudden change in prospects include declining public acceptance, reflected in tougher planning conditions, and falling subsidies.

Meanwhile, in Norway and in its home territory Sweden, Statkraft, Europe’s largest generator of renewable energy, has suspended further onshore wind construction because it would be “very challenging” to develop profitable projects in these areas. They are concentrating on other less resistant markets, such as the United Kingdom, where it has acquired a 250 MW portfolio of projects from Element Power.

But as it happens, things in the UK may prove to be no more promising. It has just dawned on the wind industry that government is actually acting on Amber Rudd’s landmark energy reset speech when Secretary of State for the Department of Energy & Climate Change in November 2015. In that speech Rudd remarked that “we also want intermittent generators to be responsible for the pressures they add to the system”. That of course was only right, but perhaps the industry hoped the intention would never materialise. If that was their expectation they were gravely mistaken. Aurora Energy Research has now released analysis of the regulator, Ofgem’s proposal to reform network charges, the “Targeted Charging Review”, and believes that the proposed changes “could set back subsidy-free renewables by up to five years”. When “unspun” this actually means is that if the regulator removes the hidden subsidy of avoided system costs, imposed by renewables but socialised over all generators, then more of the true cost of renewables will be revealed to the market, making it much less likely that even the most greenwash-thirsty corporate, NGO, or governmental body will sign an extravagant long-term Power Purchase Agreement (PPA) with a wind or solar farm. In other words, far from hindering the emergence of subsidy-free renewables, Ofgem’s reforms threaten to give the lie to the subsidy-free claim and show that it was never anything more than an empty PR gambit.

In spite of all this, it is doubtless too soon to say that the game is up for renewables. The industries concerned will fight back, and beg further direct and indirect public assistance while threatening politicians and civil servants with missed climate targets if that support is not forthcoming. In all likelihood they will be to some degree successful. But this will only delay the inevitable. As the depressing news stories summarised above suggest, after decades of public support and de-risking there are still fundamental weaknesses in the renewables industry that go well beyond teething troubles and localised management failure. One explanation, the sole necessary one in my view, is that the physics is against this industry, and that the physics is beginning to tell. It remains only to say that this blog is not licensed to give investment or financial advice.



For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here


Wednesday, May 29, 2019

Sea Level Rise Touted In New UN IPCC Report Is Mega Scary!

And is totally contradicted by history

For the upcoming  Sixth Assessment Report (AR6) a rate of global mean sea level rise over 1980-2000 is touted faster than during any preceding 20-year period since at least 1000 BCE.

The IPCC witch-doctors are rewriting their sea level narrative, see below for their wording and our comments:

Sea level change over recent decades is unprecedented over the last several millennia (medium confidence) and the rate of global mean sea level rise has increased in recent decades (high confidence).

This is incorrect, as the now emerged lands are full of marine history, and many ancient ports are now several kilometers inland.

Ostia Antica was the harbor city of ancient Rome 2,000 years ago.  It now lies 3 kilometers from the sea.

8,000 years ago, when the sea levels were about 4.5 m above present levels, the shoreline of the South China Sea almost reached Phnom Penh and the Tonle Sap Great Lake. They are now far from the sea.

Over the 20th century, tide gauge-based reconstructions show that global mean sea level has risen by 0.15-0.22 m between 1901 and 2015 (high confidence), and this increase was faster than that of any century since at least 1000 BCE (medium confidence).

This is also incorrect, as the tide gauges that have recorded since the start of the last century, show a completely different story from what is told in subjective reconstructions based on cherry picking.

If mathematics is not an opinion, a 0.15-0.22 m sea level rise 1901 to 2015 translates in a 1.32 to 1.93 mm/yr. of rate of sea level rise. Statistics of long-term-trend (LTT) tide gauges tell us that the naïve average relative rate of rising – a much better measure than a cherry picking also variable in time – is much less than that, at about 0.33 mm/yr.

Coupling the relative sea level rise information with GNSS monitoring of domes nearby the tide gauges, the so-called thermosteric component, or absolute rate of sea level rise, is also about 0.33 mm/yr. This is compatible with a gentle recovery of temperatures from the end of the last little ICE age.

The rate of global mean sea level rise over 1980-2000 was faster than during any preceding 20-year period since at least 1000 BCE (low confidence). Global mean sea level very likely rose on average by 1.2 [0.9-1.7] mm yr-1 over 1901-1990 and 1.7 [1.3-1.9] mm yr-1 over 1901-2015 and 3.1±0.3 mm yr-1 over 1993-2017 (high confidence).

This other wrong statement is an overselling of their alleged satellite global mean sea level (GMSL) measurement, and the mixing up of apples with cherries, comparing subjective interpretations of tide gauge results, with the engineered product GMSL created to show what is not.

No matter what the IPCC witch-doctors say, there is no such a thing like an instantaneous measure of the volume of the ocean waters with nanometric precision.

The truth is that a noisy, almost detrended, satellite altimeter signal has been manipulated in successive rounds of administrative corrections, to represent whatever was needed, with many pathetic excuses. This engineered product should not replace the good tide gauges observations. To be reliable, the GMSL product should match the reading of tide gauges corrected for land subsidence.

In the LTT tide gauges, Figure 1 and 2 two examples, the sea levels have oscillated about the same trend line before 1980, in between 1980 and 1990, or since 1990. No difference of behavior.

Sea level and energy budgets can be consistently closed within uncertainties for the period 1971-2018 (high confidence).

While there are no doubts that products engineered by same “pals”, for sea levels and energy budget, may fit the same narrative, the result at the long-term-trend tide gauge is confirmed by other experimental results. The direct observations of the mean sea levels at the LTT tide gauges, that are spanning more than 100 years in the different ocean basins and seas in the world, suggest negligible acceleration, and rising and falling seas for a much weaker average rate of rising.

While there are no reliable measurements of the mass of ice on land, the more direct measurements of lower troposphere temperatures and ocean temperatures 0-1900m suggest that the average rate of rising is much less  than what is claimed by the IPCC witch-doctors, only based on a circular logic of carefully engineered computational products supporting other carefully engineered computational products, never taking into account what is going on in the real world.

The lack of any significant sea level acceleration and the small average relative rate of rise have been evidenced in many works, such as Beenstock, Reingewertz and Paldor (2012); Beenstock, Felsenstein, Frank, and Reingewertz, (2015); Boretti, (2012a,b); Boretti and Watson (2012); Dean and Houston (2013); Douglas (1992); Douglas and Peltier (2002); Holgate (2007); Houston and Dean (2011); Japan Meteorological Agency (2018); Jevrejeva, Grinsted, Moore and Holgate (2006); Jevrejeva, Moore, Grinsted, and Woodworth (2008); Mörner, (2004); Mörner (2007); Mörner (2010a,b,c); Mörner, (2011a,b); Mörner (2013); Mörner (2016); Okunaka and Hirahara (2016); Parker (2013a,b,c,d,e); Parker, (2014a,b); Parker and Ollier (2015); Parker (2016a,b,c,d,e); Parker and Ollier (2017a,b); Parker (2018a,b,c); Parker and Ollier (2018); Parker, Mörner, and Matlack-Klein (2018); Parker (2019); Scafetta (2014); Schmith, Johansen, and Thejll (2012); Wenzel and Schröter (2010); and finally Wunsch, Ponte and Heimbach (2007), just to name a few. These works should not be ignored.


How humans create as well as destroy species

The effect of human activity on the natural world is profound, and if we want to gain a complete understanding of how it is altering the biosphere, then examining speciation is important.

We know that speciation does exist, and so does human-induced speciation. If we want to use biodiversity as a measure of our impact on the biosphere, then surely speciation needs to be considered.

Speciation can occur rapidly, and is not necessarily slower than extinction, so it is certainly relevant.

It is often said that we are living through one of our planet’s great mass extinction events, and that the cause is humanity. This loss of biodiversity is tragic not only for how it can and will affect our physical well being, but also for how it seems to make the world a poorer place to live in aesthetically and emotionally.

But while human activity can lead to the decline and extinction of species, it can also lead to the emergence of new species. From domestication to the creation of new ecosystems, human activity has proven an effective driver of speciation. But there is little data to quantify this phenomenon, and it is largely overlooked when discussing humanity’s impact on the natural environment.

What separates similar populations into distinct species is, of course, not always clear, but the road to speciation can be understood well enough. When a species becomes divided into different populations that cannot interbreed, and when new selection pressures are apparent, separate populations can begin to develop new traits and make steps towards speciation. Human activity has done much to create barriers to breeding, and to create new selection pressures.

Creating new species

Many of the ways in which humans can drive speciation are the same ways that humans drive extinction. The introduction of species to new habitats is one example. Invasive species can out-compete natives and drive them to extinction. But the new environment in which animals and plants find themselves, and their isolation from other populations, can encourage morphological changes to develop, as well. Data from an Australian study found that 70 percent of introduced plants had developed a new morphological trait over 150 years. On top of that, invasive species introduce new pressures on native species, which can also encourage them to change.

Domestication is perhaps the most obvious way in which humans have promoted genetic diversity. Wolves have been bred into over 400 varieties of domestic dog, and the range of crops bred by humans includes many that can be regarded as totally separate species.

Anthropogenic climate change is altering environments across the globe and creating new selection pressures. There is even evidence to show it has increased biodiversity on mountaintops. Rates of genetic change in populations hunted by humans have been shown to be greater than for populations that are not hunted.

In the future, the possible recreation, or de-extinction, of animals such as the wooly mammoth, and even the movement of organisms to extra-terrestrial bodies such as Mars, could create further opportunities for speciation. There seems no end to humanity’s power as a force for evolution.

So what, if anything, does this mean for conservation?

The effect of human activity on the natural world is profound, and if we want to gain a complete understanding of how it is altering the biosphere, then examining speciation is important. We know that speciation does exist, and so does human-induced speciation. If we want to use biodiversity as a measure of our impact on the biosphere, then surely speciation needs to be considered. Speciation can occur rapidly, and is not necessarily slower than extinction, so it is certainly relevant.

Considering speciation leads us to a number of questions. Should we consider only species loss, or net species loss, when thinking about biodiversity? Can human-induced speciation compensate for human-induced extinctions? If we are creating as many new species as we are destroying, then should we be content? The answer most people would give to this last question is almost certainly ‘no.’

The one property of a species that is not quantifiable in a simple number is the meaning it has for people. When we look at an animal, it is not just its physical properties that are important, but the impression it makes upon us. The very idea of biodiversity has emotional meaning to people, such that any loss of species, even if countered by the introduction of new species, is usually seen as tragic.

What this says about the value we place on a species, and the reasons we value biodiversity, is perhaps something that ought to be discussed.


Climate And The Fate Of America’s Corn Belt

COLD is the most likely problem

It is a remarkable thing that the U.K. and Irish parliaments were able to hypnotize themselves and pass climate emergency legislation when the southern half of the planet has not warmed at all in 120 years.

For example, this record of Cape Leeuwin (courtesy of Erl Happ), on the southwest corner of the Australian landmass, shows recent January mean maximum temperature back below the 120-year average:

Figure 1: Cape Leeuwin January Mean Maximum Temperature 1897–2019.

The U.K. and Irish parliaments were able to work themselves up into a lather over climate even though parts of the northern hemisphere set new cold records this last winter.

A spike in food prices due to cold weather might get them to see the world as it really is. What is happening in the Corn Belt this season may be enough to burn through the global warming groupthink.

It has been a very wet and cold start to the 2019 growing season in the Corn Belt, with the consequence that a lot of farmers have not been able to get into their fields to plant.

In a normal year, most of the crop would be planted by now. It will now be delayed by a month if it does get planted.

Projections of likely corn production from here rely upon near perfect conditions for the rest of the season.

But as a return to 19th-century level solar activity will mean a return to 19th-century growing conditions, then the other end of the growing season will be shortened as well.

Seed-producers have tuned their product to the longer and warmer growing conditions of the second half of the 20th century, with corn that requires 2,500 growing degree days (GDD) to reach maturity.

If the season looks as if it is going to be short, then farmers might switch to early maturity corn. Another alternative is to switch to soybeans.

Growing conditions last decade were warmer, longer, and safer than a century before.

The chance of a crop being killed off by an early frost before maturity is not insignificant now.

Corn as a source of food for humans in the U.S. has a buffer in the 30% of the crop that goes to the ethanol mandate.

The focus on climate may also go from being a way to thrash the economy with carbon taxes to its impact on food prices. The biblical “years of lean” may be upon us.

More HERE  (See the original for links, graphics etc.)

New Paper: Arctic Sea ‘Ice-Free’ During Early Holocene

By historical standards the present climate is unusually COLD

Biomarker evidence for Arctic-region sea ice coverage in the northern Barents Sea indicates the most extensive sea ice conditions of the last 9,500 years occurred during the 20th century (0 cal yr BP).

In contrast, this region was ice-free with open water conditions during the Early Holocene (9,500-5,800 years ago).

The early Holocene (ca. 9500 – 5800 cal yr BP) … Relatively low IP25 concentrations [a proxy for sea ice presence] with increased brassicasterol abundances indicate reduced seasonal (spring) sea ice cover and longer (warmer) summers with open water conditions suitable for phytoplankton production.

The occurrence of reduced sea ice cover and longer summers is consistent with increased planktic foraminiferal concentrations (reported here and Carstens et al., 1997) and with longer ice-free seasons and a retreated ice margin in the northern Barents Sea (Duplessy et al., 2001) as well as increased phytoplankton production in the northern Fram Strait (Müller et al., 2009).

Reduced spring sea ice cover also indicates the HTM recorded at the sea surface between ca. 9300 and 6500 cal yr BP, which probably results from maximum summer insolation at 78° N.”

Our proposed sea ice scenario suggests that water masses south of the study area were ice free, which agrees with open water conditions observed in the western Barents Sea (Berben et al., 2014) and the West Svalbard margin (Müller et al., 2012) during the early Holocene.

For the West Svalbard margin, Werner et al. (2013) associated high planktic foraminiferal fluxes ca. 8000 cal yr BP to ice-free or seasonally fluctuating sea ice margin conditions.

The PBIP25 index shows the lowest values of the record (0.16 – 0.40) suggesting a period characterized by low or variable seasonal sea ice cover and influenced substantially by open water conditions (Müller et al., 2011).

The late Holocene (ca. 2200 – 0 cal yr BP) is characterized by the highest abundances of IP25 (0.35 µg/g OC)and relatively low (but stable) brassicasterol (12.5 µg/g OC) (Figure 7A-B).).

Consistent with the opposing trends in the IP25 and brassicasterol records, the PBIP25 values reach their highest value (0.87) of the record at ca. 0 cal yr BP. An increase in PBIP25 suggests a further extension in sea ice cover, reflecting Arctic Front conditions (Müller et al., 2011), most similar to modern conditions.

The Early Holocene was about 6-7°C warmer than today in this region (NW Barents Sea).

Another recent reconstruction for this region also indicated the Early Holocene was sea ice free and that modern sea ice conditions are among the most extensive of the last 9,500 years.

More HERE  (See the original for links, graphics etc.)

New Australian Leftist leader still evasive on planned coalmine

Says it is not for him to decide

Anthony Albanese has continued to question the economics around the Adani mine, but says a climate change convoy which enraged Queensland communities was “very unproductive.”

The incoming Opposition Leader today fielded multiple questions about his repeated refusal to back the Adani mine,, despite the issue costing Labor votes in north and central Queensland.

Mr Albanese, who is making his first trip to the Sunshine State today, said this morning the markets would ultimately decide the economic case for Adani and pointed to its history of missing deadlines.

“It’s not up to government to determine that, it’s up to markets themselves,” he told ABC radio.

“One of the things that has occurred over a period of time is that the company has not met a range of timelines that they’ve put forward.

“But we will see what decisions the company make once the approvals are made or not made.”

Climate change and the Adani mine has been labelled key reasons behind Bill Shorten’s disastrous performance in Queensland at the federal election, where Labor only managed a primary vote of over 27 per cent.

One of the key issues was a “climate change convoy” of activists led by former Greens leader Bob Brown which travelled through north and central Queensland protesting Adani.

Several Labor MPs have pointed to the convoy as a factor working against them in the campaign and Mr Albanese poured scorn on the activists this morning. “The truth is that was incredibly provocative and did nothing to advance, in my view, a genuine debate about climate change,” he said. “To reduce it to a debate about a single mine is very unproductive, it does nothing to advance the debate.

“Good policy is about jobs as well as clean energy, as well as making sure we take the community with us … people could do with less yelling and more genuine debate.”

Mr Albanese will be confirmed as Labor leader by his parliamentary colleagues on Thursday, as he will his presumptive deputy Richard Marles.

Energy Minister Angus Taylor said this morning that Mr Albanese had to be clearer if he supported the coal export industry. “Is he going to support them? He seems to be pretty unclear on that,” Mr Taylor told Sky News.

“I’m pleased that he is not saying he’s going to get in the way (of Adani) ... we want to see these industries succeed.”

Mine craft doesn’t add up

Yesterday Mr Albanese has questioned the “economics” of opening up the Galilee Basin to coalmining and refused to publicly support Adani’s $2 billion Carmichael mine, ahead of his visit to Queensland today to win back blue-collar workers.

The inner-Sydney left-wing powerbroker, who previously called into question the future of thermal coal and the feasibility of the Adani project, is facing internal pressure to further distance Labor from the coal industry.

Asked yesterday whether he supported the Adani coalmine, Mr Albanese, who will today visit the northern Brisbane electorate of Longman which Labor lost to the Coalition, said he would “respect the process” but did not endorse jobs for central Queensland.

“There is the other issue with regard to Adani, and indeed to the whole issue of the Galilee coal basin, the issue of the economics of it, the basic cost-benefit ratios,” Mr Albanese said, after being confirmed as the ALP’s 21st leader.

“One of the things, for example, that was put forward, was that it should receive a subsidised railway line. No, I didn’t support subsidisin­g a railway line for a private­-sector operation.”

Labor MPs and candidates in the central and north Queensland seats of Flynn, Capricornia, Dawson­ and Herbert signed petitions before the election calling for the development of the Galilee, a 247,000sq m thermal coal basin in central Queensland with an estimated 27 billion ­tonnes of untapped coal.

Six coalmines in the Galilee Basin have been approved by the state government, which could generate 16,000 jobs and nearly double Australia’s thermal coal production. Mr Albanese faces the task of reversing massive swings in Queensland against Labor at the May 18 election and the loss of two seats, including the Townsville seat of Herbert, which relies on mining to generate jobs and business.

The party’s election failure prompted Queensland’s Labor premier Annastacia Palaszczuk to immediately intervene to end the delays to the approval process of the Adani mine project.

Mr Marles also refused yesterday to throw his support behind the Adani mine but backtracked on comments he made before the election suggesting it would be a “good thing” if global demand for Australian coal collapsed.

“The comments I made earlier this year were tone-deaf and I regret­ them and I was apologising for them within a couple of days of making them,” Mr Marles said. “It failed to acknowledge the significance of every person’s job.”

Resources Minister Matt Canavan lashed Mr Albanese and Mr Marles for refusing to say they supported the Adani coalmine.

“The Labor Party have heard nothing and learned nothing from the election result,” Senator Canavan said. “People voted last week to protect their jobs, protect their futures, but the Labor Party are showing again that they are no longer the party of workers.”

Queensland Resources Council chief executive Ian Macfarlane, a former Coalition resources ministe­r, said Mr Albanese should throw his support behind jobs in central Queensland.

“It doesn’t really matter what Anthony Albanese thinks about viability — that is a decision for the company and its sharehold­ers,” Mr Macfarlane said. “The project will proceed or not on the basis of its commercial viabili­ty and that will be assessed by the company and its shareholders.”

Senator Canavan said he used Mr Marles’s comments — when he said the collapse of coal exports would be a “good thing” — against Labor during the campaign.

The coal and Adani issues helped the Liberal National Party win Herbert and retain Dawson, Capricornia and Flynn, with swings to the government.

The result, which included a statewide primary vote of just 27 per cent, stunned senior Labor figures and prompted the Palasz­czuk state government to demand a fast-tracking of its Adani approvals process, with a decision on the future of the mine to be made within weeks.



For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here


Tuesday, May 28, 2019

How climate change can fuel wars

The article excerpted below is very shifty.  It includes crop losses from COOLING as due to global warming, for instance.  But its main focus is on disruptions caused by drought.  But drought is NOT going to be made worse by Warming. Warming will warm the seas and warmer seas give off more water vapor, which comes down as rain.

Drought is more likely an effect of cooling and they admit that a cooler climate some decades back originated the Sahelian drought.  The Sahel in fact has been recovering in recent decades -- as it should due to the greatly increased CO2 in the air.  High levels of CO2 enable plants to require less water. Look up "stomata" if you don't believe it.

In summary the actual facts about climate that they produce lead us to the conclusion that the travails in the region are NOT caused by global warming

Fifty years ago the Dar es Salaam camp would have been under several metres of water. In the 1960s Lake Chad was the sixth-largest freshwater lake in the world, an oasis and commercial hub in the arid Sahel. Water and fertile lands were shared by farmers, herders and fisherfolk alike.

The vast lake has shrunk from 25,000 square km to half that area today. In the camp, which the UNHCR (the UN’s refugee agency) helps run, over 12,000 men, women and children huddle in any shade they can find from heat that often reaches 45°C. The camp has no guard towers or walls. Boko Haram fighters are only a few miles away. A tangle of torn tarpaulins and human debris is scattered across the desert. For miles around, baked white sand is dotted with sparse, scraggy trees bristling with inch-long thorns. The sole signs of life are camels pecking at the dry vegetation.

As Mr Ibrahim remembers when the lake stretched over the horizon. “Before the lake began to shrink everything was going normally,” he says. “And now, nothing. We cannot get food to eat.” As the lake receded, people moved towards it, plagued by swarms of tsetse flies. Herdspeople, farmers and fisherfolk competed for access to the shrunken supply of water. Mr Ibrahim had to walk farther and farther to get to the fishing grounds.

Green campaigners and eager headline-writers sometimes oversimplify the link between global warming and war. It is never the sole cause. But several studies suggest that, by increasing the frequency and intensity of extreme weather events, including floods and droughts, it makes conflict likelier than it would otherwise be. In a meta-analysis carried out in the early 2010s, Solomon Hsiang, then at Princeton University, and Marshall Burke, then at the University of California, Berkeley, found “strong support” for a causal link between climate change and conflict (encompassing everything from interpersonal to large-scale violence). They even tried to quantify the relationship, claiming that each rise in temperature or extreme rainfall by one standard variation increased the frequency of interpersonal violence by 4% and intergroup conflict by 14%.

History offers several examples of climate change appearing to foment mayhem. An examination of Chinese records spanning a millennium found that the vast majority of violent eras were preceded by bouts of COOLER weather. The team behind the study argues that lower temperatures reduced agricultural production, provoking fights over land and food.

Consider Syria. Between 2012 and 2015 three academic papers argued that climate change had been a catalyst or even a primary driver of the civil war. Headlines blamed it for the waves of refugees reaching Europe. The argument was that human emissions had caused or exacerbated a severe drought in Syria in the late 2000s that triggered mass migration from farmland into cities, contributing to tensions which ultimately led to war.

The headlines were too simplistic, as headlines often are. Climate modelling led by Colin Kelley, then at the University of California in Santa Barbara, estimated that greenhouse-gas emissions made the drought twice as likely. That is significant, but need not mean that in the absence of climate change, there would have been no drought and no war. Syrians had many reasons to revolt against their ruler, Bashar al-Assad, a despot from a religious minority who enforced his rule with mass torture.

The conflict around Lake Chad is also a tangled tale. Its roots can be traced back to a deadly drought in the 1970s and 1980s. Many have blamed that drought on industrial emissions of greenhouse gases. But climate models suggest they did not in fact play a big role in the drought. The recurrent failure of monsoon rains was caused by COOLER temperatures in the north Atlantic, which pushed the rains too far south. The cooling was itself caused by a mixture of natural and human factors, notably air pollution above the ocean—a striking reminder that greenhouse-gas emissions are not the only way in which human activity may alter the climate.

A report published this month by Adelphi, a Berlin-based think-tank, shows that *Lake Chad is no longer shrinking*. Its authors examined 20 years of satellite data and found that the southern pool was stable for the duration. The northern pool is still shrinking slightly, but total water storage in the area is increasing, as 80% of the water is held in a subterranean aquifer, which is being replenished, as is moisture in the soil, as the rains have returned.

Despite all these caveats, climate change clearly can play a part in fostering conflict. The Sahel is warming 1.5 times faster than the global average, owing to greenhouse-gas emissions. In future, most models suggest, it will experience more extreme and less predictable rains over shorter seasons. In a region where most people still grow or rear their own food, that could make millions desperate and restless.

Climate models predict that, as global average temperatures rise, dry regions will get drier and wet regions will get wetter, with more extremes and greater variability. Poverty makes it harder for farmers to adapt. Trying something new is always risky—and potentially catastrophic for those with no savings to fall back on. In conflict zones, farmers who once had the means to plant several different crops may only be able to plant one. They end up with all their seeds in one basket. On the shores of Lake Chad, violent clashes between government forces and armed opposition groups have created zones that are off-limits to civilians, says Chitra Nagarajan, a researcher for the Adelphi report, who spent two years conducting surveys in all four littoral countries.

Conflict itself makes the poor even poorer, and more vulnerable to the vagaries of a changing climate. Fearing murder, pastoralists cannot take their herds to places with water and vegetation. The UNHCR’s Mr Condé says that fishermen can no longer go into the deep lake to fish. Government troops block them, and Boko Haram is still on the prowl. Fighters steal farmers’ crops. All the farmers can harvest is wood, which they sell as fuel. In a bitter twist, doing so accelerates desertification, further degrading the land.


Solar power is NOT the solution for making fertilizer

The ‘next big thing’ in the environmental movement appears to be ammonia, or rather a more efficient way to make it. Australian computer expert Geoff Russell crunches the numbers, and the results are prohibitive

The world produced 200 million tonnes of ammonia last year; or more than three times more ammonia than cattle meat.

This might surprise you if you think of ammonia as some kind of old-fashioned cleaning product your grandmother used to use. But ammonia is at the heart of most fertilisers; so it’s at the start of any modern food chain, including the cattle meat food chain.

Many of Australia’s 77 million hectares of managed pastures will be fertilised with an ammonia-based fertiliser; particularly if used by dairy cattle.

How much land do you have to cover with solar panels to supply electricity to ammonia production lines to make 200 million tonnes of ammonia? That’s the multi-billion dollar question we’ll answer shortly.

Ammonia’s chemical formula is NH3. The N is for nitrogen. How do you measure the amount of protein in any food? Measure the nitrogen and multiply by 6.25 because nitrogen is about 16 percent of any protein.

During the manufacture of ammonia, the nitrogen is plucked from the air and bonded with hydrogen, this is mixed with other stuff to become fertiliser which is then used by plants to make protein; among other things.

Without that 200 million tonnes of fertiliser, our global protein supply would be seriously limited because very few plants can pull nitrogen from the air. Legumes can do it, using special bacteria bound to their roots. But other plants have to rely on getting it from the soil. And once it’s gone, you have to put it back. Hence the need for fertiliser. Alternatively, you can or plant some legumes and wait for them to work their magic.

The processes that produce ammonia using electricity and water typically take about 11 or so megawatt-hours of electricity to produce a tonne of ammonia. Some people reckon this is a terrific application for solar power. I’ll get to the CSIRO breakthrough shortly.

Nyngan is one of Australia’s largest solar plants; covering about 250 hectares and producing about 233 gigawatt-hours of energy annually; a gigawatt is a billion watts.

To make 200 million tonnes of ammonia annually using a bunch of Nyngan-like solar plants, you’d need to build 9,442 Nyngan solar farms covering 2.36 million hectares.

One down, 9,441 to go.

We can of course divide the job up and have 100 countries each building 94 Nyngans. That’s much more manageable and only gives us 93 more Nyngans to build; or another 23,500 hectares to cover in solar panels. Assuming everybody else pulls their weight and covers the rest of the 2.36 million hectares.

But there’s a catch. Did anybody else notice the IPBES report on Biodiversity and ecosystem services?

What was the number one cause of our loss of biodiversity and ecosystem services? Habitat loss and degradation. Think about 2.36 million hectares. Our intensive land use in Australia – our cities – cover about 1.4 million hectares. So 2.36 million is rather a lot… just to make fertiliser.

The CSIRO breakthrough in synthesising ammonia, assuming it can be scaled up, will require just 1.8 million hectares instead of 2.36. They are aiming at 8.5 megawatt-hours per tonne of ammonia rather than 11.

Replacing oil

But of course, the boffins working on ammonia aren’t just interested in ammonia and fertiliser, they have the entire global energy supply in their sights.

The plan is to make ammonia using renewable energy and ship it globally. That’s what the renewable energy super power chant is all about. You can then use the ammonia to power vehicles, either directly or by cracking the NH3 and extracting the hydrogen and using it in fuel cells.

Ammonia has well under half the energy density of oil. So you need to produce two tonnes of ammonia to replace one tonne of oil. The amount of land required to produce oil is tiny; because oil comes in 3D deposits. Oil flows out of holes in the ground and you can think about the power per square meter as the energy flowing from the wells in an area.

Energy expert Vaclav Smil did this kind of calculation and found that power from oil typically achieves rates between 125 and 40,000 watts per square metre (w/m2) of the size of the field. If we calculate the power per square metre (averaged over 24 hours) of Nyngan we get a figure of just over 10 (w/m2), and this drops to about 6 (w/m2) when we use the solar power to make ammonia using the new super-efficient CSIRO method.

Clearly wildlife habitat will take a hammering if this kind of technology really is scaled up to take on oil.


The single-use plastics ban is a load of rubbish

Faced with an unprecedented crisis of legitimacy, our politicians are clutching at straws.

This week, the UK government announced that from April next year, the sale of plastic straws, drink-stirrers and cotton buds will be banned.

The plastic ban follows several years of high-profile, emotive and misleading campaigns on the problem of plastic waste ending up in the oceans. Despite the obvious fact that these little plastic things are quite useful, for environmentalists it seems that there is no problem, real or imagined, that cannot be solved by banning something.

It is extraordinary that as the UK faces perhaps its deepest political and democratic crisis for centuries, politicians are preoccupied with something as petty as the use and disposal of plastic. The ban is the last gasp of a useless, desperate administration.

The policy is all the more striking considering how little plastic waste from Britain actually finds its way into the ocean directly. But thanks to green policies promoting recycling, millions of tonnes of waste are sent for ‘recycling’ overseas. The countries receiving our waste often become overwhelmed and local waste goes unprocessed. As there are fewer environmental regulations in these countries, excess waste can just be burnt or dumped. Three years ago, two-thirds of the UK’s plastic waste was sent to China. But China has since banned imports of foreign waste, while other countries, including Indonesia, Vietnam and Taiwan, have introduced heavy restrictions. This has left exporters of waste, like the UK, with a problem on their hands.

Faced with vast mountains of rubbish of their own making, politicians prefer to pin the blame on the public and their apparently excessive plastic use. But there is no need to resort to a ban on plastic when there are perfectly safe and clean ways to dispose of it.

The simplest method is incineration. The heat can even be used to generate electricity. But green types are the first to whinge the moment an incineration plant is proposed, despite the fact that modern incinerators produce almost no toxic emissions at all. In contrast, recycling – the greens’ preferred method of waste management – is nowhere near as clean or safe. Uncontrolled, accidental fires break out in recycling collection centres around 300 times per year in the UK, spewing thousands of tonnes of thick black smoke into Gaia’s precious skies.

Water-treatment plants can also be upgraded to allow them to better capture plastic and other items. London’s new 25km-long, £4.2 billion Thames Tideway mega-sewer scheme will prevent the discharge of millions of tons of unprocessed sewage into Thames tributaries each year.

A similar investment could be used to create an effective infrastructure for the processing and incineration of plastic waste. But greens object to burning waste because it is incompatible with the so-called ‘circular economy’ – a utopian ambition of environmentalists in which all resources are endlessly recycled and all waste is eliminated. This means that no green billionaire-backed NGOs, no quangos, no UN or EU committees and no weepy BBC documentaries narrated by David Attenborough are going to make the case for incineration, despite its obvious benefits.

More importantly, for the establishment actors engaged in the war on plastic, finding a technical solution to a problem as simple as waste disposal would rob them of their last vestiges of legitimacy. They want to be seen as planet-saving superheroes, as politicians with visions and purpose. Without the crusade against plastic, our politicians would be exposed as pointless, petty-minded bureaucrats.


The truth about Chernobyl

The shocking truth about the Chernobyl disaster is how FEW people it killed -- despite Greenie panic

In 1995, nine years after the Chernobyl nuclear disaster in Ukraine, I spent six months working at the heart of it all. At that time, I was the only Westerner permanently based at the site. The scale of the fallout, which displaced hundreds of thousands of people and affected millions living in designated contamination zones, was massive.

In 1986, following the disaster, the rescue effort was courageous and inspirational. But there was also an inexcusable, criminal cover-up by the Soviet authorities, led by the then leader of the Soviet Union, Mikhail Gorbachev. The West was also slow to expose the magnitude of the disaster. Undoubtedly, there was an initial hesitation to criticise Gorbachev, as Western leaders were courting him at the time. Following the nuclear disaster of Three Mile Island, Pennsylvania in 1979, Chernobyl was also a hammer blow to the global atomic industry – not least because industry figures had repeatedly and misleadingly claimed that a full nuclear core meltdown could never happen. This downplaying of the disaster led many to distrust the official accounts.

Kate Brown, a professor at MIT, specialising in environmental and nuclear history as well as the Soviet Union, is the latest to cast doubt on the official version of events. Her new book, "Manual for Survival: A Chernobyl Guide to the Future", sets out to expose a 33-year cover-up, in which the United Nations was in cahoots with the KGB and Western intelligence agents. This cover-up, she argues, was designed to downplay the horrendous health consequences of the disaster, both to protect the reputation of the Soviet Union and to prevent lawsuits arising in the West against the nuclear industry.

The UN estimates of fatalities were first published in 2005 in the landmark "Chernobyl Forum Report". Its research and publication was led by the International Atomic Energy Agency (IAEA) on behalf of – and endorsed by – eight UN agencies and the governments of Russia, Belarus and Ukraine. The report assessed all the epidemiological evidence in a collaborative effort involving hundreds of experts. It found that there had been fewer than 50 deaths directly attributable to radiation from the disaster – there have been a further four deaths directly caused by Chernobyl since the report came out. Almost all of those who died were highly exposed rescue workers, many of whom died within months of the accident. The UN also predicted that there could be up to 9,000 Chernobyl-related deaths from cancer.

In response to the UN’s 2005 report, Greenpeace claimed that the death toll could be as high as 200,000.

"Manual for Survival" sets out to explain this ‘Grand Canyon-sized gap between the UN and Greenpeace estimates of fatalities’. Brown argues that decades after Chernobyl exploded there is still a need for a large-scale, long-term epidemiological study of the consequences of low-level radiation on human health in the affected areas. That may be so. But this does not itself prove a cover-up.

Brown also wants us to believe that after Fukushima blew its top in 2011, scientists told the public that they had no certain knowledge of the damage that could be caused by exposure to low doses of radiation as a result of the accident. Yet she provides no evidence of them claiming ignorance on this subject that is well understood by experts. She also argues that because of our failure to learn the lessons of Chernobyl, we are stuck in an ‘eternal video loop’, with the same scene of nuclear disaster playing over and over again, from Three Mile Island and Chernobyl to Fukushima. She accuses Japan’s scientists in the wake of Fukushima of ‘reproducing the playbook of Soviet officials 25 years before them’ by also downplaying the consequences of the disaster.

But Brown seems to inhabit a parallel universe. If anything, there is unnecessary panic about the supposed impact of low-level radiation. Far from downplaying the consequences of Fukushima, the official response has been overly precautious. An exclusion zone was created around the site, which largely exists today, though the nearby town of Okuma has since been declared safe for residents to return. Professor Geraldine Thomas of Imperial College, London, one of Britain’s leading researchers on the effects of radiation on the human body, told the BBC in 2016 that the radiation levels in the exclusion zone pose little risk to human health: ‘There are plenty of places in the world where you would live with background radiation of at least this level’.

A key case study in Brown’s Chernobyl cover-up theory is the story of Keith Baverstock. Baverstock was one of many courageous scientists and doctors who battled to convince the World Health Organisation (WHO), his employer, that the Chernobyl disaster had resulted in an unexpected outbreak of thyroid cancers among children. They discovered and publicised the fact that radioactive iodine (iodine-131 and caesium-137) was more carcinogenic than was previously thought.

But Brown understates the fact that Baverstock and his colleagues won their battle. The final Chernobyl Forum report of 2005 incorporated their findings. What’s more, Baverstock repeatedly told the media that the biggest cause of health problems among people living in territories affected by low-level radiation was increased anxiety and stress levels, fanned by scaremongering.

Baverstock continued to criticise the Chernobyl Forum after he left WHO. He offers the most considered interrogation of the Chernobyl Forum findings that I have read. Along with his colleague, Dillwyn Williams, Baverstock criticised the conflicted politics behind the Forum and the impact this has on its research. He has pointed out the limits of the existing body of knowledge and called for more intensive research into the long-term effects of Chernobyl. He correctly said that we won’t know for sure the full outcome on human health for decades – the real death toll might still emerge because low-level radiation impacts are difficult to detect. But while Baverstock has advanced important criticisms of the UN’s figures, he has never endorsed Greenpeace’s speculative inflation of the evidence.

In contrast, Brown misrepresents much of what was uncovered by scientists in the wake of Chernobyl. She does so, presumably, in order to back up her claim that the Chernobyl exclusion zones will have to remain abandoned for much longer than anybody expected. (They are being repopulated gradually already.) Her claim is based on a misunderstanding of the science. Brown says it will take between ‘180 and 320 years’ for caesium-137 to disappear from Chernobyl’s forests. But the half-life of cesium-137 is 30 years.

Her misunderstanding is based on an article in Wired, which has since been updated to reflect the science of half-lives more accurately. In other words, Brown is basing her conclusions on secondary sources written by people who failed initially to understand what they had been briefed by scientists. Meanwhile, in the real world, according to Professor Jim Smith of the University of Portsmouth, while the Chernobyl exclusion zone is still contaminated, ‘if we put it on a map of radiation dose worldwide, only the small “hotspots”’ would stand out’.

In the final chapter of Manual for Survival, we finally learn what Brown considers to be a more credible estimate of the total number of existing and expected fatalities from the Chernobyl accident. And once again we see there isn’t any real evidence or credible sources to support her thesis: ‘Off the record, a scientist at the Kyiv All-Union Center for Radiation Medicine put the number of fatalities at 150,000 in Ukraine alone. An official at the Chernobyl plant gave the same number. That range of 35,000 to 150,000 Chernobyl fatalities – not 54 – is the minimum.’ In reality, if even half or one-third of those deaths actually occurred, this would be impossible to hide.

In the end, the shocking truth about Chernobyl is how few people were killed or made ill by the radiation.


Thanks, Bob Brown, You Helped the Australian Labor Party Lose The Unloseable Election

Greens leader Bob Brown

For an alleged Labor party to put Greenie causes ahead of worker welfare was epic folly.  Coal miners are workers too and they make a good dollar.  Labor now need to divorce themselves from their happy marriage to the Greens

Still trying to figure out how Labor lost another unloseable election? The pollsters got it wrong, the bookies got it wrong, the punters got it wrong the ABC and most of the mainstream media got it wrong. And obviously Bill Shorten got it very wrong.

Bob Hawke got it right when he said, “Never underestimate the intelligence of The Australian voters”. He probably should have added, “Especially in Queensland”, where Labor lost two seats and the LNP shored up their margins even in Peter Dutton’s Dickson, where Labor and GetUp put in a huge effort.

We even saw the spectacle of another ex-Labor PM Paul Keating, shakily urge voters to “drive a stake through his dark political heart”.

Why did they all get it so far off the mark? Well Queenslanders don’t take kindly to a bunch of ratbags from the south telling them how to run their economy and create jobs. So Bob Brown’s Anti-Adani Convoy couldn’t have come at a better time for the LNP. Waving banners shouting “Coal Kills” and “Block Adani” floated like a lead balloon over a State which reaps billions from coal exports.

This folly combined with Shorten’s fence sitting and the Palaszczuk Government’s stalling over issues such as the numbers of a common bush bird, the black-throated finch. Anastacia must be worried she’ll be next.

The LNP increased its vote substantially in the previously very marginal seat of Flynn, which was high on the Labor wish list. Centered on the major coal port of Gladstone and held by Ken O’Dowd since 2010, it also takes in an extensive agriculture and beef area including the North Burnett region.

Rockhampton’s Michelle Landry increased her LNP winning margin in neighbouring Capricornia and in Dawson, centred on Mackay, the so-called Member for Manila, George Christensen, gained another big unexpected win. Further north in Townsville, Labor’s Cathy O’Toole was out-gunned by war veteran LNP candidate, Phillip Thompson. In all these centres, jobs and the economy were major factors.

Combine all that with Labor’s big taxing agenda, its hit at self-funded retirees, negative gearing, Capital GainsTax, the blank cheque it sought for an un-costed, over-ambitious climate policy (including a controversial push for 50 percent electric vehicle sales by 2030), and the result in Queensland and most other States is not surprising.

Add the arrogant advice to retirees and investors from Labor’s Treasury spokesman and candidate for the top job, Chris Bowen, “If you don’t like it, don’t vote Labor”.

Good advice. So the voters said it’s not time to risk Shorten, we’ll stick with Scott Morrison and a stable economy.

Now it looks likely Morrison will gain an absolute majority and enjoy a major opportunity to grow his influence over the coming term.



For more postings from me, see  DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are   here or   here or   here.  Email me (John Ray) here.  

Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here