Tuesday, March 06, 2018

Why what we eat is crucial to the climate change question (?)

Ruth Khasaya Oniang'o is a bit confused below. Global warming will reduce food availability?  Canadian farmers would love a couple of degrees of warming.  It would open up countless hectares of mid-Northern Canada to grain cropping -- unleashing a FLOOD of food onto the world.  And the rest of Ruth's generalizations are equally tendentious.  Greenies can only regurgitate their talking points.  They are incapable of critical thought

Did you know that what’s on your plate plays a larger role in contributing to climate change than the car you drive? When most wealthy people think about their carbon footprint, or their contributions to climate change, they’ll think about where their electricity and heat come from or what they drive. They’ll think about fossil fuels and miles per gallon, about LED lights and mass transit – but not so much about combine harvesters or processed meals or food waste. Few consider the impacts of the food they eat, despite the fact that globally, food systems account for roughly one quarter of all manmade greenhouse gas emissions. That’s more than the entire transportation sector, more than all industrial practices, and roughly the same as the production of electricity and heat.

Meanwhile, the most immediate threat of climate change for most of the global population will be at the dinner table, as our ability to grow critical staple crops is being affected by the warming we’ve already experienced. Between 1980 and 2008, for instance, wheat yields dropped 5.5 % and maize yields fell 3.8% due to rising temperatures. Climate change threatens the food security of millions of poor people around the world. Young people are increasingly keen to protect the environment by shifting to animal-product-free diets. They seek plant proteins which taste like meat, while insects are also growing popular as an alternative.

What these inverse challenges – that food and agriculture are both enormous contributors to climate change, and massively impacted by it – really tell us is that our food systems, as currently structured, are facing major challenges.

There is a much larger problem that implores us to look beyond farm and agricultural practices. We need to open our eyes to solutions that address the full scope of the challenge to create more sustainable and equitable food systems. That way, we can provide healthy food for all people while we protect our planet’s resources at the same time.

So what are food systems? Everything from seed and soil to the supermarket to the plate to the landfill. Food systems include the growing, harvesting, processing, packaging, transporting, marketing, consumption, and disposal of food and food-related items.

While farming alone accounts for 10-12% of global greenhouse gas emissions, when we look at entire food systems the contributions to climate change more than double. A recent report published by the Meridian Institute lays out the many factors throughout food systems that spell trouble for the climate, and also explains why a broad systems-wide perspective is necessary for implementing effective changes.

Consider deforestation and soil. A narrow view of agriculture alone would neglect the fact that a full 80% of the forests that are clear cut or destroyed are done so to create farmland. Forests are massive carbon sinks. So is soil, locking away two to three times as much carbon as there is present in the atmosphere. But farmers can help restore ecosystem functions and build resilient communities by producing crops and livestock in productive ways that sequester carbon and protect forests.

Or consider food waste. Not just the scraps that you throw away, but throughout the entire food system. A staggering 30-40% of the food produced in the world is never eaten. Some never gets harvested, some spoils before it reaches consumers, and a lot is tossed away by retailers, restaurants, and at home. For the sake of comparing emissions, if food waste were its own country it would be the third largest greenhouse gas emitter in the world, after only China and the United States.

This says nothing of the gross injustice of wasting so much food while so many in the world go hungry. In the developing world, improving infrastructure along the food chain – including cold storage – would prevent much good food being lost. In the developed world, retailers can prevent large amounts of waste by finding outlets for slightly blemished goods and consumers can limit waste by buying food in amounts they actually want and need.

There are countless more examples of challenges and solutions all throughout the food system — from production of fertiliser to distribution systems to the production of dried and purified foods that make up processed meals to the diets and lifestyles of the public. Everyone has a role to play; these challenges cannot be solved in a vacuum.

The complex, dynamic, and widely diverse forms of the world’s many food systems yield some wildly divergent outcomes in terms of nutrition, health, and environmental and climate impacts. It is critical that we start to better examine what works in some systems and what must be improved in others, in order to produce more equitable, just, and sustainable outcomes around the world.

Just as there’s no universal crop that grows everywhere, there’s no “one size fits all” model food system to implement across the world. A broader systems-wide perspective is necessary if there is any hope for truly transformative change. It’s time to look beyond farming and agriculture and to see the whole picture, to create systems that cause less harm to the climate and are more resilient to the impacts we’re already suffering from global warming.

Food is a fundamental human need and to eat is a basic human right. Our food systems must deliver that need, fairly and equitably, without worsening the impacts of climate change.


Neonicotinoids: EFSA’s coded plea for help

What a sad, sorry state we see for science in the European Union!

We are now in a situation where a European scientific advisory authority (EFSA) is being forced to base its advice on a scientifically worthless document and has to resort to sending out coded messages pleading to the scientific community for help (or forgiveness).

In a recent press release on neonicotinoids, EFSA had to publish the advice they were told to produce: that there was not enough evidence to declare with certainty that neonicotinoids are not harmful to bees. This conclusion was baked into the European Commission’s original request for advice and EFSA chose its press release to acknowledge, in a coded manner, why they were not too pleased with the process thrust upon them.

Paragraph 3 of EFSA’s press release: letting the scientific community know their advice was useless.

Why would EFSA choose to mention (in the third paragraph of their press release) just one tool among many in their review methodology?

The draft Bee Guidance Document (BGD) has not been accepted into law by the Standing Committee, meaning that any decisions based on its use would not be legitimate.

Why, after five years, has the draft bee guidance document not been adopted?

The BGD set parameters that were impossible for field tests to comply with. It demanded an acceptable test mortality rate of 7%, far below the average 15% bee mortality rate under normal conditions. The BGD also set a minimum contiguous field test size – at least 168km² – that was much larger than possible to track bees… No field test data could comply, and without data, there was no way to be certain neonicotinoids were safe.

Furthermore, the BGD working group that had set these impossible parameters was infiltrated with anti-pesticide activist scientists with an agenda – with Poudelet in Sanco directing the process, the BGD was nothing else but corrupted and unscientific. EFSA is fully aware of the hidden conflicts of interest but seems unable to use this as a basis to review the entire BGD process. Perhaps if Arnold had worked for Monsanto rather than a bee NGO, things would have been different.

As readers of this blog may recall, I have shown repeatedly how the bee guidance document was not scientific and misused by anti-pesticides activists. It was designed to reject all credible field test data, not only for neonicotinoids, but for any pesticides, including those approved for organic farming. And with this fabricated block to any available field data, a precautionary approach was presented as the only possible conclusion.

The head of EFSA, Bernhard Url, has tried to wash his hands of the bee guidance document, but it seems stickier than that mythical insecticide-laden honey. See the clip in the European Parliament where Url exonerated EFSA and admits that he has no choice but to use the flawed bee guidance document. He admits someone in the Commission forced him to. Note that Dr Url did not defend the validity of the bee guidance document. I think it would be very hard for anyone to come to a legitimate defence of that miserable piece of activist science.

So EFSA was stuck in a frustrating position, where they had to answer an inappropriate question. Agencies and authorities work to serve the European policy process so when they receive a question that is politically-driven, they are not in a position to send it back and request a better question. So after wasting much time, EFSA’s press release answered the inappropriate question using the useless guidance document as instructed, but at least they could send out their coded plea for help.

The irony is that EFSA was very clear how their decision on the safety of glyphosate was based on all available data (while IARC restricted themselves), but with neonicotinoids, they have let their hands be tied by a politically-motivated guidance document that arbitrarily rejected any data that would show the insecticide’s safety. Wouldn’t it be nice if Dr Url stood up for scientific integrity in all cases and demanded a review of the rejected DRAFT bee guidance document. That would have to come as a request from the Commission though.

Junck-Science and Regulatory Failure

I wrote a paper last year showing how Juncker has imposed his interests on the policy process related to both glyphosate and neonicotinoid legislation arguing how it reflected a casual ignorance towards evidence. I called it Junck-science. What the blog did not cover was what policy tools have been quietly abandoned by this present Commission.

The anticipated banning of neonicotinoids will set a new high-water mark in European regulatory failure. I cut my teeth on responsible policymaking during the Delors period where certain regulation process standards were put in place to ensure legitimate policies were determined. Juncker has abandoned many of these accountability tools and has turned Brussels into a cesspool of influence, king-making and special interests. In the case of neonicotinoids, the following responsible regulatory tools were not used and we have ended up with yet another policy disaster that has destroyed trust in Brussels and fostered economic and trade uncertainty.

We no longer rely on evidence-based policy, impact assessments or inter-service consultations. Instead, Brussels now has its own Bismark, pulling strings from behind the sunken shadows of a weak, aging leader.


Health savings outweigh costs of limiting global warming: study

The usual rubbish.  A warmer climate would REDUCE illness  overall.  Winter is the season of death

The estimated cost of measures to limit Earth-warming greenhouse gas emissions can be more than offset by reductions in deaths and disease from air pollution, researchers said on Saturday.

It would cost $22.1 trillion (17.9 trillion euros) to $41.6 trillion between 2020 and 2050 for the world to hold average global warming under two degrees Celsius (3.6 degrees Fahrenheit), a team projected in The Lancet Planetary Health journal.

For the lower, aspirational limit of 1.5 C, the cost would be between $39.7 trillion and $56.1 trillion, they estimated.

But air pollution deaths could be reduced by 21-27 percent to about 100 million between 2020 and 2050 under the 2 C scenario, the team estimated, and by 28-32 percent to about 90 million at 1.5 C.

"Depending on the strategy used to mitigate climate change, estimates suggest that the health savings from reduced air pollution could be between 1.4-2.5 times greater than the costs of climate change mitigation, globally," they wrote.

Health costs from air pollution include medical treatment, patient care, and lost productivity.

The countries likely to see the biggest health savings were air pollution-ridden India and China, said the researchers, who used computer models to project future emissions, the costs of different scenarios for curbing them, and the tally in pollution-related deaths.

"The health savings are exclusively those related to curbing air pollution," study co-author Anil Markandya of the Basque Centre for Climate Change in Spain told AFP.

"Other health benefits are not included, which of course makes our figures underestimates of the total benefits."

The costs of limiting warming, Markandya explained, included higher taxes on fossil fuels like oil and coal, which in turn raise the costs of production.

The world's nations agreed on the 2 C limit in Paris in 2015, and undertook voluntary greenhouse gas emissions reduction targets.

These pledges, even if they are met, place the world on a 3 C trajectory, scientists say.

To date, the average global temperature is thought to have increased by 1 C since the Industrial Revolution.

"We hope that the large health co-benefits we have estimated... might help policymakers move towards adopting more ambitious climate policies and measures to reduce air pollution," said Markandya.

Air pollution from fossil fuel emissions, particularly fine particulate matter and ozone, has been linked to lung and heart disease, strokes, and cancer.


Time to Cool It: The U.N.’s Moribund High-End Global Warming Emissions Scenario

The amount of future warming is predicated on the amount of emitted greenhouse gases and the sensitivity of earth’s surface temperature to changes in their concentrations. Here we take a look at the emissions component.

The U.N. currently entertains four emissions scenarios, all expressed as the change in downwelling radiation (in watts/meter-sq, nominal year 2100) towards the surface that results from an increase in the atmospheric concentration of certain greenhouse gases. They are called “representative concentration pathways,” or RCPs.

As can be seen in Figure 1, there are four, given as 2.6, 4.5, 6(.0) and 8.5. The ranges of associated warming for over 1000 total scenarios are given on the right axis.

Figure 1.  Approximately 1000 scenario runs for four RCPs. From Fuss et al., 2014.

Figure 1. Approximately 1000 scenario runs for four RCPs. From Fuss et al., 2014.

It’s not surprising that those making the case for climate action most frequently reference the highest (RCP8.5), embedding it in most climate scenarios, assessments, and international agreements (the Paris Agreement being a prime example). Here is a summary of Google Scholar citations for the different RCPs, published on February 9 by Eric Roston in Bloomberg:

Figure 2. Although increasingly untenable, RCP8.5 draws the most attention. 

Figure 2. Although increasingly untenable, RCP8.5 draws the most attention. 

RCP8.5 is obsolete. It was obsolete when it was first published in the journal Climate Change by Riahi et al. in 2011. By then the shale gas revolution was underway, as can be seen from the plot below of shale gas production. By 2011, abundant shale gas had begun a wholesale displacement of coal for electrical generation, increasing natural gas’s portion of our energy portfolio and decreasing that of coal.

Figure 3. U.S. shale gas production, 2007-2016, according to the U.S. Energy Information Administration.

Figure 3. U.S. shale gas production, 2007–2016, according to the U.S. Energy Information Administration.

The Riahi et al. RCP 8.5 continues to be the favorite for analysts. It completely dominates the draft of the upcoming fourth “National Assessment” of climate change, created by our U.S. Global Change Research Program. Here is the fanciful “wedge chart” for various energy sources in RCP8.5:

Figure 4. Energy contributions (in Joules X 1018, or EJoules) in RCP 8.5.

Figure 4. Energy contributions (in Joules X 1018, or EJoules) in RCP 8.5.

There are at least two notable errors in RCP8.5, which both serve to exaggerate its radiative forcing. The first is an incorrectly modest growth in natural gas use, and the second is the massive growth in coal combustion. According to the International Energy Agency (2017):

The global natural gas market is undergoing a major transformation driven by new supplies coming from the United States to meet growing demand in developing countries and industry surpasses the power sector as the largest source of gas demand growth…[emphasis added]

The evolution of the role of natural gas in the global energy mix has far-reaching consequences on energy trade, air quality and carbon emissions…

Global gas demand is expected to grow by 1.6% a year…China will account for 40% of this growth.

British Petroleum (BP) recently estimated the global fuel mix through 2040 in its 2018 Energy Outlook. Under their “Evolving Transition” assumption, natural gas usage passes coal worldwide around 2030, and oil use levels off at the same time. A comparison to RCP 8.5 (above) shows how wrong it is, even in the near future.

Figure 5. British Petroleum’s fuel outlook from its most recent (2018) Energy Outlook. Note the color scheme is somewhat different than in Figure 4, with natural gas now red, instead of blue.

Figure 5. British Petroleum’s fuel outlook from its most recent (2018) Energy Outlook. Note the color scheme is somewhat different than in Figure 4, with natural gas now red, instead of blue.

The substitution of shale gas for coal continues to drive down the “carbon (dioxide) intensity” of developing and developed economies. This is the amount of carbon (dioxide) emitted per unit of GDP, usually normalized to 2010 dollars adjusted for their purchasing power in a given economy. In the United States, in the quarter-century beginning in 1990, the drop was a remarkable: from 0.9kg of carbon dioxide/dollar to 0.35, or over 60%.

The imminent dethroning of King Coal is obvious in the BP data, which leads to another problem: Justin Ritchie and Hadi Dowlatabadi from University of British Columbia recently found there simply isn’t enough coal to support RCP8.5. Nor were they conservatively looking at so-called “recoverable” reserves; instead, they toted up all geologically identified coal around the planet.

They then adjusted RCP8.5 for the twin errors of increasing carbon (dioxide) intensity by a huge growth in coal use over natural gas (recall that IEA indicates large scale industrial as well as electrical switchover), and the fact that there isn’t enough coal, and modified RCP8.5 to look like this:

By comparing the contribution of oil, coal, and natural gas (the greenhouse gas sources) between RCP8.5 and what is likely to happen, we can estimate the total downwelling radiation change: it drops from 8.5 watts to roughly 5.1. (Recognizing there is a lot of fine print—this is certainly a ball-park number.)

It is the nature of climate models to scale global warming with percentage changes in emissions; i.e. a quadrupling of emissions has almost exactly the effect of doubling prospective warming over that forecast from an initial doubling of the concentration. Reducing emissions by 40%, which is the difference between Rihai’s RCP8.5 and Ritchie’s modification, similarly reduces total warming.

There’s the further problem of model overprediction of warming that we recently documented in our public comments on the upcoming Fourth National Assessment of Climate Change. Generally speaking, we find the data-based sensitivity of temperature to be about 56% of the average of the 105 climate models in the UN’s most (2013) science summary.

Multiplying everything through, we take the mean 20th and 21st century RCP8.5 warming of 4.3⁰C, adjust by 60% to get the difference with the modified RCP, and then adjust for the 56% sensitivity and we find a 21st century warming a teense under 1.5⁰C—very, very close to the sensitivity just calculated by University of Alabama-Huntsville’s Roy Spencer.

SOURCE (See the original for links and graphics)

Will Congress finally get tough on junk science?

House hearing investigates a UN cancer agency accused of misusing US taxpayer funds

Paul Driessen

A growing problem for modern industrialized Western societies is the legion of government agencies and unelected bureaucrats and allied nongovernmental organizations that seem impervious to transparency, accountability or reform. Their expansive power often controls public perceptions and public policies.

Prominent among them are those involved in climate change research and energy policy. In recent years, they have adjusted data to fit the dangerous manmade climate chaos narrative, while doling out billions of taxpayer dollars for research that supports this perspective, and basing dire predictions and policy demands primarily on climate models that assume carbon dioxide now drives climate and weather (and the sun, water vapor, ocean currents and other powerful natural forces have been relegated to minor roles).

Reform is essential. Meanwhile, another troubling example underscores the scope of the problem and the difficulties Congress and other government administrators face when they try to rein in rogue agencies.

In November 2017, the U.S. House of Representatives Committee on Science, Space and Technology sent the UN’s International Agency for Research on Cancer (IARC) a letter raising questions about scientific bias, secrecy and corruption at the agency. When IARC obfuscated the issues, the committee sent a second letter, seeking answers within a week.

Otherwise, the Committee said, it would consider “whether the values of scientific integrity and transparency are reflected in IARC monographs and if future expenditures of federal taxpayer dollars need to continue.” The United States is the IARC monograph program’s biggest contributor, having given it nearly $50 million to date.

Agency director Dr. Christopher Wild bided his time four weeks before replying (many would say rather testily and condescendingly) and concluding: “IARC would be grateful if the House Science Committee would take all necessary measures to ensure that the immunity of the Organization, its officials and experts, as well as the inviolability of its archives and documents, are fully respected.” [emphasis added]

Refusing to be cowed, on February 6 the committee held a hearing, “In Defense of Scientific Integrity: Examining the IARC Monograph Programme and Glyphosate Review.” Evidence presented revealed that the monograph program is an antiquated approach that simply tries to determine from laboratory studies whether a particular chemical might cause cancer in test animals, even if only at ridiculously high levels that no human would or could ever be exposed to in the real world.

IARC performs no actual risk assessments that examine the potency of a substance to humans or the level of exposure at which the substance might actually have an adverse effect on people. It thus places bacon, sausage, plutonium and sunlight together in Group 1, its highest risk category: “definitely carcinogenic.” This provides no useful information from a public health perspective, but does give ammunition to activists who want to stoke fear and get chemicals they dislike banned.

IARC’s Group 2B carcinogens include caffeic acid, which is found in coffee, tea, and numerous healthy, must-eat fruits and vegetables, including apples, blueberries, broccoli, kale and onions. This group also includes acetaldehyde, which is found in bread, ginkgo balboa and aloe vera, lead Science Committee witness Dr. Timothy Pastoor noted in his testimony.

As Pastor also pointed out during the hearing, countless chemicals could theoretically cause cancer in humans at extremely high doses – but are completely harmless at levels encountered in our daily lives.

But it’s not just IARC’s overall approach that raises questions. As investigative journalists David Zaruk and Kate Kelland discovered, serious allegations have also been raised regarding the integrity of IARC’s review process. These include evidence that IARC deleted or manipulated data – and covered up major conflicts of interest by agency panel members who were employed by environmental activists and mass tort plaintiff attorneys who are targeting the very chemicals the panelists were reviewing and judging.

IARC’s latest quarry is glyphosate, the world’s most widely used herbicide. The principal ingredient in the weed killer RoundUp, glyphosate is vital in modern agriculture, especially no-till farming.

The European Food Safety Authority, European Chemicals Agency, German Institute for Risk Assessment, US Environmental Protection Agency and other experts all found that glyphosate is safe and non-carcinogenic. So did the 25-year, multi-agency US Agricultural Health Study (AHS), which analyzed data on more than 89,000 farmers, commercial applicators, other glyphosate users and their spouses.

IARC alone says glyphosate is likely a cancer-causing agent – contradicting every other regulatory and reputable scientific body around the world. How could it possibly reach such a different conclusion?

According to Zaruk, Kelland and committee members, IARC deliberately ignored the AHS analysis. The chairman of the IARC working group on glyphosate later admitted in a sworn deposition that this study would have “altered IARC’s analysis.”

When an animal pathology report clearly said researchers “unanimously” agreed glyphosate had not caused abnormal growths in mice they had studied, IARC deleted the problematical sentence.

In other cases, IARC panelists inserted new statistical analyses that effectively reversed a study’s original finding, or quietly changed critical language exonerating the herbicide.

Meanwhile, Dr. Christopher Portier, the “consulting expert” for the working group that labeled glyphosate as “probably” cancer-causing, admitted in his own sworn testimony that – just a few days after IARC announced its guilty verdict – he signed a contract to serve as consultant to a law firm that is suing the chemical’s manufacturer (Monsanto) based on that verdict. Portier collected at least $160,000 just for his initial preparatory work.

Adding to the confusion and collusion, say Committee members, Linda Birnbaum’s $690-million-per-year National Institute for Environmental Health Sciences (in the National Institutes of Health) has been collaborating with the same government agencies, pressure groups, trial lawyers and yet another anti-chemical activist organization, the Ramazzini Institute in Italy.

This is not science. It is corruption distortion and fraud – supported by our tax dollars and used to get important chemicals off the market.

The end result, if not the goal, is to undermine public confidence in science-based risk assessments, lend credibility to activist campaigns claiming numerous chemicals contaminate our foods and poison our bodies, and enable predatory tort lawyers to get rich suing manufacturers and driving them into bankruptcy.

Dr. Wild’s letters clearly suggest that IARC views the Science Committee’s concerns about the agency’s lack of scientific integrity and transparency as irrelevant – as a mere irritant, a minor threat to his agency’s unbridled power … and something the US government will ultimately do nothing to correct.

We will soon find out whether IARC is right – or if Congress is finally ready to play hardball with this unethical UN agency.

It’s also an important test for congressional oversight, spine and intestinal fortitude on holding other deep state agencies accountable for how they spend our money, what kind of science or pseudo-science they support and conduct, and how they will affect or even determine the public policies that in so many ways are the foundation of our economy, livelihoods and living standards.

PS: The Science Committee has also discovered that Vladimir Putin’s Internet Research Agency engaged in significant hacking, to inflame social media and instigate discord over US energy development and climate change policies – while Putin cronies laundered millions to fund radical green organizations. That too must be addressed by Congress and administrative agencies, including the Justice Department.

Via email



Preserving the graphics:  Most graphics on this site are hotlinked from elsewhere.  But hotlinked graphics sometimes have only a short life -- as little as a week in some cases.  After that they no longer come up.  From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site.  See  here or here


1 comment:

Anonymous said...

Ruth Khasaya Oniang'o fails to recognize a simple fact, before farming changed the nature of many areas they were still populated by ungulates and grasses.

Farming has only changed the mix of which ungulates are in those areas and the particular grasses that area is growing.

That we intelligently harvest the seeds of the grasses as food instead of letting them merely fall to the ground and then replant some of those seeds for the next season only means that grasses with edible seeds have found a symbiotic relationship.

Likewise the ungulates we raise for feed are the most successful ungulates as there are more of them than ever before though their cousins who are less successful in being food have had to pay the price of being displaced.

Overall we have not increased the amount of grasses the ground is growing nor the number of ungulates grazing, we have only changed which ones are successful by also feeding ourselves and making us more successful as well.

Nature loves success.