Thursday, July 13, 2023



Georgia Nuclear Reactors: Firing Up Vogtle 3 And 4

On May 29, the Vogtle 3 nuclear reactor was brought to 100% power for the first time. It's getting closer to adding another 1100 MW to the grid, with its sister plant, Vogtle 4, not far behind. This is significant because the Vogtle reactors are the first new nuclear power plants built in the U.S. since the 1979 Three Mile Island meltdown.

The Vogtle plants will be the first of the advanced AP1000 reactor designs to be built in the US (another two – Turkey Point 3 and 4 – are still in the planning stage, and there are six AP1000 reactors that have come online in China since 2018). This reactor design represents a huge step forward, leapfrogging at least three decades of reactor design. You can catch up on how to design a nuclear reactor here, but it’s worth looking at what’s different about the AP1000 plants.

The very first reactors were used to demonstrate that a self-sustaining nuclear reaction could be initiated and controlled as well as to produce plutonium for nuclear weapons, i.e., the Manhattan Project. The first generation of commercial nuclear reactors were prototypes and research reactors, intended largely to develop the technology and to learn how to make reliable nuclear power reactors. These reactors served as the basis for the nuclear reactors that would be put in the first nuclear submarines and surface ships.

Even as the Generation I reactor design was being tweaked, refined, and made larger, the Generation II reactors were being designed to take advantage of what had been learned – most currently operating reactors are Generation II plants, designed and built between about 1965 and 1996. The main distinctions are that the Generation II plants were designed to operate reliably and safely for at least 30 years (when the construction loan was paid off), and many have had their lifespan extended to more than a half-century. They also tended to be huge and complex. The Watts Bar reactor in Tennessee is likely to be the last of this generation to be built in the US. And that brings us to the AP1000 – a Generation III reactor plant.

Generation III

The Generation III and III+ reactors are not a revolutionary improvement over the Generation II designs – that’s the Generation IV reactors that are still on the drawing board. The Generation III is characterized as an evolutionary design that offers significant improvements without straying too far beyond the boundaries of Generation II technology. Generation III and III+ reactors take advantage of over three decades of operational experience, increased understanding of materials science and reactor physics, and vastly improved computer design techniques. As a result, the Gen III and III+ reactors are

simpler in design

designed to last for at least 60 years

employ safety features that take into account lessons learned from the Three Mile Island and Fukushima accidents.

This last point – enhanced reactor safety features – warrants additional discussion.

Acting Less, Doing More – Passive Safety Systems

Reactors generate heat, and it must be removed, even from a shutdown reactor, to prevent the core from melting. Gen II reactors accomplish this by running reactor coolant pumps powered by the grid, emergency diesel generators, or other means. Fukushima demonstrated that any power source could be lost if things go bad; Gen III and III+ reactors have passive safety features that can help out when needed.

For example, when water warms, it becomes less dense and rises, allowing denser and cooler water to flow in to replace it. Hot water rises into a heat exchanger where it transfers heat into the atmosphere or water in a cooling pond, the heat sink, after which it’s dense enough to sink into the core once more…this is called natural circulation. Another form of passive cooling is to have a large water tank situated at a higher level than the reactor core; in the event of a leak or loss of power, the water in the tank (or pond) simply flows through the core due to gravity.

Another safety feature of many Gen III and III+ reactors is a “core catcher” that’s there in case the emergency cooling system runs out of water or otherwise fails to operate correctly. Suppose a loss of cooling causes the core to heat to the point of melting. In that case, the molten materials (called “corium” by some) will fall or flow into the core catcher instead of melting through the bottom of the reactor vessel and onto the floor of the containment building. This helps to protect the integrity of the containment structure, reducing the risk that large quantities of radioactivity will be released into the environment. The idea is to not rely on the operators to take the correct action (or, indeed, to take any actions). According to nuclear engineer James Mahaffey in his book Atomic Accidents, a number of nuclear reactor accidents would have been less severe or averted altogether had the operators simply sat on their hands and relied on the automatic systems – this suggests that including passive cooling systems that require neither electricity nor actions by reactor operators is a good thing!

Both Vogtle reactors will add over 2200 MW of carbon-free electricity to the grid when online. Generating this energy using natural gas would pump more than 7,700 tons of CO2 into the atmosphere yearly; fuel oil and coal are twice as bad. Over the half-century, they’re expected to be operational, that’s nearly 400,000 tons of carbon dioxide that will not be produced because these two reactors went online. That’s not bad.

***********************************************

Wrong, USA Today and Other Media, U.S. States Haven’t Set New High Temperature Records This Summer

USA Today followed the lead of several other mainstream media outlets claiming that during a recent summer heatwave in early July that the world, along with several U.S. states, have set new all time high temperature records. This is false. Although it has been hot, had the outlets bothered to check the historical records they would find none of the high temperatures hit in the states mentioned in recent days approached their historic highs.

The USA Today story, “Heat record after heat record will be broken in 2023. Here’s how to make sense of it all,” claims that all time high temperature records were broken in Florida, Arizona, and Texas. These claims mirror similar claims made by other media outlets, for instance, The Guardian and Axios.

“The Earth’s unofficial average temperature broke records last week,” wrote USA Today. “Daily high temperatures broke records in South Florida and Arizona. A Texas heat dome broke records in June and it was the planet’s warmest June on record.”

As has been discussed by numerous other analysts, here, here, and at Climate Realism, here, for instance, claims that the Earth set all-time temperature records last week were based on flawed computer model “reanalyses,” not hard data recorded by satellites, weather balloons, and surface stations. Computer models consistently project much hotter temperatures than those actually recorded.

Indeed, despite the headline grabbing attention given to the asserted record setting global temperatures, when USA Today asked the National Oceanic and Atmospheric Administration about the record setting claims, the agency said surface temperature data did not confirm that global high temperatures were broken in early July.

Although high temperatures for particular dates may have been broken in some cities in the states discussed in the last week, there is no evidence that long-term climate change is to blame for a single date’s or week’s spike in temperatures, especially when there are more likely causes for the recorded temperatures.

The earth has recently entered an El Nino phase, a natural weather pattern, that the National Weather Service points out causes a widespread increase in surface temperatures. In addition, the Atlantic Ocean is simultaneously experiencing a period of extremely low North Atlantic Oscillation, which also drives warmer sea surface temperatures.

Also, each of the cities or areas where it is claimed records were broken have temperature stations badly compromised by the urban heat island effect (UHI). Tremendous population growth has occurred in the locations discussed in the various stories since the previous records were set. For example, San Angelo, Texas was among the areas that the media is reporting broke its all-time high for a date in early July, with a temperature of 114℉, breaking the previous record of 111℉, set first in 1933 and tied repeatedly since then, in 1943, 1944, and 1960 (the latter three years during a time when the Earth was cooling modestly). What the stories neglect to mention, however, is that since the prior record was set and tied, San Angelo has grown substantially, with more attendant artificial heat sources. From the early 1930s to the present San Angelo’s population has grown approximately 294 percent, and since 1960, the last time the previous temperature record was tied, San Angelo’s population has grown more than 69 percent.

It is worth remembering that Texas’s all-time record high temperature, of 120℉ was recorded in two locations both with populations then and now of fewer than 10,000 people: Seymour in 1933 and Monhans in 1994. Both locations have a fraction of the population of San Angelo. San Angelo’s recent high doesn’t seem that dramatic in context.

Population has grown even faster and is much larger in Florida and Arizona, two of the nation’s fastest growing states. For example, whereas USA Today vaguely references daily high temperature records being set in Arizona, The Guardian gives some details, pointing out that Phoenix may have tied it second hottest day on record of 121℉, a record set in 1995. Yet, since 1990 Phoenix’s population has increased more than 67 percent. For Arizona as a whole, the current high temperature record, of 128℉, was set in 1994 at Lake Havasu City. In 1990, Lake Havasu City’s population was just 2.5 percent of Phoenix’s and despite more than doubling in size since then it is still only 3.4 percent of Phoenix’s size. In sparsely populated Lake Havasu City, it is exceedingly unlikely that the UHI compromised Arizona’s all-time high temperature record. By contrast the UHI almost certainly factored into Phoenix’s recent non-record setting but high temperature readings.

And as for Florida, its record temperature for heat, 109℉, was set more than 90 years of global warming ago on June 29, 1931 in Monticello, then under 2000 people. That’s 19 degrees above the temperatures in Daytona Beach in early June that USA Today expressed such concern about.

Have and will some high temperature records be set this summer? Almost certainly. It’s almost always the case that during the summer new temperature records are set somewhere, on some date. This will be especially likely this year with the combination of existing oceanic circulation patterns in place. However, because there is no long-term trend in an increasing number of heatwaves or in consistently record setting temperatures, climate change cannot be blamed for any new records.

*********************************************

We Must Remove All Barriers to Oil and Gas Pipelines

The world liquefied natural gas (LNG) market has changed significantly since the first LNG ship carried cargoes from Louisiana to the UK, proving the feasibility of transoceanic LNG transport in 1959. This change has been even more significant in the U.S.. Over the years, we have been both an importer and an exporter of LNG, albeit more of a net importer until recently. Today, the U.S. is a major exporter in global markets thanks to an abundance of natural gas supply, brought about by advancements in fracking technologies.

If these changes are evaluated in a bubble, it is a success story for the U.S. energy industry and the nation. After all, being energy independent – or at least a net energy exporter – is one of the most important things for national security, as we have seen recently with the impacts of Russia and Ukraine war. And that independence can have bigger implications: according to CSIS experts, “Energy security in Europe – and globally – now rests on U.S. natural gas exports.”

Despite the strategic importance of U.S. natural gas in global markets, the domestic discourse around the issue has been less than celebratory. The discussion among policymakers revolves around two big issues: potential price impacts on U.S. consumers of increased LNG exports and the impact of increased production on climate change.

Price impacts became a big concern since the onset of the Ukraine-Russia war disrupted global energy markets significantly, creating short-term price spikes. Opponents of U.S. LNG exports claim that any increase in the U.S. supply to global markets could limit the amount for the domestic consumption and could have negative impacts for both residential and industrial users of natural gas. This argument is not new and over the years research, including research conducted or commissioned by the U.S. Department of Energy, has shown that the U.S. has ample supply and that price impacts would be small since the export supply would come from the increased domestic production.

But given the significant change in the dynamics of the global energy markets, we asked NERA Economic Consulting, who conducted similar studies in 2012 and 2018, to revisit the potential implications of increased U.S. LNG exports on prices. NERA considered various increased demand scenarios, such as increased U.S. demand or European supply diversification. The results of the study showed that price impacts were small, ranging from 5 to 10 cents per MMBtu in 2025, validating the results of prior studies.

We went one step further and asked NERA to evaluate how the access to natural gas, in the form of extended pipelines, might change these price dynamics and that is where we got interesting results: building adequate pipeline infrastructure lowers prices by about 10%, or 25 cents per MMBtu in 2025, under all scenarios and between 25 and 40 cents per MMBtu in 2035. These results also underline the importance of recent calls to reform the permitting process, which is key to the expansion of energy infrastructure in the U.S. Conclusions show that U.S. policymakers should be rooting for energy infrastructure development rather than blocking energy exports if their main concern is U.S. consumers.

Assuming existing studies answer the price question, how about climate change? According to the International Energy Agency, fossil fuel operations “generate over one third of all methane emissions from human activity.” The Environmental Protection Agency shows that gas production was responsible for 41% of the methane emissions in 2020. Even the industry’s main association lists methane emission reduction as one of their top priorities and voices their support for cost effective policies and direct regulation of methane for new and existing sources across the supply chain. Global consumers and investors are also demanding cleaner gas.

However, recent global developments have highlighted the integral role of natural gas in energy markets, and simply replacing or blocking its development is not in the cards in the short- and medium-term. During this past winter, many European and Asian countries resorted to coal, which has a significantly more damaging environmental impact. Given the growing role of natural gas globally, industry needs a public-private partnership to address methane emissions – the Achilles heel of natural gas. Increased efforts to address the problem are already underway in the U.S. In addition to private sector efforts to deal with methane emissions, the U.S. Department of Energy recently announced nearly $47 million in funding for 22 research projects to develop new and innovative measurement, monitoring, and mitigation technologies. Government funding of new technologies as it did with carbon capture and storage technology, could help significantly reduce methane emissions from natural gas production.

The U.S. has an ample supply of natural gas, that could help both domestic and international markets. We should work on providing access to this low-cost energy to U.S. consumers through expanded pipelines and create a system that encourages adoption of cleaner development of this resource.

*************************************************

Australian trial of seaweed cow feed fails to achieve hoped-for methane cuts

One of the world’s longest commercial trials of a seaweed supplement that the global meat industry hopes could slash methane from beef cattle has recorded much lower reductions in the potent greenhouse gas than previous studies.

Putting the supplement into the diets of 40 wagyu cattle in an Australian feedlot for 300 days cut the methane they produced by 28%.

The supplement was derived from the red seaweed species Asparagopsis, which has been widely promoted as being able to cut methane by more than 80%, with some experiments suggesting reductions as high as 96%.

Globally, the UN’s Food and Agriculture Organization estimates, methane from burping cattle – known as enteric emissions – releases about 2.1bn tonnes of CO2-equivalent a year, compared with the 37.5bn tonnes of CO2 from burning fossil fuels.

But because methane is about 80 times more potent than CO2 at warming the planet over a 20-year period, cutting methane is seen as a way to slow global heating faster.

The trial, reported by the red meat industry’s marketing and research group Meat and Livestock Australia (MLA), also found animals given the supplement ate less food and weighed 15kg less by the time they were sent for slaughter.

Dr Fran Cowley, a livestock scientist at the University of New England who led the trial, said it was the longest run so far using the red seaweed.

She said more research was needed to understand why the wagyu in the trial had not delivered the same level of emissions reductions as other experiments.

One factor could be the way the methane was measured in the trial, which used an open-air system in a feedlot compared with animals measured in dedicated indoor chambers.

But the trial report noted that other experiments over shorter timeframes using the same open-air measurement technique had recorded higher methane reductions.

“This was the biggest and longest trial so far and [the supplement] has not performed to the levels seen in the headlines people might have picked up. But that doesn’t mean it can’t,” Cowley said.

Cowley said she thought cuts of 90% “in the real world” were possible but there would also be economic factors that commercial producers would have to take, such as factoring in the cost of the supplement against the market benefits of methane reductions.

The seaweed was mixed in canola oil and added to the animals’ feed. In this trial it was given to the animals at slightly lower concentrations than other experiments that showed much higher methane reductions.

Cowley said it was also not clear why the animals on the supplement ate less food and put on weight more slowly.

Accounting for the extra 35 days the animals would have taken to reach the same weight would have theoretically meant the emissions savings were cut from 28% to 19% as they would have been alive for longer, all the time emitting methane.

Wagyu is considered a higher-end and more expensive beef. The trial found the seaweed supplement had no effect on the meat’s properties, including flavour.

Dr Rob Kinley is a pioneer of the Asparagopsis supplement and the chief scientist at FutureFeed – the Australian company that holds the intellectual property for its use globally as a livestock feed supplement.

He said it was not surprising the trial had seen lower results given the differences across breeds, measuring techniques, diets of the animals and the amount of supplement given to the animals.

But Kinley said it should be celebrated that the supplement was able to cut methane over such a long period and is confident other trials would deliver far higher reductions.

“The golden lining is even though it was just under 30% emissions reduction, it stayed that way for 275 days – it hardly faltered at all and I was impressed by that,” he said.

The Australian government funds a $29m research program to test different methane-reducing livestock supplements, including red seaweed.

The latest trial was financially backed by the country’s biggest beef producer, the Australian Agricultural Company (AACo), which helped run the trial and provided the animals.

The AACo chief executive, David Harris, said the company had anticipated bigger methane cuts but “reducing emissions by almost 30% is still significant”.

“There is no silver bullet to eliminating enteric methane emissions, but we’ll keep trying and we’ll discover how to make it work in our environment,” he said. “The important thing is that we are determined to get there.”

Most trials of methane-reducing supplements report emissions reductions only while animals are in the feedlot. Only 12%-15% of AACo’s emissions occur while the animals are in a feedlot.

An MLA spokesperson said: “Each time a new research project concludes, it places another piece into the puzzle, helping us understand the various products that might incorporate Asparagopsis and also helps us to understand further questions that need to be answered.”

***************************************

My other blogs. Main ones below

http://dissectleft.blogspot.com (DISSECTING LEFTISM )

http://edwatch.blogspot.com (EDUCATION WATCH)

http://pcwatch.blogspot.com (POLITICAL CORRECTNESS WATCH)

http://australian-politics.blogspot.com (AUSTRALIAN POLITICS)

http://snorphty.blogspot.com/ (TONGUE-TIED)

http://jonjayray.com/blogall.html More blogs

*****************************************

No comments: