Thursday, July 24, 2014
Mega pesky: Deep Oceans have been Cooling For The Past 20 Years
In polite scientific language, this study demolishes the Warmist explanation for "missing heat". At every point of the warmist explanation, the data show the opposite of what that explanation requires. In addition, and as even I have repeatedly pointed out, the authors note that there is no known mechanism that would cause ocean heat to move in the paradoxical way that Warmists theorize. It's all BS, to put it in layman's terms
Two of the world’s premiere ocean scientists from Harvard and MIT have addressed the data limitations that currently prevent the oceanographic community from resolving the differences among various estimates of changing ocean heat content. They point out where future data is most needed so these ambiguities do not persist into the next several decades of change.
As a by-product of that analysis they 1) determined the deepest oceans are cooling, 2) estimated a much slower rate of ocean warming, 3) highlighted where the greatest uncertainties existed due to the ever changing locations of heating and cooling, and 4) specified concerns with previous methods used to construct changes in ocean heat content, such as Balmaseda and Trenberth’s re-analysis (see below). They concluded, “Direct determination of changes in oceanic heat content over the last 20 years are not in conflict with estimates of the radiative forcing, but the uncertainties remain too large to rationalize e.g., the apparent “pause" in warming.”
Wunsch and Heimbach (2014) humbly admit that their “results differ in detail and in numerical values from other estimates, but the determining whether any are “correct" is probably not possible with the existing data sets.”
They estimate the changing states of the ocean by synthesizing diverse data sets using models developed by the consortium for Estimating the Circulation and Climate of the Ocean, ECCO. The ECCO “state estimates” have eliminated deficiencies of previous models and they claim, “unlike most “data assimilation" products, [ECCO] satisfies the model equations without any artificial sources or sinks or forces. The state estimate is from the free running, but adjusted, model and hence satisfies all of the governing model equations, including those for basic conservation of mass, heat, momentum, vorticity, etc. up to numerical accuracy.”
Their results (Figure 18. below) suggest a flattening or slight cooling in the upper 100 meters since 2004, in agreement with the -0.04 Watts/m2 cooling reported by Lyman (2014).6 The consensus of previous researchers has been that temperatures in the upper 300 meters have flattened or cooled since 2003,4 while Wunsch and Heimbach (2014) found the upper 700 meters still warmed up to 2009.
The deep layers contain twice as much heat as the upper 100 meters, and overall exhibit a clear cooling trend for the past 2 decades. Unlike the upper layers, which are dominated by the annual cycle of heating and cooling, they argue that deep ocean trends must be viewed as part of the ocean’s long term memory which is still responding to “meteorological forcing of decades to thousands of years ago”. If Balmaseda and Trenberth’s model of deep ocean warming was correct, any increase in ocean heat content must have occurred between 700 and 2000 meters, but the mechanisms that would warm that “middle layer” remains elusive.
The detected cooling of the deepest oceans is quite remarkable given geothermal warming from the ocean floor. Wunsch and Heimbach (2014) note, “As with other extant estimates, the present state estimate does not yet account for the geothermal flux at the sea floor whose mean values (Pollack et al., 1993) are of order 0.1 W/m2,” which is small but “not negligible compared to any vertical heat transfer into the abyss.3 (A note of interest is an increase in heat from the ocean floor has recently been associated with increased basal melt of Antarctica’s Thwaites glacier. ) Since heated waters rise, I find it reasonable to assume that, at least in part, any heating of the “middle layers” likely comes from heat that was stored in the deepest ocean decades to thousands of years ago.
Wunsch and Heimbach (2014) emphasize the many uncertainties involved in attributing the cause of changes in the overall heat content concluding, “As with many climate-related records, the unanswerable question here is whether these changes are truly secular, and/or a response to anthropogenic forcing, or whether they are instead fragments of a general red noise behavior seen over durations much too short to depict the long time-scales of Fig. 6, 7, or the result of sampling and measurement biases, or changes in the temporal data density.”
Given those uncertainties, they concluded that much less heat is being added to the oceans compared to claims in previous studies (seen in the table below). It is interesting to note that compared to Hansen’s study that ended in 2003 before the observed warming pause, subsequent studies also suggest less heat is entering the oceans. Whether those declining trends are a result of improved methodologies, or due to a cooler sun, or both requires more observations.
No climate model had predicted the dramatically rising temperatures in the deep oceans calculated by the Balmaseda/Trenberth re-analysis,13 and oceanographers suggest such a sharp rise is more likely an artifact of shifting measuring systems. Indeed the unusual warming correlates with the switch to the Argo observing system. Wunsch and Heimbach (2013)2 wrote, “clear warnings have appeared in the literature—that spurious trends and values are artifacts of changing observation systems (see, e.g., Elliott and Gaffen, 1991; Marshall et al., 2002; Thompson et al., 2008)—the reanalyses are rarely used appropriately, meaning with the recognition that they are subject to large errors.”3
More specifically Wunsch and Heimbach (2014) warned, “Data assimilation schemes running over decades are usually labeled “reanalyses.” Unfortunately, these cannot be used for heat or other budgeting purposes because of their violation of the fundamental conservation laws; see Wunsch and Heimbach (2013) for discussion of this important point. The problem necessitates close examination of claimed abyssal warming accuracies of 0.01 W/m2 based on such methods (e.g., Balmaseda et al., 2013).” 3
So who to believe?
Because ocean heat is stored asymmetrically and that heat is shifting 24/7, any limited sampling scheme will be riddled with large biases and uncertainties. In Figure 12 below Wunsch and Heimbach (2014) map the uneven densities of regionally stored heat. Apparently associated with its greater salinity, most of the central North Atlantic stores twice as much heat as any part of the Pacific and Indian Oceans. Regions where there are steep heat gradients require a greater sampling effort to avoid misleading results. They warned, “The relatively large heat content of the Atlantic Ocean could, if redistributed, produce large changes elsewhere in the system and which, if not uniformly observed, show artificial changes in the global average.” 3
Furthermore, due to the constant time-varying heat transport, regions of warming are usually compensated by regions of cooling as illustrated in their Figure 15. It offers a wonderful visualization of the current state of those natural ocean oscillations by comparing changes in heat content between1992 and 2011. Those patterns of heat re-distributions involve enormous amounts of heat and that make detection of changes in heat content that are many magnitudes smaller extremely difficult. Again any uneven sampling regime in time or space, would result in “artificial changes in the global average”.
Figure 15 shows the most recent effects of La Nina and the negative Pacific Decadal Oscillation. The eastern Pacific has cooled, while simultaneously the intensifying trade winds have swept more warm water into the western Pacific causing it to warm. Likewise heat stored in the mid?Atlantic has likely been transported northward as that region has cooled while simultaneously the sub-polar seas have warmed. This northward change in heat content is in agreement with earlier discussions about cycles of warm water intrusions that effect Arctic sea ice, confounded climate models of the Arctic and controls the distribution of marine organisms.
Most interesting is the observed cooling throughout the upper 700 meters of the Arctic. There have been 2 competing explanations for the unusually warm Arctic air temperature that heavily weights the global average. CO2 driven hypotheses argue global warming has reduced polar sea ice that previously reflected sunlight, and now the exposed dark waters are absorbing more heat and raising water and air temperatures. But clearly a cooling upper Arctic Ocean suggests any absorbed heat is insignificant. Despite greater inflows of warm Atlantic water, declining heat content of the upper 700 meters supports the competing hypothesis that warmer Arctic air temperatures are, at least in part, the result of increased ventilation of heat that was previously trapped by a thick insulating ice cover.7 That second hypothesis is also in agreement with extensive observations that Arctic air temperatures had been cooling in the 80s and 90s. Warming occurred after subfreezing winds, re-directed by the Arctic Oscillation, drove thick multi-year ice out from the Arctic.11
Regional cooling is also detected along the storm track from the Caribbean and along eastern USA. This evidence contradicts speculation that hurricanes in the Atlantic will or have become more severe due to increasing ocean temperatures. This also confirms earlier analyses of blogger Bob Tisdale and others that Superstorm Sandy was not caused by warmer oceans.
In order to support their contention that the deep ocean has been dramatically absorbing heat, Balmaseda/Trenberth must provide a mechanism and the regional observations where heat has been carried from the surface to those depths. But few are to be found. Warming at great depths and simultaneous cooling of the surface is antithetical to climate models predictions. Models had predicted global warming would store heat first in the upper layer and stratify that layer. Diffusion would require hundreds to thousands of years, so it is not the mechanism. Trenberth, Rahmstorf, and others have argued the winds could drive heat below the surface. Indeed winds can drive heat downward in a layer that oceanographers call the “mixed-layer,” but the depth where wind mixing occurs is restricted to a layer roughly 10-200 meters thick over most of the tropical and mid-latitude belts. And those depths have been cooling slightly.
The only other possible mechanism that could reasonably explain heat transfer to the deep ocean was that the winds could tilt the thermocline. The thermocline delineates a rapid transition between the ocean’s warm upper layer and cold lower layer. As illustrated above in Figure 15, during a La Nina warm waters pile up in the western Pacific and deepens the thermocline. But the tilting Pacific thermocline typically does not dip below the 700 meters, if ever.8
Unfortunately the analysis by Wunsch and Heimbach (2014) does not report on changes in the layer between 700 meters and 2000 meters. However based on changes in heat content below 2000 meters (their Figure 16 below), deeper layers of the Pacific are practically devoid of any deep warming.
The one region transporting the greatest amount of heat into the deep oceans is the ice forming regions around Antarctica, especially the eastern Weddell Sea where annually sea ice has been expanding.12 Unlike the Arctic, the Antarctic is relatively insulated from intruding subtropical waters (discussed here) so any deep warming is mostly from heat descending from above with a small contribution from geothermal.
Counter-intuitively greater sea ice production can deliver relatively warmer subsurface water to the ocean abyss. When oceans freeze, the salt is ejected to form a dense brine with a temperature that always hovers at the freezing point. Typically this unmodified water is called shelf water. Dense shelf water readily sinks to the bottom of the polar seas. However in transit to the bottom, shelf water must pass through layers of variously modified Warm Deep Water or Antarctic Circumpolar Water. Turbulent mixing also entrains some of the warmer water down to the abyss. Warm Deep Water typically comprises 62% of the mixed water that finally reaches the bottom. Any altered dynamic (such as increasing sea ice production, or circulation effects that entrain a greater proportion of Warm Deep Water), can redistribute more heat to the abyss.14 Due to the Antarctic Oscillation the warmer waters carried by the Antarctic Circumpolar Current have been observed to undulate southward bringing those waters closer to ice forming regions. Shelf waters have generally cooled and there has been no detectable warming of the Warm Deep Water core, so this region’s deep ocean warming is likely just re-distributing heat and not adding to the ocean heat content.
So it remains unclear if and how Trenberth’s “missing heat” has sunk to the deep ocean. The depiction of a dramatic rise in deep ocean heat is highly questionable, even though alarmists have flaunted it as proof of Co2’s power. As Dr. Wunsch had warned earlier, “Convenient assumptions should not be turned prematurely into ‘facts,’ nor uncertainties and ambiguities suppressed.” … “Anyone can write a model: the challenge is to demonstrate its accuracy and precision... Otherwise, the scientific debate is controlled by the most articulate, colorful, or adamant players.”
To reiterate, “the uncertainties remain too large to rationalize e.g., the apparent “pause" in warming.”
More HERE (See the original for links, graphics etc.)
“WELL-ESTIMATED GLOBAL SURFACE WARMING”
Warmist paper was just being wise after the event
Dr David Whitehouse
This new paper allows great headlines to proclaim that the warming “pause” in global surface temperature is explainable by climate models. As is often the case in climate reporting the details do not back up the headline.
Risbey et al (2014) in Nature Climate Change is yet another paper suggesting that the global surface temperature hiatus of the last 15-years or so is due to changes in the character of the ENSO. But they go a little further and say that once the observational timing of ENSO changes is included in climate models they do a good job. Unfortunately, whilst an interesting and thought provoking paper, it does not support its own conclusion that “climate models have provided good estimates of the 15-year trends for recent periods.”
Climate models have many uses and are essential tools to discover what is going on and, with major caveats, suggest future possibilities. It is well-known that as a whole the CIMP5 ensemble of models does not represent reality that well with only two models coming anywhere near reflecting the hiatus in global surface temperature seen in the last 15-years or so.
With a climate model ensemble that is mainly unrepresentative of reality there are several possibilities for further action. One is to have faith in the models that over longer timescales realities departure from them is temporary. Another is to select those models that best simulate reality and concentrate on them, and the other is to refine the models. Risbey et al (1014) carry out both the latter options.
They selected 18 out of 32 CIMP5 models choosing the ones that had sea surface temperature as a model output. In itself this introduces a selection effect whose influence on subsequent selections of “suitable” models is unknown. Out of those 18 they selected the four best and four worst. The best included ENSO parameters that are in phase with observations. They argue that when the phase of ENSO is got right climate models do represent reality. Unfortunately the evidence they provide for this is not convincing.
If the ENSO with El Nino dominant is having the effect of flattening the global surface temperature of the past 15 years or so then the converse must also be true. ENSO with La Nina dominant would have contributed to the warming seen since about 1980. [Pesky!]
Our lack of understanding of the ENSO process also affects the stated conclusions of this paper. We cannot predict these events with any certainty and we cannot simulate them to any degree of great accuracy. So while there are ENSO components in a climate model, to say that those in the right phase do better could mean nothing. In addition there are other semi-regular changes such as the Atlantic oscillation that might, or might not, be in phase with the observations.
Supplementary information would have helped understand this paper, especially the selection of the models, but unfortunately there are none. This means that given the information in this paper alone it would not be possible to retrace the author’s footsteps.
This paper allows great headlines to be written proclaiming that the “pause” in global surface temperature is explainable by climate models. As is often the case in climate reporting the details do not back up the headline.
What this paper has really done is to draw attention to the limitations of the climate models. One can select subsets of them and argue that they are better than others but the real test is if the Risbey et al (2014) paper has predictive power. In science looking forward is always more powerful than looking back and adjusting models to fit the data.
Risbey et al (2014) say they expect the observed trend to bounce back. So do many others for different reasons. If it does how will we know who is right?
SOURCE
Deficient Chicago infrastructure blamed on climate change
Since there has been no climate change for 17 years, we can KNOW that to be false
Sewage gushed up Lori Burns’s toilet. It swept the floor. It wrecked the water heater, the deep freezer, her mother’s wedding veil.
This basement invasion was the third in five years. Burns, 40, could no longer afford to pay a cleanup crew. So she slipped on polka dotted rain boots, waded into the muck, wrenched out the stand-pipe and watched the brown water drain.
The South Side native, a marketing specialist, estimated damages at $17,000. And that did not include what she could not replace: the family heirlooms, the oriental rugs, her cashmere sweaters. The bungalow had flooded four times from 1985 to 2006, when her parents owned it. Lately, it flooded every other year. Burns felt nature was working against her. In a way, it was.
As Washington still fights over whether or not climate change is real, people across the country are already paying costs scientists ascribe to it — sometimes in unexpected places. You might think about climate change in terms of rising sea levels threatening coastal cities. But all over the Midwest, from Chicago to Indianapolis and Milwaukee, residents face just as many difficult issues as changing weather patterns collide with aging infrastructure. The costs — for governments, insurance companies and homeowners — are measured not only in dollars, but in quality of life.
In Chicago over the past century, downpours that force human waste up pipes and into homes — storms that dump at least 1.5 inches of rain in a single day — have struck the city more often. Annual precipitation in the Midwest grew about 20 percent during the past century. Rains of more than 2.5 inches a day are expected to increase another 50 percent in the next 20 years. That means more flooding — and more clean-up costs for people like Burns.
As the April rain poured, she texted her brother: How much bleach do you have?
On came the snowsuits, goggles and face masks. They dumped bleach on the floor, mopped and reminisced about what they had survived in this basement: a midnight home intruder, the occasional pop-pop of neighborhood gunfire, their parents’ divorce. Here they played Monopoly and watched “The Cosby Show” and learned the truth about Santa Claus.
Soon the silt, as Burns euphemistically called it, was gone. Fans would dry the dampness. The worst was over, it seemed.
In May, a year after sewage swamped Burns’s basement, an insurance giant took to an Illinois courtroom for what might have been a publicity stunt, or what might be a preview of a nationwide battle over who foots the bill for extreme weather events linked to climate change. Farmer’s Insurance Co. sued the city of Chicago for failing to prepare for the effects of global warming.
The city “should have known,” the lawsuit alleged, “that climate change in Cook County has resulted in greater rainfall volume … than pre-1970 rainfall history evidenced.” The storms are not an act of God, the suit claimed, but a carbon-driven reality outlined in Chicago’s own Climate Action Plan, published in 2010.
Last April, sewage water flooded roughly 600 Chicago buildings, according to the lawsuit: “Geysers of sewer water shot out from the floor drains, toilets, showers. … Elderly men and women and young children were forced to evacuate.” That could have been prevented, the company claimed, if Chicago would have remedied an underground storm-water storage that has become, over time, “obsolete.”
“Farmers has taken what we believe is the necessary action to recover payments made on behalf of our customers,” spokesman Trent Frager said in a statement, “for damages caused by what we believe to be a completely preventable issue.”
Two months later, the company dropped the suit — “We hoped that by filing … we would encourage cities and counties to take preventative steps,” Frager said — but not before raising issues that are sure to return to the courts if current climate trends persist.
“The debate we have entered now is: Why does it seem more and more disasters are happening?” said Erwann Michel-Kerjan, executive director of the Wharton Risk Management and Decision Processes Center at the University of Pennsylvania. “And, as a nation, who’s supposed to pay for them?”
The National Climate Assessment, released by the Obama administration in May, predicts that the “frequency and intensity” of the Midwest’s heaviest downpours will more than double over the next hundred years. A handful of heavy spring and summer storms, the kind that flood homes, can supply 40 percent of the region’s annual rainfall, according to the Environmental Protection Agency.
If weather patterns follow projections, that means trouble for aging urban infrastructures and the cities, like Chicago, that rely on them: “Designs are based upon historical patterns of precipitation and stream flow,” the climate assessment says, “which are no longer appropriate guides.”
The link between climate and flooding in Chicago, however, can’t be summarized with, It’s warmer out, so this storm happened. Inherent uncertainties in science make it difficult to disentangle just what forces play Rainmaker.
The American Association for the Advancement of Science, which calls itself the world’s largest non-government group science advocacy group, released a report this year called “What We Know,” which offers a nuanced look at climate change and its effects. The report concludes that natural disasters, like floods, are striking harder and more often. But, beyond anecdotes and weather projections, it adds, it’s hard to link one specific flood to carbon emissions.
Increased storm frequency is particularly problematic in Chicago, where the sewer system was designed to absorb rain nearly 120 years ago. The city’s storm water systems were built on the assumption that the biggest storms happen only once each decade, at a time when the population was much smaller, said Robert Moore, who leads a climate preparation team at the Natural Resources Defense Council in downtown Chicago. “Climate change will only amplify an existing issue.”
The combined sewer system overflows when an inch of rain soaks the city, directing waste into the Chicago River. If more than 1.5 inches of rain fall city-wide in a day, Moore said, it floods basements across town, disrupting lives and bank accounts.
District engineers agree the problem is serious, and they’re building heavily to address it. They’ve seen the data and the changing weather patterns, but don’t think it suggests any particular cause. They don’t blame a man-made Apocalypse.
“Climate change is a political term,” said David St. Pierre, head of Chicago’s Metropolitan Water Reclamation District.“But you can’t ignore that our weather has changed drastically in the past five years.”
The city’s underground storm and wastewater storage can now hold about 2.7 billion gallons of overflow. By 2015, storage should total 7.5 billion gallons, St. Pierre said. By 2029, 17.5 billion gallons.
“I don’t see any overflows happening when that’s done,” he said. “We’re getting this under control, maybe more than any other city in the U.S.”
SOURCE
Britain Won’t Sign New Climate Treaty Unless China, India Agree CO2 Caps
Britain will not sign a global deal on climate change unless it includes commitments from China and India on reducing emissions, the energy and climate change secretary said on the eve of visiting the two countries.
China is the world’s highest emitter of greenhouse gases and India the third. Neither has agreed any cap on emissions. In an interview with The Times, Ed Davey said that there was little point in Britain making great efforts to cut emissions if other countries did not. “If I looked around the world and no one was doing anything I would have to ask myself the question: is it worth us doing anything if no one else is?” he said.
Speaking before meetings in Beijing and Delhi this week to discuss contributions to a global climate deal due to be signed in Paris next year, Mr Davey said: “We won’t do a deal unless these countries come on board. We need a deal that’s applicable to all — that’s what we didn’t get at Kyoto [the 1997 conference in Japan at which binding targets were set for the emissions of industrialised nations].” Mr Davey said that developing countries should be allowed to carry on increasing their emissions for a few years but at a lower rate and with clear targets for when the level should peak and start declining.
“We expect the rich, developed countries to cut aggressively, emerging economies to peak and then decline and the developing countries and the poorest to increase but hopefully at low rates and have a more sustainable development model than we had.”
On China, he said: “The key for them and the world is when they will peak. The earlier the better. I would like it to be 2025 or earlier. If the Chinese were to say ‘we are not going to commit to a peaking point’, I’m not sure you would get a deal.
More HERE
A Great Plan to Replace the EPA
By Alan Caruba
For years now I have been saying that the Environmental Protection Agency (EPA) must be eliminated and its powers given to the fifty states, all of which,have their own departments of environmental protection. Until now, however, there has been no plan put forth to do so.
Dr. Jay Lehr has done just that and his plan no doubt will be sent to the members of Congress and the state governors. Titled “Replacing the Environmental Protection Agency” it should be read by everyone who, like Dr. Lehr, has concluded that the EPA was a good idea when it was introduced in 1971, but has since evolved into a rogue agency threatening the U.S. economy, attacking the fundamental concept of private property, and the lives of all Americans in countless and costly ways.
Dr. Lehr is the Science Director and Senior Fellow of The Heartland Institute, for whom I am a policy advisor. He is a leading authority on groundwater hydrology and the author of more than 500 magazine and journal articles, and 30 books. He has testified before Congress on more than three dozen occasions on environmental issues and consulted with nearly every agency of the federal government and with many foreign countries. The Institute is a national nonprofit research and education organizations supported by voluntary contributions.
Ironically, he was among the scientists who called for the creation of the EPA and served on many of the then-new agency’s advisory councils. Over the course of its first ten years, he helped write a significant number of legislative bills to create a safety net for the environment.
As he notes in his plan, “Beginning around 1981, liberal activist groups recognized EPA could be used to advance their political agenda by regulating virtually all human activities regardless of their impact on the environment. Politicians recognized they could win votes by posing as protectors of the public health and wildlife. Industries saw a way to use regulations to handicap competitors or help themselves to public subsidies. Since that time, not a single environmental law or regulation has passed that benefited either the environment or society.”
“The takeover of EPA and all of its activities by liberal activists was slow and methodical over the past 30 years. Today, EPA is all but a wholly owned subsidiary of liberal activist groups. Its rules account for about half of the nearly $2 trillion a year cost of complying with all national regulations in the U.S. President Barack Obama is using it to circumvent Congress to impose regulations on the energy sector that will cause prices to ‘skyrocket.’ It is a rogue agency.”
Dr. Lehr says that “Incremental reform of EPA is simply not an option.” He's right.
“I have come to believe that the national EPA must be systematically dismantled and replaced by a Committee of the Whole of the 50 state environmental protection agencies. Those agencies in nearly all cases long ago took over primary responsibility for the implementation of environmental laws passed by Congress (or simply handed down by EPA as fiat rulings without congressional vote or oversight.”
Looking back over the years, Dr. Lehr notes that “The initial laws I helped write have become increasingly draconian, yet they have not benefited our environment or the health of our citizens. Instead they suppress our economy and the right of our citizens to make an honest living. It seems to me, and to others, that this is actually the intention of those in EPA and in Congress who want to see government power expanded without regard to whether it is needed to protect the environment or public health.”
Eliminating the EPA would provide a major savings by eliminating 80% of its budget. The remaining 20% could be used to run its research labs and administer the Committee of the Whole of the 50 state environmental agencies. “The Committee would determine which regulations are actually mandated in law by Congress and which were established by EPA without congressional approval.”
Dr. Lehr estimates the EPA’s federal budget would be reduced from $8.2 billion to $2 billion. Staffing would be reduced from more than 15,000 to 300 and that staff would serve in a new national EPA headquarters he recommends be “located centrally in Topeka, Kansas, to allow the closest contact with the individual states.” The staff would consist of six delegate-employees from each of the 50 states.”
“Most states,” says Dr. Lehr, “will enthusiastically embrace this plan, as their opposition to EPA’s ‘regulatory train wreck’ grows and since it gives them the autonomy and authority they were promised when EPA was first created and the funding to carry it out.”
The EPA was a good idea when it was created, the nation’s air and water needed to be cleaned, but they have been at this point. Since then, the utterly bogus “global warming”, now called “climate change”, has been used to justify a torrent of EPA regulations. The science the EPA cites as justification is equally tainted and often kept secret from the public.
“It’s time for the national EPA to go,” says Dr. Lehr and I most emphatically agree. “All that is missing is the political will.
SOURCE
The EPA takes aim at Tesla, electric cars
The cornerstone of personal independence and commerce in the modern world is motorized mobility — the car. Ever since Henry Ford’s Model T revolutionized travel in the United States over a hundred years ago, people have relied on the automobile for virtually every personal interaction and business expenditure. Today, the car may very well be at the precipice of its evolutionary leap into the 21st century, and Obama’s regulatory state could kill it on arrival.
Elon Musk, founder and CEO of Tesla Motors, has been a pioneer in the development of electric cars that are as practical as they are attractive. Tesla cars are inherently American: efficient, sleek, fast, and, well, sexy. Everything we look for in the vehicles that represent such an enormous part of the American experience.
Recent stories have revealed Musk’s plan to release a $35,000 Tesla model with the capability of traveling more than 200 miles per charge — or about double what the unattractive, euro-like Nissan Leaf can travel — said to possess the amenities and attractiveness of the current, far more expensive Tesla models. A top-end electric car for the every-man. If achieved successfully, this may mark the beginning of the commonly used exhaust-free, electric automobile. What a glorious achievement for the environmentalist left! … Right?
Well, not quite.
As one can easily deduce, the electric car requires electricity. For electricity to be a more efficient way to power said electric car over, say, petrol fuels, it needs to be available in inexpensive abundance. That’s the non-starter for the EPA and the environmental extremist allies of the Obama administration.
Most American energy is generated by coal and natural gas. Coal is already on its way out. Regardless of the resource’s ability to power the nation for over 500 years at current energy usage rates, the EPA has recently laid down a regulation forcing all plants to reduce emissions by 30 percent — a crippling blow to an already suffering industry. The regulations may actually work far better, and worse, than expected. They very well reduce emissions from power generated by coal by 100 percent when the industry is unable to afford the amazing costs of retrofitting plants with new government-regulated technology. They may also, ironically, kill an industry that actually lures the American public away from the gasoline-fired automobile that the same regulatory clear-cutters want to do away with.
If energy prices skyrocket, as Obama said would be an inevitable outcome of his environmental policies, there is no practical purpose to investing in an electric car at any price point.
The free market could be ready to be rid of the carbon-puffing car and the alarmist, reactionary left may have already killed it upon arrival.
What exactly does the Obama administration want for the future of American energy? The market knows what it wants, the people know what they want. But it seems like the environmentalist radicals behind the Obama administration’s energy and environment public policies have an indiscriminate taste to destroy, rather than build for the future.
Progress is just over the horizon, only the self-titled “progressives” stand in the way.
SOURCE
***************************************
For more postings from me, see DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are here or here or here. Email me (John Ray) here.
Preserving the graphics: Most graphics on this site are hotlinked from elsewhere. But hotlinked graphics sometimes have only a short life -- as little as a week in some cases. After that they no longer come up. From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site. See here or here
*****************************************
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment