Tuesday, June 23, 2015
Schellnhuber's Papers that were presented at the Vatican
The papers are here.
Don Easterbrook comments:
Cleverly written nonsense. Virtually everything he says is contrary to real data, but a non-scientist reading this might think the graphs and rhetoric are real. Most of his claims are completely ludicrous. For example:
“As the latest IPCC Assessment Report demonstrates, the global mean surface temperature could rise above pre-industrial values by more than 4°C by 2100.”
Even the IPCC has had to admit that computer modeling predictions have been a total failure, not even close to reality, so this negates all his conclusions that follow.
“Using both computer simulations and sediment data, one can expect sea level rises by at least 2 meters per degree of warming.” 2 meters = 6.7 feet x 4 = 27 feet by 2100! That’s 3 feet per decade! (Sea level has been rising at about 7 inches per century).
The real question is where is all that water going to come from? Even under the rate of warming (20F in 40-100 years) and drastic melting of the gigantic ice sheets covering huge areas at the end of the Pleistocene, the rate of sea level rise was only about 3 feet per century.
Those melting ice sheets as a source of water to sea level rise are now gone and Antarctic ice is growing, not melting, yet he is predicting 10 times the rate of late Pleistocene sea level rise with no source of meltwater!
“The East Antarctic ice sheet – so far believed to be utterly stable – might “tip”, once a critical ice plug near the coast melts away and thereby “uncorks” the basin upstream which would lead to additional 3-4 meters of global sea-level rise.”
This pathetically uninformed view, pushed by some engineers who know very little about glaciers (e.g. Rignot et al., Joughin) consider glaciers to be held up by ‘dams’ at their mouths and their removal will cause the entire glacier to slide into the sea, totally ignoring the physical dynamics of glacial systems whose termini are controlled by equilibrium of snow accumulation and melting.
EPA Causing This Summer's Rising Electricity Rates
Due to EPA regulations, you’ll start to notice that your electricity bill is more expensive this summer. The Energy Information Administration predicted that Americans will spend 4.8% more on electricity this year than the last.
Is global warming causing air conditioners to run more? Nah, it’s just EPA regulations that are causing power plants to close because they are not deemed green enough. “Consumers are receiving the dim news as utilities take tens of thousands of megawatts of coal-generated power offline to comply with a host of EPA regulations and because of the sharp increase in cheap, domestic natural gas,” The Washington Times reports.
“Regulations such as the EPA’s mercury and air toxic standards already are having an effect on the power sector, utilities and analysts say, and the impact will be greater after the agency releases further limits on carbon emissions from power plants this summer.”
The EPA argues the effects will be minuscule — people will hardly notice the bill. It’s all part of a fight against the boogey man of “global warming,” and old electricity plants must shutter to make way for politically correct plants — an expensive undertaking. But when has the EPA ever acted in the best interests of U.S. citizens?
Scientists are creating ‘low carbon cows’ to try and reduce greenhouse gases generated by herds
Scientists have created three herds of 'eco cows' as part of an experiment to reduce the amount of greenhouses gases generated by the production of beef.
The researchers have 90 cows at North Wyke farm near Okehampton in Devon for the project, where they will closely examine every aspect of their environment - particularly what they eat - to try and reduce the amount of harmful gases they produce by up to 50 per cent.
As well as measuring the amount of fertilizer used and rain that falls on the fields, scientists will weigh each of the animals and the size of their cowpats as part of the tests.
The scientists at Rothamsted Research said they hope to show beef can be produced with less damage to the environment and a carbon footprint smaller than growing cucumber
In addition specially-fitted gas analysts will monitor how much methane and nitrous oxides is being produced from each of the herds.
One of the fields has been sown with 'novel' grasses, Professor John Crawford told The Sunday Times, because it is packed with sugar and easily digested, and so produces less methane.
A second herd will be fed plants that produce lots of protein.
'It is inefficient to grow cows using grains that humans could eat; but keeping them on grassland where crops cannot grow creates a valuable source of food', Professor Crawford said.
Cows produce a lot of methane and the less harmful carbon dioxide because of the way they process food.
As cows digest food in their stomachs, rather than in their intestines, they regurgitate their food before eating it again, resulting in bacteria in their stomachs that contain methane being released into the air. Flatulence by cattle also contributes to the amount of gas released into the atmosphere.
A tenth of the 570million tonnes of carbon dioxide produced by Britain every year comes from farming, with 28m tonnes attributable to cattle and livestock.
Why We Don’t Have Battery Breakthroughs
Electric cars are quick and quiet, with a range more than long enough for most commutes. If you want a car with extremely fast acceleration, the Tesla Model S is hard to beat. And, of course, electric vehicles avoid the pollution associated with conventional cars, including emissions of carbon dioxide from burning gasoline. Yet they account for a tiny fraction of automotive sales, mainly because the batteries that propel them are expensive and need to be recharged frequently.
A better battery could change everything. But while countless breakthroughs have been announced over the last decade, time and again these advances have failed to translate into commercial batteries with anything like the promised improvements in cost and energy storage. Some well-funded startups, most notably A123 Systems, began with bold claims but failed to deliver (see “What Happened to A123?”).
The Powerhouse, a new book by journalist Steve LeVine, chronicles the story behind one of the most dramatic battery announcements of recent years and explains how it came to nothing (see “The Sad Story of the Battery Breakthrough that Proved Too Good to Be True”).
The announcement was made in February 2012, at a conference in Washington, D.C., where a crowd of researchers, entrepreneurs, and investors had come to hear the likes of Bill Gates and Bill Clinton expound on the importance of new energy technology—and also to tap into one of the newest funding sources in Washington, the Advanced Research Projects Agency for Energy, or ARPA-E.
Founded in 2009, ARPA-E had been tasked with identifying potentially transformational research. The head of that agency, Arun Majumdar, was ready to unveil one of its first major successes: a battery cell, developed by the startup Envia, that could store twice as much energy as a conventional one. The cost of a battery that could take a car from Washington to New York without recharging, Majumdar said, would fall from $30,000 to $15,000. Electric cars would become far more affordable and practical (see “A Big Jump in Battery Capacity”).
Within months, GM licensed the technology and signed an agreement to support its development, gaining the right to use any resulting batteries. The deal was potentially worth hundreds of millions of dollars to Envia, LeVine writes. But soon Envia was getting frustrated messages from GM engineers who couldn’t reproduce the startup’s results. The year after the announcement, the deal was scuttled. Envia’s impressive battery had been a fluke.
LeVine’s account of Envia’s work shows why major progress in batteries is so hard to achieve and why startups that promise world-changing breakthroughs have struggled. Over the last decade we’ve seen remarkable improvements in this industry, but they’ve come largely from established companies steadily making small advances.
Envia’s cell was a new type of lithium-ion battery. Invented in the late 1970s and early 1980s and commercialized in the 1990s, these batteries generate electrical current when lithium ions shuttle between two electrodes. Light but powerful, they have transformed portable electronics. Their use in electric cars, however, is recent. In the 1990s, GM used cheaper lead-acid batteries for its electric EV-1; each battery weighed a bulky 600 kilograms and delivered only 55 to 95 miles before it needed to be recharged. When Tesla Motors introduced one of the first lithium-ion-powered electric cars in 2008, it could go 250 miles on a charge, roughly three times farther than the EV-1.
But the vehicle cost over $100,000, in large part because the batteries were so expensive. To cut costs, the lithium-ion-powered electric cars made today by companies such as Nissan and GM use small battery packs with a range of less than 100 miles.
While countless breakthroughs have been announced over the last decade, time and again these advances failed to translate into commercial batteries.
One difficult thing about developing better batteries is that the technology is still poorly understood. Changing one part of a battery—say, by introducing a new electrode—can produce unforeseen problems, some of which can’t be detected without years of testing. To achieve the kinds of advances venture capitalists and ARPA-E look for, Envia incorporated not just one but two experimental electrode materials.
LeVine describes what went wrong. In 2006 Envia had licensed a promising material developed by researchers at Argonne National Laboratory. Subsequently, a major problem was discovered. The problem—which one battery company executive called a “doom factor”—was that over time, the voltage at which the battery operated changed in ways that made it unusable. Argonne researchers investigated the problem and found no ready answer. They didn’t understand the basic chemistry and physics of the material well enough to grasp precisely what was going wrong, let alone fix it, LeVine writes.
With its experimental material for the opposite electrode, this one based on silicon, Envia faced another challenge. Researchers had seemingly solved the major problem with silicon electrodes—their tendency to fall apart. But the solution required impractical manufacturing techniques.
When Envia made its announcement in 2012, it seemed to have figured out how to make both these experimental materials work. It developed a version of the silicon electrode that could be manufactured more cheaply.
And through trial and error it had stumbled upon a combination of coatings that stabilized the voltage of the Argonne material. Envia cofounder Sujeet Kumar “understood that the answer was a composite of coatings,” LeVine writes. “But he still didn’t know what the composite was arresting or why it succeeded in doing so.” Since Envia was a startup with limited funds, he “didn’t have the instruments that could figure it out.” But once it became obvious that the results Envia had reported for its battery couldn’t be reproduced, understanding the problem became crucial. Even tiny changes to the composition of a material can have a significant impact on performance, so for all Envia knew, its record-setting battery worked because of a contaminant in a batch of material from one of its suppliers.
The story of Envia stands in sharp contrast to what’s turned out to be the most successful recent effort to cut the price of batteries and improve their performance. This success hasn’t come from a breakthrough but from the close partnership between Tesla Motors and the major battery cell supplier Panasonic. Since 2008, the cost of Tesla’s battery packs has been cut approximately in half, while the storage capacity has increased by about 60 percent. Tesla didn’t attempt to radically change the chemistry or materials in lithium-ion batteries; rather, it made incremental engineering and manufacturing improvements. It also worked closely with Panasonic to tweak the chemistry of existing battery materials according to the precise needs of its cars.
Since 2008, the cost of Tesla’s battery packs has been cut approximately in half, while the storage capacity has increased by about 60 percent.
Tesla claims that it is on track to produce a $35,000 electric car with a roughly 200-mile range by 2017—a feat that’s equivalent to what GM hoped to achieve with Envia’s new battery. The company anticipates selling hundreds of thousands of these electric cars a year, which would be a big leap from the tens of thousands it sells now. Yet for electric cars to account for a significant portion of the roughly 60 million cars sold each year around the world, batteries will probably need to get considerably better. After all, 200 miles is far short of the 350-plus miles people are used to driving on a tank of gasoline, and $35,000 is still quite a bit more than the $15,000 price of many small gas-powered cars.
How will we close the gap? There is probably still plenty of room to improve lithium-ion batteries, though it’s hard to imagine that Tesla’s success with minor changes to battery chemistry will continue indefinitely. At some point, radical changes such as the ones Envia envisioned may be needed. But the lesson from the Envia fiasco is that such changes must be closely integrated with manufacturing and engineering expertise.
That approach is already yielding promising results with the Argonne material that Envia licensed. Envia’s battery operated at high voltages to achieve high levels of energy storage. Now battery manufacturers are finding that using more modest voltage levels can significantly increase energy storage without the problems that troubled Envia. Meanwhile, battery researchers are publishing papers that show how trace amounts of additives change the behavior of the materials, making it possible to edge up the voltage and energy storage. The key is to combine research that illuminates details about the chemistry and physics of batteries with the expertise that battery manufacturers have gained in making practical products.
It’s an industry in which it’s very difficult for a startup, however enticing its technology, to go it alone. Andy Chu, a former executive at A123 Systems, which went bankrupt in 2012, recently told me why large companies dominate the battery industry. “Energy storage is a game played by big players because there are so many things that can go wrong in a battery,” he said. “I hope startups are successful. But you can look at the history over the past few years, and it’s not been good.”
Not so Green
Do as I say, not as I do?
His environmental credentials are well known – but then so is his taste for some of the finer things in life.
So Prince Charles will be braced for scrutiny when figures released this week reveal he has spent more than £1million of public money on air travel in the past year, thanks in part to his use of a luxury French jet named Head of State.
Boasting a double bed, shower room, ‘presidential’ area and an array of plush, clubman-style recliner seats, the converted Airbus A320-232 has been hired for three of the Prince’s foreign trips. One of these alone – to America – cost almost a quarter of a million pounds.
Charles and Camilla used the lavish jet, based at Le Bourget airport outside Paris, for their tours of Mexico and Colombia in October last year, the Middle East in February and America in March.
For their four-day trip to the US, the aircraft was flown to RAF Brize Norton in Oxfordshire before travelling to Andrews Air Force Base in Maryland.
The Sovereign Grant accounts to be published on Tuesday, which calculate the Royal Family’s cost to taxpayers, will show this is the first time the Prince has broken through the £1million barrier for flights.
It is also expected to reveal a rise in his private income from the Duchy of Cornwall.
But a Palace spokesman said the total cost of flights had increased because, unlike the previous year, Prince Charles and the Duchess of Cornwall have undertaken three transatlantic trips, as well as trips to the Middle East and to France to mark the D-Day landings.
‘They have been on official visits to Colombia and Mexico, as well as the US and a visit to Canada at the invitation of the Canadian government,’ the spokesman added.
Last year, Charles and Camilla spent £906,662 on air travel, including two Foreign and Commonwealth trips to the Middle East and India and a much criticised £246,160 flight to attend Nelson Mandela’s funeral.
That’s not to mention £96,649 on staff reconnaissance trips and £123,083 on the Royal Train.
This year’s figure is higher, also, because Charles is increasingly standing in for the Queen on foreign tours. She has been gradually handing over her more arduous jobs in a ‘gentle succession’ by the Palace.
The Head of State Airbus is owned by Masterjet and, according to its website, a flight from London to Maryland, returning four days later would cost £205,187. With VAT added at 20 per cent, the cost increases to £246,224.
Specially adapted, the plane seats 26 instead of the usual 180, has wi-fi and satellite phones, a ten-channel TV system with 200 movies on demand, and even Nespresso coffee machines.
Sources within the Palace, however, were keen to downplay the Prince’s use of the Airbus.
‘A number of factors are taken into consideration when deciding which form of travel to use, including security, availability, punctuality and logistics,’ one said.
‘To allow Their Royal Highnesses to fulfil a busy programme and to meet the inevitable security requirements, using scheduled services is not always possible although always carefully considered.
‘In the case of all tours, after careful consideration, charter flights are often the only practical option.’
In the past, courtiers have privately admitted the Prince needed to ‘substantially curb’ his use of gas-guzzling jets. One explained it as ‘a generational thing’, admitting the younger Royals were more inclined to board scheduled flights.
The Prince’s spending on air fares is in direct contrast to his son William, who often takes scheduled flights. Last December, he travelled to New York with his wife Kate on a scheduled British Airways flight, while the pair also flew to Mustique for a family holiday in February aboard a commercial aircraft.
In May last year, William chose to fly economy class on an American Airlines flight from Memphis to Dallas as he returned home from a friend’s wedding.
A passenger tweeted: ‘I am still amazed. Prince William flies [economy]. That’s pretty humble/awesome. Who knew?’
GREENIE ROUNDUP FROM AUSTRALIA
Three current articles below
Response to BOM whitewash from good ol' boy, Ron Sandland
Remarks below by Dr Jennifer Marohasy
FOLLOWING are my initial comments in response to the release of the report by the Technical Advisory Forum on the Australian Climate Observations Reference Network (ACORN-SAT):
Dr Sandland chairs 'The Forum' that has so far refused to hold an open forum.
I NOTE that The Forum, chaired by Dr Ron Sandland formerly of the CSIRO, concurs with the Bureau that:
“There is a need to adjust the historical temperature record to account for site changes, changes in measurement practices and identifiable errors in measurement… To this end, the Forum supports the need for the Bureau’s homogenisation process to incorporate both metadata-based adjustments and adjustments based on the statistical detection of atypical observations. In the opinion of the Forum members, unsolicited submissions received from the public did not offer a justification for contesting the overall need for homogenisation or the scientific integrity of the Bureau’s climate records.”
As a member of the public who made an unsolicited submission, I would like to clarify that at no time did I suggest there was no need for adjustments, rather I have queried why there are adjustments made when, in fact, there are no documented site changes, no changes in measurement practices, and no identifiable errors. Yet adjustments are still made.
The Forum appears to have overlooked many examples of this provided in the public submissions, and published by The Australian newspaper late in 2014. For example, the Forum has completely ignored the notorious example of Rutherglen, where a slight cooling trend was converted into a warming trend, despite an absence of any metadata providing justification.
The Forum has also made no comment on the actual choice of stations for inclusion in ACORN-SAT, nor how the selection of stations has changed in recent years. For example, in his submission to the panel, retired chartered accountant Merrick Thomson showed how the choice of ACORN-SAT stations changed from 2012 to 2013 and, how this could generate a large increase in global warming.
The Forum has suggested that the Bureau consider pre-1910 data in its analysis of climatic trends.
“Recommendation 5: Further, the possible availability of pre-1910 data at south-eastern sites may allow for a comparative analysis to be performed for south-eastern Australia to assess whether the inclusion of pre-1910 data is worthwhile in attempting to understand current temperature patterns.”
This is currently listed as a low priority by The Forum, but its inclusion is nevertheless welcome, and was a key recommendation in my submission. I also recommended that all temperature series start at the same date. For example, I provided the example, in my submission, of the Bureau adding in the very hot town of Wilcannia only from 1957, when there is data available from the late 1800s.
I also welcome the recommendation that the Bureau:
“Address two key aspects of ACORN-SAT, namely: a) improving the clarity and accessibility of information provision—in particular, explaining the uncertainty that is inherent to both raw and homogenised datasets, and b) refining some of the Bureau’s data handling and statistical methods through appropriate statistical standardisation procedures, sensitivity analyses, and alternative data fitting approaches.”
I note that The Forum state in their report that: “It is not currently possible to determine whether the improvements recommended by the Forum will result in an increased or decreased warming trend as reflected in the ACORN-SAT dataset.”
I would suggest that if the committee’s recommendations were properly implemented, and the Bureau abandoned some of its more creative accounting practices (e.g. adding in particularly hot locations for later years in the time series), then it would become apparent that there has been an overall trend of cooling over much of central and eastern Australia from 1880 to 1960, more dramatic warming than previously documented from 1960 through to about 2002, while more recently temperatures have plateaued, with some evidence of a cooling trend establishing in north eastern Australia since 2002.
I note The Forum intends to operate for another two years, and urge them to be honest to their title of “The Forum” and actually meet with some of those who have so far provided unsolicited public submissions. Indeed, I urge Dr Sandland to immediately set up an open and transparent Forum process whereby these submissions can be presented allowing any accusations of scientific misconduct by the Bureau to be both defended and contested before the Australian public, and media.
The committee makes five recommendations, but puts emphasis on the importance of the first two components of the first recommendation.
I applaud the first component of the first recommendation of the committee that in full states:
“Expediting the Bureau’s current work on developing uncertainty measures in closer consultation with the statistical community. The Forum recommends the Bureau seek to better understand the sources of uncertainty and to include estimates of statistical variation such as standard errors in reporting estimated and predicted outcomes, including: quantifying the uncertainty for both raw and adjusted data; prioritising the provision of explicit standard errors or confidence intervals, which should further inform the Bureau’s understanding and reporting of trends in all temperature series maintained by the Bureau; examining the robustness of analyses to spatial variation; and articulating the effect of correcting for systematic errors on the standard error of resulting estimates.”
Of course, that such basic statistical information is not currently available is impossible to reconcile with the overall conclusion in the report that, “the analyses conducted by the Bureau reflect good practice in addressing the problem of how to adjust the raw temperature series for systematic errors.” Then again, the executive summary of The Forum’s report appears to have been written by someone straight out of the BBC television series ‘Yes Minister’.
The second component of the first recommendation is also applauded, which reads in full:
“Developing a clearer articulation of the purpose for the ACORN-SAT exercise to enhance public understanding of the program, and communicating processes for developing and using ACORN-SAT in a way that is appropriately clear, broad and supported by graphics and data summaries. In particular, the central focus on the Australian annual mean temperature anomaly as the primary end point of the ACORN-SAT exercise should be reconsidered and a broader narrative around including regional effects should be developed.”
Indeed, it has become apparent over the years that the entire focus of the work of the small ACORN-SAT unit is not the provision of higher quality individual temperature series, but the remodeling of the raw data, and the compilation of a select few station, to suggest that it is getting hotter and hotter across the Australian landmass with such announcements made with great fanfare by the Bureau’s David Jones at the beginning of each year.
Recommendation No. 2, has several components including comment that:
“Releasing the Python computer code for ACORN-SAT as a downloadable link along with all supporting documentation and listing of the technical requirements for the software. The Bureau should also monitor and gather download statistics to gauge demand for this software.”
Of course, without access to this software it has been impossible to reproduce any of the adjustments made by the Bureau. Yet if the method is scientific, it should be reproducible. For many years, the Bureau has erroneously claimed its methods are transparent. It should be noted, however, that even with the provision of this software, it will be impossible to justify ACORN-SAT because it is unclear why the Bureau chooses some stations above others for its comparisons. For example, despite endless requests for clarification, the Bureau has never explained why it uses the distant location of Hillston to make comparison, and then changes, to the raw temperature data for Rutherglen in north eastern Victoria.
Recommendation 2 also includes comment that: “Publishing a brief, plain-language (as far as possible) description of the criteria for adjustment and the basis for adjustment itself.” Of course this should have been available since the very first adjustment was made in the development of ACORN-SAT. That such a document still does not exist is evidence that ACORN-SAT is poorly documented. So, how could The Forum endorse the Bureau’s claims that it represents world’s best practice?
Pope's climate adviser lambasts Australia
If you think he looks like something that has recently emerged from the anus of a zoo animal, I will not contradict you, "ad hominem" though that is. Apologies but the pompous fraud has certainly succeeded in irritating me. More temperately, exactly what qualifies a theoretical physicist to pontificate on the Australian economy? Also see above for a comment on his "science"
A leading German climate change authority and adviser to the Pope on the effects of global warming has lambasted Australia over what he perceives as its failure to address an inevitable process of de-carbonisation.
Professor Hans Schellnhuber, head of the highly-regarded Potsdam Institute for Climate Impact Research outside Berlin, told reporters Australia's reliance on coal exports to China was a "suicide strategy".
"I don't think Australia can be sustained based simply on raw materials he says. "Just pursuing the carbon path is a red herring."
Professor Schellnhubner will be in Rome Thursday for the release of an eagerly awaited papal encyclical on the effects of climate change.
An adviser to both the Pope and German Chancellor Angela Merkel, Prof Schellnhuber is one of Europe's leading climate change scientists in his capacity as Professor of Theoretical Physics at the University of Potsdam.
He was interviewed in his study where Albert Einstein developed his Theory of Relativity.
In good natured remarks about the challenges facing a country like Australia, Prof Schellnhuber said it was "not responsible to run a country like a lottery."
He compared Australia unfavorably with resource-rich Norway which is being run almost completely on renewable energy [mostly hydro, which Greenies hate] and was making use of its vast sovereign wealth fund to build new and innovative industries.
Australia, he says, was excellently-placed to make the most of its renewable potential in solar, wind-power and other forms of renewable energy.
Asked why Germany experienced a low level of climate skepticism compared with countries like Australia and the United States, Prof Schellnhuber says "Anglo-American" societies tended to be dominated by ideas of entrepeneurship and free market impulses.
The Anglo-American world believed technology and innovation would help it to overcome its challenges. Germany, with its "different history", was more "cautious." [Germany has a cautious history? You could have fooled me!]
"Australia and Canada suffered from the curse of bounty," he says. "We will be fine forever: why should we change?"
"In the end," he adds. "it [the curse of bounty] makes you complacent. Unfortunately paradise doesn't last forever".
Africa and South America also have bountiful natural resources, so how come they are not in "paradise"? Schellnhuber hasn't even asked himself that question. His economics and sociology are on a par with his climatology
Electric cars in Australia
Tesla may have ambitious plans for battery technology for the home but it is also looking to upgrade its electric vehicle batteries, which will allow them to travel twice the distance they currently do. So what will be the implications for Australia?
While Australia has generally been an early adopter of new technology, electric vehicles pose more of a problem. Anybody who has grown up in regional Australia knows that being the family taxi at weekends for children’s sporting events can regularly mean a round trip of more than 200km.
The current battery life of an electric vehicle is around 160km – the Nissan Leaf is quoting an average even lower at 135km – so they are still not an option as the primary vehicle for even the most die-hard regional environmentalist.
There has been some take-up of hybrid vehicles – and they are more suitable to Australian conditions – but what is needed for those who would love to move to a fully electric vehicle?
Electric is more suited to the major cities, where they can be used for the daily commute to work (and may provide an alternative for the second family vehicle).
But the uptake of new electric vehicles is slow according to one recent report, with limited sales in the first few months of the year, although BMW claimed the most with 70 of its i3 model. (It’s a similar story in other countries where sales are far less than predicted.)
One of the reasons for the slow take-up in Australia has been identified as a lack of infrastructure to keep electric vehicles powered, especially on the longer journeys that are typical here.
For more postings from me, see DISSECTING LEFTISM, TONGUE-TIED, EDUCATION WATCH INTERNATIONAL, POLITICAL CORRECTNESS WATCH, FOOD & HEALTH SKEPTIC and AUSTRALIAN POLITICS. Home Pages are here or here or here. Email me (John Ray) here.
Preserving the graphics: Most graphics on this site are hotlinked from elsewhere. But hotlinked graphics sometimes have only a short life -- as little as a week in some cases. After that they no longer come up. From January 2011 on, therefore, I have posted a monthly copy of everything on this blog to a separate site where I can host text and graphics together -- which should make the graphics available even if they are no longer coming up on this site. See here or here
Posted by JR at 12:35 AM