Member States of the European Union have agreed on targets aimed at reducing greenhouse gas emissions by cutting energy consumption by 20% and increasing the share of renewables in the energy mix to 20% by 2020. The ‘Europe’s Energy’ project gives users a set of visual tools to put these targets into context and to understand and compare how progress is being made towards them in different countries.
The survey asked 106 utility executives – the people that arguably know more about the energy supply and demand challenges our nation faces than anyone else – a range of questions on the smart grid, energy efficiency and related topics and issues.
We issued a press release today with some of the highlights, but to help put this week’s news into context, we also wanted to share a full breakdown of the results. Nothing earth shattering, but worth keeping in mind as the week progresses…
The annual smart grid event Distributech kicked off in San Diego Tuesday morning and — as expected — unleashed a whole series of news from smart grid-focused firms. From new home energy management products, to plug-in car software, to distribution automation gear, this is a list of trends and news from the show.
US venture capital (VC) investment in cleantech companies increased by 8% to $3.98 billion in 2010 from $3.7 billion in 2009 and deal total increased by 7% to 278, according to an Ernst & Young LLP analysis based on data from Dow Jones VentureSource. VC investment in cleantech in Q4 2010 reached $979 million with 72 financing rounds. VC investment in cleantech in Q4 2010 reached $979 million with 72 financing rounds, flat in terms of deals and down 14% in terms of capital invested compared to Q4 2009…
Temperatures in Tibet rose last year to the highest level since records began for the remote Himalayan country, which scientists say is particularly vulnerable to global warming, state media reported on Friday.
The Associated Press recently reported that at least 27 of 104 nuclear reactors across the United States are leaking potentially dangerous levels of tritium into the groundwater around the plants.
The scope of the problem surfaced after the recent discovery of a leak at the Vermont Yankee nuclear plant. According to the AP, new tests have shown that the levels of tritium in the wells at the Vernon, Vermont site are more than three-and-a-half times the federal safety standard.
“We’ve heard this one before; Obama threatening to sever subsidies to the fossil fuel industry. He called for ending the subsidies during his SOTU, and was doing so even before that. Now, he’s proposing that $36 billion worth of those subsidies for oil and $2.3 billion for coal (both get $70 billion a year in total) get stripped from the budget–which would be great. Too bad special interests will almost certainly keep this from happening.
According to NASA, 2007 was tied with 1998 as the second-warmest year in a century.
According to the NASA Goddard Institute for Space Studies (GISS), the eight warmest years in the GISS record have all occurred since 1998, and the 14 warmest years in the record have all occurred since 1990.
The article goes on to say:
The greatest warming in 2007 occurred in the Arctic, and neighboring high latitude regions. Global warming has a larger affect in polar areas, as the loss of snow and ice leads to more open water, which absorbs more sunlight and warmth. Snow and ice reflect sunlight; when they disappear, so too does their ability to deflect warming rays. The large Arctic warm anomaly of 2007 is consistent with observations of record low geographic extent of Arctic sea ice in September 2007.
“As we predicted last year, 2007 was warmer than 2006, continuing the strong warming trend of the past 30 years that has been confidently attributed to the effect of increasing human-made greenhouse gases,” said James Hansen, director of NASA GIS
The data in the graph above are pretty conclusive. The planet’s climate is changing. Now what are we going to do about it?
The breakthrough comes from using capacitors as batteries. Up until now this has not been feasible because there hasn’t been a strong enough insulator to make this approach compelling. However, EEstor, the company who have made the breakthrough have applied for a patent for a highly insulated capacitor.
In their patent application, it suggests that:
the charge storage is much higher than anything achieved in an academic lab: 52 kilowatt-hours in a 2,000 cubic inch capacitor array. A rough conversion calculation suggests that this is over 10 times the power density of standard lead-acid batteries.
The Ars Technica article goes on to note that:
the Associated Press is reporting that the ZENN Motor Company, which makes compact electric cars, plans to start using the capacitors before the year is out. The company has invested in EEStar in return for production goals being met and so is in a position to know how realistic its claims are
If this has any basis in fact, it could have incredible consequences for the reduction of carbon emissions from transport and from the environment in general with the reduction in the use of the particularly nasty chemicals which currently go to make up batteries.
The problem with wind power is that its production is variable and difficult to predict. From the perspective of a power supply company, such a supplier is unreliable and likely to de-stabilise the power network.
For instance, at 2am in Ireland, when the demand for electricity is near its lowest, if a 40mph wind is blowing across the country, wind can be supplying up to 30% of the demand. However, if the wind picks up to 50mph, the wind farms shut down to protect their mechanisms and suddenly you lose 30% of your supply! The electricity supply companies have to scramble to bring power stations online to meet the sudden fall off.
In CIX, we have come up with a strategy for Data Centre’s to act as a flywheel for electricity supply companies. This will allow the supply companies to greatly increase the amount of green energy they buy. And if the Data Centre’s are burning biodiesel then you are in a win-win situation .
It seems we are not alone in our thinking – Google, no-less, has come up with a similar strategy using cars! Yes cars. You’d think that with all their data centres they’d use them in the way we propose but they have decided to go the ‘vehicle to grid’ route for now.
Google’s strategy is modify hybrid cars so that they can consume power from the grid. These new ‘plug-in hybrids’ achieve 70-100mpg.
These plug-in hybrids take power from the grid overnight at times of low demand, say. Then the batteries in these cars, which store electricity, can ‘sell’ electricity back to the grid at times of high demand.
Check out the Google video on this to see what I mean:
A cute idea but one which would have to achieve massive scale before making a difference, I suspect.
I was speaking to a sales rep yesterday who was driving a company car. He told me about the Irish government’s scheme to tax people for receipt of company cars. It is called Benefit in Kind (BiK).
Basically, if your employer gives you a company car, you are liable to pay 30% of the original market value of the car in tax (the original market value includes the amount the government already collects in VRT!).
However, if you do more than 15,000 per annum, the amount of BiK you have to pay drops. The more mileage you do, the less BiK you have to pay (up to a ceiling at 30,000 miles).
Sounds fair, you might say. These people are using the cars the company gave them.
Possibly, until you realise that what this law does is incentivise company car owners to use their cars more to drive to meetings (for example) where they might otherwise have taken a more carbon friendly alternative (telecon anyone?). The rep I was talking to said he will preferentially drive anywhere to get his mileage up!
If you want to tax company cars, why not do it on the basis of their carbon footprint (or engine size if that rating isn’t easy to come by). Something like â‚¬500 for cars 1.6L and less; â‚¬2,500 for 1.6L to 2L; â‚¬6,000 for 2L to 3L and â‚¬12,000 for 3L and above index linked.
I see Yahoo! has announced that it is going to follow our lead in CIX *cough* and aim for carbon neutrality!
In the announcement David Filo, co-founder of Yahoo! said:
weâ€™re going to invest in greenhouse gas reduction projects around the world to neutralize Yahoo!â€™s impact on the environment. While doing our homework on this, we measured our carbon footprint and discovered that Yahoo! going carbon neutral is equivalent to shutting off the electricity in all San Francisco homes for a month. Or, pulling nearly 25,000 cars off the road for a year.
While buying carbon credits isn’t the ideal way to go carbon neutral (I can think of a couple of better ways – David, come along to my talk at Barcamp Dublin on Saturday if you want to know more!), it is certainly a step in the right direction and puts a financial imperative on the company to “clean up its act”, from a carbon point of view, at least!
Kudos to Yahoo! for taking this stance and hopefully we’ll see more companies going down this route sooner than later (though I don’t see Halliburton coming on board any time soon).
The summary report is a 21 page document summarising a four volume report yet to be released. It is the work of over 1200 scientific authors and over 2500 scientific reviewers from over 130 countries.
The Intergovernmental Panel on Climate Change (IPCC) was established by the World Meteorological Organization (WMO) and the United Nations Environment Programme (UNEP) in 1988 to
assess on a comprehensive, objective, open and transparent basis the scientific, technical and socio-economic information relevant to understanding the scientific basis of risk of human-induced climate change, its potential impacts and options for adaptation and mitigation. The IPCC does not carry out research nor does it monitor climate related data or other relevant parameters. It bases its assessment mainly on peer reviewed and published scientific/technical literature.
The numbers and data in the report are horrifying.
Eleven of the last twelve years (1995 -2006) rank among the 12 warmest years in the instrumental record of global surface temperature
Observations since 1961 show that the average temperature of the global ocean has increased to depths of at least 3000 m and that the ocean has been absorbing more than 80% of the heat added to the climate system. Such warming causes seawater to expand, contributing to sea level rise
Global average sea level rose at an average rate of 1.8 [1.3 to 2.3] mm per year over 1961 to 2003. The rate was faster over 1993 to 2003, about 3.1 [2.4 to 3.8] mm per year.
For the next two decades a warming of about 0.2Â°C per decade is projected for a range of SRES emission scenarios. Even if the concentrations of all greenhouse gases and aerosols had been kept constant at year 2000 levels, a further warming of about 0.1Â°C per decade would be expected.
Continued greenhouse gas emissions at or above current rates would cause further warming and induce many changes in the global climate system during the 21st century that would very likely be larger than those observed during the 20th century.
Some of the graphs say it all:
What scared me even more was hearing one of the report’s lead authors, Dr Andrew Weaver on NewsTalk 106 on Friday afternoon and he said that the report was conservative in many of its estimates and findings. Not good.
UPDATE: – Expect to see more stories like the floodings in Jakarta as the effects of climate change become more and more pronounced.