The event is the premier utilities event annually in Europe with 12,000 attendees, and 600 exhibitors. I was honoured to be asked, and of course accepted, without hesitation.
The talk wasn’t video’d but you can check out the slides I used above. In slides 3-29 I outline why utilities need to adopt new business models (revenues are falling due to factors like falling costs of generation, the rising popularity of renewables, climate change, etc.). In slides 33-40 I discuss some of the evolutionary business models open to utilities. While slides 41-60 outline some of the more revolutionary opportunities open to utilities – many being enabled by the Internet of Things, and utilities digital transformation.
With all the changes occurring, utilities need to disrupt, or they themselves will be disrupted.
The utilities industry has typically been change averse, and often for good reasons, but with the technological advances of the past few years, the low carbon imperative, and pressure from customers, utilities are going to have to figure out how to disrupt their business, or they will themselves be disrupted.
I gave the opening keynote at this year’s SAP for Utilities event in Huntington Beach on the topic of the Convergence of IoT and Energy (see the video above). Interestingly, with no coordination beforehand, all the main speakers referred to the turmoil coming to the utilities sector, and each independently referenced Tesla and Uber as examples of tumultuous changes happening in other industries.
What are the main challenges facing the utilities industry?
As noted here previously, due to the Swanson effect, the cost of solar is falling all the time, with no end in sight. The result of this will be more and more distributed generation being added to the grid, which utilities will have to manage, and added to that, the utilities will have reduced income from electricity sales, as more and more people generate their own.
On top of that, with the recent launch of their PowerWall product, Tesla ensured that in-home energy storage is set to become a thing.
Battery technology is advancing at a dizzying pace, and as a consequence:
1) the cost of lithium ion batteries is dropping constantly
and
2) the energy density of the batteries is increasing all the time
(Charts courtesy of Prof Maarten Steinbuch, Director Graduate Program Automotive Systems, Eindhoven University of Technology)
With battery prices falling, solar prices falling, and battery energy density increasing, there is a very real likelihood that many people will opt to go “off-grid” or drastically reduce their electricity needs.
How will utility companies deal with this?
There are many possibilities, but, as we have noted here previously, an increased focus on by utilities on energy services seems like an obvious one. This is especially true now, given the vast quantities of data that smart meters are providing utility companies, and the fact that the Internet of Things (IoT) is ensuring that a growing number of our devices are smart and connected.
Further, with the cost of (solar) generation falling, I can foresee a time when utility companies move to the landline model. You pay a set amount per month for the connection, and your electricity is free after that. Given that, it is all the more imperative that utility companies figure out how to disrupt their own business, if only to find alternative revenue streams to ensure their survival.
Mobile industry consortium GreenTouch today released tools and technologies which, they claim, have the potential to reduce the energy consumption of communication networks by 98%
The world is now awash with mobile phones.
According to Ericsson’s June 2015 mobility report [PDF warning], the total number of mobile subscriptions globally in Q1 2015 was 7.2 billion. By 2020, that number is predicted to increase another 2 billion to 9.2 billion handsets.
Of those 7.2 billion subscriptions, around 40% are associated with smartphones, and this number is increasing daily. In fact, the report predicts that by 2016 the number of smartphone subscriptions will surpass those of basic phones, and smartphone numbers will reach 6.1 billion by 2020.
When you add to that the number of connected devices now on mobile networks (M2M, consumer electronics, laptops/tablets/wearables), we are looking at roughly 25 billion connected devices by 2020.
That’s a lot of data passing being moved around the networks. And, as you would expect that number is increasing at an enormous rate as well. There was a 55% growth in data traffic between Q1 2014 and Q1 2015, and there is expected to be a 10x growth in smartphone traffic between 2014 and 2020.
Fortunately five years ago an industry organisation called GreenTouch was created by Bell Labs and other stakeholders in the space, with the object of reducing mobile networking’s footprint. In fact, the goal of GreenTouch when it was created was to come up with technologies reduce the energy consumption of mobile networks 1,000x by 2015.
Today, June 18th in New York, they are announcing the results of their last five years work, and it is that they have come up with ways for mobile companies to reduce their consumption, not by the 1,000x that they were aiming for, but by 10,000x!
The consortium also announced
research that will enable significant improvements in other areas of communications networks, including core networks and fixed (wired) residential and enterprise networks. With these energy-efficiency improvements, the net energy consumption of communication networks could be reduced by 98%
And today GreenTouch also released two tools for organisations and stakeholders interested in creating more efficient networks, GWATT and Flexible Power Model.
They went on to announce some of the innovations which led to this potential huge reduction in mobile energy consumption:
·
Beyond Cellular Green Generation (BCG2) — This architecture uses densely deployed small cells with intelligent sleep modes and completely separates the signaling and data functions in a cellular network to dramatically improve energy efficiency over current LTE networks.
Large-Scale Antenna System (LSAS) — This system replaces today’s cellular macro base stations with a large number of physically smaller, low-power and individually-controlled antennas delivering many user-selective data beams intended to maximize the energy efficiency of the system, taking into account the RF transmit power and the power consumption required for internal electronics and signal processing.
Distributed Energy-Efficient Clouds – This architecture introduces a new analytic optimization framework to minimize the power consumption of content distribution networks (the delivery of video, photo, music and other larger files – which constitutes over 90% of the traffic on core networks) resulting in a new architecture of distributed “mini clouds” closer to the end users instead of large data centers.
Green Transmission Technologies (GTT) – This set of technologies focuses on the optimal tradeoff between spectral efficiency and energy efficiency in wireless networks, optimizing different technologies, such as single user and multi-user MIMO, coordinated multi-point transmissions and interference alignment, for energy efficiency.
Cascaded Bit Interleaving Passive Optical Networks (CBI-PON) – This advancement extends the previously announced Bit Interleaving Passive Optical Network (BiPON) technology to a Cascaded Bi-PON architecture that allows any network node in the access, edge and metro networks to efficiently process only the portion of the traffic that is relevant to that node, thereby significantly reducing the total power consumption across the entire network.
Now that these innovations are released, mobile operators hoping to reduce their energy costs will be looking closely to see how they can integrate these new tools/technologies into their network. For many, realistically, the first opportunity to architect them in will be with the rollout of the 5G networks post 2020.
Having met (and exceeded) its five year goal, what’s next for GreenTouch?
I asked this to GreenTouch Chairman Thierry Van Landegem on the phone earlier in the week. He replied that the organisation is now looking to set a new, bold goal. They are looking the energy efficiency of areas such as cloud, network virtualisation, and Internet of Things, and that they will likely announcement their next objective early next year.
So I ventured to the conference with high hopes of what I was going to learn there. and for the most part I wasn’t disappointed. IBM had some very interesting announcements, more on which later.
However, there is one area where IBM has dropped the ball badly – their Cloud Services Division, Softlayer.
IBM have traditionally been a model corporate citizen when it comes to reporting and transparency. They publish annual Corporate Responsibility reports with environmental, energy and emissions data going all the way back to 2002.
However, as noted here previously, when it comes to cloud computing, IBM appear to be pursuing the Amazon model of radical opaqueness. They refuse to publish any data about the energy or emissions associated with their cloud computing platform. This is a retrograde step, and one they may come to regret.
Instead of blindly copying Amazon’s strategy of non-reporting, shouldn’t IBM be embracing the approach of their new best buddies Apple? Apple, fed up of being Greenpeace’d, and seemingly genuinely wanting to leave the world a better place, hired the former head of the EPA, Lisa Jackson to head up its environmental initiatives, and hasn’t looked back.
This was made more stark for me because while at InterConnect, I read IBM’s latest cloud announcement about their spending $1.2bn to develop 5 new SoftLayer data centres in the last four months. While I was reading that, I saw Apple’s announcement that they were spending €1.7bn to develop two fully renewably powered data centres in Europe, and I realised there was no mention whatsoever of renewables anywhere in the IBM announcement.
Even better than Apple though, are the Icelandic cloud computing company GreenQloud. GreenQloud host most of their infrastructure out of Iceland, (Iceland’s electricity is generated 100% by renewable sources – 70% hydro and 30% geothermal), and the remainder out of the Digital Fortress data center in Seattle, which runs on 95% renewable energy. Better again though, GreenQloud gives each customer a dashboard with the total energy that customer has consumed and the amount of CO2 they have saved.
This is the kind of cloud leadership you expect from a company with a long tradition of openness, and the big data and analytics chops that IBM has. Now this would be A New Way to Think for IBM.
But, it’s not all bad news, as I mentioned at the outset.
As you’d expect, there was a lot of talk at InterConnect about the Internet of Things (IoT). Chris O’Connor, IBM’s general manager of IoT, in IBM’s new IoT division, was keen to emphasise that despite the wild hype surrounding IoT at the moment, there’s a lot of business value to be had there too. There was a lot of talk about IBM’s Predictive Maintenance and Quality solutions, for example, which are a natural outcome of IBM’s IoT initiatives. IBM has been doing IoT for years, it just hasn’t always called it that.
In fact, IoT plays right into the instrumented, interconnected and intelligent Smarter Planet mantra that IBM has been talking for some time now, so I’m excited to see where IBM go with this.
Fun times ahead.
(Disclosure – IBM paid my travel and accommodation for me to attend InterConnect.)
I was asked to speak at the recent SAP TechEd && d-code (yes, two ampersands, that’s the branding, not a typo) on the topic of the Internet of Things and Energy.
This is a curious space, because, while the Internet of Things is all the rage now in the consumer space, the New Black, as it were; this is relatively old hat in the utilities sector. Because utilities have expensive, critical infrastructure in the field (think large wind turbines, for example), they need to be able to monitor them remotely. These devices use Internet of Things technologies to report back to base. this is quite common on the high voltage part of the electrical grid.
On the medium voltage section, Internet of Things technologies aren’t as commonly deployed currently (no pun), but mv equipment suppliers are more and more adding sensors to their equipment so that they too can report back. In a recent meeting at Schneider Electric’s North American headquarters, CTO Pascal Brosset announced that Schneider were able to produce a System on a Chip (SoC) for $2, and as a consequence, Schneider were going to add one to all their equipment.
And then on the low voltage network, there are lots of innovations happening behind the smart meter. Nest thermostats, Smappee energy meters, and SmartThings energy apps are just a few of the many new IoT things being released recently.
Now if only we could connect them all up, then we could have a really smart grid.
The slides for this talk are available on SlideShare.
GE’s Digital Energy business produced this infographic recently, based on the results of its Grid Resiliency Survey measuring the U.S. public’s current perception of the power grid. The survey was conducted by Harris Poll on behalf of GE from May 02-06, 2014 among 2,049 adults ages 18 and older and from June 3-5, 2014 among 2,028 adults ages 18 and older.
Given the fact that hurricane Sandy was still reasonably fresh in people’s minds, and that polar vortices meant that early 2014 saw particularly harsh weather, it is perhaps not surprising that 41% of the respondents East of the Mississippi were more willing to pay $10 extra a month to ensure the grid is more reliable. A further 34% of those leaving West of the Mississippi would be willing to pay more for a more reliable grid.
What is most surprising is that the numbers are so low, to be honest. Especially the 41% figure, given that energy consumers East of the Mississippi had three times as many power outages as those living West of the Mississippi.
What’s the alternative to paying more? Home generation? Solar power is dropping in price, but it is still a very long term investment. And the cost of a decent generator can be $800 or more. And that’s just to buy it. Then there’s fuel and maintenance on top of that. As well as the inconvenience an outage brings.
Here in Europe, because most of the lines are underground, outages are very rare. The last electricity outage I remember was Dec 24th 1997, after a particularly severe storm in Ireland, for example.
The really heartening number to take away from this survey is that 81% of utility customers expect their energy company to use higher levels of renewables in the generation mix. If that expectation can be turned into reality, we’ll all be a lot better off.
Welcome to episode thirty four of the Technology for Good hangout. In this week’s episode our guest was SalesForce SVP of Strategy, John Taschek. John and I are both longtime members of the Enterprise Irregulars, but this was the first time John and I had had a conversation outside of email!
I had a really interesting, wide-ranging, conversation with SalesForce’s VP for Strategic Research, Peter Coffee the other day.
A lot of our conversation revolved around how recent changes in the Internet of Things space, in ubiquitous computing, and in Big Data and analytics area are enabling profound effects on how we interact with the world.
Peter had a superb analogy – that of sound travelling through air. When sound is generated, it is transmitted from the source to the surrounding air particles, which vibrate or collide and pass the sound energy along to our ears. Without any air particles to vibrate, we wouldn’t hear the sound (hence there is no sound in space).
As you enter our planet’s atmosphere from space you start to encounter molecules of air. The more molecules there are, the better they can interact and the more likely they are to transmit sound.
If you hadn’t experienced air before, you might not be aware of the existence of sound. It is unlikely you would even predict that there would be such a thing as sound.
In a similar way, in the late eighties, when very few people had mobile phones, it would have been nigh on impossible to predict the emergence of the mobile computing platforms we’re seeing now, and the advances they’ve brought to things like health, education and access to markets (and cat videos!).
And, we are just at the beginning of another period when massive change will be enabled. This time by pervasive connectivity. And not just the universal connectivity of people which mobile phones has enabled, but the connectivity of literally everything that is being created by low cost sensors and the Internet of Things.
We are already seeing massive data streams now coming from expensive pieces of equipment such as commercial jets, trains, and even wind turbines.
But with the drastic fall in the price of the technologies, devices such as cars, light bulbs, even toothbrushes that were never previously, are now being instrumented and connected to the Internet.
This proliferation of (typically cloud) connected devices will allow for massive shifts in our ability to generate, analyse, and act on, data sets that we just didn’t have before now.
When we look at the concept of the connected home, for example. Back in 2009 when we in GreenMonk were espousing the Electricity 2.0 vision, many of the technologies to make it happen, hadn’t even been invented. Now, however, not only are our devices at home increasingly becoming connected, but technology providers like Apple, Google, and Samsung are creating platforms to allow us better manage all our connected devices. The GreenMonk Electricity 2.0 vision is now a lot closer to becoming reality.
We are also starting to see the beginnings of what will be seismic upheavals in the areas of health, education, and transportation.
No-one knows for sure what the next few years will bring, but it is sure going to be an exciting ride as we metaphorically discover sound, again and again, and again.
We at GreenMonk have been researching and writing about the smart grid space for over six years now. It has long been a sector which resisted significant change, but no more.
Several factors have come into play which has ensured that the smart grid we envisioned all those years ago, is now starting to come into being. Some of those factors involve necessary practical first steps such as the rollout of smart meters to homes, other factors would include the huge advances in mobile, big data and analytics technologies which have taken place in the last couple of years.
Then there’s the issue of budgets. More money is definitely starting to be freed up for smart grid investments with revenue from asset management and condition monitoring systems for the power grid projected to grow from $2.1 billion annually in 2014 to $6.9 billion by 2023.
I attended GE’s recent Digital Energy conference in Rotterdam as a speaker, and at this event GE showcased their new PowerOn product set. This is a combined outage, and distribution management system in a singular modular platform. Combining OMS and DMS systems seems to be a new direction for the industry. It remains to be seen if it will become the norm, but it should bring advantages in process efficiency and consequently in productivity.
The application uses newer modern screens (see screens above), with a more intuitive user interface, and a single system database. This combining of systems into a single platform should simplify operations for the system operators, leading to reduced outage times, and a more reliable grid for customers. Repair crews out in the field have access to the system as well, and can update the status of any repairs ongoing. This data can be fed directly into the IVR so customers who are still using telephones can get the latest updates.
In time, as utilities embrace next generation customer service, this information will be fed into customers social channels of choice as well. Then we’ll really start to see the grid get smarter.