Tag: cloud computing

Digital Supply Chain, Industry 4.0, and IoT/Edge Computing – a chat with Elvira Wallis (aka @ElviraWallis)

On this second Digital Supply Chain podcast on the theme of Industry 4.0, I had a great chat with Elvira Wallis (@ElviraWallis on Twitter and Elvira Wallis on LinkedIn). Elvira is the Global Head of IoT at SAP, so obviously I was keen to find out her take on how Digital Supply Chain, IoT and Industry 4.0 intersect.

We had a great conversation covering Supply Chain, Internet of Things, Edge Computing, Cloud – their use cases, challenges and opportunities.

Read the full transcript of our conversation below, or listen to it using the player above.

Elvira Wallis [00:00:00] The Internet of Things is a key enabler for industry 4.0, and it is required to make industrial IoT, to make industry 4.0 possible because you need to connect to sensors, you need to connect to autonomous systems. You need to connect to CoBots. You need to connect to big data lakes and so forth.

 

Tom Raftery [00:00:21] Good morning, good afternoon or good evening. Wherever you are in the world, this is the digital supply chain podcast. And I’m your host, Tom Raftery. Hi, everyone, welcome to the supply chain podcast. This is another of the industry four-point all themed podcasts of the digital supply chain podcast. And my very special guest on the show today is Elvira Wallis. Elvira would you like to introduce yourself.

 

Elvira Wallis [00:00:48] Sure Tom. Thanks for having me on the podcast. So hello, everyone. My name is Elvira Wallace and I am running Internet of Things here at SAP.

 

Tom Raftery [00:00:58] Super. Well, that’s a great role. Can you tell me Elvira, we’re on the obviously Industry 4.0 themed podcast today, so how are we connecting Industry 4.0 and Internet of Things? Cause, you know, for a lot of people who think about Industry 4.0, they might think about maybe, you know, improvements in manufacturing and things like that. But it is just that? Is it more than that? How do you how do you see Industry 4.0 and the connection to IoT?

 

Elvira Wallis [00:01:27] Yeah. So, let me maybe start with some, you know, regional flavour here. In Europe we often like to call things industry 4.0. If you look into North America the same phenomenon, namely the phenomena of an industrial transformation using new digital technologies such as Internet of Things or Edge and cloud computing, big data lakes and so forth, is termed industrial IoT, so dependent on the region of the world, the terms industry 4.0 and industrial IoT are used interchangeably and referring to an industrial transformation using new digital technologies. And if you didn’t go to Asia, it’s called ABC Country 2025 or D E F Country 2030. In other words, we’re all talking about a phenomenon of industrial transformation which we often call Industry 4.0 in Europe. And it requires new digital technology such as the Internet of Things, edge and cloud computing, big data lakes. So, in other words, the Internet of Things is a key enabler for Industry 4.0. And it is required to make industrial IoT to make industry 4.0 possible, because you need to connect to sensors, you need to connect to autonomous systems, you need to connect to Cobots, you need to connect to big data lakes and so forth. So, you need an enabler. And the key here is, all of that data in and by itself is relatively uninteresting. Where SAP comes in… And that has to do with our rich history and also our hopefully very rich future is bringing this type of data with our technologies in the context of business processes.

 

Tom Raftery [00:03:21] OK, OK. Now, for people who may be unfamiliar… We’re obviously not a hardware company. We’re a software company. And IoT is very much a mix of hardware and software. So, where do we fall into that kind of ecosystem?

 

Elvira Wallis [00:03:37] It’s a very, very good notion that you bring up. Clearly, Industry 4.0 as well as Internet of Things is not a one person’s island. Whoever sets out with the idea of it’s me, myself, and I shall fail miserably. It is an ecosystem play that requires the OT players, it requires the hardware players. It requires some clearly various software companies and even into software realm, it’s not SAP alone, it’s us and our esteemed ecosystem. Where SAP is playing is clearly solely in the realm of software, right? Not hardware. Of course, we have a lot of hardware partners that we work very closely with so we can recommend to our customers in specific situations, specific types of hardware.

 

Elvira Wallis [00:04:23] So we’re not ignorant, we’re just not owning that space. Yet to your question, where we’re playing, we’re playing in two places if we cut it very broadly. One is the cloud where we have, of course, the applications that run in the cloud as well as the underlying technology for Internet of Things that works in conjunction with the applications and the second realm where we’re playing is edge computing. The world is moving more and more towards distributed computing. And when SAP says edge computing, we’re of course again referring to software and our software runs on various types of hardware, very close to the source of data. And as to the hardware we run on we’re agnostic, we play with many of the key industry leaders here.

 

Tom Raftery [00:05:17] OK. OK. So, for anyone who is unfamiliar with the concept of edge computing, could you just give us a 101 on that?

 

Elvira Wallis [00:05:25] Oh, definitely. And it’s one of my favourite topics. So, let’s not start with, you know, with SAP. Let’s start with the trends in the market. Right. Great. And. If we put it very, very generically, then edge computing is a new form of distributed computing, meaning not all data will be processed in the cloud. Some data will be processed at the edge. So, what is the edge? It’s basically edge computing means running data applications and business processes near the source of that generated data. So, the source of the generated data could be a factory, a plant, a mine. And it refers to the concept of running the data running the application, the business process near to the source of the data, and if people now say, oh, isn’t it very far away and do we need to deal with that today?

 

Elvira Wallis [00:06:19] Maybe some data points, Tom. If we’re if we’re looking at edge computing, it has been growing steadily in the past and if you if you listen to the analysts, Gartner, for example, predicts that by 2025, 50% of enterprise generated data would be created and processed outside a traditional centralized cloud data centre. Now, 50%, is that a lot or not? Well, that would be up from 10% in 2019. So that’s quite a big growth in the ability to, you know, extend and run business processes at the edge, meaning in the plant, in the factory close to the source of data that enables customers to automate and run their operations independently, and that’s what a lot of people want in the world of industry 4.0, in the world of industrial I.T. in order to endorse the digital transformation. They say, hey, my plant, my factory needs to run independently of the cloud. So, in order to endorse the cloud, we see a new form of distributed computing, namely the edge. And the edge addresses customer concerns with running and low latency. Right. Very often we hear that I need to run low latency, low bandwidth. And then let’s not forget in many places of the world there, specific security and regulatory requirements which says, hey, the data must be processed locally instead of in a centralized cloud. So, it can also be regulatory reasons why edge computing starts to prevail. And if you listen to some more data points and then IDC, for example, predicts that by 2023, 70 percent of IoT deployment will include edge-based decision making, right. So, the decisions will be made decentral supporting the organization’s agenda. So, meaning we can do industrial IoT. We can do industry 4.0 without it, meaning some central cloud-based system taking over. Local autonomy can happen if edge computing is involved. And if we look at the IDC saying they’re saying, OK, 70 percent of all enterprises will run varying levels of data processing at the edge. And that also means organizations will have to spend a lot on IoT edge infrastructure in that timeframe.

 

Elvira Wallis [00:08:53] So I think edge is here to increase in prominence and in relevance for our customers, and it’s a good idea to get prepared. I mean, we at SAP we’re very well positioned to run data driven business processes at the edge. We can run manufacturing processes at the edge orchestrated from the cloud, and we provide our customers the option to run applications in a hybrid approach meaning, at the edge and clouds and this hybrid cloud edge offering helps customers accelerate the transition to the cloud by addressing their need around data privacy, around security, around latency and regulatory requirements.

 

Elvira Wallis [00:09:37] Now, going back to no person is alone. It’s, of course, clear that we also in the realm of edge computing, we’re in need to be committed to a strong ecosystem. No one can do it alone. You need the hardware providers, and we have announced strategic partnerships with the hyper scalars and also in some cases regional industry specific players in IoT and edge where we leverage the strength of all the players in the ecosystem to help our customers be successful. It’s a joint digital transformation where SAP participates together with our customers and our partners.

 

Tom Raftery [00:10:14] OK, super, super for any of our customers, potential customers or just anyone who’s listening, who is interested on embarking on some kind of industry 4.0 project. How do you start something like that? Where do you kick off?

 

Elvira Wallis [00:10:35] And so it’s a very good point to raise. My first perspective would be. There is no one size fits all right? Customers are. By and large, all increasingly challenged to adapt to ever changing conditions. Now, mind you which of these conditions is the most prevalent and in which line of business is it the trade wars? Is it managing the global supply chain? Is it skills shortages? Successful customers need to embrace the digital transformation right to discover new ways to solve their business problems and to keep their customers engaged. Because this is also to do with customer experience and customer loyalty. Now, customers might start in different areas. They all centre on their customers. But whether they start with reinventing production to centre on their customers or whether it is connecting various departments in their company to overcome their own segregation of duties in a way that is hindering success. That is something that customers really will vary. In other words, SAP can help make industry 4.0 an everyday reality. Now where customers start, whether it’s with the intelligent asset and managing the overall equipment effectiveness or whether it’s the intelligent product where customers want to understand the business impact of design and engineering changes in products, or whether it’s the intelligent factory where IoT helps enterprises to be agile and deal with varying production volumes and new manufacturing technologies, or whether it is with empowering people so that people can fulfil complex tasks with a fast work-around that is really dependent on the customer need. We need to understand that it’s important to centre on the customers and connect the entire company, but it doesn’t mean you need to start everywhere at the same time with the same urgency. Our clear perspective is customers have a choice where they start and we recommend to start somewhere, where of course there is an immediate need and it can be time boxed because nothing is more convincing than initial positive results and then you can widen the exercise.

 

Tom Raftery [00:13:02] Okay, very good. What kind of challenges are companies likely to face on a journey like this? I mean, you mentioned, you know, having skilled staff there. Is it is the staffing or is it technology or is it a combination or is it something else entirely and you know, having then identified a couple of the challenges, what would be ways of overcoming them?

 

Elvira Wallis [00:13:28] It’s a very good question. And there are some interesting studies out there in the market that I enjoyed. One is by McKinsey and that study showed clearly that the success rate of these digital transformation projects are not necessarily tied to the area within which they are started. So, you couldn’t say, oh, let’s start it in production or let’s start around the asset and as it is more successful than production or, vice versa right? What they showed is it is other factors that correlate with success. In other words, the more initiatives a customer ran. So, in other words, if they addressed digital transformation in more lines of business, they were likely to be more successful than if they were just doing what I would call island exercise in one area. So, spreading wide helps clearly with the RoI. The other thing that some of the studies showed is time boxing is key, having a line of business sponsor is key. So, in other words, it doesn’t work if you have just some little IT exercise or if it’s just some innovation centre not connected to the line of business. So, sponsorship, time boxing, clear KPIs as to what do we want to achieve, and which problem do we want to solve. In other words, all that is more successful than what I would call analysis paralysis and looking for the perfect case. Or the what I would call research approach where let’s take some sensors and collect them and produce a dashboard. So, you need to have a clear proof business problem to solve, a business sponsor, time boxing, clear KPIs and ideally more than one initiative. Spreading it and seeing what are the successful front runners and building on those. Those are clearly some of the what I would call non-technology challenges in a way they are common sense that we learned from various studies, but also from working with our customers.

 

Tom Raftery [00:15:30] OK. OK. Very good. We’re coming up towards the end of the show, now Elvira. Is there any question that I haven’t asked you that you think I should have?

 

Elvira Wallis [00:15:44] It’s a very good question. I would say when we look at the type of use cases, what kind of typical use cases do we see is one question that I very often get asked and I mentioned before, yes, we have the area of intelligent asset, intelligent product, intelligent factory and empower people. Now, another dimension to look at it would be what type of goals are people pursuing? Is it about new business models? Is it about efficiency? Is it about customer experience? In other words, what type of goal do people look at? And one thing I’d point out is we see increasingly people looking at some product as a service offerings. Now, that doesn’t work for all types of offerings, but that is something that we see a shift to product as a service in the construction, transportation, hospitality, realm and insurance industries. Where we see a shift and I believe we look at new customer experience, in other words, does my digital transformation help me create a better, better customer experience is clearly something that we see where people look at their customers, but also their customers customers. And I would encourage people to take that line of sight to look in addition to the productivity gains and the overall production. Really the focus on the customers and to put that at the forefront and the centre of a digital transformation.

 

Tom Raftery [00:17:17] Superb. Elvira if people want to know more about Elvira, or IoT, or Industry 4.0, or any and all of the above where would you have me direct them and feel free to give multiple links? I’ll put them into the description of the show notes when I publish this.

 

Elvira Wallis [00:17:35] Oh definitely join me on Twitter. Join me on LinkedIn. And of course, we have our flabbergastingly great web site SAP.com/IoT. And not to forget, we’re going to run an openSAP IoT course in the near future. And I would really appreciate you joining us in that openSAP course.

 

Tom Raftery [00:17:56] Fantastic. I’ll have links to all of those in the show notes. OK, that’s been great. Elvira. Thanks a million for joining us on the show today.

Elvira Wallis [00:18:01] Thank you, Tom. It’s always great to be one of your interviewees.

And if you want to know more about any of SAP’s Digital Supply Chain solutions, head on over to www.sap.com/digitalsupplychain and if you liked this show, please don’t forget to rate and/or review it. It makes a big difference to help new people discover it. Thanks.

The Internet of Things – trends for the telecoms, data centre, and utility industries

I gave the closing keynote at an event in Orlando last week on the topic of The Impact of the Internet of Things on Telcos, Data Centres, and Utilities.

The slides by themselves can be a little hard to grok, so I’ll go through them below. I should note at the outset that while many of my slide decks can be over 90, or even 100 slides, I kept this one to a more terse 66 😉

And so, here is my explanation of the slides

  1. Title slide
  2. A little about me
  3. The IoT section start
  4. IoT has been around for a while, but the recent explosion in interest in it is down to the massive price drops for sensors, combined with near ubiquitous connectivity – we’re heading to a world where everything is smart and connected
  5. According to the June 2016 Ericsson Mobility Report [PDF], the Internet of Things (IoT) is set to surpass mobile phones as the largest category of connected devices in 2018
  6. Depending on who you believe, Cisco reckons we will have 50bn connected devices by 2020
  7. While IDC puts the number at 212bn connected devices. Whatever the number is, it is going to mean many devices will be creating and transmitting data on the Internet
  8. What kinds of things will be connected? Well, everything from wind turbines (this is an image from GE’s website – they have a suite of IoT apps which can “improve wind turbine efficiency up to 5%” which in a large wind farm is a big deal)
  9. Rio Tinto has rolled out fully autonomous trucks at two of its mines in Australia. They developed the trucks in conjunction with Komatsu. The trucks, which are supervised from a control room 1,000km away in Perth, outperform manned trucks by 12%
  10. A nod to one of my favourite comedy movies (“See the bears game last week? Great game”), while also introducing the next three slides…
  11. Planes – according to Bill Ruh, GE’s CEO of Digital, GE’s jet engines produce 1TB of data per flight. With a typical plane flying 5-10 flights per day, that’s in the region of 10TB per plane per day, and there are 20,00 planes – that’s a lot of data. Plus, GE is currently analysing 50m variables from 10m sensors
  12. Trains – New York Air Brakes has rolled out a sensor solution for trains, which it says is saving its customers $1bn per year
  13. And automobiles – in the 18 months since Tesla starting collecting telemetry data from its customers’ cars, it has collected 780m miles of driving data. It is now collecting another 1 million miles every 10 hours. And the number of miles increases with each new Tesla sold
    And since 2009 Google has collected 1.5m miles of data. This may not sound like much in comparison, but given its data comes from Lidar radars, amongst other sensors, it is likely a far richer data set
  14. With the rollout of smart meters, UK utility Centrica recently announced that it will be going from 75m meter reads a year, to 120bn meter reads per annum
  15. Wearables, like the Fitbit now record our steps, our heartbeat, and even our sleep
  16. This was my heartbeat last November when I presented at the SAP TechEd event in Barcelona – notice the peak at 2:30pm when I went onstage
  17. Lots of in-home devices too, such as smoke alarms, thermostats, lightbulbs, and even security cameras and door locks are becoming smart
  18. Even toy maker Atari has announced that it is getting into the Internet of Things business
  19. Which is leading to an enormous data explosion
  20. In 2012 analyst form IDC predicted that we will have created 40ZB of data by 2020
  21. In 2015 it updated that prediction to 75ZB
  22. Where will this data be created?
  23. Well, according to the 2016 Ericsson Mobility Report, most of the IoT devices will be in Asia Pacific, Western Europe, and North America
  24. When?
  25. That depends, different devices have different data profiles for creation and consumption of data, depending on geography, time of day, and day of year
  26. And why?
  27. Because, as Mary Meeker pointed out in her 2016 State of The Internet report, global data growth has had a +50% CAGR since 2010, while data storage infrastructure costs have had a -20% CAGR in the same timeframe
  28. In 2011 EU Commissioner Neelie Kroes famously said that Data is the new gold
  29. And if that’s true, as is the case with any gold rush, the real money is to be made supplying the prospectors
  30. Now, let’s look at some of the trends and impacts in the telecoms industry
  31. From Ericsson’s 2016 Mobility Report we can see that the big growth for the telecoms is in data traffic
  32. And not content to be merely infrastructure providers, telcos are looking to climb the value chain
  33. To facilitate this data explosion, telecom companies are building fatter pipes with LTE growing significantly in numbers between 2015 and 2021, while 2019 will see 5G kicking off
  34. Telcos are now offering cloud solutions. Their USP being that their cloud is fast, reliable, and end-to-end secure
  35. There are huge opportunities for telcos in this space
  36. In the next few slides I did a bit of a case study of AT&T, and some of the ways it is leveraging the Internet of Things. First off AT&T has partnered with solar company SunPower to connect residential solar panels for remote monitoring of the panels’ performance
  37. In its connected vehicle portfolio, AT&T manage the connections for Tesla, Audi, GM, and Uber. They have 8m connected cars atm, and expect to grow that to 10m by the end of 2017
  38. And, an interesting data point to back that up – in the first quarter of 2016, in the US, 32% of all new cellular connections were for cars. The largest percentage of any segment
  39. 243,000 refrigerated shipping containers connected through AT&T

  40. AT&T have a partnership with GE for intelligent lighting solutions for cities and public roadways
  41. In the equipment and heavy machinery space, nearly half of all tractors and harvesters in the US are connected through AT&T
  42. While in healthcare, AT&T predicts that wellness tracking and virtual care solutions will reach 60m homes & 74m users by 2019
  43. Then there’s outdoor advertising. AT&T knows data analysis. For years they owned the largest telemarketing organisation in the US. Now, with cellular data, they can completely transform outdoor advertising. Previously for advertising hoardings, the amount of footfall, or vehicular traffic passing a sign could be guesstimated, but no more info than that was available. But now, because AT&T knows where everyone is, their gender, age, and approximate income, they can transform this business.
    Recently they carried out a study with a customer who wanted to advertise to women in the Dallas area who earned over $75,000 per year. They queried the data and found that the customer only needed to buy two billboards in all of Dallas, to adequately cover the target demographic. Needless to say the customer was impressed
  44. Because they don’t have a monopoly on ideas, AT&T have opened up their M2X Internet of Things developer platform to allow outside developers create solutions using AT&T’s infrastructure
  45. They’re far from being alone in this – Verizon have an Internet of Things platform as well called ThingSpace Develop
  46. While t-mobile has announced that it is teaming up with Twilio for its Internet of Things play
  47. And it is not just cellular technologies they are using – there are also other low bandwidth radio protocols such as Lora and Sigfox which the telcos are looking at to broaden their reach
  48. I spoke to a senior exec at a telcom firm recently (who for obvious reasons preferred to remain unnamed) and he told me:
    Telcos want to own everything, everywhere“The internet of things is certainly one way for them to get there
  49. How is all this impacting the data centre industry?
  50. Well, in the next four years data centre capacity will need to increase 750% according to IDC. Also required will be significant ramp-ups in analytics, security and privacy
  51. As Jim Gray pointed out in his book The Fourth Paradigm:

    “As datasets grow ever larger, the most efficient way to perform most of these computations is clearly to move the analysis functions as close to the data as possible”

    In other words, instead of bringing all the data back to the data centre to be processed, more and more of the analysis will need to be performed at the edge

  52. As a graduate biologist, this reminds me of the reflex arc – this arc allows reflex actions to occur relatively quickly by activating spinal motor neurons, without the delay of routing signals through the brain
  53. So there will be a greater need for event stream processing outside the data centre – this will bring about faster responsiveness, and reduce storage requirements
  54. This also explains the rise of companies such as EdgeConnex – companies who provide proximity, and lower latency
  55. And the rise of new designs of racks for hyperscale computing, such as the 150kW Vapor.io Vapor Chamber which, according to a study conducted by Romonet is $3m cheaper per MW and reclaims 25% of floor space
  56. Other initiatives in the industry include Google’s attempting to create a new standard for HDD’s to make them taller, adding more platters, and thus increasing IOPs
  57. Microsoft and Facebook are getting together with Telefonica to build a 160TB transatlantic fibre cable (the largest to-date) to handle the vast streams of data they see coming
  58. While Intel are warning that organisations need to become more security aware, as more devices become connected
  59. I also decided to address a trend in data centres to require renewable energy from their utility providers, and did so by referencing this excellent letter from Microsoft General Counsel Brad Smith on the topic (recommended reading)
  60. Finally, what about the utilities sector…
  61. Well, there are many ways the internet of Things will impact the utilities vertical, but one of the least obvious, but most impactful ones will be the ability to move energy demand, to more closely match supply. If you’re curious about this, I’ve given 45 minute keynotes on this topic alone
  62. Another way the Internet of Things will help utilities is renewables management (such as the GE example referenced earlier), and preventative maintenance applications
  63. And finally, energy information services will be a big deal, for everything from remote monitoring for seniors, through to device maintenance, and home management
  64. The conclusions
  65. Thanks
  66. Any questions?

I received extremely positive feedback on the talk from the attendees. If you have any comments/questions, feel free to leave them in the comments, email me (tom@tomraftery.com), or hit me up on Twitter, Facebook, or LinkedIn.

IBM acquires Weather.com for Cloud, AI (aaS), and IoT

Raindrops keep falling...

IBM has announced the completion of the acquisition The Weather Company’s B2B, mobile and cloud-based web-properties, weather.com, Weather Underground, The Weather Company brand and WSI, its global business-to-business brand.
Weather Channel screenshot
At first blush this may not seem like an obvious pairing, but the Weather Company’s products are not just their free apps for your smartphone, they have specialised products for the media industry, the insurance industry, energy and utilities, government, and even retail. All of these verticals would be traditional IBM customers.

Then when you factor in that the Weather Company’s cloud platform takes in over 100 Gbytes per day of information from 2.2 billion weather forecast locations and produces over 300 Gbytes of added products for its customers, it quickly becomes obvious that the Weather Company’s platform is highly optimised for Big Data, and the internet of Things.

This platform will now serve as a backbone for IBM’s Watson IoT.

Watson you will remember, is IBM’s natural language processing and machine learning platform which famously took on and beat two former champions on the quiz show Jeopardy. Since then, IBM have opened up APIs to Watson, to allow developers add cognitive computing features to their apps, and more recently IBM announced Watson IoT Cloud “to extend the power of cognitive computing to the billions of connected devices, sensors and systems that comprise the IoT”.

Given Watson’s relentless moves to cloud and IoT, this acquisition starts to make a lot of sense.

IBM further announced that it will use its network of cloud data centres to expand Weather.com into five new markets including China, India, Brazil, Mexico and Japan, “with the goal of increasing its global user base by hundreds of millions over the next three years”.

With Watson’s deep learning abilities, and all that weather data, one wonders if IBM will be in a position to help scientists researching climate change. At the very least it will help the rest of us be prepared for its consequences.

New developments in AI and deep learning are being announced virtually weekly now by Microsoft, Google and Facebook, amongst others. This is a space which it is safe to say, will completely transform how we interact with computers and data.

Equinix rolls out 1MW fuel cell for Silicon Valley data center

Equinix Silicon Valley Data Center

Equinix is powering one of its Silicon Valley data centers with a 1MW Bloom Energy fuel cell

As we have pointed out here many times, the main cloud providers (particularly Amazon and IBM) are doing a very poor job either powering their data centers with renewable energy, or reporting on the emissions associated with their cloud computing infrastructure.

Given the significantly increasing use of cloud computing by larger organisations, and the growing economic costs of climate change, the sources of the electricity used by these power-hungry data centers is now more relevant than ever.

Against this background, it is impressive to see to see Equinix, a global provider of carrier-neutral data centers (with a fleet of over 100 data centers) and internet exchanges, announce a 1MW Bloom Energy biogas fuel cell project at its SV5 data center, in Silicon Valley. Biogas is methane gas captured from decomposing organic matter such as that from landfills or animal waste.

Why would Equinix do this?

Well, the first phase of California’s cap and trade program for CO2 emissions commenced in January 2013, and this could, in time lead to increased costs for electricity. Indeed in their 2014 SEC filing [PDF], Equinix note that:

The effect on the price we pay for electricity cannot yet be determined, but the increase could exceed 5% of our costs of electricity at our California locations. In 2015, a second phase of the program will begin, imposing allowance obligations upon suppliers of most forms of fossil fuels, which will increase the costs of our petroleum fuels used for transportation and emergency generators.

We do not anticipate that the climate change-related laws and regulations will force us to modify our operations to limit the emissions of GHG. We could, however, be directly subject to taxes, fees or costs, or could indirectly be required to reimburse electricity providers for such costs representing the GHG attributable to our electricity or fossil fuel consumption. These cost increases could materially increase our costs of operation or limit the availability of electricity or emergency generator fuels.

In light of this, self-generation using fuel cells looks very attractive, both from the point of view of energy cost stability, and reduced exposure to increasing carbon related costs.

On the other hand, according to today’s announcement, Equinix already gets approximately 30% of its electricity from renewable sources, and it plans to increase this to 100% “over time”.

Even better than that, Equinix is 100% renewably powered in Europe despite its growth. So Equinix is walking the walk in Europe, at least, and has a stated aim to go all the way to 100% renewable power.

What more could Equinix do?

Well, two things come to mind immediately:

  1. Set an actual hard target date for the 100% from renewables and
  2. Start reporting all emissions to the CDP (and the SEC)

Given how important a player Equinix in the global internet infrastructure, the sooner we see them hit their 100% target, the better for all.

Lack of emissions reporting from (some) cloud providers is a supply chain risk

Pollution

We at GreenMonk spoke to Robert Francisco, President North America of FirstCarbon Solutions, last week. FirstCarbon solutions is an environmental sustainability company and the exclusive scoring partner of CDP‘s (formerly the Carbon Disclosure Project), supply chain program.

Robert pointed out on the call that there is a seed change happening and that interest in disclosure is on the rise. He noted that carbon scores are now not only showing up at board level, but are also being reported to insurance companies, and are appearing on Bloomberg and Google Finance. He put this down to a shift away from the traditional regulation led reporting, to a situation now where organisations are responding to pressure from investors, as well as a requirement to manage shareholder risk.

In other words the drivers for sustainability reporting now are the insurance companies, and Wall Street. Organisations are realising that buildings collapsing in Bangladesh can have an adverse effect on their brand, and ultimately their bottom line.

So transparency in business is the new black.

Unfortunately, not everyone has received the memo.

We’re written previously about this lack of transparency, even ranking some cloud computing providers, and the supply chain risk as a result of that lack of reporting. Amazon and SoftLayer being two prime examples of cloud computing platforms that fail to report on their emissions.

However, SoftLayer was purchased by IBM in 2013, and IBM has a reasonably good record on corporate reporting (although, as of July 2014, it has yet to publish its 2013 Corporate Responsibility report). Hopefully this means that SoftLayer will soon start publishing its energy and emissions data.

Amazon, on the other hand, has no history of any kind of environmental energy or emissions reporting. That lack of transparency has to be a concern for its investors, a risk for for its shareholders, and a worry for its customers who don’t know what is in their supply chain.

Image credit Roger

(Cross-posted @ GreenMonk: the blog)

Cloud computing meets supply chain transparency and risk

Supply chains? Yawn, right?

While supply chains may seem boring, they are of vital importance to organisations, and their proper management can make, or break companies.

Some recent examples of where poorly managed supply chains caused at best, serious reputational damage for companies include the Apple Computers child labour and workers suicide debacle; the Tesco horse meat scandal; and Nestlé’s palm oil problems.

What does this have to do with Cloud computing?

Well, last week, here in GreenMonk we published a ranking of cloud computing companies and their use of renewables. Greenqloud, Windows Azure, Google, SAP and Rackspace all come out of it quite well.

On the other hand, IBM and Oracle didn’t fare well in the study due to their poor commitment to renewables. But, at least they are reasonably transparent about it. Both organisations produce quite detailed corporate responsibility reports, and both report their emissions to the Carbon Disclosure Project. So if you are sourcing your cloud infrastructure from Oracle or IBM, you can at least find out quite easily where the dirty energy powering your cloud is coming from.

Amazon however, does neither. It doesn’t produce any corporate responsibility reports and it doesn’t publish its emissions to the Carbon Disclosure Project. This is particularly egregious given that Amazon is, by far the largest player in this market.

Amazon’s customers are taking a leap of faith by choosing Amazon to host their cloud. They have no idea where Amazon is sourcing the power to run their servers. Amazon could easily be powering their server farms using coal mined by Massey Energy, for example. Massey Energy, as well as having an appalling environmental record, is the company responsible for the 2010 West Virginia mining disaster which killed 29 miners, or Amazon could be using oil extracted from Tar sands. Or there could be worse in Amazon’s supply chain. We just don’t know, because Amazon won’t tell us.

This has got to be worrisome for Amazon’s significant customer base which includes names like Unilever, Nokia and Adobe, amongst many others. Imagine what could happen if Greenpeace found out… oh wait.

Just a couple of weeks ago US enterprise software company Infor announced at Amazon’s Summit that it plans to build it’s CloudSuite offerings entirely on Amazon’s AWS. As I tweeted last week, this is a very courageous move on Infor’s part

All the more brave given that Infor will be using Amazon to host the infrastructure of Infor’s own customer base. “Danger, Will Robinson!”

This lack of supply chain transparency is not sustainable. Amazon’s customers won’t tolerate the potential risk to their reputations and if Amazon are unwilling to be more transparent, there are plenty of other cloud providers who are.

This post was originally published by Tom Raftery on GreenMonk.

Image credits failing_angel

Cloud computing companies ranked by their use of renewable energy


Cloud provider Renewables percentage

Cloud computing is booming. Cloud providers are investing billions in infrastructure to build out their data centers, but just how clean is cloud?

Given that this is the week that the IPCC’s 5th assessment report was released, I decided to do some research of my own into cloud providers. The table above is a list of the cloud computing providers I looked into, and what I found.

It is a real mixed bag but from the table you can see that Icelandic cloud provider Greenqloud comes out on top because they are using the electricity from the 100% renewable Icelandic electricity grid to power their infrastructure.

On the Windows Azure front, Microsoft announced in May of 2012 that it was going to go carbon neutral for its facilities and travel. Microsoft are now, according to the EPA, the second largest purchaser of renewable energy in the US. In 2013 they purchased 2,300m kWh which accounted for 80% of their electricity consumption. They made up the other 20% with Renewable Energy Certificates (RECs). And according to Microsoft’s TJ DiCaprio, they plan to increase their renewable energy purchases from 80% to 100% in the financial year 2014.

Google claim to have been carbon neutral since 2007. Of Google’s electricity, 32% came from renewables, while the other 68% came from the purchase of RECs.

SAP purchased 391m kWh of renewable energy in 2013. This made up 43% of its total electricity consumption. SAP have since announced that they will go to powering 100% of its facilities from renewable energy in 2014.

The most recent data from IBM dates from 2012 when they purchased 764m kWh of renewable energy. This accounted for just 15% of their total consumption. In the meantime IBM have purchased cloud company Softlayer for whom no data is available, so it is unclear in what way this will have affected IBM’s position in these rankings.

The most up-to-date data on Oracle’s website is from 2011, but more recent data about their renewable energy is to be found in their 2012 disclosure to the Carbon Disclosure Project (registration required). This shows that Oracle purchased 5.4m kWh of renewable energy making up a mere 0.7% of their total consumption of 746.9m kWh in 2012.

Rackspace have no data available on their site, but in email communications with me yesterday they claim that 35% of their electricity globally is from renewable sources. They declined to say exactly how much that was (in kWh).

Amazon discloses no information whatsoever about its infrastructure apart from a claim that its Oregon and GovCloud regions are using 100% carbon free power. However, they don’t back up this claim with any evidence, they don’t disclose to the Carbon Disclosure Project, nor do they produce an annual Corporate Responsibility report.

The other three cloud providers in the list, Softlayer, GoGrid, and Bluelock have no information on their websites (that I could find), and they didn’t respond to written inquiries.

I’ll be writing a follow-up post to this in the next few days where I look into the supply chain risks of utilising cloud platforms where there is no transparency around power sourcing.

(Cross-posted @ GreenMonk: the blog)

(Cross-posted @ GreenMonk: the blog)

SAP to power its cloud computing infrastructure from 100% renewable energy

Wind turbine

Cloud computing is often incorrectly touted as being a green, more environmentally-friendly, computing option. This confusion occurs because people forget that while cloud computing may be more energy efficient (may be), the environmental friendliness is determined by how much carbon is produced in the generation of that energy. If a data centre is primarily powered by coal, it doesn’t matter how energy efficient it it, it will never be green.

We have mentioned that very often here on GreenMonk, as well as regularly bringing it up with cloud providers when talking to them.

One such cloud provider is SAP. Like most other cloud vendors, they’re constantly increasing their portfolio of cloud products. This has presented them with some challenges when they have to consider their carbon footprint. In its recently released 2013 Annual Report SAP admits

Energy usage in our data centers contributed to 6% of our total emissions in 2013, compared with 5% in 2012

This is going the wrong direction for a company whose stated aim is to reduce the greenhouse gas emissions from their operations to levels of the year 2000 by 2020.

To counter this SAP have just announced

that it will power all its data centers and facilities globally with 100 percent renewable electricity starting in 2014

This is good for SAP, obviously, as they will be reducing their environmental footprint, and also good for customers of SAP’s cloud solutions who will also get the benefit of SAP’s green investments. How are SAP achieving this goal of 100 per cent renewable energy for its data centers and facilities? A combination of generating its own electricity using solar panels in Germany and Palo Alto (<1%), purchasing renewable energy and high quality renewable energy certificates, and a €3m investment in the Livlihoods Fund.

So, how does SAP’s green credentials stack up against some of its rivals in the cloud computing space?

Well, since yesterday’s pricing announcements from Google they definitely have to be considered a contender in this space. And what are their green credentials like? Well, Google have been carbon neutral since 2007, and they have invested over $1bn in renewable energy projects. So Google are definitely out in front on this one.

Who else is there?

Well, Microsoft with its recently branded Microsoft Azure cloud offerings are also a contender, so how do they fare? Quite well actually. In May 2012, Microsoft made a commitment

to make our operations carbon neutral: to achieve net zero emissions for our data centers, software development labs, offices, and employee business air travel in over 100 countries around the world.

So by doing this 2 years ahead of SAP and by including employee air travel, as well as facilities, you’d have to say that Microsoft come out ahead of SAP.

However, SAP does come in well ahead of other cloud companies such as IBM, who reported that renewable electricity made up a mere 15% of its consumption in 2012. IBM reported emissions of 2.2m tons of CO2 in 2012.

But, at least that’s better than Oracle. In Oracle’s 2012 report (reporting on the year 2011 – the most recent report available on their site), Oracle state that they don’t even account for their scope 3 emissions:

Scope 3 GHG emissions are typically defined as indirect emissions from operations outside the direct control of the company, such as employee commutes, business travel, and supply chain operations. Oracle does not report on Scope 3 emissions

And then there’s Amazon. Amazon doesn’t release any kind of information about the carbon footprint of its facilities. None.

So kudos to SAP for taking this step to green its cloud computing fleet. Looking at the competition I’d have to say SAP comes in around middle-of-the road in terms of its green cloud credentials. If it wants to improve its ranking, it may be time to revisit that 2020 goal.

(Cross-posted @ GreenMonk: the blog)

(Cross-posted @ GreenMonk: the blog)

Why are Salesforce hiding the emissions of their cloud?

Salesforce incorrect carbon data
The lack of transparency from Cloud computing providers is something we have discussed many times on this blog – today we thought we’d highlight an example.

Salesforce dedicates a significant portion of its site to Sustainability and on “Using cloud computing to benefit our environment”. They even have nice calculators and graphs of how Green they are. This all sounds very promising, especially the part where they mention that you can “Reduce your IT emissions by 95%”, so where is the data to back up these claims? Unfortunately, the data is either inaccurate or missing altogether.

For example, Salesforce’s carbon calculator (screen shot above) tells us that if an organisation based in Europe moves its existing IT platform (with 10,000+ users) to the Salesforce cloud, it will reduce its carbon emissions by 87%.

This is highly suspect. Salesforce’s data centers are in the US (over 42% of electricity generated in the US comes from coal) and Singapore where all but 2.6% of electricity comes from petroleum and natural gas [PDF].

On the other hand, if an organisation’s on premise IT platform in Europe is based in France, it is powered roughly 80% by nuclear power which has a very low carbon footprint. If it is based in Spain, Spain generates almost 40% of its power from renewables [PDF]. Any move from there to Salesforce cloud will almost certainly lead to a significant increase in carbon emissions, not a reduction, and certainly not a reduction of 87% as Salesforce’s calculator claims above.

Salesforce incorrect carbon data

Salesforce also has a Daily Carbon Savings page. Where to start?

To begin with, the first time we took a screen shot of this page was on October 1st for slide 26 of this slide deck. The screen shot on the right was taken this morning. As you can see, the “Daily Carbon Savings” data hasn’t updated a single day in the meantime. It is now over two months out-of-date. But that’s probably just because of a glitch which is far down Salesforce’s bug list.

The bigger issue here is that Salesforce is reporting on carbon savings, not on its carbon emissions. Why? We’ve already seen (above) that their calculations around carbon savings are shaky, at best. Why are they not reporting the much more useful metric of carbon emissions? Is it because their calculations of emissions are equally shaky? Or, is it that Salesforce are ashamed of the amount of carbon they are emitting given they have sited their data centers in carbon intensive areas?

We won’t know the answer to these questions until Salesforce finally do start reporting the carbon emissions of its cloud infrastructure. In a meaningful way.

Is that likely to happen? Yes, absolutely.

When? That’s up to Salesforce. They can choose to be a leader in this space, or they can choose to continue to hide behind data obfuscation until they are forced by either regulation, or competitive pressure to publish their emissions.

If we were Salesforce, we’d be looking to lead.

Image credits Tom Raftery

Enhanced by Zemanta

(Cross-posted @ GreenMonk: the blog)

Cloud Computing: Google Apps cloud has a relatively high carbon intensity

Cloud
I have been researching and publishing on Cloud Computing for quite some time here. Specifically, I’ve been highlighting how it is not possible to know if Cloud computing is truly sustainable because none of the significant Cloud providers are publishing sufficient data about their energy consumption, carbon emissions and water use. It is not enough to simply state total power consumed, because different power sources can be more, or less sustainable – a data center run primarily on renewables is far less carbon intensive than one that relies on power from an energy supplier relying on coal burning power stations.

At Greenmonk we believe it’s important to get behind the headline numbers to work out what’s really going on. We feel it’s unacceptable to simply state that Cloud is green and leave it at that, which is why we’ve been somewhat disappointed by recent work in the field by the Carbon Disclosure Project. We would like to see more rigour applied by CDP in its carbon analytics.

Carbon intensity should be a key measure, and we need to start buying power from the right source, not just the cheapest source.

I was pleasantly surprised then yesterday when I heard that Google had published a case study ostensibly proving that Cloud had reduced the carbon footprint of at least one major account.

However, it is never that straightforward, is it?

The Google announcement came in the form of a blog post titled Energy Efficiency in the Cloud, written by Google’s SVP for Technical Infrastructure, Urs Hölzle. I know Urs, I’ve met him a couple of times, he’s a good guy.

Unfortunately, in his posting he heavily references the Carbon Disclosure Project’s flawed report on Cloud Computing, somewhat lessening the impact of his argument.

Urs claims that in a rollout of Google Apps for Government for the US General Services Administration,

the GSA was able to reduce server energy consumption by nearly 90% and carbon emissions by 85%.

An 85% reduction in carbon emissions sounds very impressive – but how does Google calculate that figure?