Category: Technology / Software

IBM’s InterConnect 2015, the good and the not so good

IBM InterConnect 2015

IBM invited me to attend their Cloud and Mobile Conference InterConnect 2015 last week.

Because of what IBM has done globally to help people get access to safe water, to help with solar forecasting, and to help deliver better outcomes in healthcare, for example, I tend to have a very positive attitude towards IBM.

So I ventured to the conference with high hopes of what I was going to learn there. and for the most part I wasn’t disappointed. IBM had some very interesting announcements, more on which later.

However, there is one area where IBM has dropped the ball badly – their Cloud Services Division, Softlayer.

IBM have traditionally been a model corporate citizen when it comes to reporting and transparency. They publish annual Corporate Responsibility reports with environmental, energy and emissions data going all the way back to 2002.

However, as noted here previously, when it comes to cloud computing, IBM appear to be pursuing the Amazon model of radical opaqueness. They refuse to publish any data about the energy or emissions associated with their cloud computing platform. This is a retrograde step, and one they may come to regret.

Instead of blindly copying Amazon’s strategy of non-reporting, shouldn’t IBM be embracing the approach of their new best buddies Apple? Apple, fed up of being Greenpeace’d, and seemingly genuinely wanting to leave the world a better place, hired the former head of the EPA, Lisa Jackson to head up its environmental initiatives, and hasn’t looked back.

Apple’s reporting on its cloud infrastructure energy and emissions, on its supply chain [PDF], and on its products complete life cycle analysis, is second to none.

This was made more stark for me because while at InterConnect, I read IBM’s latest cloud announcement about their spending $1.2bn to develop 5 new SoftLayer data centres in the last four months. While I was reading that, I saw Apple’s announcement that they were spending €1.7bn to develop two fully renewably powered data centres in Europe, and I realised there was no mention whatsoever of renewables anywhere in the IBM announcement.

GreenQloud Dashboard

Even better than Apple though, are the Icelandic cloud computing company GreenQloud. GreenQloud host most of their infrastructure out of Iceland, (Iceland’s electricity is generated 100% by renewable sources – 70% hydro and 30% geothermal), and the remainder out of the Digital Fortress data center in Seattle, which runs on 95% renewable energy. Better again though, GreenQloud gives each customer a dashboard with the total energy that customer has consumed and the amount of CO2 they have saved.

This is the kind of cloud leadership you expect from a company with a long tradition of openness, and the big data and analytics chops that IBM has. Now this would be A New Way to Think for IBM.

But, it’s not all bad news, as I mentioned at the outset.

IBM Predictive Maintenance

As you’d expect, there was a lot of talk at InterConnect about the Internet of Things (IoT). Chris O’Connor, IBM’s general manager of IoT, in IBM’s new IoT division, was keen to emphasise that despite the wild hype surrounding IoT at the moment, there’s a lot of business value to be had there too. There was a lot of talk about IBM’s Predictive Maintenance and Quality solutions, for example, which are a natural outcome of IBM’s IoT initiatives. IBM has been doing IoT for years, it just hasn’t always called it that.

And when you combine IBM’s deep expertise in Energy and Utilities, with its knowledge of IoT, you have an opportunity to create truly Smart Grids, not to mention the opportunities around connected cities.

In fact, IoT plays right into the instrumented, interconnected and intelligent Smarter Planet mantra that IBM has been talking for some time now, so I’m excited to see where IBM go with this.

Fun times ahead.

(Disclosure – IBM paid my travel and accommodation for me to attend InterConnect.)

Apple, cloud computing, and enterprise supply chain management

Solar power

Apple’s recent announcements around renewables and supply chain transparency, put the major cloud providers to shame.

Apple had a couple of interesting announcements last week. The first was that they were investing $848m in a 130MW solar farm being built by First Solar in California. With this investment, Apple enters into a 25 year power purchase agreement with the solar farm, guaranteeing income for the solar farm, and securing Apple’s energy bills for the next 25 years in California. According to First Solar this is the largest agreement in the industry to provide clean energy to a commercial end user, and it will provide enough energy for Apple to fully power its headquarters, operations and retail stores in California, with renewable energy.

For it’s data centers, which hosts Apple’s iCloud, App Store, and iTunes content, Apple uses 100% locally generated, renewable energy. It’s Maiden, North Carolina data centre, for example, uses a combination of biogas fuel cells and two 20‑megawatt solar arrays — the largest privately owned renewable energy installation in the US, according to Apple. And it is now investing another $55 million in a third, 100-acre 17.5MW plant for the facility. You can find details of Apple’s other data centre facilities, and how they are powered by renewables, here.

Apple's Maiden Data Center Solar Array

The second announcement from Apple was the publication of its 2015 Supplier Responsibility Progress Report (highlights here, full PDF here). Apple has been criticised in the past for workers rights violations in its supply chain, so it is good to see Apple taking very real steps, positive, to address this. The amout of detail, the steps taken, and the levels of transparency in the report are impressive.

On underage labour, for instance, Apple’s policy requires that

any supplier found hiring underage workers fund the worker’s safe return home. Suppliers also have to fully finance the worker’s education at a school chosen by the worker and his or her family, continue to pay the worker’s wages, and offer the worker a job when he or she reaches the legal age. Of more than 1.6 million workers covered in 633 audits in 2014, 16 cases of underage labor were discovered at six facilities — and all were successfully remediated.

Apple also has strict policies around work week hours, health and safety, sourcing of conflict minerals, and the environment. In order to increase its transparency, Apple publishes its Supplier Code of Conduct, its Supplier Responsibility Standards, its Conflict Minerals Standard, as well as a list of its smelter suppliers and its top 200 suppliers amongst other documents. And Apple’s comprehensive list of environmental reports are published here.

What does this have to do with cloud computing and the enterprise supply chain management?

Well, Apple recently partnered with IBM in order to expand its userbase into the enterprise space. And it has opened its iWork office suite to anyone with an Apple ID, no Apple device required – though this was long overdue.

Comparing Apple’s cloud offerings to actual enterprise cloud players (or any cloud players, for that matter), you see there’s a yawning chasm in terms of transparency, reporting, and commitment to renewables.

Of the main enterprise cloud players:

  • Microsoft publish their Citizenship Report here [PDF]. And while it is a decent enough report, it doesn’t go into anything like the level of detail that Apple does. On page 53 of this report Microsoft mention that 47% of the energy it purchases is renewable. It does purchase renewable energy certificates for the other 53% so it can report that it is carbon neutral.
  • Google doesn’t produce a corporate sustainability report. Instead it has this page which outlines some of the work it does in the community. Information on Google’s energy breakdown is sparse. What is published is found on the Google Green site, where we find that although Google has many investments in renewable energy, and Google has been carbon neutral since 2007, Google’s actual percentage of renewables is only 35%.
  • IBM has a good history of producing corporate reports (though it still hasn’t published its report for 2014). However on the energy conservation section of IBM’s corporate report, IBM reports that sources 17% of its electricity came from renewable sources in 2013. However, they go on to note that this does not include the energy data of Softlayer – IBM’s cloud platform.

Cloud Providers Energy and Transparency

  • And finally, Amazon, who have arguably the largest cloud computing footprint of any of the providers, is the worst performer in terms of reporting, and likely in terms of emissions. The only page where Amazon mentions emissions, claims that it has three carbon neutral regions, but fails to say how they have achieved this status (or whether they are third party audited as such). The same page also claims that “AWS has a long-term commitment to achieve 100% renewable energy usage for our global infrastructure footprint” but it fails to give any time frame for this commitment, or any other details on how it plans to get there.

Taking into account last November’s historic deal between the US and China on carbon reductions, and the upcoming Paris Climate Change Conference in December this year (2015), where there are very likely to be binding international agreements on carbon reductions. This will lead inevitably to increased requirements for CO2 reporting from the supply chain.

With that in mind, including the % renewable energy as one of the factors when choosing a cloud provider, would be a very wise move.

UPDATE:
As pointed out to me on Twitter:

In that case, you could always go with GreenQloud. GreenQloud bill themselves as a drop-in AWS replacement and being based in Iceland their electricity is 100% renewable.

(Cross-posted @ GreenMonk: the blog)

EPRI releases open, standards based software, to connect smart homes to the smart grid

Smart Appliance Screen

Automated Demand Response (ADR) is something we’ve talked about here on GreenMonk for quite a while now. And in other fora, back, at least as far as 2007.

What is Automated Demand Response? Well, demand response is the process whereby electricity consumers (typically commercial) reduce their usage in response to a signal from the utility that they are in a period of peak demand. The signal often takes the form of a phone call.

Automated demand response, as you would imagine, is when this procedure is automated using software signals (often signals of price fluctuation). The development of ADR technologies received a big boost with the development of the OpenADR standard, and in 2010 the subsequent formation of the OpenADR Alliance to promote its use.

Consequently, EPRI‘s recent announcement that it has developed automated demand response software, is to be welcomed.

In their announcement EPRI say the new software will:

provide a common way for devices and appliances on the electric grid to respond automatically to changes in price, weather, and demand for power, a process called automated demand response (ADR).

ADR makes it possible to translate changes in wholesale markets to corresponding changes in retail rates. It helps system operators reduce the operating costs of demand response (DR) programs while increasing its resource reliability. For customers, ADR can reduce the cost of electricity by eliminating the resources and effort required to achieve successful results from DR programs.

The EPRI ADR software was certified by the OpenADR Alliance. “Making this software freely available to the industry will accelerate the adoption of standards-based demand response” said Mark McGranaghan, vice president of Power Delivery and Utilization at EPRI.

This software has the potential to finally bring the smart grid into the home, allowing smart appliances to adjust their behaviour depending on the state of the grid. Some manufacturers have been fudging this functionality already with a combination of internet connected devices and cloud computing resources (see Whirlpool 6th Sense device above). And others, like GE are planning to bring older appliances into the connected fold, by sending out wifi modules that add new sensor capabilities.

(Cross-posted @ GreenMonk: the blog)

The coming together of the Internet of Things and Smart Grids

I was asked to speak at the recent SAP TechEd && d-code (yes, two ampersands, that’s the branding, not a typo) on the topic of the Internet of Things and Energy.

This is a curious space, because, while the Internet of Things is all the rage now in the consumer space, the New Black, as it were; this is relatively old hat in the utilities sector. Because utilities have expensive, critical infrastructure in the field (think large wind turbines, for example), they need to be able to monitor them remotely. These devices use Internet of Things technologies to report back to base. this is quite common on the high voltage part of the electrical grid.

On the medium voltage section, Internet of Things technologies aren’t as commonly deployed currently (no pun), but mv equipment suppliers are more and more adding sensors to their equipment so that they too can report back. In a recent meeting at Schneider Electric’s North American headquarters, CTO Pascal Brosset announced that Schneider were able to produce a System on a Chip (SoC) for $2, and as a consequence, Schneider were going to add one to all their equipment.

And then on the low voltage network, there are lots of innovations happening behind the smart meter. Nest thermostats, Smappee energy meters, and SmartThings energy apps are just a few of the many new IoT things being released recently.

Now if only we could connect them all up, then we could have a really smart grid.

The slides for this talk are available on SlideShare.

(Cross-posted @ GreenMonk: the blog)

GE publishes Grid Resiliency survey

GE Grid Survey Infographic

GE’s Digital Energy business produced this infographic recently, based on the results of its Grid Resiliency Survey measuring the U.S. public’s current perception of the power grid. The survey was conducted by Harris Poll on behalf of GE from May 02-06, 2014 among 2,049 adults ages 18 and older and from June 3-5, 2014 among 2,028 adults ages 18 and older.

Given the fact that hurricane Sandy was still reasonably fresh in people’s minds, and that polar vortices meant that early 2014 saw particularly harsh weather, it is perhaps not surprising that 41% of the respondents East of the Mississippi were more willing to pay $10 extra a month to ensure the grid is more reliable. A further 34% of those leaving West of the Mississippi would be willing to pay more for a more reliable grid.

What is most surprising is that the numbers are so low, to be honest. Especially the 41% figure, given that energy consumers East of the Mississippi had three times as many power outages as those living West of the Mississippi.

What’s the alternative to paying more? Home generation? Solar power is dropping in price, but it is still a very long term investment. And the cost of a decent generator can be $800 or more. And that’s just to buy it. Then there’s fuel and maintenance on top of that. As well as the inconvenience an outage brings.

Here in Europe, because most of the lines are underground, outages are very rare. The last electricity outage I remember was Dec 24th 1997, after a particularly severe storm in Ireland, for example.

The really heartening number to take away from this survey is that 81% of utility customers expect their energy company to use higher levels of renewables in the generation mix. If that expectation can be turned into reality, we’ll all be a lot better off.

(Cross-posted @ GreenMonk: the blog)

Technology for Good – episode thirty four with Salesforce’s John Tascheck

Welcome to episode thirty four of the Technology for Good hangout. In this week’s episode our guest was SalesForce SVP of Strategy, John Taschek. John and I are both longtime members of the Enterprise Irregulars, but this was the first time John and I had had a conversation outside of email!

Some of the more fascinating stories we looked at on the show, included a very successful Kickstarter campaign for a small router which can completely anonymise your internet activity, Lockheed Martin announcing that they’ve made a breakthrough on nuclear fusion technology, and Satya Nadella’s response to his gaffe last week about women seeking a raise.

Here is the full list of stories that we covered in this week’s show:

 

Climate

Energy

Hardware

Internet of Things

Wearables

Mobility

Comms

Privacy

Open Source

Sustainability

(Cross-posted @ GreenMonk: the blog)

Customer service, in-memory computing, and cloud? The utility industry is changing.

SAP For Utilities 2014 Exec Panel

I attended this year’s North American SAP for Utilities event and I was pleasantly surprised by some of the things I found there.

The utilities industry (electricity, gas, and water) are regulated industries which can’t go down (or at least, shouldn’t go down). Because of this, the industry is very slow to change (the old “if it ain’t broke…” mindset). However, with technology relentlessly enabling more and more efficiencies at the infrastructure level, utilities need to learn how to be agile without affecting their service.

This is challenging, sure. But, on the other hand, organisations like Google, Facebook, and Microsoft are incredibly nimble, updating their technologies all the time, and yet they have far better uptime figures than most utilities, I suspect (when is the last time Google was down for you, versus when did your electricity last go out?).

Having said all that, at this year’s event I saw glimmers of hope.

There were a number of areas where change is being embraced:

  1. Customer Service – utility companies have traditionally not been very consumer friendly. This is the industry which refers to its customers as rate payers, and end-points. However, that is starting to break down. This breakdown has been hastened in some regions by market liberalisation, and in all areas by the huge adoption of social media by utility customers. SAP for Utilities agenda
    Utility companies are now starting to adopt social media and utilise some of the strategies we have spoken about and written about so often here.
    What was really encouraging though, was to see that one of the four parallel tracks on the first day of the conference was dedicated to usability (which admittedly is more geared to usability of apps for utility employees, but there’s a knock-on for its customers too), and even better, on the second day of the conference, one of the four parallel tracks dedicated to customer engagement!
  2. In-memory computing – SAP has been pushing its SAP HANA in-memory computing platform to all its customers since it was announced in 2010. As mentioned previously, utility companies are slow to change, so it was interesting to listen to Snohomish County PUD CIO Benjamin Beberness, in the conference’s closing keynote, talking about his organisation’s decision to go all-in on SAP’s HANA in-memory platform. I shot an interview with Benjamin which I’ll be publishing here in the next few days where he talks about some of the advantages for Snohomish PUD of in-memory computing.
  3. Cloud computing – and finally, there was some serious talk of the move to Cloud computing by utilities. In the Utility Executive Panel (pictured above), Xcel Energy‘s CIO and VP, David Harkness said that before he retires his organisation will have closed their data center and moved their IT infrastructure entirely to the cloud. And he then added a rider that his retirement is not that far off.
    Given that this was the week after the celebrity photo leaks, there was also, understandably, some discussion about the requirement for cybersecurity, but there was broad acceptance of the inevitability of the move to cloud computing

I have been attending (and occasionally keynoting) this SAP for Utilities event now since 2008 so it has been very interesting to see these changes occurring over time. A year and a half ago I had a conversation with an SAP executive where I said it was too early to discuss cloud computing with utilities. And it was. Then. But now, cloud is seen by utilities as an a logical addition to their IT roadmap. I wouldn’t have predicted that change coming about so soon.

Disclosure – SAP paid my travel and accommodation to attend the event.

(Cross-posted @ GreenMonk: the blog)

Big Data and analysis tools are facilitating huge advances in healthcare


SAP's Genomic Analyzer
As we noted recently here on GreenMonk, technology is revolutionising the healthcare industry, and the pace of change is astounding with new products and services being announced daily.

We were recently given a demonstration of two products currently being developed by SAP (Genomic Analyzer, and Medical Insights), and they are very impressive products.

The Genomic Analyzer (pictured above) can take large numbers of genomes and interrogate them for various traits. This may sound trivial, but this is a serious Big Data problem. In a talk at SAP’s Sapphirenow conference in June, Stanford’s Carlos Bustamante outlined the scale of the issue when he noted that in sample size of 2534 genomes takes up 1.2tb of RAM and consists of over 20bn records.

The industry standard for storing genomic data is in a variant call format (VCF) text file. This is then interrogated using either open source or some specialised commercial software analyse the genomic data. Researchers frequently have to write their own scripts to parse the data, and the parsing takes a considerable amount of time.

SAP's Genomic Analyzer results

On the other hand, SAP’s Genomic Analyzer, because it is based on SAP’s in-memory database technology, can take record sets of 2,500 genomes in its stride returning multi-variant results in seconds. This will allow previously impossible tests to be run on genomic datasets, which opens up the potential for disease biomarker identification, population genetics studies, and personalised medicine.

SAP are actively looking for research partners to work with them on the development of the Genomic Analyzer. Partners would typically be research institutions, and they would receive login access to the analyzer (it is cloud delivered), and the ability to create and run as many query sets as they wish.

SAP’s Medical Insights application again takes advantage of SAP’s Hana in-memory database to take in the vast swathes of medical data which would typically be housed in siloed data warehouses (EMR’s, scans, pathology reports, chemo info, radio info, biobank system, and so on). It can be used to quickly identify patients suitable for drug trials, for example or to surface new research when relevant to patients.

The medical Insights solution is currently being developed as part of a co-innovation project with a large cancer institute in Germany, but will ultimately be applicable to any hospital or medical institution with large disparate data banks it needs to consolidate and query.

SAP are far from alone in this field. As well as developing innovative medical applications themselves, many in their Startup Focus program are also furiously innovating in this field, as previously noted.

Outside of the SAP ecosystem, IBM’s Watson cognitive computing engine is also tackling important healthcare issues. And like SAP, IBM have turned Watson onto a platform, opening it up to external developers, crowdsourcing the innovation, to see what they will develop.

The main difference between IBM’s cognitive computing approach, and SAP’s Hana in-memory database is that Watson analyses and interprets the results on behalf of the researchers, whereas Hana delivers just the data, leaving the evaluation in the hands of the doctors.

And news out today shows that Google is launching its Google X project, Baseline Study so as not to be left out of the running in this space.

There’s still a lot of work to be done, but the advances these technologies are starting to unlock with change the healthcare industry irreversibly for the good.

(Cross-posted @ GreenMonk: the blog)

Lack of emissions reporting from (some) cloud providers is a supply chain risk

Pollution

We at GreenMonk spoke to Robert Francisco, President North America of FirstCarbon Solutions, last week. FirstCarbon solutions is an environmental sustainability company and the exclusive scoring partner of CDP‘s (formerly the Carbon Disclosure Project), supply chain program.

Robert pointed out on the call that there is a seed change happening and that interest in disclosure is on the rise. He noted that carbon scores are now not only showing up at board level, but are also being reported to insurance companies, and are appearing on Bloomberg and Google Finance. He put this down to a shift away from the traditional regulation led reporting, to a situation now where organisations are responding to pressure from investors, as well as a requirement to manage shareholder risk.

In other words the drivers for sustainability reporting now are the insurance companies, and Wall Street. Organisations are realising that buildings collapsing in Bangladesh can have an adverse effect on their brand, and ultimately their bottom line.

So transparency in business is the new black.

Unfortunately, not everyone has received the memo.

We’re written previously about this lack of transparency, even ranking some cloud computing providers, and the supply chain risk as a result of that lack of reporting. Amazon and SoftLayer being two prime examples of cloud computing platforms that fail to report on their emissions.

However, SoftLayer was purchased by IBM in 2013, and IBM has a reasonably good record on corporate reporting (although, as of July 2014, it has yet to publish its 2013 Corporate Responsibility report). Hopefully this means that SoftLayer will soon start publishing its energy and emissions data.

Amazon, on the other hand, has no history of any kind of environmental energy or emissions reporting. That lack of transparency has to be a concern for its investors, a risk for for its shareholders, and a worry for its customers who don’t know what is in their supply chain.

Image credit Roger

(Cross-posted @ GreenMonk: the blog)

Ubiquitous computing, the Internet of Things, and the discovery of sound

Sounds of East Lansing photo

I had a really interesting, wide-ranging, conversation with SalesForce’s VP for Strategic Research, Peter Coffee the other day.

A lot of our conversation revolved around how recent changes in the Internet of Things space, in ubiquitous computing, and in Big Data and analytics area are enabling profound effects on how we interact with the world.

Peter had a superb analogy – that of sound travelling through air. When sound is generated, it is transmitted from the source to the surrounding air particles, which vibrate or collide and pass the sound energy along to our ears. Without any air particles to vibrate, we wouldn’t hear the sound (hence there is no sound in space).

As you enter our planet’s atmosphere from space you start to encounter molecules of air. The more molecules there are, the better they can interact and the more likely they are to transmit sound.

If you hadn’t experienced air before, you might not be aware of the existence of sound. It is unlikely you would even predict that there would be such a thing as sound.

In a similar way, in the late eighties, when very few people had mobile phones, it would have been nigh on impossible to predict the emergence of the mobile computing platforms we’re seeing now, and the advances they’ve brought to things like health, education and access to markets (and cat videos!).

And, we are just at the beginning of another period when massive change will be enabled. This time by pervasive connectivity. And not just the universal connectivity of people which mobile phones has enabled, but the connectivity of literally everything that is being created by low cost sensors and the Internet of Things.

We are already seeing massive data streams now coming from expensive pieces of equipment such as commercial jets, trains, and even wind turbines.

But with the drastic fall in the price of the technologies, devices such as cars, light bulbs, even toothbrushes that were never previously, are now being instrumented and connected to the Internet.

This proliferation of (typically cloud) connected devices will allow for massive shifts in our ability to generate, analyse, and act on, data sets that we just didn’t have before now.

When we look at the concept of the connected home, for example. Back in 2009 when we in GreenMonk were espousing the Electricity 2.0 vision, many of the technologies to make it happen, hadn’t even been invented. Now, however, not only are our devices at home increasingly becoming connected, but technology providers like Apple, Google, and Samsung are creating platforms to allow us better manage all our connected devices. The GreenMonk Electricity 2.0 vision is now a lot closer to becoming reality.

We are also starting to see the beginnings of what will be seismic upheavals in the areas of health, education, and transportation.

No-one knows for sure what the next few years will bring, but it is sure going to be an exciting ride as we metaphorically discover sound, again and again, and again.

Photo credit Matt Katzenberger

(Cross-posted @ GreenMonk: the blog)