Tag: Big Data

The Role of AI in Making Shipping Safer, Smarter, and More Sustainable

I’m excited to share the latest episode of the Digital Supply Chain podcast with you. This week, I had the pleasure of chatting with Ami Daniel, the co-founder and CEO of Windward, a company that provides maritime data and analytics to the supply chain industry.

During the episode, Ami shared some fascinating insights into how the company’s technology is being used to help various stakeholders in the supply chain ecosystem, from regulators to shippers to freight forwarders. We discussed Windward’s journey as a company, their plans for the future, and the challenges they’ve faced along the way.

One of the main topics we explored was the importance of data in the supply chain industry. Ami explained how Windward’s data is being used to increase transparency, reduce friction, and drive efficiency in the shipping industry. We also talked about the challenges of working with data at such a large scale and how Windward is using AI and machine learning to make sense of the vast amounts of information they collect.

Another interesting area we delved into was the impact of the COVID-19 pandemic on the supply chain industry. Ami shared his perspective on how the pandemic has accelerated the adoption of technology in the industry, as well as the challenges it has posed to various stakeholders in the ecosystem.

Ami also shared how data can be used to tackle illegal fishing and labor abuse in the global shipping industry, as well as with compliance with the Jones Act.

If you’re interested in the supply chain industry or the role of data in driving efficiency and transparency, I highly recommend you check out this episode of the Digital Supply Chain podcast. You can listen to it here or click the player above.

If you enjoy this episode, please consider following the podcast and sharing it with others who may be interested. And as always, if you find the podcast of value, and you’d like to help me continue to make episodes like this one, you can go to the podcast’s Support page and become a Digital Supply Chain podcast Supporter for less than the cost of a cup of coffee!

And if you’re interested in having your brand associated with the leading Supply Chain podcast, don’t hesitate to check out these sponsorship packages and how I can help your company gain exposure and establish yourself as a thought leader in the supply chain industry, please don’t hesitate to get in touch.

Thank you!

Photo credit – Torsten Sobanski on Flickr

Ubiquitous computing, the Internet of Things, and the discovery of sound

Sounds of East Lansing photo

I had a really interesting, wide-ranging, conversation with SalesForce’s VP for Strategic Research, Peter Coffee the other day.

A lot of our conversation revolved around how recent changes in the Internet of Things space, in ubiquitous computing, and in Big Data and analytics area are enabling profound effects on how we interact with the world.

Peter had a superb analogy – that of sound travelling through air. When sound is generated, it is transmitted from the source to the surrounding air particles, which vibrate or collide and pass the sound energy along to our ears. Without any air particles to vibrate, we wouldn’t hear the sound (hence there is no sound in space).

As you enter our planet’s atmosphere from space you start to encounter molecules of air. The more molecules there are, the better they can interact and the more likely they are to transmit sound.

If you hadn’t experienced air before, you might not be aware of the existence of sound. It is unlikely you would even predict that there would be such a thing as sound.

In a similar way, in the late eighties, when very few people had mobile phones, it would have been nigh on impossible to predict the emergence of the mobile computing platforms we’re seeing now, and the advances they’ve brought to things like health, education and access to markets (and cat videos!).

And, we are just at the beginning of another period when massive change will be enabled. This time by pervasive connectivity. And not just the universal connectivity of people which mobile phones has enabled, but the connectivity of literally everything that is being created by low cost sensors and the Internet of Things.

We are already seeing massive data streams now coming from expensive pieces of equipment such as commercial jets, trains, and even wind turbines.

But with the drastic fall in the price of the technologies, devices such as cars, light bulbs, even toothbrushes that were never previously, are now being instrumented and connected to the Internet.

This proliferation of (typically cloud) connected devices will allow for massive shifts in our ability to generate, analyse, and act on, data sets that we just didn’t have before now.

When we look at the concept of the connected home, for example. Back in 2009 when we in GreenMonk were espousing the Electricity 2.0 vision, many of the technologies to make it happen, hadn’t even been invented. Now, however, not only are our devices at home increasingly becoming connected, but technology providers like Apple, Google, and Samsung are creating platforms to allow us better manage all our connected devices. The GreenMonk Electricity 2.0 vision is now a lot closer to becoming reality.

We are also starting to see the beginnings of what will be seismic upheavals in the areas of health, education, and transportation.

No-one knows for sure what the next few years will bring, but it is sure going to be an exciting ride as we metaphorically discover sound, again and again, and again.

Photo credit Matt Katzenberger

(Cross-posted @ GreenMonk: the blog)

Here comes the sun… IBM and solar forecasting


Concentrating solar power array

For decades now electricity grids have been architected in the same way with large centralised generation facilities pumping out electricity to large numbers of distributed consumers. Generation has been controlled, and predictable. This model is breaking down fast.

In the last decade we have seen a massive upsurge in the amount of renewable generation making its way onto the grid. Most of this new renewable generation is coming from wind and solar. Just last year (2013), almost a third of all newly added electricity generation in the US came from solar. That’s an unprecedented number which points to a rapid move away from the old order.

This raises big challenges for the grid operators and utilities. Now they are moving to a situation where generation is variable and not very predictable. And demand is also variable and only somewhat predictable. In a situation where supply and demand are both variable, grid stability can be an issue.

To counter this, a number of strategies are being looked at including demand response (managing the demand so it more closely mirrors the supply), storage (where excess generation is stored as heat, or potential energy, and released once generation drops and/or demand increases), and better forecasting of the generation from variable suppliers.

Some of the more successful work being done on forecasting generation from renewables is being undertaken by Dr Hendrik Hamann at IBM’s TJ Watson Research Center, in New York. Specifically Dr Hamann is looking at improving the accuracy of forecasting solar power generation. Solar is extremely complex to forecast because factors such as cloud cover, cloud opacity and wind have to be taken into account.
IBM Solar Forecaster
Dr Hamann uses a deep machine learning approach to tackle the many petabytes of big data generated by satellite images, ground observations, and solar databases. The results have been enviable apparently. According to Dr. Hamann, solar forecast accuracy using this approach is 50% more accurate than the next best forecasting model. And the same approach can be used to predict rainfall, surface temperature, and wind. In the case of wind, the forecast accuracy is 35% better than the next best model.

This is still very much a research project so there is no timeline yet on when (or even if) this will become a product, but if it does, I can see it being an extremely valuable tool for solar farm operators (to avoid fines for over-production, for example), for utilities to plan power purchases, and for grid management companies for grid stability purposes.

The fact that it is a cloud delivered (pun intended, sorry) solution would mean that if IBM brings it to market it will have a reduced cost and time to delivery, bringing it potentially within reach of smaller operators. And with the increase in the number of solar operators (140,000 individual solar installations in the U.S. in 2013) on the grid, highly accurate forecasting is becoming more important by the day.

(Cross-posted @ GreenMonk: the blog)

(Cross-posted @ GreenMonk: the blog)

Microsoft, big data and smarter buildings

Smarter building dashboard

If you checked out the New York Times Snow Fall site (the story of the Avalanche at Tunnel Creek), then Microsoft’s new 88 Acres site will look familiar. If you haven’t seen the Snow Fall site then go check it out, it is a beautiful and sensitive telling of a tragic story. You won’t regret the few minutes you spend viewing it.

Microsoft’s 88 Acres is an obvious homage to that site, except that it tells a good news story, thankfully, and tells it well. It is the story of how Microsoft is turning its 125-building Redmond HQ into a smart corporate campus.

Microsoft’s campus had been built over several decades with little thought given to integrating the building management systems there. When Darrell Smith, Microsoft’s director of facilities and energy joined the company in 2008, he priced a ‘rip and replace’ option to get the disparate systems talking to each other but when it came in at in excess of $60m, he decided they needed to brew their own. And that’s just what they did.

Using Microsoft’s own software they built a system capable of taking in the data from the over 30,000 sensors throughout the campus and detecting and reporting on anomalies. They first piloted the solution on 13 buildings on the campus and as they explain on the 88 Acres site:

In one building garage, exhaust fans had been mistakenly left on for a year (to the tune of $66,000 of wasted energy). Within moments of coming online, the smart buildings solution sniffed out this fault and the problem was corrected.
In another building, the software informed engineers about a pressurization issue in a chilled water system. The problem took less than five minutes to fix, resulting in $12,000 of savings each year.
Those fixes were just the beginning.

The system balances factors like the cost of a fix, the money that will be saved by the fix, and the disruption a fix will have on employees. It then prioritises the issues it finds based on these factors.

Microsoft facilities engineer Jonathan Grove sums up how the new system changes his job “I used to spend 70 percent of my time gathering and compiling data and only about 30 percent of my time doing engineering,” Grove says. “Our smart buildings work serves up data for me in easily consumable formats, so now I get to spend 95 percent of my time doing engineering, which is great.”

The facilities team are now dealing with enormous quantities of data. According to Microsoft, the 125 buildings contain 2,000,000 data points outputting around 500,000,000 data transactions every 24 hours. The charts, graphics and reports it produces leads to about 32,300 work orders being issued per quarter. And 48% of the faults found are corrected within 60 seconds. Microsoft forecasts energy savings of 6-10% per year, with an implementation payback of 18 months.

Because Microsoft’s smart building tool was built using off the shelf Microsoft technologies, it is now being productised and will be offered for sale. It joins a slew of other smarter building software solutions currently on the market but given this one is built with basic Microsoft technologies, it will be interesting to see where it comes in terms of pricing.

One thing is for sure, given that buildings consume around 40% of our energy, any new entrant into the smarter buildings arena is to be welcomed.

Image credit nicadlr

 

(Cross-posted @ GreenMonk: the blog)

Sustainability, social media and big data

The term Big Data is becoming the buzz word du jour in IT these days popping up everywhere, but with good reason – more and more data is being collected, curated and analysed today, than ever before.

Dick Costolo, CEO of Twitter announced last week that Twitter is now publishing 500 million tweets per day. Not alone is Twitter publishing them though, it is organising them and storing them in perpetuity. That’s a lot of storage, and 500 million tweets per day (and rising) is big data, no doubt.

And Facebook similarly announced that 2.5 billion content items are shared per day on its platform, and it records 2.7 billion Likes per day. Now that’s big data.

But for really big data, it is hard to beat the fact that CERN’s Large Hadron Collider creates 1 petabyte of information every second!

And this has what to do with Sustainability, I hear you ask.

Well, it is all about the information you can extract from that data – and there are some fascinating use cases starting to emerge.

A study published in the American Journal of Tropical Medicine and Hygiene found that Twitter was as accurate as official sources in tracking the cholera epidemic in Haiti in the wake of the deadly earthquake there. The big difference between Twitter as a predictor of this epidemic and the official sources is that Twitter was 2 weeks faster at predicting it. There’s a lot of good that can be done in crisis situations with a two week head start.

Another fascinating use case I came across is using social media as an early predictor of faults in automobiles. A social media monitoring tool developed by Virginia Tech’s Pamplin College of Business can provide car makers with an efficient way to discover and classify vehicle defects. Again, although at early stages of development yet, it shows promising results, and anything which can improve the safety of automobiles can have a very large impact (no pun!).

GE's Grid IQ Insight social media monitoring tool

GE have come up with another fascinating way to mine big data for good. Their Grid IQ Insight tool, slated for release next year, can mine social media for mentions of electrical outages. When those posts are geotagged (as many social media posts now are), utilities using Grid IQ Insight can get an early notification of an outage in its area. Clusters of mentions can help with confirmation and localisation. Photos or videos added of trees down, or (as in this photo) of a fire in a substation can help the utility decide which personnel and equipment to add to the truckroll to repair the fault. Speeding up the repair process and getting customers back on a working electricity grid once again can be critical in an age where so many of our devices rely on electricity to operate.

Finally, many companies are now using products like Radian6 (now re-branded as Salesforce Marketing Cloud) to actively monitor social media for mentions of their brand, so they can respond in a timely manner. Gatorade in the video above is one good example. So too are Dell. Dell have a Social Media Listening Command Centre which is staffed by 70 employees who listen for and respond to mentions of Dell products 24 hours a day in 11 languages (English, plus Japanese, Chinese, Portugese, Spanish, French, German, Norwegian, Danish, Swedish, and Korean). The sustainability angle of this story is that Dell took their learnings from setting up this command centre and used them to help the American Red Cross set up a similar command centre. Dell also contributed funding and equipment to help get his off the ground.

No doubt the Command Centre is proving itself invaluable to the American Red Cross this week mining big data to help people in need in the aftermath of Hurricane Sandy.

(Cross-posted @ GreenMonk: the blog)

SAP’s Sustainability announcements at Sapphire Now


SAP co-CEO Jim Hagemann Snabe at Sapphire Now 2012

SAP co-CEO Jim Hagemann Snabe at Sapphire Now 2012

Technology innovation plays a major part in creating a sustainable world tomorrow

So said SAP co-CEO Jim Hagemann Snabe at this year’s SAP Sapphire Now conference in Orlando. He then went on to predict three major trends in computing for the coming years – according to Jim, in the next five years everything will move to Cloud, everything will be in main memory and everything will be mobile.

This wasn’t just some off-the-cuff remark – these three developments are core to SAP’s product roadmap – even in the Sustainability space.

In the mobile space for example, at Sapphire Now SAP announced a new version of a mobile app for incident management. With this app, workers can now log issues from their mobile device with a photo or video, as well as an audio recording, and send it directly to an incident or safety manager for corrective action. This crowd-sourcing of safety information also has built-in tracking of the reported incident which is hugely empowering for workers who may previously have felt their voice wasn’t heard. And for the companies deploying this solution it leads to a safer work environment and a happier workforce.

This puts me in mind of an initiative IBM rolled out with the Los Angeles Unified School District (LAUSD) where they enabled students, teachers and staff to report issues like water leaks, broken aircon/heating, exposed cables and so on, by sending text messages and photos through their mobile phones. More please.

Also in the mobile sustainability space, SAP have their Electronic Medical Record app [SilverLight warning] – an app which gives doctors instant access to a patient’s electronic medical records.

In the Cloud space, SAP have made two major recent acquisitions – Successfactors and more recently Ariba at a cost of roughly $7.7bn. This is a clear indicator that while SAP maybe late to the party, it is serious about catching up…